CN116743971B - Homologous screen projection method and electronic equipment - Google Patents

Homologous screen projection method and electronic equipment Download PDF

Info

Publication number
CN116743971B
CN116743971B CN202211340697.8A CN202211340697A CN116743971B CN 116743971 B CN116743971 B CN 116743971B CN 202211340697 A CN202211340697 A CN 202211340697A CN 116743971 B CN116743971 B CN 116743971B
Authority
CN
China
Prior art keywords
screen
application
message
window
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211340697.8A
Other languages
Chinese (zh)
Other versions
CN116743971A (en
Inventor
王冬伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211340697.8A priority Critical patent/CN116743971B/en
Publication of CN116743971A publication Critical patent/CN116743971A/en
Application granted granted Critical
Publication of CN116743971B publication Critical patent/CN116743971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0861Generation of secret information including derivation or calculation of cryptographic keys or passwords
    • H04L9/0863Generation of secret information including derivation or calculation of cryptographic keys or passwords involving passwords or one-time passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/088Usage controlling of secret information, e.g. techniques for restricting cryptographic keys to pre-authorized uses, different access levels, validity of crypto-period, different key- or password length, or different strong and weak cryptographic algorithms

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a homologous screen projection method and electronic equipment, and relates to the technical field of terminals. The problem that when the first equipment is in a screen locking or screen extinguishing state, the screen cannot be normally thrown to other equipment is solved. The specific scheme is as follows: the first device forwards the first message to the second device, the second device displays a first control, and the first control is used for displaying the first message from the first device; the second device sends a first request to the first device; the first device creating a first display area, the first display area being a display area that is not visible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the method comprises the steps that a first device starts a first application, wherein the first application comprises a first process, and the first process is not dormant; after the first device draws an application interface of the first process in the first display area, the first device sends first data to the second device; the second device displays a first window in which display content drawn in the first display area is displayed in response to the first data.

Description

Homologous screen projection method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a homologous screen projection method and electronic equipment.
Background
The electronic devices work cooperatively to bring convenience to the life and work of the user. Taking the message flow as an example, after device a receives a message notification, device a may flow the message notification to device b. In this way, even if the user does not currently use the device a, the message notification can be viewed on the device b, and of course, when the user instructs the device b to display the message notification from the device a, the device b can display a homologous screen-throwing window corresponding to the device a, where the homologous screen-throwing window is used to display an application interface corresponding to the message notification. The above-described process of displaying the homologous screen-drop window may be referred to as homologous screen-drop.
However, the device a cannot be released by the homologous screen, and the display content in the homologous screen projection window needs to be synchronized with the device a. In this way, when the device a performs homologous screen projection in a scene of screen locking, screen stopping and other states, abnormal display can occur.
Disclosure of Invention
The embodiment of the application provides a homologous screen projection method and electronic equipment, which are used for solving the problem of abnormal homologous screen projection of the electronic equipment during screen-off and screen-locking.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
In a first aspect, an embodiment of the present application provides a method for screen projection in a homologous manner, which is applied to a first device and a second device, where the first device is in a screen-off state or a screen-locking state, and the method includes: the first device receives a first message; the first device forwards the first message to the second device, the second device displays a first control, and the first control is used for displaying the first message from the first device; the second device responds to the operation of the user on the first control and sends a first request to the first device; the first device responding to the first request, creating a first display area, wherein the first display area is invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the method comprises the steps that a first device starts a first application, wherein a first message is an application message of the first application, the first application comprises a first process, and the first process is not dormant; after the first device draws an application interface of the first process in the first display area, the first device sends first data to the second device; the second device displays a first window in which display content drawn in the first display area is displayed in response to the first data.
When the first device is in a screen locking or screen extinguishing state, a user can directly operate on the second device without performing any operation on the first device, namely, the operation on the first control is performed, so that a first application on the first device is pulled up to throw a screen on the second device. The first application is an application program installed on the first device, and whether the second device installs the first application does not affect the implementation of the system.
In the above embodiment, the first data displayed by the second device in the first window is derived from the content drawn in the first display area, and the first display area is not dormant and does not draw the lock screen layer. In this way, the content of the screen shot into the first window is not influenced by screen stop or screen locking of the first device.
In some embodiments, the first request carries an application identifier of the first application, and before the first device starts the first application, the method further includes: the first device marks a first label to a first process of a first application in response to the first request; in the event that it is determined that the first process is marked with the first tag, the first device configures the first process not to sleep during the life cycle, and configures an application interface of the first process to allow drawing on the first display area during the life cycle.
In some embodiments, before sending the first data to the second device, the method further comprises: the first device generates first data according to an application interface drawn in a first display area, wherein the first data comprises an application layer corresponding to the application interface.
In some embodiments, the second device is responsive to a user operation of the first control, the method further comprising, prior to sending the first request to the first device: the second device displays a second window, wherein the second window is a distributed authentication window; during the display of the second window, the second device receives password information input by a user; the second device sends password information to the first device; and when the password information is matched with an unlocking key preset in the second device, sending first response information to the second device, wherein the first response information indicates that the second device passes distributed authentication.
In the embodiment, the information security of screen projection is improved through distributed cognition.
In some embodiments, before the second device displays the second window, the method further comprises: the second device determines that an unlock key has been configured in the first device.
In some embodiments, before the second device displays the second window, the method further comprises: the second equipment acquires a first time point adjacent to the last time of receiving the first response information; the second device determines that a first time interval is greater than a first threshold, the first time interval being an interval between a first point in time and a current system time.
In some embodiments, before sending the first request to the first device, the method further comprises: the second device determines that an unlocking key is not configured in the first device, wherein the unlocking key is a key for releasing the screen locking state of the first device.
In some embodiments, before the sending the first request to the first device, the method further comprises: the second device determines that an unlocking key is configured in the first device; the second equipment acquires a first time point, wherein the first time point is the system time adjacent to the last time of receiving first response information fed back by the first equipment, and the first response information indicates that the password information sent by the second equipment is matched with an unlocking key arranged in the first equipment; the second device determines that a second time interval is not greater than the first threshold, the second time interval being an interval between the first point in time and a current system time.
In some embodiments, after creating the first display region, the method further comprises: the first device marks the first display area with a second label; after determining that the first display area is marked with the second label, the first device configures the first display area not to sleep and configures not to draw a lockscreen layer on the first display area.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a second control, the second control displays a second message from the first device, and the second message is an application message of a second application; and under the condition that the second application in the first device starts the application lock, the second device responds to the operation of the user on the second control, and displays first reminding information, wherein the first reminding information indicates that the second application is protected by the application lock and cannot be screen-throwing.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a third control, and a third message from the third device is displayed on the third control; and under the condition that a communication channel for screen projection is occupied, the second equipment responds to the operation of the user on the third control, and displays second reminding information which indicates that the problem of network connection conflict exists currently.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a third control, and a third message from the third device is displayed on the third control; and under the condition that the network quality between the second equipment and the third equipment does not meet the preset condition, the second equipment responds to the operation of the user on the third control, third reminding information is displayed, and the third reminding information indicates that the network quality influences the screen throwing.
In some embodiments, during the display of the first window by the second device, if the first device receives an operation that the user instructs to unlock the screen or to rest the screen, the first device continues to maintain the screen locked or rest state.
In some embodiments, after the second device displays the first window, the method further comprises: and the first equipment responds to the operation of releasing the screen locking or screen extinguishing state indicated by the user and displays an application interface corresponding to the first process.
In a second aspect, an embodiment of the present application provides a method for screen projection in a homologous manner, where the method is applied to a first device, and the first device is in a screen-off state or a screen-locking state and is connected with a second device in a communication manner, where the method includes: the first device receives a first message; the first device forwards a first message to the second device, instructs the second device to display a first control, and the first control is used for displaying the first message; the first device responding to the first request, creating a first display area, wherein the first display area is invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the first request is a request sent to the first device by the second device in response to the operation of the first control by the user; the method comprises the steps that a first device starts a first application, wherein a first message is an application message of the first application, the first application comprises a first process, and the first process is not dormant; after the first device draws the application interface of the first process in the first display area, first data is sent to the second device, the second device is instructed to display a first window, and display content drawn in the first display area is displayed in the first window.
In some embodiments, the first request carries an application identifier of the first application, and before the first device starts the first application, the method further includes: the first device marks a first label to a first process of a first application in response to the first request; in the event that it is determined that the first process is marked with the first tag, the first device configures the first process not to sleep during the life cycle, and configures an application interface of the first process to allow drawing on the first display area during the life cycle.
In some embodiments, before sending the first data to the second device, the method further comprises: the first device generates first data according to an application interface drawn in a first display area, wherein the first data comprises an application layer corresponding to the application interface.
In some embodiments, after creating the first display region, the method further comprises: the first device marks the first display area with a second label; after determining that the first display area is marked with the second label, the first device configures the first display area not to sleep and configures not to draw a lockscreen layer on the first display area.
In some embodiments, after the second device displays the first window, the method further comprises: and the first equipment responds to the operation of releasing the screen locking or screen extinguishing state indicated by the user and displays an application interface corresponding to the first process.
In a third aspect, an embodiment of the present application provides a method for screen projection in a homologous manner, where the method is applied to a second device, and the second device is in communication connection with a first device, and the first device is in a screen-off state or a screen-locking state, where the method includes: the second device receives a first message forwarded by the first device; the second device displays a first control, the first control being for displaying a first message from the first device; the second device responds to the operation of the user on the first control and sends a first request to the first device; wherein the first request is used for instructing the first device to create a first display area, and the first display area is a display area invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the first request is further used for indicating the first device to start a first application, the first message is an application message of the first application, the first application comprises a first process, and the first process is not dormant; the application interface of the first process is drawn in a first display area; the second device receives first data sent by the first device; the second device displays a first window in which display content drawn in the first display area is displayed in response to the first data.
In some embodiments, the second device is responsive to a user operation of the first control, the method further comprising, prior to sending the first request to the first device: the second device displays a second window, wherein the second window is a distributed authentication window; during the display of the second window, the second device receives password information input by a user; the second device sends password information to the first device; and when the password information is matched with an unlocking key preset in the second device, sending first response information to the second device, wherein the first response information indicates that the second device passes distributed authentication.
In some embodiments, before the second device displays the second window, the method further comprises: the second device determines that an unlock key has been configured in the first device.
In some embodiments, before the second device displays the second window, the method further comprises: the second equipment acquires a first time point adjacent to the last time of receiving the first response information; the second device determines that a first time interval is greater than a first threshold, the first time interval being an interval between a first point in time and a current system time.
In some embodiments, before sending the first request to the first device, the method further comprises: the second device determines that an unlocking key is not configured in the first device, wherein the unlocking key is a key for releasing the screen locking state of the first device.
In some embodiments, before sending the first request to the first device, the method further comprises: the second device determines that an unlocking key is configured in the first device; the second equipment acquires a first time point, wherein the first time point is the system time adjacent to the last time of receiving first response information fed back by the first equipment, and the first response information indicates that the password information sent by the second equipment is matched with an unlocking key arranged in the first equipment; the second device determines that a second time interval is not greater than the first threshold, the second time interval being an interval between the first point in time and a current system time.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a second control, the second control displays a second message from the first device, and the second message is an application message of a second application; and under the condition that the second application in the first device starts an application lock, the second device responds to the operation of the user on the second control, and displays first reminding information, wherein the first reminding information indicates that the second application is protected by the application lock and cannot be screen-throwing.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a third control, and a third message from the third device is displayed on the third control; and under the condition that a communication channel for screen projection is occupied, the second equipment responds to the operation of the user on the third control, and displays second reminding information which indicates that the problem of network connection conflict exists currently.
In some embodiments, the method further comprises: the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a third control, and a third message from the third device is displayed on the third control; and under the condition that the network quality between the second equipment and the third equipment does not meet the preset condition, the second equipment responds to the operation of the user on the third control, third reminding information is displayed, and the third reminding information indicates that the network quality influences the screen throwing.
In some embodiments, during the display of the first window by the second device, if the first device receives an operation that the user instructs to unlock the screen or to rest the screen, the first device continues to maintain the screen locked or rest state.
In a fourth aspect, an electronic device provided by an embodiment of the present application includes one or more processors and a memory; the memory is coupled to a processor, the memory being for storing computer program code comprising computer instructions for performing the methods of the second aspect, the third aspect and possible embodiments thereof, when the computer instructions are executed by one or more processors.
In a fifth aspect, embodiments of the present application provide a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of the second aspect, the third aspect and possible embodiments thereof.
In a sixth aspect, the application provides a computer program product for causing an electronic device to carry out the method of the second, third and possible embodiments thereof.
It will be appreciated that the electronic device, the computer storage medium and the computer program product provided in the above aspects are all applicable to the corresponding methods provided above, and therefore, the advantages achieved by the electronic device, the computer storage medium and the computer program product may refer to the advantages in the corresponding methods provided above, and are not repeated herein.
Drawings
FIG. 1 is a diagram of an exemplary system according to an embodiment of the present application;
FIG. 2 is a diagram illustrating an exemplary message flow scenario provided in an embodiment of the present application;
fig. 3 is an exemplary diagram of a screen-projection scene provided in an embodiment of the present application;
fig. 4A is a diagram illustrating an example of message flow in a mobile phone screen-off state according to an embodiment of the present application;
Fig. 4B is a diagram illustrating an example of message flow in a mobile phone screen locking state according to an embodiment of the present application;
FIG. 5A is an exemplary diagram of a screen shot scene of a mobile phone in a screen-off state according to the related art;
fig. 5B is an exemplary diagram of a screen-throwing scenario in a mobile phone screen-locking state in the related art;
fig. 6A is an exemplary diagram of a screen-throwing scene in a screen-extinguishing state of a mobile phone according to an embodiment of the present application;
fig. 6B is an exemplary diagram of a screen-throwing scenario in a screen-locking state of a mobile phone according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface of a notebook computer according to an embodiment of the present application;
FIG. 8 is a second diagram of an interface of a notebook computer according to an embodiment of the present application;
FIG. 9 is a flowchart illustrating an exemplary process of determining a screen-projection environment according to an embodiment of the present application;
FIG. 10 is a third exemplary diagram of an interface of a notebook computer according to an embodiment of the present application;
FIG. 11 is a schematic illustration of distributed authentication according to an embodiment of the present application;
FIG. 12 is a diagram illustrating an exemplary software architecture of a mobile phone and a notebook computer according to an embodiment of the present application;
fig. 13 is one of signaling interaction diagrams of the homologous screen projection method provided by the embodiment of the present application;
FIG. 14 is a second signaling diagram of a method for screen projection with homology according to an embodiment of the present application;
FIG. 15 is a third signaling diagram of a method for screen projection with homology according to an embodiment of the present application;
FIG. 16 is a fourth signaling diagram of a method for screen projection with homology according to an embodiment of the present application;
FIG. 17 is a fifth signaling diagram of a method for screen projection with homology according to an embodiment of the present application;
FIG. 18 is a sixth signaling interaction diagram of the method for homologous screen projection according to the embodiment of the present application;
FIG. 19 is a diagram of signaling interaction of a method for homologous screen projection according to an embodiment of the present application;
FIG. 20 is a signaling interaction diagram of a method for homologous screen projection according to an embodiment of the present application;
FIG. 21A is a schematic diagram of a composite application layer according to an embodiment of the present application;
FIG. 21B is a second schematic diagram of a composite application layer according to an embodiment of the present application;
fig. 22 is an exemplary diagram of a chip system according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The implementation of the present embodiment will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a system provided in an embodiment of the present application, which includes a plurality of electronic devices, such as a device 1, a device 2, and a device 3. The electronic devices in the above system (i.e., device 1, device 2, device 3) may belong to the same local area network.
Illustratively, the local area network may be a wireless communication network, such as a wireless local area network (wireless local areanetworks, WLAN), a wireless fidelity point-to-point (Wi-Fi P2P), a bluetooth network, a zigbee network, an Infrared (IR) or near field communication (NEAR FIELD communication) network, and the like. Also by way of example, the local area network may be a wired communication network. Such as a communication network established by a video graphics adapter (video GRAPHICS ARRAY, VGA), a digital video interface (digital visual interface, DVI), a high-definition multimedia interface (high definition multimedia interface, HDMI), or a data transmission line. Further exemplary, the local area network may also be a remote network.
After device 1, device 2, and device 3 have trusted each other, the system made up of device 1, device 2, and device 3 may also be referred to as a trust ring. In addition, the process of trust mutual authentication between devices may refer to related technologies, for example, it may be determined according to one or more of whether the devices log in with the same system account, whether authorization has been performed, and whether a near field communication mode is adopted, which will not be described herein.
It will be appreciated that data sharing may be securely performed between multiple electronic devices in the same trust ring, the shared data including, but not limited to, message alerts, message records, and other electronic device user information, device type, running status of each application, identification of each application, synchronization of identifications of recently used applications, and the like.
Thus, in the trust ring, various collaboration services, such as collaboration conversation, collaboration notification, screen projection and the like, can be realized.
Taking collaborative notification as an example, in the trust ring shown in fig. 1, the device 1 receives a message notification, so that the message notification can be displayed by the device 1, and can be transferred to the device 2 and/or the device 3, and displayed by the device 2 and/or the device 3, thereby facilitating a user to view the message across devices and improving the efficiency of collaborative work between the devices. Likewise, device 2 may also stream received message notifications to device 1 and/or device 3, and device 3 may also stream received message notifications to device 1 and/or device 2.
Of course, merely making a message flow between devices is not enough, and if a user views a message notification from device 1 on device 2, it is often necessary to process the message notification by device 2, which requires that device 1 be on-screen to device 2.
The screen projection scene comprises a transmitting end and a receiving end. The sending end sends the display information to the receiving end for display, namely, the sending end throws a screen to the receiving end. For convenience of description, the transmitting end will be referred to herein as a source end device, and the receiving end will be referred to as a target device. Any one electronic device in the trust ring can be used as a source device, other devices different from the source device in the trust ring can be used as target devices, and one source device corresponds to one target device at the same time. It will be appreciated that, in addition to the source device and the destination device, the transmitting end and the receiving end may be referred to as other names, for example, the transmitting end is an active screen-throwing device, and the receiving end is a passive screen-throwing device; or the transmitting end is the first electronic device, the receiving end is the second electronic device, etc., which is not limited in the present application. In an exemplary screen-projection scene, a protocol is used between the source device and the target device to display the content on the source device on the target device. Common protocols are, for example, DLNA (DIGITAL LIVING Network Alliance) protocol and Chromecast protocol, miracast protocol, airplay protocol, etc.
Currently, common screen-casting techniques include homologous screen casting and heterologous screen casting.
Under the homologous screen-throwing scene, the source device throws the foreground display information (such as an application interface A corresponding to the application A) to the target device, that is, the target device displays a screen-throwing window corresponding to the source device, and then the display information (such as the application interface A) is displayed in the screen-throwing window. When the foreground display information of the source device changes, for example, the application interface A is switched to the application interface B of the application A, and correspondingly, the screen throwing window on the target device also changes, namely, the application interface A is switched to the application interface B. For another example, the source device switches from the application interface a to the main interface, and the corresponding screen window on the target device also changes, i.e. switches from the application interface a to the main interface of the source device. That is, in the homologous screen-projection scene, when the source device projects a screen to the target device, the display information on the target device is synchronous with the display information of the foreground of the source device. The screen is projected by the homologous screen projection mode in a mode of expanding the screen, and an application interface started by the source equipment is projected to the target equipment.
In a heterogeneous screen-throwing scene, under the scene that the source equipment throws the application interface C of the application B to the target equipment, the source equipment displays the application interface C of the application B, a screen-throwing window in the target equipment displays the application interface C of the application B, and then, if the source equipment responds to the operation of a user, the source equipment cuts off the application interface D of the application B, and a screen-throwing window in the target equipment displays the application interface D of the application B. If the source device responds to the user operation to switch the application interface E of the application C or switch the display main interface, the screen throwing window in the target device continues to display the application interface of the application B. In other words, the source device integrally screens the application B to the target device, i.e. the screen granularity of the heterogeneous screen is the application.
It can be appreciated that the homologous screen projection mode and the heterologous screen projection mode have advantages and disadvantages. For example, the homologous screen projection mode can ensure the continuity of the application; and the heterogeneous screen-throwing mode needs to restart the application when switching among different screens. In the embodiment of the application, a homologous screen projection scene is taken as an example for description.
As shown in fig. 2, taking a source device as a mobile phone and a target device as a notebook computer as an example. After the mobile phone and the notebook computer belong to the same local area network and trust mutual recognition is completed, the mobile phone and the notebook computer belong to the same trust ring, that is, a communication channel for transferring messages can be established between the mobile phone and the notebook computer.
In some embodiments, multiple types of applications may be installed in the mobile phone, and during operation of the mobile phone, application messages pushed by each application may be received, where the application messages come from a message server corresponding to the application.
Illustratively, a social application may be installed in a mobile phone, and the mobile phone may receive an application message pushed by the social application, for example, a message sent by a contact's min in the social application, "do you go to run today? ".
After receiving the application message pushed by the application program, if the mobile phone is in an unlocked and bright screen state, the mobile phone can display prompt information on the current interface. For example, when receiving an application message pushed by a social application, if the current interface of the mobile phone is the main interface 201, a notification bubble 202 is displayed in the main interface 201, where the notification bubble 202 is used to display the application message received by the mobile phone.
The cell phone may then transfer the application message (e.g., "do you go to run today" pushed by the social application) to the notebook through a communication channel with the notebook for transferring messages.
In some embodiments, after the notebook computer receives the application message from the mobile phone, a prompt message for the application message may be displayed. The prompt information may be displayed on the desktop of the notebook computer in a form of a message notification control, and the message notification control may be, for example, a floating window, a message bubble, a message card, etc., which is not limited in particular in the embodiment of the present application. For example, as shown in fig. 2, a message card 204 is displayed on the desktop 203, and the message card 204 is used to display the relevant content of the application message. The related content may be that the application message comes from a mobile phone, or that the application message comes from a social application, or that the application message is main content of the application message, etc. It will be appreciated that the display position of the message card 204 is not particularly limited in the embodiment of the present application.
In the scenario shown in fig. 2, if the user is using a notebook computer, the message card 204 displayed in the desktop 203 may alert the user to the fact that the application message is currently received by the mobile phone. Therefore, even if the user does not directly use the mobile phone at present or the mobile phone is not at hand, the user can know the brief content of the application message, and the omission of the message is avoided.
In some embodiments, the message card 204 may have a display time limit. For example, the display time limit is preset to be duration 1, and the notebook computer starts to display the message card 204 in response to the application message forwarded by the mobile phone. When the duration of the message card 204 display exceeds duration 1 and the user is not detected to select the message card 204, the notebook may cancel the display of the message card 204.
Illustratively, when the display screen of the notebook computer is a touch screen, the notebook computer detects that the user is in contact with the display area 1 in the touch screen, and the notebook computer determines that the user indicates to select the message card 204. The display area 1 is an area on the touch screen where the message card 204 is displayed. Also, for example, when the notebook computer has accessed the mouse, it detects that the cursor corresponding to the mouse is located in the display area 1, and inputs a confirmation instruction, and the notebook computer determines that the user instructs to select the message card 204.
In some embodiments, after the message card 204 is displayed by the notebook, the user may process the application message by operating the notebook, or may choose to process the application message by directly operating the cell phone.
The method of directly operating the mobile phone to process the application message can refer to the related technology. For example, the cell phone may run a social application in the foreground and display an application interface containing the social messages described above in response to a user clicking on the notification bubble 202. Thus, the user can reply to the application message by operating on the application interface. For another example, the mobile phone responds to the operation of starting the social application instructed by the user, and displays the application interface of the social application, so that the user can look up the application message or reply to the application message by operating on the application interface provided by the social application.
In addition, the selection of the notebook computer to process the application message triggers the screen between the mobile phone and the notebook computer. Illustratively, the user may select the message card 204 on a notebook computer. After the notebook computer can detect that the user selected the message card 204, the notebook computer can remotely control the cell phone foreground to run the social application. After the social application enters the foreground running state, the mobile phone can display an application interface containing the application message. In addition, the notebook computer may display a screen window corresponding to the mobile phone on the desktop 203. The screen-throwing window corresponding to the mobile phone is used for synchronously displaying the current interface of the mobile phone. For example, when the current interface of the mobile phone is an application interface of the social application, the screen-throwing window also displays the application interface. After the screen-throwing window is displayed on the notebook computer, the user can control the mobile phone through the screen-throwing window in the notebook computer, for example, the mobile phone is operated to process the application message.
As one implementation, after the laptop determines that the user selects the message card 204, the cell phone may render the corresponding application interface on the viewable display area (i.e., cell phone display) through data interaction between the laptop and the cell phone. The content rendered on the display may be referred to as foreground display information for the handset. In addition, the mobile phone can mirror the foreground display information to the virtual interface. It can be understood that the virtual interface is a service facing the notebook computer, and the virtual interface is invisible in the display screen of the mobile phone, so that when the notebook computer calls the virtual interface, the content in the virtual interface can be displayed on the screen-throwing window in the notebook computer, and the screen-throwing window of the notebook computer can mirror the display screen of the mobile phone. Thereafter, the content displayed by the screen-drop window also changes synchronously with the content displayed in the mobile phone display. Therefore, a user can not only send control information to the mobile phone through the notebook computer, but also check the response condition of the mobile phone to the control information through the notebook computer, and real-time remote control is realized.
In some embodiments, after the notebook detects that the user indicates the message card 204 is selected, the handset may switch from displaying the main interface 201 to displaying the application interface 301, as shown in fig. 3. The application interface 301 may be an application interface corresponding to a social application, where the application interface 301 includes application messages, that is, application messages indicated by the message card 204.
As shown in fig. 3, after the notebook detects that the user indicates that the message card 204 is selected, a drop screen window 302 may be displayed on the desktop 203 of the notebook. During the display of the drop window 302, the content displayed by the drop window 302 is consistent with the content displayed by the handset. For example, as shown in fig. 3, in the case where the application interface 301 is displayed on the mobile phone, the screen-drop window 302 also displays the application interface 301.
Thereafter, the user can remotely control the cell phone through the screen window 302 displayed by the notebook computer. For example, the remote control handset inputs the reply content and sends it to the designated contact (Xiaoming). For another example, the remote control handset closes the social application. For another example, the remote control handset jumps to enable other applications, etc.
In addition, in the case of using a homologous screen between the notebook computer and the mobile phone, the display interface in the screen projection window 302 changes synchronously with the interface shown in the display screen of the mobile phone.
Of course, the drop window 302 has a close control disposed thereon. The notebook computer may cancel displaying the above-described drop screen window 302 in response to a user's selection of the "close control". After the notebook computer cancels displaying the screen-drop window 302, the handset also cancels the screen-drop.
In the above fig. 2 and fig. 3, the process of the mobile phone performing the homologous screen projection to the notebook computer is described that after the mobile phone receives the message notification in the state of screen lighting and unlocking, the mobile phone flows to the notebook computer, and then the user operates the message card on the notebook computer to select and display the message notification.
In the related art, as shown in fig. 4A, in the idle state of the mobile phone, the display screen of the mobile phone is in the idle state. At this time, if the mobile phone receives the application message from the social application, the mobile phone may briefly lighten the screen and display the notification bubble, or may not display the notification bubble, and continuously keep the black screen state. It will be understood that the above-mentioned black screen state refers to a state in which the display screen of the mobile phone is not lit or only the screen saver image is displayed.
As shown in fig. 4B, in the locked state, the mobile phone display may display the locked interface or also be in the black state. At this time, if the mobile phone receives an application message from the social application, the mobile phone may display a notification bubble on the lock screen interface.
In the scenario shown in fig. 4A and fig. 4B, the mobile phone may also transfer the application message stream to the notebook computer. In this way, the notebook computer can still display the message card 204 in the desktop 203.
Of course, in the scenario illustrated in fig. 4A and 4B, during the display of the message card 204 by the notebook computer, the cell phone continues to remain in the black state or display the lock screen interface even if the notebook computer determines that the user indicates the selection of the message card 204. It will be appreciated that during the period of maintaining the black screen state or displaying the lock screen interface, the mobile phone cannot run the social application in the foreground, that is, cannot display the application interface of the social application.
Correspondingly, as shown in fig. 5A, even if the notebook computer displays a screen window (e.g., screen window 501) corresponding to the mobile phone, the screen window 501 is also in a black screen state.
Correspondingly, as shown in fig. 5B, even though the notebook computer displays a screen-throwing window (e.g., screen-throwing window 501) corresponding to the mobile phone, the screen-throwing window 501 also displays a screen-locking interface.
It can be seen that in the scenario shown in fig. 5A and 5B, no matter whether the screen-throwing window 501 is in a black screen state or the screen-locking interface is displayed, the user cannot remotely control the mobile phone through the notebook computer, and process the application message. That is, the user can only select to unlock the mobile phone, directly operate the mobile phone, and complete the processing of the application message.
That is, when the mobile phone is in a screen-off or screen-lock state, there is one such scenario: when a user uses a notebook computer to play games or work in a study room or a living room in home, the mobile phone is placed in other rooms; when a contact (Xiaoming) on the social application sends a message to a mobile phone of a user, the user can immediately know that the mobile phone of the user receives a message notification on a notebook computer. However, in this scenario, the user is often focused on playing or working on the notebook computer, and does not want to get up to the room to take the mobile phone to view or reply to the message, but wants to start the application window for displaying the message on the notebook computer (i.e. the screen window corresponding to the mobile phone), and processes the message through the notebook computer. However, because the mobile phones in other rooms are in the screen locking and extinguishing state, based on the prior art, the user cannot directly start the screen of the application window for displaying the message on the notebook computer, but can start the screen of the application window corresponding to the mobile phone on the notebook computer after the mobile phone is unlocked and is in the screen brightening state by firstly going to the room.
Therefore, when the mobile phone is in a screen locking or screen extinguishing state, the user can start screen throwing on the notebook computer very inconvenient and time-consuming. Of course, in the related art, when the mobile phone is in the screen-off and screen-lock state at the same time, similar problems exist, and are not described herein.
In order to improve the above problems, the embodiment of the application provides a homologous screen projection method which can be applied to an electronic device supporting cross-device screen projection. The electronic device in the embodiment of the present application may be a mobile phone, a tablet computer, a desktop (desktop computer), a handheld computer, a notebook (laptop computer), an ultra-mobile personal computer, a UMPC, a netbook, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, or the like, in which the mail application may be installed, and the embodiment of the present application does not particularly limit the specific form of the electronic device. In addition, the operating system of the electronic device may be an Android (Android), an IOS (input/output) system or other operating systems, and the embodiment of the application does not limit the type of the operating system of the electronic device.
By adopting the method for starting the screen, provided by the application, even if the scene is faced, a user does not need to go to other rooms to operate the mobile phone, but can continue to wait in a study or a living room, and the screen-throwing window corresponding to the mobile phone can be started and displayed on the notebook computer only by clicking the message card corresponding to the mobile phone on the notebook computer, so that the mirror image display of the homologous screen throwing is realized, the user can process the message conveniently, and the user experience is improved.
In the following, referring to fig. 6A and fig. 6B, details of implementation of the method for implementing the homologous screen projection method provided by the present application are described by taking a mobile phone as a source device and a notebook computer as a target device.
As shown in fig. 6A, in the screen-off state, if the mobile phone receives an application message from the social application, the mobile phone may briefly lighten the screen and display a notification bubble, or may not display any content, and keep on the screen-off. In addition, the mobile phone can transfer the application message to the notebook computer. In this way, the notebook computer can still display the message card 204 in the desktop 203.
During the display of the message card 204, the notebook computer determines that the user indicates that the message card 204 is selected, as shown in fig. 6A, and the handset continues to remain in the black screen state. The notebook computer may display a drop screen window 601, which drop screen window 601 may display information prompting to wait, such as the word "wait for load". In addition, after the notebook determines that the user indicates to select the message card 204, the notebook may also instruct the cell phone to launch the social application. Of course, after the mobile phone starts the social application, the mobile phone may not display the application interface containing the application message, that is, the mobile phone may keep on keeping the black screen. But the mobile phone will project an application interface containing the application messages described above to the projection window 601 in the notebook computer. In this way, the content displayed in the screen projection window 601 changes from "information waiting for prompt" to an application interface corresponding to the application message. Thus, the user can remotely control the mobile phone to process the application message through the screen-throwing window 601 in the notebook computer.
As shown in fig. 6B, in the screen-locked state, if the mobile phone receives an application message from the social application, the mobile phone may display the notification bubble briefly, or may continue to display only the screen-locked interface. In addition, the mobile phone can transfer the application message to the notebook computer. In this way, the notebook computer can still display the message card 204 in the desktop 203.
During the display of the message card 204, the notebook computer determines that the user indicates that the message card 204 is selected, as shown in fig. 6B, and the handset continues to display the lock screen interface. The notebook computer may display a drop screen window 601, which drop screen window 601 may display information prompting to wait, such as the word "wait for load". After the notebook determines that the user indicates to select the message card 204, the notebook may also instruct the cell phone to launch the social application. Of course, after the mobile phone starts the social application, the mobile phone may continue to display the lock screen interface. But the mobile phone will screen the application interface containing the application message to the screen window 601 in the notebook computer. In this way, the content displayed in the screen projection window 601 changes from "information waiting for prompt" to an application interface corresponding to the application message. Thus, the user can remotely control the mobile phone to process the application message through the screen-throwing window 601 in the notebook computer.
In addition, after the screen-drop window 601 has displayed screen-drop content (e.g., an application interface containing application messages) from the handset, if the user lights up and unlocks the handset, the display of the handset may display the same content as the screen-drop window 601.
In other embodiments, the notebook may receive one or more application messages streamed by any device in the same trust ring. For example, the notebook computer may display an information list window in response to a user operation, which may collectively display the received message notifications, or collectively display the received and unprocessed message notifications, such as a first message from a first application in the first device, a second message from a second application in the first device, and a third message from a third device. As an implementation manner, the message notification may also be displayed in the form of a message card on the information list window, for example, a second control is used to display the second message, and a third control is used to display the third message.
Taking a notebook computer as an example, the notebook computer includes a touch screen. When the notebook computer detects a specified operation, for example, as shown in fig. 7, the notebook computer detects a slide-down operation of the user on the touch screen, an information list window 701 (third window) may be displayed. The information list window 701 displays an application message (e.g., message card 804) from the device 1, an application message (e.g., message card 805) from the device 3, and the like. It will be appreciated that the messaging notifications displayed in the information list window 701 are all recently received information from other devices in the trust ring by the notebook. Wherein the message card 804 corresponds to a message pushed by a social application in device 1. The message card 805 corresponds to a message (e.g., a short message) pushed by the information application in the device 3.
Similar to the message card 204 displayed on the desktop, when the notebook computer detects that the user selects any message card in the information list window 701, the device corresponding to the message card can be triggered to perform screen projection. For example, when the notebook computer detects that the user selects the message card 805, the device 3 may be triggered to perform screen projection, so that the device 3 is a source device and the notebook computer is a corresponding target device. For another example, when the notebook computer detects that the user selects the message card 804, the device 1 (i.e. the mobile phone) may be triggered to perform screen projection, so that the mobile phone is a source device and the notebook computer is a corresponding target device.
Of course, after the notebook computer instructs the corresponding device to perform screen projection, a screen projection failure may also occur. For example, in the case that an application program corresponding to the message card is protected by an application lock, the screen dropping failure is caused. For another example, in the case where a channel for screen projection is occupied by other devices, the screen projection is failed. For another example, if the network is unstable, a screen drop failure may occur.
Firstly, an application lock refers to that a user of source equipment sets protection for an application in order to protect the security or privacy of the application, and a preset password or fingerprint and the like need to be input to open the application. Therefore, in a scenario where the source device is in a locked screen state, in some possible embodiments, before the source device is actually put on the screen, it is determined whether an application program a (for example, an application program corresponding to a message card selected by a user on a notebook computer) is optionally protected by the source device by a layer of protection. If the source device sets an application lock for the application program a, which means that the owner of the source device wants to input a password or a fingerprint to unlock each time of access, the application with strong privacy protection is not suitable for starting screen projection directly at the target device side when the source device is in a screen-blocking state, which is contrary to the original purpose of setting the application lock by the user.
Thus, as shown in fig. 8, after detecting that the user selects the message card 805, the notebook computer transmits query information 1 to the corresponding device (i.e., device 3), the query information 1 being used to query whether the device 3 is configured with an application lock for the information application. If an application lock is configured in the device 3 for the information application, the device 3 will send feedback information 1 to the notebook computer, the feedback information 1 indicating that the information application is currently protected by the application lock. In this way, after receiving the feedback information 1, the notebook computer may display the reminding information 801, which is also called the first reminding information. The reminding information 801 is used for reminding the user that the application program corresponding to the selected message card is protected by the application lock. For example, the word "information application protected by application lock, please go to device 3 to unlock the application lock" may be used. In addition, the reminding information 801 may be displayed on the notebook computer in a floating window or card form, and the display form of the reminding information is not limited in the embodiment of the application.
Secondly, the source device and the target device may establish a screen-throwing connection through a WLAN or WI-FI P2P mode, and the method for establishing the screen-throwing connection may refer to the method for establishing the communication connection between the source device and the target device, which is not described herein. For convenience of description, the following embodiment establishes a screen-drop connection in a WI-FI P2P manner.
In some embodiments of the present application, as shown in fig. 8, after the notebook computer detects that the user selects the message card 805, it is determined whether the screen-on communication channel (Wi-Fi P2P channel) of the source device (i.e., device 3) and the target device (i.e., the notebook computer) corresponding to the message card 805 is already occupied by other services. The source device and the target device establish screen connection through the screen projection channel to perform data transmission, so that the mirror image of the content between two or more electronic devices is realized. When the source device and the target device have established other services based on WI-FI P2P, such as PC collaboration, device sharing, etc., a screen projection channel corresponding to the source device and the opposite device is occupied, so that the source device cannot project a screen to the target device based on WI-FI P2P.
For example, as shown in fig. 8, the notebook computer determines that a screen-throwing channel (Wi-Fi P2P) between the notebook computer and the device 3 is occupied, starts to throw a screen, fails, displays a reminding message 802, namely a second reminding message, and the reminding message 802 exemplarily prompts a user to "throw a screen connection conflict, please disconnect and retry after connection with other devices", so as to prompt the user to go to a source device (device 3) or a target device (notebook computer) to disconnect the screen-throwing connection with other devices. The reminding information 802 may be displayed on the notebook computer in a form of a floating window or a card, and the display form of the reminding information is not limited in the embodiment of the application.
Again, in some embodiments of the present application, as shown in fig. 8, after the notebook computer detects that the user selects the message card 805, it is determined whether the network status of the source device (i.e., device 3) corresponding to the message card meets the screen-throwing requirement. The network status includes, but is not limited to, a flow entry matching status, a packet execution port load status, a total number of lost/forwarded/received packets, or a TTR failure of a packet, as long as any network status may occur in the network. It will be appreciated that the network status may affect the quality of service of the screen, and that the notebook computer may optionally have a minimum network status threshold that satisfies the screen-casting condition. If the notebook computer determines that the current network state is lower than the minimum network state threshold, the current network state is considered to be incapable of meeting the screen-throwing requirement, so that the screen throwing is failed.
As shown in fig. 8, when the notebook computer determines that the current network state does not meet the requirement of screen projection, the screen projection is started and fails, and the reminding information 803 is also called third reminding information. The reminding information 803 can remind the user to try to trigger the source terminal equipment to throw the screen after the network is stable. Exemplary prompts the user to "network instability, please retry" to prompt the user to adjust and optimize the network state.
It will be appreciated that the above three scenarios are merely examples and are not limiting of embodiments of the present application. The notebook computer judges whether the corresponding application program is protected by the application lock, whether the screen throwing channel is occupied, and whether the network state meets the screen throwing requirement is not taken as the operation which is necessary for the embodiment of the application, and the judging sequence is not limited. In some embodiments, referring to fig. 9, fig. 9 is a flowchart of one possible determination mechanism. The notebook computer responds to the operation of clicking the message card by a user, sequentially judges whether an application lock is set, whether a screen throwing channel is occupied or not and whether the network state meets the screen throwing requirement or not at each judging node, and if one judging result triggers the scene of failed screen throwing starting, the round of judgment is terminated. After the user adjusts according to the prompt, the user can click on the message card again to trigger a new round of judgment. Of course, in some embodiments, the judging order of the judging mechanism may be adjusted, and after a certain judging node triggers a failure of starting the screen, the judging process stays at the judging node, and continuously detects whether the condition of entering the next judging node is met. And the user adjusts according to the prompt, and when the condition of entering the next judging node is met, the judging process automatically enters the next judging node.
It will be appreciated that, after the notebook computer detects that the user selects the message card 805, the notebook computer may first check whether the corresponding application program is protected by the application lock (may be referred to as application lock detection), then check whether the communication channel for screen projection is occupied (may be referred to as channel detection), and finally check whether the network condition meets the requirement (network detection for short). In fact, in the embodiment of the present application, the sequence of the above-mentioned multiple detection is not limited, that is, in some embodiments, the notebook computer may also check whether the communication channel for screen projection is occupied first, then check whether the corresponding application program is protected by the application lock, and finally detect the network condition.
In other possible embodiments, the notebook computer may further perform application lock detection, channel detection and network detection synchronously, and when the obtained detection result indicates that at least one detection item fails to trigger screen projection, display corresponding reminding information. For example, if the obtained measurement result indicates that the application program is protected by the application lock and the communication channel is occupied, the notebook computer can synchronously display the reminding information 801 and the reminding information 802.
In addition, when the desktop 203 displays the message card 204, if the notebook computer determines that the user selects the message card 204, the notebook computer also needs to perform application lock detection, channel detection and network detection for the mobile phone. Under the condition that the detection result indicates that none of the detection items trigger the screen-throwing failure, the notebook computer determines that the mobile phone has the screen-throwing condition, so that the notebook computer can display a screen-throwing window 601, and the screen-throwing window 601 corresponds to the mobile phone. Under the condition of failed screen throwing, the notebook computer can display corresponding reminding information.
In other embodiments, the source device may enable a distributed authentication function to ensure privacy of the locked or off-screen source device. That is, if the target device (notebook computer) determines that the source device has a screen-throwing condition through the above several detections (for example, application lock detection, communication channel detection and network detection) under the condition that the source device (for example, mobile phone) enables the distributed authentication function and is configured with the unlocking password, the target device may display a screen-throwing window, and the screen-throwing window at this time is used for displaying the unlocking verification interface. The unlocking password can be an authentication password for releasing the screen locking state of the mobile phone.
For example, as shown in fig. 10, after the notebook computer detects that the user selects the message card 204, if it is determined that the mobile phone already has a screen-throwing condition, the notebook computer may display a screen-throwing window 601. The screen projection window 601 may display an unlock verification interface first. Thus, after the notebook computer receives the mobile phone unlocking password input by the user and the unlocking password passes verification, the screen throwing window 601 is controlled to switch and display the information of prompting waiting. And during the period of displaying the waiting prompting information, the mobile phone responds to the indication of the notebook computer, starts the social application, draws an application interface of the social application in the virtual interface, and obtains drawing parameters of each display object in the virtual interface. The virtual interface is a service for a notebook computer, and is invisible on a display screen of the mobile phone. The above-described rendering parameters are relevant parameters for rendering the interface, which may also be referred to as display information in some examples.
After the mobile phone finishes drawing the application interface, the corresponding display information is sent to the notebook computer, and the notebook computer is instructed to render the application interface in the screen-throwing window 601, so that the screen-throwing window 601 switches the application interface for displaying the social application (for example, the application interface containing the application message corresponding to the message card 204), that is, the mobile phone throws the virtual interface into the screen-throwing window 601 of the notebook computer. That is, after the notebook computer receives the unlocking password of the mobile phone input by the user and the unlocking password passes verification, the mobile phone starts to perform the homologous screen projection to the notebook computer.
Referring to fig. 11, fig. 11 is a flowchart illustrating a distributed authentication process. As an implementation manner, in a scenario that the source device firstly throws a screen to the target device, if the source device does not set an unlocking password, the target device does not need to display an unlocking verification interface in a screen throwing window, but instructs the source device to directly perform homologous screen throwing. In this scenario, the target device may display "prompt waiting information" directly. After the source device draws the application interface to be projected, the target device controls the projection window to display the application interface to be projected.
In the scenario that the source device first projects a screen to the target device, if the source device sets an unlocking password, that is, before the user directly uses the source device, password authentication is also required. In this case, after the target device determines that the source device has the condition of screen projection, distributed authentication may be triggered. For a description of distributed authentication, reference is made to the above, and no further description is given here.
In addition, it should be noted that, the source device performs the homologous screen projection to the target device for the first time, and sets the unlocking password, so that no matter the source device is in the screen-locking state or the unlocking state, the distributed authentication mechanism is triggered when the target device instructs the source device to perform the homologous screen projection.
It should be noted that, the first start of the screen-throwing of the source device on the target device may refer to the first start of the homologous screen-throwing of the source device on the same target device, or may refer to the first start of the homologous screen-throwing in the period from the start of the target device to the current shutdown. The second explanation is specifically assumed that the target device is powered on for the first time, and is powered off for the first time after a period of use, and the first time from the first power on to the first power off is the first period of use. The target equipment is powered on for the second time, and the target equipment is powered off for the second time after a period of use, wherein the period from the second power on to the second power off is the second use period. During the first use period, the source device starts the homologous screen projection on the target device for the first time, which belongs to the first start in the second explanation; during the second use, the first initiation of the homologous screen drop by the source device on the target device also belongs to the "first" initiation referred to in the second explanation.
As an implementation manner, in a scenario where the source device does not first throw a screen to the target device, that is, in a scenario where the source device has started a homologous throw on the same target device, or in a use period from when the target device is started to when the target device is shut down, the source device starts a homologous throw on the target device.
The second explanation is specifically that during the first use period, the source device starts the homologous screen on the target device for the first time, and when the source device starts the homologous screen on the target device for the second time, the target device determines that the source device starts the homologous screen on the target device; the same is true for the third, fourth, etc. times thereafter. During the second use period, the source device starts the homologous screen on the target device for the first time, and the target device determines that the source device starts the homologous screen on the target device for the first time; when the source equipment starts the homologous screen projection on the target equipment for the second time, the target equipment judges that the source equipment starts the homologous screen projection on the target equipment; the same is true for the third, fourth, etc. times thereafter. If the source device starts the homologous screen on the target device, in order to reduce the number of times of inputting the password by the user on the premise of ensuring the privacy security of the user, optionally, the target device can firstly judge whether the time from the last time passing through the distributed authentication exceeds a preset duration threshold.
For example, the security authentication may not be re-performed when the time since the last pass of the distributed authentication does not exceed a preset duration threshold. It should be noted that, the preset duration threshold may be a default value or a value set by the user, which indicates that in this time range, the screen-throwing connection between the source device and the target device still meets the requirement of protecting the privacy security of the user. Therefore, when the target equipment determines that the time from the last time of passing the distributed authentication does not exceed the preset time threshold, the screen is directly started, and an unlocking verification interface is not required to be displayed, so that complex operation caused by repeated requirement of a user for inputting a password can be avoided, privacy safety of the user is ensured, and convenience of using the homologous screen by the user is improved.
Also illustratively, when the time since the last pass of the security authentication exceeds a preset duration threshold, the distributed authentication needs to be re-performed. It can be understood that when the target device determines that the time from the last time of passing through the distributed authentication exceeds the preset duration threshold, the screen-throwing connection between the source device and the target device is considered to have not met the requirement of protecting the privacy security of the user, and the distributed authentication flow under this condition is the same as the distributed authentication flow of "the source device starts the homologous screen throwing on the target device for the first time" in the above, and will not be described herein.
For example, assume that the source device is in a lock screen rest state and sets an unlock code, and that a homologous screen throw has not been previously initiated on the target device. When the target equipment determines that the user instructs the source equipment to perform screen projection and determines that the source equipment has a screen projection condition, a distributed authentication mechanism can be triggered, and after the distributed authentication is passed, the first homologous screen projection is successfully started; ending the first homologous screen throwing at the time T1; assuming that a preset duration threshold value set by a user is 1 hour, detecting again at the time T2 that the user instructs the same source terminal equipment to perform homologous screen throwing; the target equipment determines that the source equipment starts the homologous screen on the target equipment, and further judges whether the time of the T2 distance T1 is longer than 1 hour; the target device determines that the value of (T2-T1) is less than 1 hour, and directly starts the second homologous screen projection.
The following describes a software structure of an electronic device according to an embodiment of the present application with reference to fig. 12.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. It should be noted that, the left half of fig. 12 is a software architecture diagram of the source device, for example, may be a mobile phone. The right half of fig. 12 is a software architecture diagram of the target device, which may be a notebook computer, for example.
In some embodiments, as shown in the left half of fig. 12, the software framework of the source device may include an application layer, a capability service layer, an application framework layer (FWK), an Zhuoyun rows, and a system library and driver layer.
(1) Application layer
The application layer may include a series of application packages, such as social, information, conversation, camera, etc. applications (which may also be referred to as applications). For convenience of description, an application program will be hereinafter simply referred to as an application. The application on the source device may be a native application (for example, an application installed in the source device when the operating system is installed before the source device leaves the factory), or may be a third party application (for example, an application installed by a user through an application store), which is not limited in the embodiments of the present application.
(2) Capability service layer
The capability service layer provides capability support for implementing corresponding services, and as shown in fig. 12, may include a screen projection assistant, a virtualization service module, a screen projection service module, a device discovery authentication connection module, and the like.
The screen projection assistant may be a module for interacting screen projection related information with other electronic devices (such as target devices), so as to implement logical control and connection management from end to end of the screen projection, and is used for generating a session key (session key) to ensure the safety of a communication session between the source device and the target device. The screen projection assistant can comprise a screen projection management module, a virtualization service initialization module and other functional modules. The screen management module is responsible for managing screen related transactions, such as adding and removing VirtualDisplay related processing logic settings. Wherein VirtualDisplay may also be referred to as a virtual Display area, a virtual screen, a virtual interface, a virtual Display, or the like. The virtualization service initialization module performs initialization setting for starting the homologous screen-throwing virtualization service.
The virtualization service module is used for realizing the end-to-end logic control of the audio and video stream and is responsible for audio and video data code streams generated by Speaker, mic, camera and the like transmitted between the source equipment and the target equipment.
The screen-casting service module provides screen-casting capability (namely, the content of the source terminal device is cast to the target device for display) and reverse event (namely, the event triggered by the target device and controlling the source terminal device) control basic capability. The screen-throwing service module receives the instruction of the screen-throwing management module and provides corresponding screen-throwing service according to the instruction, so that the screen throwing of the program running locally to other electronic equipment (such as target equipment) is realized. The screen-throwing service module injects the countercontrol and input method into a DISPLAY (hereinafter referred to as an external screen) of the target device, and configures a VIRTUAL DISPLAY Flag to be Flag1 (e.g., virtual_display_flag_ PRESENTATION) when VirtualDisplay is created, so that the external screen mirrors the content of the DISPLAY (hereinafter referred to as a main screen) of the source device, thereby implementing homologous screen throwing.
The device discovery authentication connection module is responsible for discovery, authentication, connection and other works and is used for controlling the driving layer to realize the functions of proximity discovery, authentication, connection and the like between the source device and the target device, and can comprise a communication module and a security authentication module. And the communication module is used for providing a signaling transmission channel between the source equipment and the target equipment and sending an instruction for establishing communication connection and screen throwing connection between the source equipment and the target equipment to the driving layer. Meanwhile, the communication module is also used for starting the screen throwing assistant after the transmission channel is established. The security authentication module is used for distributed authentication, and is responsible for the machine owner to identify and start distributed unlocking authentication, so that a trust ring is obtained, and the privacy security of a user is ensured.
(3) Application framework layer (FWK)
The application framework layer provides an application programming interface (Application Programming Interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 12, the application framework layer may include SystemUI modules, an activity management service (ACTIVITYMANAGERSERVICE, AMS), a window management service (WindowManagerService, WMS), an input event management service (input MANAGER SERVICE, IMS), and a multi-screen framework module, etc.
The SystemUI module is a set of UI components for providing system-level information display and interaction for users, and is mainly used for realizing status bar information display (such as battery, wifi signal, 3G/4G and other icon display), notification panel (such as system message, third party application message), recent task bar display panel (such as display recently used application), and the like. In the embodiment of the application, the SystemUI module is responsible for a notification service, and is used for transferring the message notification flow of the source device to the target device.
The AMS is responsible for managing Activity, and for starting, switching, scheduling, managing and scheduling applications of each component in the system. For each Activity, there will be an application record (ActivityRecord) in the AMS corresponding to it, and this ActivityRecord records the status of the application's Activity. Each Activity creates a corresponding ActivityRecord object in the AMS, which can schedule the application's Activity process using this ActivityRecord as an identification; these ActivityRecord objects are managed in respective task stacks, each of which may add multiple ActivityRecord objects; all task stacks are managed uniformly by ACTIVITYSTACK and are responsible for the sequence, push and pop of TASKSTACK. Specifically, the AMS defines data classes for saving processes (processes), activities (activities), and tasks (tasks), respectively. The data class corresponding to the process (process) may include process file information, memory state information of the process, activity, service included in the process, and the like. The Activity information may be stored in ACTIVITYSTACK. Wherein ACTIVITYSTACK is used to uniformly schedule application activities. ACTIVITYSTACK can specifically store all the running Activity (i.e., FINAL ARRAYLIST mHistory) information, such as interface configuration information. For example, the running Activity may be saved in NEWARRAYLIST. ACTIVITYSTACK may also store information of historic run activities, such as interface configuration information. Note that the Activity does not correspond to an application, ACTIVITYTHREAD corresponds to an application. Android therefore allows multiple applications to run simultaneously, and in fact allows multiple ACTIVITYTHREAD to run simultaneously. In Android, the basic idea of Activity scheduling is as follows: each application process reports to the AMS when a new Activity is to be started or a current Activity is to be stopped. The AMS internally records all application processes, and when the AMS receives a report of start or stop, it first updates the internal records and then notifies the corresponding client process of the specified Activity to run or stop. Because the AMS has records of all activities inside, it can schedule these activities and automatically close the background activities according to the activities and the state of the system memory.
WMSs carry data and attributes related to "interfaces" for managing states related to "interfaces", i.e. for managing graphical user interface (GRAPHICAL USER INTERFACE, GUI) resources used on the cell phone screen, e.g. for managing window programs and event dispatches. The management window program refers to orderly outputting to a physical screen or other display devices under the assistance of an application server and the WMS according to the display request of the application program. Event dispatch refers to dispatching user events from a keyboard, physical buttons, touch screen, mouse, trackball (TraceBoll), etc., to the corresponding control or window. The management window program can also acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like, and is responsible for managing the display mode of the application window, including the coordinate size of window display, the window display level and the like. The method specifically comprises the following steps: creating and destroying a window, displaying and hiding the window, laying out the window, managing the focus, inputting a method, managing wallpaper and the like. In some embodiments of the present application, WMS may ensure visibility of a screen-drop window in a screen-hold state of a mobile phone lock.
The task stack management function of the AMS is used to create, close, and the like task stacks. The lifecycle management function of WMSs is to manage the lifecycle of windows. The two are matched with each other, a window can be activated through a callback (onResume) interface, and the activated window is displayed in the foreground; or pause the window by pausing (onPause) the interface to cause the window to switch from foreground to background display. Taking the top Activity of the uppermost window in the task stack as an example, the top Activity is called onResume to activate, then the top Activity is displayed in the foreground, when a new window is pushed into the task stack and the new window is located in the upper layer of the original top Activity, the original top Activity is called onPause to pause the window so that the original top Activity is switched from the foreground to the background for display, and a similar principle can be used for managing the life cycle for the new window.
The IMS may be used to translate, encapsulate, etc. the original input event, obtain an input event containing more information, and send the input event to the WMS, where clickable areas (such as controls) of each application, location information of a focus window, etc. are stored. Thus, the WMS may properly distribute the input event to the designated control or focus window. The IMS may respond to received incoming events. As shown in fig. 12, the IMS receives an input event of a user, forwards the input event to the WMS, distributes the WMS to a corresponding application, and the application transmits a registration request to the AMS, and starts the application.
The multi-screen framework module is responsible for implementing the homologous screen throwing logic, calls PendingIntetRecord an interface to start up ACVTIVITY (designated action send) corresponding to PENDINGINTET, and adds a homologous screen throwing label (such as Flag 2). PENDINGINTENT is an Android provided capability for an external program to call up its own program, and the life cycle is not related to the main program, and when certain conditions are met or certain events are triggered, specified actions are executed. The multi-screen framework module is further configured to monitor the sensed event and interact with the screen-drop assistant, for example, call ActivityRecord to monitor whether Activity OnResume (applied in the foreground and visible) and notify the screen-drop assistant of the monitored result; and calling Task monitoring VirtualDisplay to judge whether the stack is empty or not and informing a screen throwing assistant of the monitoring result. In addition, the multi-screen framework module is also used for switching the application thrown to the external screen back to the main screen when the user clicks the latest task card or the desktop icon. The multi-screen frame module is also used for removing the security layer in the screen locking state of the source terminal equipment. In some embodiments, the multi-screen framework module can also be used for making relevant modifications to the power key of the mobile phone in a screen-throwing scenario.
It is to be appreciated that the application framework layer can also include a content provider, a view system, a telephony manager, a resource manager, a notification manager, etc. (not shown in FIG. 12). For specific meaning, reference is made to the related art documents, and description thereof is not given here.
(4) An Zhuoyun rows and System libraries
The android runtime is responsible for scheduling and management of the system. The android runtime includes a core library and virtual machines. The core library comprises two parts: one part is the function that the programming language (e.g., java language) needs to call, and the other part is the core library of the system. The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface Manager (Surface Manager), media library (Media Libraries), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
(5) Drive layer
The driving layer is the basis of the Android operating system, and the final functions of the Android operating system are completed through the driving layer. As shown in fig. 12, the driving layer includes a bottom layer driver, and is responsible for discovery, authentication, connection, and other tasks, for example, the driving layer receives a command issued by a communication module in the application framework layer, and performs actions such as connection, disconnection, and the like. Specifically, the driving layer includes a device discovery module, a device authentication module, and a device connection module. The device discovery module is responsible for device discovery, the device authentication module is responsible for device authentication, the device connection module is responsible for device connection, and transmission channels such as WiFi, WI-FI P2P, bluetooth and the like can be established. Of course, the drive layer may also include hardware drivers such as a display screen driver, camera driver, audio driver, sensor driver, virtual card driver, and the like.
It should be noted that, the software structure schematic diagram of the electronic device shown in fig. 12 provided by the present application is only used as an example, and is not limited to specific module division in different layers of the Android operating system, and the description of the software structure of the Android operating system in the conventional technology may be referred to specifically. In addition, the method and the device for starting the screen projection can be realized based on other operating systems, and the application is not limited to one by one.
In some embodiments, as shown in the right half of fig. 12, the software framework of the target device may include an application layer, a capability service layer, an operating system, a driver layer. The partial functional modules of the target device and the roles of the functional modules may refer to the source device, and are not described herein.
Specifically, the application layer of the target device refers to the source device.
The capability service layer of the target device comprises a message center, a virtualization service module, a screen throwing service module and a device discovery authentication connection module. The message center is used for interacting with the message and signaling of the source terminal equipment, and comprises a notification service module. The notification service module is used for receiving message notifications flowing from the source terminal equipment and is responsible for initiating session instructions to the source terminal equipment (openSession). Meanwhile, the notification service module is further used for issuing an instruction for starting the screen-throwing service after the session is established. The virtualized service module, the screen-throwing service module and the equipment discovery authentication connection module refer to the source equipment, wherein the difference is that the screen-throwing service module of the target equipment is used for receiving and analyzing video streams and related information sent by other equipment, so that an application running on the other equipment is displayed on the local machine.
The operating system of the target device may be a Windows OS (e.g., a personal computer) or an Android OS (e.g., a tablet computer).
The driving layer of the target device refers to the source device.
Based on the software structure of the electronic device shown in fig. 12, taking a mobile phone as a source device and a notebook computer as a target device as an example, a specific implementation manner of the homologous screen projection method provided by the embodiment of the application is specifically described with reference to fig. 13 to 16.
Referring to fig. 13 to fig. 16, fig. 13 to fig. 16 are timing diagrams of a method for screen projection with homology according to an embodiment of the present application. It will be appreciated that fig. 13-16 are a series of sequential timing diagrams illustrating the flow of interactions between the various functional blocks shown in fig. 12.
As shown in fig. 13 to 16, the mobile phone (source device, which may also be referred to as a first device) includes SystemUI modules, a first communication module, a screen-throwing assistant, a first virtualization service module, a first screen-throwing service module, a first security authentication module, and a FWK; the notebook computer (target device is also called a second device) comprises a message center, a second screen projection module, a second communication module and a second security authentication module. It should be noted that, other functional modules shown in fig. 12 also participate in a specific implementation process of the homologous screen projection method provided by the embodiment of the present application, which is not shown in the present timing diagram. As shown in fig. 13, the method may include:
S101, in the case of finding the mobile phone, the message center module of the notebook computer sends a communication connection request 1 to the second communication module, wherein the communication connection request 1 carries the equipment identification code of the mobile phone.
In some embodiments, the target device may discover the trusted devices that are present in the surroundings (i.e., devices belonging to the same trust ring) by means of near field communication. The specific process of finding the mobile phone may refer to related technologies, and will not be described herein.
After the mobile phone is found, the message center module may send a communication connection request 1 to the second communication module of the notebook computer. The communication connection request 1 may be an inter-process communication (inter-process communication, IPC) message, where the communication connection request 1 may carry a device name, a device identification code, and a MAC address of the mobile phone, and is used to instruct the second communication module to establish a communication connection with a communication module (i.e., the first communication module) of the mobile phone.
S102, a first communication connection is established between the second communication module and the first communication module.
The first communication connection may be a communication connection established based on a near field communication manner, and is used for realizing data sharing between the notebook computer and the mobile phone, for example, a message notification received by the mobile phone is transferred to the notebook computer, so as to realize notification collaboration.
S103, after the first communication connection is created, the second communication module sends a connection success notification 1 to the message center module.
In some embodiments, after the first communication connection is successfully created, sharing of data between the mobile phone and the notebook computer may be achieved through the first communication connection, and the shared data may include application messages.
S104, the application in the mobile phone receives an application message.
In some embodiments, the application in the handset may be any application installed in the application layer, i.e., may be a system native application or may be a third party application. The application message (first message) may be from a message push server corresponding to the application program (first application). For example, the application message "do you run today" shown in FIG. 2 is a message pushing server from a social application, which may carry an application identification (e.g., application package name) of the social application.
And S105, the application sends the application message to the SystemUI module, wherein the application message comprises message content and a corresponding application package name 1.
As shown in fig. 2, the social application receives an application message, that is, "you run still today" sent by the contact's small name, and then the social application sends the application message "run still today" to the SystemUI module, so that the SystemUI module can obtain an application message, where the content of the message carried by the application message is "run still today", and the carried application package name 1 is "social". The application layer in the mobile phone receives the corresponding application message, and the application message can be forwarded to the SystemUI module.
And S106, the SystemUI module sends the application message to the first communication module.
In some embodiments, systemUI module forwards the received application message to the first communication module, instructing the first communication module to send the application message to other devices in the trust ring, such as a notebook in the same trust ring. When a plurality of trust devices are arranged in the trust ring, the application message can be sequentially sent to the first communication modules of the plurality of trust devices.
S107, the first communication module sends the application message to the second communication module.
In some embodiments, the first communication module may send an application message to the second communication module through the first communication connection, so as to implement streaming the application message to the notebook computer.
S108, the second communication module sends the application message to the message center module.
In some embodiments, the step S106 to S108 are signaling interaction steps in which the SystemUI module sends the application message to the message center module based on the first communication connection.
S109, the message center module controls the corresponding display to display the message notification control corresponding to the application message.
The display screen corresponding to the message center module is the display of the notebook computer. In some embodiments, the application message from the handset may be displayed by way of a display message notification control (first control). For example, the message notification control may be the message card 204 of FIG. 2.
That is, when the mobile phone is in an unlocked and bright screen state, as shown in the scenario shown in fig. 2, both the mobile phone and the notebook computer can display application messages received by the social application of the mobile phone. When the mobile phone is in a screen-off or screen-lock state, as shown in fig. 4A and fig. 4B, the notebook computer can display an application message received by the social application of the mobile phone.
Therefore, even if the mobile phone is not located at the side of the user, the user can also check the application message through the notebook computer or other equipment, notification coordination is realized, and the problem that the message check is not timely is reduced.
As shown in the foregoing embodiment, the notebook computer uses the message notification card to sense the user's operation, such as a touch operation or a cursor input operation, during the period when the desktop of the notebook computer displays the application message from the mobile phone, for example, during the period when fig. 4A and fig. 4B show the message card 204, and for example, during the information list window 701 of fig. 7. In some examples, the user may trigger the notebook computer to instruct the mobile phone to perform the homologous screen projection by operating the message notification control.
For example, in a scenario where the mobile phone is in a locked or off-screen state, as shown in fig. 14, the method may further include:
s201, the message center module receives a first operation of a user for a message notification control.
In some embodiments, the first operation may be an operation in which the user selects the message notification control. For example, clicking on the message notification control, that is, detecting that the user makes a click operation in the display area occupied by the message notification control at the message center module, the message center module determines that the first operation is received. For another example, when the cursor of the notebook computer moves to the display area occupied by the message notification control, a determination instruction input by the mouse is received, and the message center module determines that the first operation is received.
S202, the message center module sends a communication connection request 2 to the second communication module, wherein the communication connection request carries a Bluetooth channel identifier and a device identifier code of the mobile phone.
For example, the communication connection request 2 may be a openSession instruction, where the openSession instruction carries a bluetooth channel identifier and a device identifier of the mobile phone. That is, the message center module may send openSession an instruction to the second communication module instructing the second communication module to create a bluetooth communication channel to connect to the handset, such as referred to as a second communication connection.
S203, the second communication module establishes second communication connection with the first communication module.
In some embodiments, the second communication module may, in response to openSession instructions, create a second communication connection to the first communication module of the handset, which may be used to transport signaling involved in the distributed authentication process.
In addition, in the process of creating the second communication connection, the signaling interaction process between the second communication module and the first communication module may refer to related technologies, which are not described herein.
S204, the first communication module sends a starting instruction to the screen projection assistant module.
It can be appreciated that the first communication module may also perform inter-process data transfer with other software modules within the handset. Under the scene that the mobile phone does not need to throw the screen, the screen throwing assistant module is in a dormant state, and system resources of the mobile phone are not occupied. When the notebook computer indicates to create the second communication connection, the notebook computer means that the notebook computer initiates an invitation to the mobile phone to execute the screen-throwing, in this scenario, the first communication module may instruct to start the screen-throwing assistant module, that is, wake up the screen-throwing assistant module by sending a start instruction, and start a service corresponding to the screen-throwing assistant module, where the service is used to provide a function of the screen-throwing assistant module.
S205, starting a screen projection assistant module.
In some embodiments, after the screen-throwing assistant module is started, a service corresponding to the screen-throwing assistant module is created, and the service can monitor the networking state between the screen-throwing assistant module and the notebook computer, that is, whether the second communication connection is successfully established. In the case of the monitored successful networking, the service corresponding to the screen-throwing assistant can instruct the second communication module to transmit the notification of the successful networking (or called as a connection success notification 2) to the message center module of the notebook computer through the second communication connection.
S206, the second communication module forwards a connection success notice 2 to the message center module, and indicates that the second communication connection is established.
S207, the message center module sends a starting instruction to the second screen service module.
The starting instruction is used for indicating to operate the second screen-throwing service module, and enabling the service corresponding to the second screen-throwing service module.
In some embodiments, before S207, the message center module may further negotiate communication parameters with the screen-drop helper module based on the second communication connection between the second communication module and the first communication module, for example, may negotiate a service set identifier (SERVICE SET IDENTIFIER, SSID), and the specific procedure may refer to the related art and will not be described herein.
S208, the second screen service module starts corresponding service.
S209, the second screen service module sends a starting success notice 1 to the message center module.
The start success notification 1 is used for notifying the message center module that the second screen-throwing service module has entered the allowed state, and can provide corresponding services.
In some embodiments, as shown in fig. 10, after the user selects the message card displayed on the notebook, distributed authentication may also be performed between the notebook and the mobile phone. Implementation details of distributed authentication are described below in conjunction with fig. 15, and illustratively the above method further includes:
s301, the message center module sends a distributed authentication request to the second security authentication module. As can be seen from the above, since the mobile phone is in the screen locking state, the above-mentioned request for starting the screen-throwing service can trigger the distributed authentication mechanism in order to ensure the privacy security of the user. Because the notebook computer instructs to start the screen-throwing service, the screen-throwing service starting request is initiated by the notebook computer, that is, the notebook computer can trigger a distributed authentication mechanism, of course, in the distributed authentication process, the password authentication is actually executed by the mobile phone, so that the security authentication module of the notebook computer needs to send the authentication information received by the notebook computer to the security authentication module of the mobile phone, and the security authentication module of the mobile phone executes the password authentication.
S302, the second security authentication module controls the display screen to display a distributed authentication window.
In some embodiments, the display screen refers to a display screen of a notebook computer. The message center initiates a distributed authentication request to a second security authentication module of the notebook computer, triggers the second security authentication module to call a view interface, displays a distributed authentication window (i.e. a screen-throwing window displaying an unlocking authentication interface, which may be called a second window) on a desktop of the notebook computer through the view interface, and guides a user to input password information (which may be called an unlocking password) corresponding to the mobile phone through displaying the distributed authentication window.
S303, during the period of displaying the distributed authentication window, the second security authentication module receives password information input by a user.
The above-mentioned password information may be, for example, fingerprint information entered during the display of the distributed authentication window. Also illustratively, the above-mentioned password information may also be voice authentication information or the like collected during the display of the distributed authentication window. Of course, the above-mentioned password information may also be a character key input by a user, and the embodiment of the present application is not limited thereto.
S304, the second security authentication module sends the password information to the second communication module.
S305, the second communication module sends the password information to the first communication module through the second communication connection.
S306, the first communication module sends password information to the first security authentication module.
Thus, through the above steps S304 to S306, the second security authentication module of the notebook computer may transmit the received password information to the first security authentication module of the mobile phone, and may also be referred to as the second security authentication module sending the received password information to the first security authentication module of the mobile phone based on the second communication connection. After the first security authentication module receives the password information acquired by the notebook computer, the first security authentication module can compare the password information with the unlocking password set by the mobile phone by a user, if the password information acquired by the notebook computer is matched with the password set by the mobile phone, authentication is successful, otherwise, authentication fails.
In some embodiments, in the authentication failure scenario, the mobile phone may notify the notebook computer of the password authentication failure, and optionally, after the notebook computer determines that the password authentication failure, the mobile phone may prompt the user for a password error in the distributed authentication window, guide the user to reenter the password information, and request the mobile phone to perform password authentication again. Of course, when the number of authentication failures exceeds the password error number threshold, the authentication is suspended, and the password error number threshold can be set by the user or a default value. In the case where authentication is successful, the flow may proceed to S307.
S307, the first security authentication module determines that the password information is authenticated.
S308, the first security authentication module sends an authentication success notification to the first communication module.
Wherein the authentication success notification is used for indicating that the password information passes authentication. The authentication success notification is also referred to as first response information.
S309, the first communication module sends an authentication success notification to the second communication module through the second communication connection.
S310, the second communication module sends an authentication success notification to the second security authentication module.
S311, the second security authentication module sends an authentication success notification to the message center module.
The above-mentioned S308 to S311 may also be referred to as the first security authentication module sending an authentication success notification to the second security authentication module based on the second communication connection.
S312, the message center module controls the display to display the transitional loading window.
It can be understood that in the case of successful distributed authentication, the notebook computer can instruct the mobile phone to perform homologous screen projection, however, a certain time interval exists between the instruction of screen projection and the success of screen projection, so that a certain time span exists between the display of the distributed authentication window and the switching of the screen projection window including the application interface by the notebook computer. Optionally, to improve the user's look and feel and experience, after the security authentication is successful, a transitional loading window may be started, as in fig. 10, where a screen-throwing window of "prompt waiting information" is displayed, and the transitional loading window may prompt the user that the user has passed the distributed authentication and is starting the homologous screen-throwing.
In some embodiments, after distributed authentication, the mobile phone may formally perform a homologous screen projection to the notebook computer. In other embodiments, the distributed authentication may be skipped after determining that the user selects the message notification control (e.g., message card 204), and the handset may be directed to make a homologous screen throw. For example, the scenario in which the distributed authentication is not performed may be a scenario in which the mobile phone is not configured with an unlock password. Also, for example, the scenario in which the distributed authentication is not performed may also be that the time (first time point) when the last mobile phone feeds back that the distributed authentication was successful does not exceed the preset duration threshold (first threshold).
The following describes the procedure of homologous screen projection with reference to fig. 16 and 17:
s401, the message center module sends a communication connection request 3 to the second communication module, wherein the communication connection request carries Wi-Fi P2P channel identification and a device identification code of the mobile phone.
For example, the communication connection request 3 may be a openSession instruction, where the openSession instruction carries a Wi-Fi P2P channel identifier and a device identifier of the mobile phone, and is used to instruct the second communication module to create a Wi-Fi P2P communication channel with the mobile phone, which is called a third communication connection.
In addition, the types of the first communication connection, the second communication connection and the third communication connection established between the notebook computer and the mobile phone are not limited to Bluetooth and Wi-Fi P2P, and can be other types of connection modes, so long as the requirements of data transmission between the two can be met. It can be understood that bluetooth and Wi-Fi P2P mentioned in the foregoing embodiments are only examples, if other types of communication connection are created, the corresponding communication connection request carries a corresponding service channel name, similar to the foregoing embodiments, where the communication connection request 2 carries a bluetooth channel identifier, and the communication connection request 3 carries a Wi-Fi P2P channel identifier.
S402, the second communication module establishes third communication connection with the first communication module.
In some embodiments, the second communication module may create a third communication connection to the first communication module in response to the communication connection request 3, which may be used to transmit signaling involved in the homologous screen casting process.
In addition, during the creation of the third communication connection, the signaling interaction procedure between the second communication module and the first communication module may refer to the related art, and will not be described herein.
S403, after the third communication connection is created, the second communication module sends a connection success notification 3 to the message center module.
Wherein the connection success notification 3 is used for notifying that the third communication connection has been created.
And S404, based on the third communication connection, the message center and the screen throwing assistant carry out parameter negotiation to determine the session key and the IP address of the mobile phone.
In some embodiments, the parameter negotiation includes the service side (notebook) negotiating the SessionKey and obtaining the IP address of the peer device (handset). The SessionKey is also called a data encryption key or a working key, is an encryption and decryption key which is randomly generated by ensuring a secure communication session between a user and other computers or two computers, and can be obtained by negotiation between communication users. It is typically generated dynamically, only when session data encryption is required.
In addition, S404 includes the message center module sending a message to the second communication module, the second communication module transmitting the message to the first communication module through the third communication connection, and the first communication module sending the message to the screen-drop assistant module. For convenience of description, the following "third communication connection" includes a data transmission procedure between the first communication module and the second communication module, which is not described in detail below.
And S405, the message center sends the session key and the IP address of the mobile phone to the second screen-throwing service module.
S406, the second screen service module initializes according to the session key and the IP address of the mobile phone.
For specific implementation procedures, reference may be made to the related art, and details are not repeated here.
S407, the second screen service module returns a message indicating that the initialization is completed to the message center module.
From the foregoing, it can be seen that the above-described screen-projection service module provides screen-projection capability and reverse event control basic capability. After the second screen service of the notebook computer is initialized successfully, the capability support is provided for receiving and analyzing the related information sent by the mobile phone end, so that the application running in the mobile phone can be displayed on the notebook computer. Meanwhile, the reverse event control basic capability provides support for realizing homologous screen projection. The screen-throwing service module needs to be initialized before the homologous screen-throwing service is performed, so that the screen-throwing capability and the reverse event control basic capability are ensured to be normally provided.
S408, the message center module sends a message indicating to start the screen-throwing service to the second screen-throwing service module.
S409, based on the third communication connection, the second screen-throwing service module sends a message indicating to start the screen-throwing service to the screen-throwing assistant module.
It should be noted that, the step S409 includes the second screen-throwing service module sending a message to the second communication module, the second communication module transmitting the message to the first communication module through the third communication connection, and the first communication module sending the message to the screen-throwing assistant module. For convenience of description, the following "third communication connection" includes a data transmission procedure between the first communication module and the second communication module, which is not described in detail below.
S410, the screen projection assistant module binds the screen projection service.
It can be appreciated that the screen-casting management module in the screen-casting helper module binds the relevant screen-casting service to manage the screen-casting transaction.
S411, the screen projection assistant module sends a message indicating initialization to the first virtualization service module.
It can be appreciated that the virtualization service initialization module in the screen-drop helper module is configured to instruct the virtualization service module to initialize.
S412, the first virtualization service module completes initialization.
The first virtualization service module is responsible for audio and video virtualization in screen-throwing service, interaction between the first virtualization service module of the mobile phone and a second virtualization service module (not shown in fig. 12) of the notebook computer can realize functions of enabling a mobile phone application which is thrown on the notebook computer to use Speaker, mic, camera of the notebook computer in the screen-throwing process, transmitting generated audio and video data code streams between the mobile phone and the notebook computer, and accordingly achieving functions of making audio and video calls and the like when the screen is thrown.
S413, the first virtualization service module sends a message indicating to start the screen-throwing service to the first screen-throwing service module.
S414, starting the first screen service module.
S415, the first screen service module creates a virtual display and adds a virtual display flag thereto.
It should be noted that the screen-throwing service module is an important functional module for implementing the homologous screen-throwing mode. The specific implementation manner is that the first screen-throwing service module of the mobile phone adds Flag1 when creating a virtual display area (VirtualDispaly, which can also be called a first display area): the VIRTUAL DISPLAY FLAG PRESENTATION (a VIRTUAL DISPLAY FLAG, for example, called a second tag) is used to implement that the screen-throwing window on the notebook computer mirrors the content displayed on the mobile phone DISPLAY screen.
Therefore, the screen locking or screen locking interface can be displayed on the mobile phone in the screen locking or screen locking state of the mobile phone, and the screen throwing window of the notebook computer can not display the screen locking or screen locking interface. Meanwhile, the reverse event control basic capability provided by the screen-throwing service module can realize reverse control of the mobile phone on the notebook computer. Of course, the implementation of the homologous screen-projection mode also requires that the second screen-projection service module of the notebook computer and the first screen-projection service module of the mobile phone cooperate and interact, which is not shown in fig. 16 and 17.
S416, the first screen-throwing service module sends a starting success notice 2 to the screen-throwing assistant module, and indicates that the first screen-throwing service module is started.
S417, based on the third communication connection, the first screen projection assistant sends a start success notification 2 to the message center module.
S418, based on the third communication connection, the message center module sends a homologous screen throwing request to the FWK of the mobile phone, carrying an application package name 1.
In some embodiments, the homologous screen-drop request is a first request, and the application package name 1 is an application package name (i.e., an application identifier) corresponding to a message notification control selected by a user. For example, as shown in fig. 4A and 4B, when the user selects the message card 204, the message card 204 displays an application message from the social application in the mobile phone, so that the corresponding application package name 1 is "social application".
S419, FWK starts a homologous screen projection window.
In the state of locking or extinguishing the screen of the mobile phone, in order to ensure that the homologous screen-throwing window on the notebook computer can normally display the application interface, the FWK is required to control the visibility, non-dormancy and operability (corresponding Activity is non-dormancy, visible, etc.) of the homologous screen-throwing window. It can be appreciated that the specific implementation process of the FWK involves functional interaction between embedded classes of the FWK, and the interaction flow is referred to below and is not described herein.
S420, the FWK sends display data of the homologous screen-throwing window to the first screen-throwing service module.
S421, based on the third communication connection, the first screen projection service module sends the display data to the second screen projection service module.
S422, the second screen-throwing service module controls the corresponding display to render the display data to the screen-throwing window.
The second screen-throwing service module controls the corresponding display, namely the display screen of the notebook computer.
S423, the second screen-throwing service module sends a screen-throwing starting success notice to the message center.
S424, the message center module controls the corresponding display to cancel displaying the transitional loading window.
The display corresponding to the message center module refers to a display screen of a notebook computer.
S425, the message center module controls the corresponding display screen to display the homologous screen throwing window.
So far, the notebook computer displays a homologous screen throwing window (a first window) corresponding to the application message, and the screen throwing is successful when the mobile phone is in a screen locking and extinguishing state. That is, as shown in fig. 10, in the screen projection window 601 of the notebook computer, the interface displaying "information for prompting waiting" is switched to the application interface displaying the social application.
In some embodiments, the method provided by the embodiment of the present application may not include the steps shown in fig. 14 and 15 without performing distributed authentication between the mobile phone and the notebook computer. In this scenario, after receiving the first operation of the user for the message notification control, the message center module of the notebook computer triggers to establish a third communication connection with the mobile phone, and starts the screen-throwing assistant of the notebook computer. After the session key and the IP address are negotiated between the mobile phone and the notebook computer, the notebook computer is triggered to start the screen-throwing service module.
The specific implementation process of starting the FWK implementation application and displaying the external screen of the homologous projection window in the foregoing embodiment is described below with reference to fig. 18, 19 and 20.
In the related art, when the mobile phone is in a screen locking state or a screen extinguishing state, the Activity corresponding to the application message enters a dormant state, and the life cycle of the mobile phone can go from OnResume state (visible to a user and interacted with the user) to OnPause state (inactive state), so that the application interface cannot be normally displayed when the mobile phone is put on a notebook computer. In order to realize the screen projection of the application, the Activity of the application needs to be ensured to skip the dormant state, and the currently executed Activity is kept in OnResume states so as to ensure the visibility of the screen projection window corresponding to the application.
Referring to fig. 18, 19 and 20, fig. 18, 19 and 20 are specific implementations of step S419 in fig. 17. As shown in fig. 18, 19 and 20, the FWK further includes built-in classes such as SYSTEMSERVER, powermanager service, pendingIntentRecord, activityStarter, window management modules (WindowManagerEx), activityTaskManagerServiceEx, activityTaskSupervisor, lock screen controllers (KeyguardController), activityRecord, task, rootWindowsContainer, surfaceFlinger, and the like.
S501, systemUI module sends a lock screen processing request to SYSTEMSERVER.
In some embodiments, S418 may be the message center module sending a homologous screen drop request to SystemUI modules. In some embodiments, the request for the screen lock includes an identifier indicating the screen lock, so that, after the SystemUI module recognizes the identifier, a request for screen lock processing is sent to SYSTEMSERVER. In other embodiments, the homologous screen request does not include an identification indicating a homologous screen, and the SystemUI module, in response to the homologous screen request, detects that FLAG1 is marked in VirtualDisplay (i.e., virtual_display_flag_ PRESENTATION), and if FLAG1 is marked in VirtualDisplay, sends a screen lock processing request to SYSTEMSERVER.
S502, SYSTEMSERVER instructs powermanager service to change the sleep state of the virtual interface in response to the lock screen processing request.
The SYSTEMSERVER may start various services required in the mobile phone system, for example, powermanager service. powermanager service is responsible for power management of the mobile phone system, and common functions are as follows: light up screen, extinguish screen, enter screen saver, etc., powerManagerService inherits from SYSTEMSERVICE, is started by SYSTEMSERVER, registers into system services, interacts with other components through the Binder.
It will be appreciated that the virtual interface, virtualDisplay, is an invisible display area in the phone, and that VirtualDisplay provides services to the notebook computer, where the display in the screen window of the notebook computer comes from the display drawn in VirtualDisplay. Under the scene of the homologous screen throwing, the virtual interface needs to mirror the display screen of the mobile phone. When the mobile phone is in the screen-off state, the display screen of the mobile phone is dormant, and correspondingly, the virtual interface is also in the dormant state. In order to realize the homologous screen throwing under the screen-off condition, the mobile phone needs to control the virtual interface to skip dormancy. In some embodiments, the handset may invoke powermanager service to manage the virtual interface and release the dormant state of the virtual interface.
S503, powermanager service control the virtual interface to keep the non-sleep state.
Thus, even if the display screen of the mobile phone is dormant, virtualDisplay in the mobile phone cannot be dormant, so that VirtualDisplay can project a visible and operable screen-throwing window on the notebook computer.
S504, powermanager service instructs the window management module to cancel the lock screen policy for the virtual interface.
S505, the window management module cancels the screen locking strategy for the virtual interface.
The above-mentioned lock screen policy may be a timeout lock screen, and after the lock screen policy for the virtual interface is cancelled, the virtual interface will not display the lock screen interface (i.e. the lock screen layer is not drawn in the virtual interface), and will not be locked. After the virtual interface is projected to the notebook computer, the displayed screen projection window is not dormant and the screen cannot be locked.
S506, systemUI module sends application package name 1 to PendingIntentRecord.
The application package name 1 is from an application package name carried in a homologous screen throwing request. The application package name 1 may be the name of an application installed in the handset.
S507, pendingIntentRecord is to make all the Activity marks corresponding to the application package name 1 into the same source screen label.
Wherein, all activities corresponding to the application package name 1 refer to all activities (i.e., first process) of the application program corresponding to the application package name 1.
It should be noted that, the Activity added with the homologous screen-throwing label (first label) is started to the external screen (display screen of the notebook) to perform homologous screen throwing, step S507 is equivalent to an Activity filter, and filters out all activities of the application corresponding to the application package name, so that all activities of the application to be started to the external screen are displayed in a homologous screen throwing manner.
Wherein, the homologous screen-throwing label can be Flag2: int. java- - > FLAG_ACTIVITY_PC_CAST_NOTIFYITION=0x04000000. The activities that need to be launched to the external screen are marked so that ACTIVITYSTARTER can identify the activities that need to be launched to the external screen.
S508, pendingIntentRecord instructs ACTIVITYSTARTER to initiate Activity with the homologous drop tag.
In some embodiments PendingIntentRecord may instruct ACTIVITYSTARTER to initiate Activity labeled with a homologous screen tag by sending a homologous screen shot instruction. Of course, ACTIVITYSTARTER cannot independently complete the starting of the Activity marked with the homologous screen-drop tag, and needs to call other software processes to complete, so the flow also goes to S509.
S509, ACTIVITYSTARTER sends a request to ActivityTaskSupervisor indicating that a homologous screen is to be started.
S510, activityTaskSupervisor sends KeyguardController a message indicating that the Activity foreground is visible.
Note that ActivityTaskSupervisor monitors whether the Activity is in a foreground visible state by calling KeyguardController, if so, it is explained that after the application screen window corresponding to the Activity is started to the notebook computer, the application screen window runs in the foreground of the notebook computer and is visible to the user (OnResume state); if the Activity is in a stop state (OnPause state), the fact that the application screen window corresponding to the Activity is started to the notebook computer is that the application screen window runs on the acquired day after the application screen window is started to the notebook computer, and the user cannot operate the application screen window. Wherein OnResume states and OnPause states are two states in the life cycle of the Activity. When the Activity is in OnResume states, the Activity is in an active state which is visible and can interact with a user, and the user can obtain the focus of the Activity; when an Activity is in OnPause's state, it is covered by another transparent or Dialog style Activity, while it remains connected to the window manager, the system continues to maintain its internal state, it remains visible, but it has lost focus and is not interactable with by the user. When the mobile phone is in a screen locking state or a screen extinguishing state, the AMS component starts the flow of the Activity, and the life cycle of the Activity is switched to onPause, so that the Activity cannot interact with a user, and the screen throwing is abnormal.
S511, keyguardController identifies the Activity marked with the homologous screen-drop tag and places it in the foreground visible state.
KeyguardController is a screen locking controller, which identifies the Activity marked with the homologous screen-throwing label and returns to "turn" to indicate that the Activity needs to be placed in a foreground visible state, so as to ensure that an application screen-throwing window corresponding to the Activity marked with the homologous screen-throwing label is visible on the target device in the screen locking state of the source device.
S512, keyguardController sends a message to ActivityRecord indicating that the Activity foreground is kept visible.
S513, activityRecord controls the Activity marked with the homologous screen-throwing label to keep the foreground visible state.
S514, activityRecord sends a message indicating verification to the Task, and the Task is indicated to verify the Activity state at the top of the Task stack.
S515, the Task determines that the Activity at the top of the Task stack is in a foreground visible state.
It should be noted that an application may correspond to one or more activities, and the one or more activities may be stored in one or more task stacks. The task stack follows the rule of last-in first-out, and the Activity at the top of the task stack is executed first. In order to ensure that the foreground of the application screen-throwing window corresponding to the Activity to be executed is visible, the Activity at the top of the stack needs to be ensured to be in OnResume.
It will be appreciated that steps S514 and S515 correspond to a fault tolerant mechanism, and check whether the Activity at the top of the task stack is OnResume, so that these two steps are optional, i.e. in other embodiments, steps S514 and S515 may not be included.
S516, task instruction RootWindowsContainer places all tasks in a sleep state.
S517, rootWindowsContainer traverses the activities of all tasks, skips the activities of the homologous screen-throwing labels, and places other activities in a dormant state.
It should be noted that, in addition to the Activity of marking the homologous screen-drop tag skipping the sleep process, other activities may be put into sleep. Therefore RootWindowsContainer needs to traverse the activities corresponding to all tasks, filter the activities added with the homologous screen-throwing labels from the activities and return to "false" to indicate that the activities do not go to sleep; and the other activities return to true to enable the user to enter the sleep flow.
Illustratively, both Activity1 and Activity2 in the same task stack are added with homologous screen marks, rootWindowsContainer making Activity1 and Activity2 skip the sleep flow; and the Activity3 and Activity4 without the homologous screen marker are put into a dormant state.
When the Activity1 is located at the top of the Task stack, the Task checks the state of the Activity1, so that the Activity1 is in a foreground visible state, and then the Activity1 is visible to a user on the PC after being subjected to homologous screen projection and can be operated by the user in the foreground.
After the Activity1 is executed, the non-dormant Activity2 is advanced to the stack top of the Task stack, and the Task checks the state of the Activity2, so that the Activity2 is in a foreground visible state, and the Activity2 is visible to a user on the PC after being subjected to homologous screen projection and can be operated by the user in the foreground. The execution flow of other activities added with the homologous screen marker is similar, and will not be repeated here. The Activity without the homologous screen marker is put into a dormant state by RootWindowsContainer and is not started to the external screen display.
S518, ACTIVITYSTARTER sends an application start instruction to the application layer, including the application identifier 1.
The application starting instruction is sent to an application layer of the mobile phone and is used for starting an application program indicated by the application identifier 1.
It can be understood that when the ACTIVITYSTARTER sends the request for starting the homologous screen in step S509, a request for starting the application is also sent, because the application is to be started on the mobile phone side for implementing the homologous screen. Step S518 includes the interaction of the FWK with the application layer, i.e. ACTIVITYSTARTER in the FWK invokes the application of the application layer.
S519, the application layer starts the application indicated by the application identifier 1.
And S520, the application layer sends a message of successful starting to ActivityTaskManagerServiceEx.
S521, activityTaskManagerServiceEx sends a message to SurfaceFlinger indicating that layer synthesis is to be initiated.
S522, surfaceFlinger synthesizes the application layer on the virtual display area to obtain the corresponding display data.
Wherein SurfaceFlinger serves to receive graphic display data from multiple sources, synthesize them, and then transmit them to a display device. For example, an application is opened, three layers of display are commonly used, a top statusbar, a bottom or side navigation bar and an application interface are commonly used, each layer is updated and rendered independently, and the interfaces are synthesized by SurfaceFlinger to refresh display data of one application layer to a hardware display.
The application layer may include an application interface corresponding to the currently executed Activity. The display data (first data) may include information such as different display objects in the application layer, display positions and sizes of the display objects in the virtual display screen.
The virtual display area is the virtual display area of Flag1 created in step S415. The content of the virtual display area is invisible to the user, and the mobile phone can use the virtual display area as canvas which occupies a certain storage space. For example, referring to fig. 21A and 21B, fig. 21A and 21B are schematic diagrams of a composite application layer. When executing the Activity1 of the application 1, as shown in fig. 21A, surfaceFlinger may synthesize a corresponding application layer in real time in the region 1 of the virtual display region. When executing the Activity2 of the application 1, as shown in fig. 21B, surfaceFlinger may synthesize a corresponding application layer in real time in the region 2 of the virtual display region. The virtual display area may occupy the storage space 1, the area 1 in the virtual display area corresponds to one segment of the storage address in the storage space 1, and the area 2 in the virtual display area corresponds to another segment of the storage address in the storage space 1, i.e. the virtual display area may have a location attribute.
In addition, in the homologous screen projection scene, the application layer synthesized on the virtual display area of the Flag1 can be used as display data and output to the notebook computer, so that the notebook computer presents the display data in the corresponding homologous screen projection window.
And S523, surfaceFlinger, sending display data to the first screen service module.
After SurfaceFlinger sends the display data to the first screen service module, as shown in fig. 17, the first screen service module may transfer the display data to the second screen service module, that is, perform the steps shown in S422 to S425.
In other embodiments, following S523, activityTaskManagerServiceEx may also send a notification of the success of the screen shot through ACTIVITYSTARTER, and then ACTIVITYSTARTER also communicates the notification of the success of the screen shot to the message center module of the notebook using the third communication connection. Then, S424 and S425 are performed.
The embodiment of the application also provides electronic equipment, which can comprise: a memory and one or more processors. The memory is coupled to the processor. The memory is for storing computer program code, the computer program code comprising computer instructions. The computer instructions, when executed by the processor, cause the electronic device to perform the steps performed by the mobile phone or notebook computer in the above embodiments. Of course, the electronic device includes, but is not limited to, the memory and the one or more processors described above.
The embodiment of the application also provides a chip system which can be applied to the terminal equipment in the embodiment. As shown in fig. 22, the system-on-chip includes at least one processor 2201 and at least one interface circuit 2202. The processor 2201 may be a processor in an electronic device as described above. The processor 2201 and the interface circuit 2202 may be interconnected by wires. The processor 2201 may receive and execute computer instructions from the memory of the electronic device described above through the interface circuit 2202. The computer instructions, when executed by the processor 2201, cause the electronic device to perform the steps performed by the cell phone or notebook computer in the above embodiments. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
In some embodiments, it will be clearly understood by those skilled in the art from the foregoing description of the embodiments, for convenience and brevity of description, only the division of the above functional modules is illustrated, and in practical application, the above functional allocation may be implemented by different functional modules, that is, the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific implementation of the embodiment of the present application, but the protection scope of the embodiment of the present application is not limited to this, and any changes or substitutions within the technical scope disclosed in the embodiment of the present application should be covered in the protection scope of the embodiment of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (31)

1. A method for homologous screen projection, which is applied to a first device and a second device, wherein the first device is in a screen-off state or a screen-locking state, and the method comprises:
The first device receives a first message, wherein the first message is an application message of a first application;
The first device forwards the first message to the second device, the second device displays a first control, and the first control is used for displaying the first message from the first device;
The second device responds to the operation of the user on the first control and sends a first request to the first device;
The first device responding to the first request, creating a first display area, wherein the first display area is a display area invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn;
the first device starts the first application, wherein the first application comprises a first process, and the first process is not dormant;
The first device sends first data to the second device after drawing an application interface of the first process in the first display area;
the second device displays a first window in response to the first data, wherein the first window displays display content drawn in the first display area.
2. The method of claim 1, wherein the first request carries an application identifier of the first application, and wherein the method further comprises, prior to the first device launching the first application:
The first device marks a first label to a first process of the first application in response to the first request;
In the case that the first process is marked with the first label, the first device configures the first process not to sleep in a life cycle, and configures an application interface of the first process to allow drawing on the first display area in the life cycle.
3. The method of claim 1, wherein prior to the sending the first data to the second device, the method further comprises:
the first device generates the first data according to an application interface drawn in the first display area, wherein the first data comprises an application layer corresponding to the application interface.
4. The method of claim 1, wherein the second device is responsive to a user operation of the first control, the method further comprising, prior to the sending the first request to the first device:
The second device displays a second window, wherein the second window is a distributed authentication window;
during the display of the second window, the second device receives password information input by a user;
The second device sends the password information to the first device;
And when the password information is matched with an unlocking key preset in the second device, sending first response information to the second device, wherein the first response information indicates that the second device passes distributed authentication.
5. The method of claim 4, wherein prior to the second device displaying a second window, the method further comprises:
the second device determines that the unlock key has been configured in the first device.
6. The method of claim 4 or 5, wherein prior to the second device displaying a second window, the method further comprises:
The second device obtains a first time point adjacent to the last time the first response information is received;
The second device determines that a first time interval is greater than a first threshold, the first time interval being an interval between the first point in time and a current system time.
7. The method of claim 1, wherein prior to said sending the first request to the first device, the method further comprises:
and the second device determines that an unlocking key is not configured in the first device, wherein the unlocking key is a key for releasing the screen locking state of the first device.
8. The method of claim 1, wherein prior to said sending the first request to the first device, the method further comprises:
the second device determines that an unlocking key is configured in the first device;
the second device obtains a first time point, wherein the first time point is the system time adjacent to the last time of receiving first response information fed back by the first device, and the first response information indicates that password information sent by the second device is matched with an unlocking key built in the first device;
The second device determines that a second time interval is not greater than a first threshold, the second time interval being an interval between the first point in time and a current system time.
9. The method of claim 1, wherein after creating the first display area, the method further comprises:
the first device marks the first display area with a second label;
After determining that the first display area is marked with the second label, the first device configures the first display area not to sleep, and configures not to draw a lockscreen layer on the first display area.
10. The method according to claim 1, wherein the method further comprises:
the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a second control, the second control displays a second message from the first device, and the second message is an application message of a second application;
And under the condition that the second application in the first device starts an application lock, the second device responds to the operation of the user on the second control, and displays first reminding information, wherein the first reminding information indicates that the second application is protected by the application lock and cannot be screen-throwing.
11. The method according to claim 1, wherein the method further comprises:
The second device responds to the operation of the user for indicating to view the message list, and a third window is displayed, wherein the third window comprises a third control, and third information from the third device is displayed on the third control;
And under the condition that a communication channel for screen projection is occupied, the second device responds to the operation of the user on the third control, and displays second reminding information which indicates that the problem of network connection conflict exists currently.
12. The method according to claim 1, wherein the method further comprises:
The second device responds to the operation of the user for indicating to view the message list, and a third window is displayed, wherein the third window comprises a third control, and third information from the third device is displayed on the third control;
And under the condition that the network quality between the second equipment and the third equipment does not meet the preset condition, the second equipment responds to the operation of the user on the third control, and third reminding information is displayed, wherein the third reminding information indicates that the network quality influences the screen throwing.
13. The method of claim 1, wherein the first device continues to maintain the lock or off screen state if the first device receives an operation from a user indicating to unlock the lock or off screen state during the display of the first window by the second device.
14. The method of claim 1, wherein after the second device displays the first window, the method further comprises:
and the first equipment responds to the operation of the user for indicating the unlocking or screen-off state, and displays an application interface corresponding to the first process.
15. A method for homologous screen projection, which is applied to a first device, the first device is in communication connection with a second device, and the first device is in a screen-off state or a screen-locking state, the method comprising:
The first device receives a first message;
The first device forwards a first message to the second device, and instructs the second device to display a first control, wherein the first control is used for displaying the first message;
the first device responding to a first request, and creating a first display area, wherein the first display area is a display area invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the first request is a request sent to the first device by the second device in response to the operation of the first control by the user;
the first device starts a first application, wherein the first message is an application message of the first application, the first application comprises a first process, and the first process is not dormant;
And after the first device draws the application interface of the first process in the first display area, sending first data to the second device, and indicating the second device to display a first window, wherein the first window displays the display content drawn in the first display area.
16. The method of claim 15, wherein the first request carries an application identifier of the first application, and wherein the method further comprises, prior to the first device launching the first application:
The first device marks a first label to a first process of the first application in response to the first request;
In the case that the first process is marked with the first label, the first device configures the first process not to sleep in a life cycle, and configures an application interface of the first process to allow drawing on the first display area in the life cycle.
17. The method of claim 15, wherein prior to the sending the first data to the second device, the method further comprises:
the first device generates the first data according to an application interface drawn in the first display area, wherein the first data comprises an application layer corresponding to the application interface.
18. The method of claim 15, wherein after creating the first display area, the method further comprises:
the first device marks the first display area with a second label;
After determining that the first display area is marked with the second label, the first device configures the first display area not to sleep, and configures not to draw a lockscreen layer on the first display area.
19. The method of claim 15, wherein after the second device displays the first window, the method further comprises:
and the first equipment responds to the operation of the user for indicating the unlocking or screen-off state, and displays an application interface corresponding to the first process.
20. A method for homologous screen projection, which is applied to a second device, the second device is in communication connection with a first device, and the first device is in a screen-off state or a screen-locking state, the method comprising:
The second device receives a first message forwarded by the first device;
the second device displays a first control, the first control being used to display a first message from the first device;
The second device responds to the operation of the user on the first control and sends a first request to the first device; the first request is used for instructing the first device to create a first display area, and the first display area is a display area invisible in the first device; the first display area is not dormant and a screen locking layer is not drawn; the first request is further used for indicating the first device to start a first application, the first message is an application message of the first application, the first application comprises a first process, and the first process is not dormant; the application interface of the first process is drawn in the first display area;
the second device receives first data sent by the first device;
the second device displays a first window in response to the first data, wherein the first window displays display content drawn in the first display area.
21. The method of claim 20, wherein the second device is responsive to a user operation of the first control, the method further comprising, prior to the sending the first request to the first device:
The second device displays a second window, wherein the second window is a distributed authentication window;
during the display of the second window, the second device receives password information input by a user;
The second device sends the password information to the first device;
And when the password information is matched with an unlocking key preset in the second device, sending first response information to the second device, wherein the first response information indicates that the second device passes distributed authentication.
22. The method of claim 21, wherein prior to the second device displaying a second window, the method further comprises:
the second device determines that the unlock key has been configured in the first device.
23. The method of claim 21 or 22, wherein prior to the second device displaying a second window, the method further comprises:
The second device obtains a first time point adjacent to the last time the first response information is received;
The second device determines that a first time interval is greater than a first threshold, the first time interval being an interval between the first point in time and a current system time.
24. The method of claim 20, wherein prior to said sending the first request to the first device, the method further comprises:
and the second device determines that an unlocking key is not configured in the first device, wherein the unlocking key is a key for releasing the screen locking state of the first device.
25. The method of claim 20, wherein prior to said sending the first request to the first device, the method further comprises:
the second device determines that an unlocking key is configured in the first device;
the second device obtains a first time point, wherein the first time point is the system time adjacent to the last time of receiving first response information fed back by the first device, and the first response information indicates that password information sent by the second device is matched with an unlocking key built in the first device;
The second device determines that a second time interval is not greater than a first threshold, the second time interval being an interval between the first point in time and a current system time.
26. The method of claim 20, wherein the method further comprises:
the second device responds to the operation of the user for indicating to view the message list, a third window is displayed, the third window comprises a second control, the second control displays a second message from the first device, and the second message is an application message of a second application;
And under the condition that the second application in the first device starts an application lock, the second device responds to the operation of the user on the second control, and displays first reminding information, wherein the first reminding information indicates that the second application is protected by the application lock and cannot be screen-throwing.
27. The method of claim 20, wherein the method further comprises:
The second device responds to the operation of the user for indicating to view the message list, and a third window is displayed, wherein the third window comprises a third control, and third information from the third device is displayed on the third control;
And under the condition that a communication channel for screen projection is occupied, the second device responds to the operation of the user on the third control, and displays second reminding information which indicates that the problem of network connection conflict exists currently.
28. The method of claim 20, wherein the method further comprises:
The second device responds to the operation of the user for indicating to view the message list, and a third window is displayed, wherein the third window comprises a third control, and third information from the third device is displayed on the third control;
And under the condition that the network quality between the second equipment and the third equipment does not meet the preset condition, the second equipment responds to the operation of the user on the third control, and third reminding information is displayed, wherein the third reminding information indicates that the network quality influences the screen throwing.
29. The method of claim 20, wherein the first device continues to maintain the lock or off screen state if the first device receives an operation from a user indicating to unlock the lock or off screen state during the display of the first window by the second device.
30. An electronic device comprising one or more processors and memory; the memory being coupled to a processor, the memory being for storing computer program code comprising computer instructions which, when executed by one or more processors, are for performing the method of any of claims 15-29.
31. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 15-29.
CN202211340697.8A 2022-10-29 2022-10-29 Homologous screen projection method and electronic equipment Active CN116743971B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211340697.8A CN116743971B (en) 2022-10-29 2022-10-29 Homologous screen projection method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211340697.8A CN116743971B (en) 2022-10-29 2022-10-29 Homologous screen projection method and electronic equipment

Publications (2)

Publication Number Publication Date
CN116743971A CN116743971A (en) 2023-09-12
CN116743971B true CN116743971B (en) 2024-04-16

Family

ID=87906674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211340697.8A Active CN116743971B (en) 2022-10-29 2022-10-29 Homologous screen projection method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116743971B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090024402A (en) * 2007-09-04 2009-03-09 삼성테크윈 주식회사 Video presenting system having embedded operationg system and there of driving method
CN113114840A (en) * 2021-03-28 2021-07-13 武汉卡比特信息有限公司 Power-saving screen locking method based on Android mobile phone interconnection
CN114168043A (en) * 2021-11-29 2022-03-11 重庆思骑科技有限公司 Method and system for acquiring navigation information by using screen locking state of mobile phone during riding
CN115243398A (en) * 2021-04-22 2022-10-25 华为技术有限公司 WiFi link dormancy awakening method, electronic equipment and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090024402A (en) * 2007-09-04 2009-03-09 삼성테크윈 주식회사 Video presenting system having embedded operationg system and there of driving method
CN113114840A (en) * 2021-03-28 2021-07-13 武汉卡比特信息有限公司 Power-saving screen locking method based on Android mobile phone interconnection
CN115243398A (en) * 2021-04-22 2022-10-25 华为技术有限公司 WiFi link dormancy awakening method, electronic equipment and system
CN114168043A (en) * 2021-11-29 2022-03-11 重庆思骑科技有限公司 Method and system for acquiring navigation information by using screen locking state of mobile phone during riding

Also Published As

Publication number Publication date
CN116743971A (en) 2023-09-12

Similar Documents

Publication Publication Date Title
US11604572B2 (en) Multi-screen interaction method and apparatus, and storage medium
US11265453B2 (en) Information processing apparatus, image pickup apparatus, information processing system, information processing method, and program
JP2003085112A (en) Network information processing system, and information processing method
WO2017167126A1 (en) Window display method, information exchange method and system
US20240111473A1 (en) Distributed display method and terminal for application interface
CN108932087A (en) Desktop split screen system and its implementation for mobile device
CN114237779A (en) Method for displaying window, method for switching window, electronic equipment and system
US11733857B2 (en) Graphical user interface (GUI) for controlling virtual workspaces produced across information handling systems (IHSs)
CN114286165A (en) Display device, mobile terminal and screen projection data transmission method
US11989405B2 (en) Screen locking method and apparatus
US11150860B1 (en) Dynamic virtual workspace with contextual control of input/output (I/O) devices
CN116743971B (en) Homologous screen projection method and electronic equipment
EP2525556B1 (en) Method and apparatus for switching call mode
CN112463750A (en) File synchronization method and display device
WO2024082871A1 (en) Screen projection system and electronic device
CN117950612A (en) Screen projection method and electronic equipment
CN115150644B (en) Application awakening method of display device, mobile terminal and server
WO2022211983A1 (en) Multi-screen management
CN102932428B (en) Linking of devices
CN114302377A (en) Display device and communication connection method
CN114286320A (en) Display device, mobile terminal and Bluetooth connection method
US11928382B2 (en) Contextual intelligence for virtual workspaces produced across information handling systems (IHSs)
CN114189488B (en) Message sharing method and server
JP2003085131A (en) Network information processing system and information processing method
WO2024066992A1 (en) Multi-device networking system and method, and terminal devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant