CN116127540A - Screen sharing method, electronic device and storage medium - Google Patents

Screen sharing method, electronic device and storage medium Download PDF

Info

Publication number
CN116127540A
CN116127540A CN202111341477.2A CN202111341477A CN116127540A CN 116127540 A CN116127540 A CN 116127540A CN 202111341477 A CN202111341477 A CN 202111341477A CN 116127540 A CN116127540 A CN 116127540A
Authority
CN
China
Prior art keywords
page
shared
screen
interface
split
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111341477.2A
Other languages
Chinese (zh)
Inventor
张瑾婷
辛颖
单玉世南
鲁良兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111341477.2A priority Critical patent/CN116127540A/en
Publication of CN116127540A publication Critical patent/CN116127540A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/84Protecting input, output or interconnection devices output devices, e.g. displays or monitors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a screen sharing method, electronic equipment and a storage medium, and relates to the technical field of information, wherein the method comprises the following steps: responding to the detected screen sharing operation of the user, and establishing screen sharing connection between the first equipment and the second equipment; determining a first pre-shared page; detecting data in a first pre-shared page; if the first device detects that the first pre-shared page contains sensitive data, determining a first sending page; the first sending page comprises shared content determined by a user; and responding to the detected first transmission page, and transmitting the first transmission page to the second device for display. The method provided by the embodiment of the invention can protect the privacy of the user data while sharing the screen.

Description

Screen sharing method, electronic device and storage medium
Technical Field
The embodiment of the application relates to the technical field of information, in particular to a screen sharing method, electronic equipment and a storage medium.
Background
In a multi-device full scenario, a mobile phone may be used as a private device of a user, and may also be used as a semi-office device in some cases to store some files related to work. This has led to many times that users have the appeal of multi-device interactions, requiring others to view the cell phone screen, for example, to screen the cell phone screen on a large screen or a computer, and to share the screen during video calls and the like. However, more mobile phones are also personal devices of users, which store more private information, and if other people watch the mobile phone screen, privacy problems are easily caused.
The existing screen throwing is to throw the whole mobile phone screen on a large screen, if the user operates improperly, the user can directly leak the private information of the user to a public screen by entering the private content interface in error when operating at the mobile phone terminal. Typically, for example, the pictures are shared to a large screen, if the user finishes the sharing of the current picture and needs to share the next picture under the condition that the files to be shared are not sorted in advance, the user needs to search in the album again, and in the searching process, the privacy information of the album is revealed on the public screen.
The current screen sharing scheme also completely shares the screen to other people, and other people watch the content on the mobile phone screen and the content seen by the user are consistent. Thus, existing screen sharing scenarios (screen casting/sharing screen) are prone to privacy security issues.
Disclosure of Invention
The embodiment of the application provides a screen sharing method, electronic equipment and a storage medium, so that a screen sharing mode is provided, and user data privacy can be protected while the screen is shared.
In a first aspect, an embodiment of the present application provides a screen sharing method, which is applied to an electronic device, and includes:
Responding to the detected screen sharing operation of the user, and establishing screen sharing connection between the first equipment and the second equipment; the first device may be a mobile device such as a mobile phone or a tablet, and the second device may be a mobile device such as a mobile phone or a tablet, or may be a large screen device such as a television or a computer.
Determining a first pre-shared page; the pre-shared page may be a page waiting for screen sharing, that is, the pre-shared page is not yet displayed on the second device.
Detecting data in a first pre-shared page; wherein the detection may be of sensitive data.
If the first device detects that the first pre-shared page contains sensitive data, determining a first sending page; the first sending page comprises shared content determined by a user; the first sending page may be a page that the first device sends to the second device to display, that is, after receiving the first sending page sent by the first device, the second device may perform decoding display, where the first sending page may also be considered as an actual shared page. It will be appreciated that, after the second device receives and displays the first transmission page, the first transmission page may be a shared page.
And responding to the detected first transmission page, and transmitting the first transmission page to the second device for display.
In the embodiment of the application, the privacy data is detected on the sharing interface during screen sharing, and the user selects the data when the sensitive user is detected, so that the privacy of the user data can be protected during screen sharing.
In one possible implementation manner, the method further includes:
and if the first device detects that the first pre-shared page does not contain sensitive data, determining the first pre-shared page as a first sending page.
In the embodiment of the present application, in order to improve the efficiency of screen sharing, for a pre-shared page that does not include sensitive data, the pre-shared page may be directly shared.
In one possible implementation manner, the method further includes:
if the first device detects that the first pre-shared page contains sensitive data, the first device displays a shared content selection page, and the shared content selection page is used for determining a first sending page.
In the embodiment of the application, the user can effectively avoid the leakage of the private data of the user through the selection of the content in the pre-shared page.
In one possible implementation manner, before displaying the shared content selection page, the method further includes:
Sending the preset page as a second sending page to second equipment for display; wherein the second transmitted page does not contain sensitive data.
In the embodiment of the application, before the actual sharing page performs screen sharing, a temporary page can be sent to perform sharing, and a user can be reminded of entering a screen sharing scene, so that the user can be reminded that the current shared data possibly relate to privacy, and further user privacy disclosure can be avoided.
In one possible implementation manner, the method further includes:
after the detected user selects the shared content in the shared content selection page, the first device displays a first split screen interface, wherein the first split screen interface comprises a first split screen area and a second split screen area;
the first split screen area is used for determining a first sending page.
In the embodiment of the application, the content of the display interface of the first device is used for split-screen display, so that the operation experience of a user can be improved.
In order to improve the operation experience of the user, in one possible implementation manner, the first split screen area displays details of the content to be shared, and the second split screen area displays a summary of the content to be shared.
In one possible implementation manner, the method further includes:
In response to the detected user operation in the second split screen area, the display content in the second split screen area is updated, and the first device does not update the first transmission page.
In the embodiment of the invention, in the process of screen sharing, the first device displays through the split screen, and can operate one of the split screens which do not share the screen, and the split screen cannot be perceived by the opposite terminal device, so that the operation experience of a user can be improved.
In one possible implementation manner, the method further includes:
in response to the detected change of the shared page, updating the first pre-shared page into a second pre-shared page, wherein the second pre-shared page is a page after the change of the shared page; or (b)
Responding to the fact that the detected shared page is unchanged, and the display interface of the first device is changed, displaying a second split screen interface by the first device, wherein the second split screen interface comprises a third split screen area and a fourth split screen area; the third split screen area is used for displaying the shared page, and the fourth split screen area is used for displaying the content of the display interface change of the first device;
wherein the shared page is used to characterize a page that the first device has shared with the second device.
In the embodiment of the invention, whether to carry out split screen display or re-determine the pre-shared page is determined through judging the change of the shared page, so that the operation experience of a user can be improved.
In one possible implementation manner, the method further includes:
if the first device detects that the second pre-shared page contains sensitive data, the first device does not update the first sending page.
In the embodiment of the present invention, after detecting that the updated pre-shared page contains sensitive data, the first device does not update the sending page, that is, the second device maintains displaying the original pre-shared page, so that the security of the user privacy data can be improved.
In one possible implementation manner, the first pre-sharing page includes a first split-screen page and a second split-screen page, and further includes:
if the first device detects that the first pre-shared page contains sensitive data, displaying a split screen selection page, wherein the split screen selection page is used for determining a first sending page;
wherein the first sending page is determined by the first split-screen page, or
The first transmitted page being determined by the second split page, or
The first sent page is determined by the first pre-shared page.
In the embodiment of the invention, when the pre-shared page is a split-screen page, the split-screen selection page is provided for the user to select the split-screen page, so that the control force of the user in the screen throwing process can be enhanced, and privacy protection is facilitated.
In a second aspect, an embodiment of the present application provides a screen sharing apparatus, applied to a first device, including:
the connection module is used for responding to the detected screen sharing operation of the user, and the first equipment and the second equipment establish screen sharing connection;
the first determining module is used for determining a first pre-shared page;
the first detection module is used for detecting data in the first pre-shared page;
the second determining module is used for determining a first sending page if the first device detects that the first pre-shared page contains sensitive data; the first sending page comprises shared content determined by a user;
the first sending module is used for responding to the detected first sending page and sending the first sending page to the second equipment for display.
In one possible implementation manner, the apparatus further includes:
and the third determining module is used for determining the first pre-shared page as the first sending page if the first device detects that the first pre-shared page does not contain sensitive data.
In one possible implementation manner, the apparatus further includes:
the first display module is configured to display a shared content selection page by the first device if the first device detects that the first pre-shared page includes sensitive data, where the shared content selection page is used to determine a first sending page.
In one possible implementation manner, the apparatus further includes:
the second sending module is used for sending the preset page to the second equipment as a second sending page for display; wherein the second transmitted page does not contain sensitive data.
In one possible implementation manner, the apparatus further includes:
the second display module is used for responding to the detected selection of the shared content by the user in the shared content selection page, and the first equipment displays a first split screen interface which comprises a first split screen area and a second split screen area;
the first split screen area is used for determining a first sending page.
In one possible implementation manner, the first split screen area displays details of the content to be shared, and the second split screen area displays a summary of the content to be shared.
In one possible implementation manner, the apparatus further includes:
and the first updating module is used for responding to the detected operation of the user in the second split screen area, updating the display content in the second split screen area, and the first device does not update the first transmission page.
In one possible implementation manner, the apparatus further includes:
the second detection module is used for responding to the detected change of the shared page, and updating the first pre-shared page into a second pre-shared page, wherein the second pre-shared page is a page after the change of the shared page; or (b)
Responding to the fact that the detected shared page is unchanged, and the display interface of the first device is changed, displaying a second split screen interface by the first device, wherein the second split screen interface comprises a third split screen area and a fourth split screen area; the third split screen area is used for displaying the shared page, and the fourth split screen area is used for displaying the content of the display interface change of the first device;
wherein the shared page is used to characterize a page that the first device has shared with the second device.
In one possible implementation manner, the apparatus further includes:
and the second updating module is used for not updating the first sending page by the first device if the first device detects that the second pre-shared page contains sensitive data.
In one possible implementation manner, the first pre-shared page includes a first split-screen page and a second split-screen page, and the apparatus further includes:
the third display module is used for displaying a split screen selection page if the first device detects that the first pre-shared page contains sensitive data, wherein the split screen selection page is used for determining a first sending page;
wherein the first sending page is determined by the first split-screen page, or
The first transmitted page being determined by the second split page, or
The first sent page is determined by the first pre-shared page.
In a third aspect, an embodiment of the present application provides a first device, including:
a memory for storing computer program code, the computer program code comprising instructions that, when read from the memory by the first device, cause the first device to perform the steps of:
responding to the detected screen sharing operation of the user, and establishing screen sharing connection between the first equipment and the second equipment;
determining a first pre-shared page;
detecting data in a first pre-shared page;
if the first device detects that the first pre-shared page contains sensitive data, determining a first sending page; the first sending page comprises shared content determined by a user;
and responding to the detected first transmission page, and transmitting the first transmission page to the second device for display.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
and if the first device detects that the first pre-shared page does not contain sensitive data, determining the first pre-shared page as a first sending page.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
if the first device detects that the first pre-shared page contains sensitive data, the first device displays a shared content selection page, and the shared content selection page is used for determining a first sending page.
In one possible implementation manner, before the step of displaying the shared content selection page is performed by the first device, the instruction further performs the following steps:
sending the preset page as a second sending page to second equipment for display; wherein the second transmitted page does not contain sensitive data.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
after the detected user selects the shared content in the shared content selection page, the first device displays a first split screen interface, wherein the first split screen interface comprises a first split screen area and a second split screen area;
the first split screen area is used for determining a first sending page.
In one possible implementation manner, the first split screen area displays details of the content to be shared, and the second split screen area displays a summary of the content to be shared.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
in response to the detected user operation in the second split screen area, the display content in the second split screen area is updated, and the first device does not update the first transmission page.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
in response to the detected change of the shared page, updating the first pre-shared page into a second pre-shared page, wherein the second pre-shared page is a page after the change of the shared page; or (b)
Responding to the fact that the detected shared page is unchanged, and the display interface of the first device is changed, displaying a second split screen interface by the first device, wherein the second split screen interface comprises a third split screen area and a fourth split screen area; the third split screen area is used for displaying the shared page, and the fourth split screen area is used for displaying the content of the display interface change of the first device;
wherein the shared page is used to characterize a page that the first device has shared with the second device.
In one possible implementation manner, the instructions, when executed by the first device, cause the first device to further perform the following steps:
if the first device detects that the second pre-shared page contains sensitive data, the first device does not update the first sending page.
In one possible implementation manner, the first pre-shared page includes a first split page and a second split page, and when the instruction is executed by the first device, the first device is further caused to perform the following steps:
if the first device detects that the first pre-shared page contains sensitive data, displaying a split screen selection page, wherein the split screen selection page is used for determining a first sending page;
wherein the first sending page is determined by the first split-screen page, or
The first transmitted page being determined by the second split page, or
The first sent page is determined by the first pre-shared page.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In one possible design, the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 3 is a schematic flow chart of a screen sharing method according to an embodiment of the present application;
fig. 4 is a schematic view of a screen effect provided in an embodiment of the present application;
fig. 5a and fig. 5b are schematic views of an effect of a screen scene provided in an embodiment of the present application;
fig. 6a and fig. 6b are schematic views of effects of another screen-projection scenario provided in the embodiments of the present application;
fig. 7a and fig. 7b are schematic views of effects of still another screen-projection scenario according to an embodiment of the present application;
fig. 8a to 8c are schematic views of effects of still another screen-projection scenario according to an embodiment of the present application;
fig. 9a and fig. 9b are schematic views of effects of still another screen-projection scenario provided in an embodiment of the present application;
fig. 10a and fig. 10b are schematic views of effects of still another screen-projection scenario provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a screen sharing device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In a multi-device full scenario, a mobile phone may be used as a private device of a user, and may also be used as a semi-office device in some cases to store some files related to work. This has led to many times that users have the appeal of multi-device interactions, requiring others to view the cell phone screen, for example, to screen the cell phone screen on a large screen or a computer, and to share the screen during video calls and the like. However, more mobile phones are also personal devices of users, which store more private information, and if other people watch the mobile phone screen, privacy problems are easily caused.
The existing screen throwing is to throw the whole mobile phone screen on a large screen, if the user operates improperly, the user can directly leak the private information of the user to a public screen by entering the private content interface in error when operating at the mobile phone terminal. Typically, for example, the pictures are shared to a large screen, if the user finishes the sharing of the current picture and needs to share the next picture under the condition that the files to be shared are not sorted in advance, the user needs to search in the album again, and in the searching process, the privacy information of the album is revealed on the public screen.
The current screen sharing scheme also completely shares the screen to other people, and other people watch the content on the mobile phone screen and the content seen by the user are consistent. Thus, existing screen sharing scenarios (screen casting/sharing screen) are prone to privacy security issues.
Based on the above-mentioned problems, the embodiment of the present application proposes a screen sharing method, which is applied to the electronic device 100. The electronic device 100 may be a mobile terminal having a display screen. A mobile terminal may also be called a terminal device, user Equipment (UE), access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent, or User Equipment. The embodiment of the present application does not particularly limit the specific form of the electronic device 100 that performs the technical scheme.
An exemplary electronic device provided in the following embodiments of the present application is first described below in conjunction with fig. 1. Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present invention is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The screen sharing method provided in the embodiment of the present application will now be described with reference to fig. 2 to 10 b.
Fig. 2 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 2, the application scenario may include a first device 21 and a second device 22, where the first device 21 may be the electronic device 100, and the second device 22 may be an electronic device of the same type as the first device 21, for example, a mobile terminal such as a mobile phone, a tablet, or an electronic device such as a large screen, for example, a television or a computer. The first device 21 may have more user privacy data stored. The user may project the content stored in the first device 21 onto the screen of the second device 22 for viewing by other users, and the projection may specifically be performed by a projection screen, a shared screen (e.g., shared in a video call), a remote control screen, or the like. The second device 22 may establish a data connection with the first device 21, for receiving the data transmitted by the first device 21, and may display the data transmitted by the first device 21 on the screen of the second device 22, for implementing screen sharing.
Fig. 3 is a schematic flow chart of an embodiment of a screen sharing method according to an embodiment of the present application, where the method includes:
In step 301, a user operates a first device 21 to enter a screen sharing scene.
Specifically, the user can perform a screen sharing operation on the interface of the first device 21 for establishing a screen sharing connection between the first device 21 and the second device 22, whereby a screen sharing scene can be entered.
In particular implementations, when a user wants to screen content on the first device 21 onto the second device 22, a screen connection needs to be established between the first device 21 and the second device 22, and the user can operate at least one of the two electronic devices (e.g., the first device 21 and the second device 22) to establish a connection between the two electronic devices, thereby establishing a screen data channel. For example, if the user expects to screen the content on the mobile phone onto a large screen (e.g. television, electronic whiteboard) of the conference room, the user may perform a screen-casting operation on the mobile phone, and a screen-casting connection relationship is established between the mobile phone and the large screen.
When the first device 21 establishes a screen-casting connection with the second device 22, a screen-casting scene is entered. The process of establishing the screen connection may be consistent with the process of establishing the screen in the prior art, and will not be described herein.
The above-described screen projection process will now be exemplarily described with reference to fig. 4. As shown in fig. 4, the user may pull down a menu bar on the display interface of the first device 21, and when the screen-throwing button is turned on, the first device 21 may search for a peripheral screen-throwing device (for example, the second device 22). When the user selects the second device 22 as a screen-throwable device, a screen-throwing connection is established between the first device 21 and the second device 22. At this time, the first device 21 may screen the display interface onto the display screen of the second device 22, and the display interface of the second device 22 may display content that may be displayed by the first device 21. It will be appreciated that the first device 21 may be a screen that is turned on under a main screen interface (e.g. a desktop), or may be turned on under an application internal interface, which is not limited in particular in the embodiment of the present application.
In step 302, the first device 21 determines the pre-shared page and the security level of the data in the pre-shared page, and determines the sending page according to the security level in the pre-shared page.
Specifically, the pre-shared page may be a page displayed on the first device 21 and intended to be shared by the user. It will be appreciated that the pre-shared page has not yet been sent to the second device 22 for display. In a specific implementation, the pre-shared page may include a page that the user initiates the screen-casting, and after the user operates the first device 21 during the screen-casting, the first device 21 further displays the page on the first device 21 according to an operation instruction of the user.
The transmission page may be a page that the first device transmits to the second device 22 to display, and after the second device 22 receives the transmission page transmitted by the first device 21, the transmission page may be displayed on the screen of the second device 22. If the first device 21 sends the pre-shared page to the second device 22 for screen-casting sharing, the pre-shared page is the sending page. In the existing screen projection scheme, the pre-shared page is the sending page. In the embodiment of the present application, before the first device 21 sends the sending page, the security detection may be performed on the content of the pre-shared page, so that the privacy data in the pre-shared page may be hidden, and therefore, the sending page and the pre-shared page may not be consistent.
After the first device 21 establishes the screen-drop connection with the second device 22, the first device 21 may determine whether to use the pre-shared page as the transmission page according to whether the pre-shared page contains highly sensitive data. And if the high-sensitivity data does not exist, taking the pre-shared page as a sending page, and directly throwing the pre-shared page. If the high-sensitivity data is included, the first device 21 does not screen the pre-shared page, and determines to send the page according to a preset rule. The determining, by the preset rule, that the sending page may be: whether to send a page and select a particular page to send the page. Illustratively, if the first device 21 drops at least one page to the second device 22 prior to the current drop, the sending of the page is not determined and the second device 22 maintains the current shared page. If the screen is first screen, a temporary page can be determined as a sending page according to a preset rule, and the sending page is sent to the second device 22 for display.
Wherein the shared page may be a page that is sent to the second device 22 for display. The shared page is synchronously displayed on the screens of the first device 21 and the second device 22.
The pre-shared page, the send page, and the shared page described above are now described in connection with FIG. 4. As shown in fig. 4, if the user initiates a screen break on the home screen of the first device 21, the home screen interface may be a pre-shared page. If the user starts the screen-casting on the photo preview interface of the gallery, the photo preview interface of the gallery is a pre-sharing page. Then, after the user initiates the screen-drop, the home screen interface is sent to the second device 22, at which point the home screen interface may be a send page. When the second device 22 displays the home screen interface sent by the first device 21, the home screen interface may be a shared page. After the second device 22 displays the main screen interface sent by the first device 21, the user may click on the gallery on the first device 21, and the first device 21 may determine a new display interface (for example, a photo preview interface of the gallery) according to an operation instruction of the user, where the pre-sharing page may be switched to the photo preview interface of the gallery.
It will be appreciated that in the existing screen projection scheme, when the first device 21 enters the screen projection scene, the display page of the current screen is sent to the second device 22 for display. Thus, the pre-shared page is the send page. In the embodiment of the present application, the first device 21 may determine whether to use the pre-shared page as the transmission page according to whether the pre-shared page contains highly sensitive data. Specifically, first, data included in a pre-shared page is determined, and a security level of the data is judged. Judging whether the pre-shared page has high sensitive data according to the security level of the data, and finally, determining whether the pre-shared page is used as a sending page according to whether the pre-shared page has the high sensitive data. The security level of the data may be determined by the first device 21 according to a preset rule, which may be stored in the first device 21 in advance.
In a specific implementation, the process of determining to send the page according to the preset rule may include the following sub-steps:
in step 3021, if the pre-shared page includes highly sensitive data, the pre-shared page is taken as a sending page, and the pre-shared page is directly screened.
Specifically, the first device 21 may determine whether the pre-shared page includes high sensitive data, and if there is no high sensitive data, may use the pre-shared page as a sending page, and directly screen the pre-shared page. Referring to fig. 4, when the user starts the screen-throwing on the main screen, the main screen interface is a pre-shared page, and at this time, the pre-shared page only includes APP icons, and the data (for example, APP icons) does not include the privacy of the user, so that the security level of the data is low, that is, the current pre-shared page does not include highly sensitive data. At this time, the first device 21 may directly take the pre-shared page as a transmission page, and may directly transmit the pre-shared page (home screen interface) to the second device 22 for display.
In step 3022, if the pre-shared page includes highly sensitive data, the sending page is determined according to a preset rule.
Specifically, if the pre-shared page includes highly sensitive data, the first device 21 does not screen the pre-shared page, and may determine the transmission page according to a preset rule, including whether to determine the transmission page and selecting a specific page of the transmission page.
In a specific implementation, the determining whether to send the page may be: if the first device has successfully performed at least one screen-throwing process before the screen-throwing process (taking one screen-throwing process as one screen-throwing process), at least one page is thrown to the second device 22, that is, it means that the second device 22 is displaying the shared page at this time, the first device 21 does not determine to send the page, and the second device 22 maintains displaying the current shared page.
An exemplary description will now be given with reference to fig. 5a and 5 b. As shown in fig. 5a, the interface 500 is a mobile phone screen interface of the first device 21, that is, the interface 500 is screen-projected on the second device 22. At this time, the user may click on the icon of gallery APP on interface 500 of first device 21, and then the page to be displayed by first device 21 is photo preview interface 510 of the gallery top page of fig. 5 b. It will be appreciated that at this point, interface 510 is not actually shown, and thus the display in interface 510 is shown in phantom. And interface 510 is the current pre-shared page.
Wherein interface 510 includes a thumbnail of a large number of photographs, and the thumbnail of photographs is highly sensitive data. Thus, the first device 21 will temporarily interrupt the process of sending the interface 510 (pre-shared page) to the second device 22. Meanwhile, since the first device 21 has previously made a screen shot (e.g., the interface 500) with the second device 22, the second device 22 displays the interface 500, after the first device 21 temporarily interrupts the transmission interface 510, it is no longer determined that a new transmission page is transmitted to the second device 22, that is, the second device 22 maintains displaying the shared page (e.g., the interface 500) of the previous screen shot process.
In addition, the specific page of the above selected transmission page may be: if the screen throwing process is the first screen throwing process, a temporary page can be determined as a sending page according to a preset rule and sent to the second device 22 for display.
The temporary page may be a default page in the first device 21, for example, the default page may be a home screen interface of the first device 21. Alternatively, the temporary page may be preset content on the first device 21, where the preset content may be any one of a group of pictures in the first device 21 or a preset picture, for example, a picture may be determined from a wallpaper library as a sending page, or a current wallpaper may be used as a sending page; for another example, a plain picture may be designated as the transmission page. It will be appreciated that the temporary page described above does not contain sensitive data. Through the operation, the user can be reminded of entering a screen sharing scene, so that the user can be reminded that the data currently shared by the user possibly relate to privacy, and the privacy disclosure of the user can be avoided.
An exemplary description is first provided with reference to fig. 6a and 6 b. As shown in fig. 6a, the user first opens the gallery APP in the first device 21, and the photo preview interface 600 of the gallery top page is displayed on the first device 21. The user may then initiate a drop-in screen, that is, the user may take the photo preview interface 600 as the first pre-shared page. At this time, the photo preview interface 600 is a page containing the private content of the user, and the first device 21 may send, for example, a plain white picture to the second device 22 as a sending page according to a preset rule, so as to display the plain white picture. When the second device 22 receives the plain white picture sent by the first device 21, the shared page 610 is displayed, and the shared page 610 includes the plain white picture, so that the first screen projection can be completed, and the privacy of the user can be protected.
As shown in fig. 6b, the user may further click on gallery APP in first device 21 in the projected scene as shown in fig. 4 (e.g., first device 21 displaying a home screen interface 620, second device 22 displaying an interface 630 of the same content as home screen interface 620 of first device 21). At this point, the user will open a photo preview interface (e.g., interface 510 as shown in FIG. 5 b) for the picture, which interface 510 may be the second pre-shared page. Since this interface 510 is not the first pre-shared page, the first device 21 no longer determines to send the page and the second device 22 maintains the last shared page (e.g., interface 630) displayed.
In addition, when the temporary page is used as the transmission page, prompt information can be added to the temporary page. As shown in fig. 6a, an icon 611 in the shared page 610 may be used to indicate that the second device 22 is linking to the gallery APP of the first device 21. The prompt information may be determined by the first device 21 directly when transmitting the temporary page to the second device 22, or may be determined by the second device 22 according to the received data stream. For example, the second device 22 may determine that the received data stream is from the gallery APP of the first device 21.
In step 303, the first device 21 detects the security level of the data in the pre-shared page, and if the security level of the data in the pre-shared page meets the preset level, the first device 21 displays the shared content selection page.
Specifically, the first device 21 may also detect whether the data in the pre-shared page contains highly sensitive data, and when the highly sensitive data is included in the pre-shared page, the first device 21 may temporarily interrupt the screen-break transmission of the pre-shared page. In a specific implementation, the method for detecting whether the data in the pre-shared page contains the highly sensitive data may be: the first device 21 may detect the security level of the data in the pre-shared page, and if the security level of the data in the pre-shared page meets the preset level, it indicates that the pre-shared page includes highly sensitive data. At this time, the first device 21 may temporarily interrupt the screen shot transmission of the pre-shared page, and may display the shared content selection page. The shared content selection page may be used for a user to select content to share to the second device 22. The shared content selection page can be obtained by performing preset processing on a pre-shared page.
An exemplary description will now be given with reference to fig. 7a and 7 b. Taking the photo preview interface 600 shown in fig. 6a as an example, when the user opens the gallery APP in the first device 21 and starts the screen-casting, the first device 21 may send a temporary page (e.g. a plain white picture) to the second device 22 for display because the photo preview interface 600 contains highly sensitive data. At this time, the first device 21 temporarily interrupts the transmission process of the pre-shared page, and can obtain the interface 700 as shown in fig. 7. As shown in fig. 7, at this time, the first device 21 displays an interface 700, the second device 22 displays an interface 710, the interface 710 includes a plain white picture, and the interface 700 selects a page for sharing content. In the above interface 700, the user may select one or more pictures for screen casting. For example, the user may select 5 photos, such as photo 1, photo 2, photo 3, photo 4, and photo 5, to wait for the screen to be dropped.
Next, taking the home screen interface 620 shown in fig. 6b as an example, the first device 21 and the second device 22 have established a screen, the second device 22 displays an interface 630, and the content displayed by the interface 630 is consistent with the content of the home screen interface 620. When the user clicks on an icon of gallery APP in main screen interface 620, and after gallery APP is started, first device 21 will display a photo preview interface of the gallery top page, which is a pre-shared page. However, since the pre-shared page contains highly sensitive data, the first device 21 temporarily interrupts the sending of the pre-shared page and can get an interface 720 as shown in fig. 7b, the interface 720 selecting a page for sharing content. In the interface 720, the user may select one or more pictures to cast. For example, the user may select 5 photos, such as photo 1, photo 2, photo 3, photo 4, and photo 5, to wait for the screen to be dropped. It will be appreciated that the second device 22 may maintain display of the last shared page (e.g., interface 730) until it receives the pre-shared page (e.g., photo 1, photo 2, photo 3, photo 4, and photo 5) sent by the first device 21, wherein the content displayed by the interface 730 is consistent with the content in the interface 630.
Step 304, after the user selects the shared content from the shared content selection page, determining an actual shared page according to the selected shared content.
Specifically, in the shared content selection page, the user can select content to be shared. The sharing content selection page may include a sharing control, or the sharing control may be displayed along with a selection operation of the user, through which the user may end the selection of the sharing content, and determine an actual sharing page actually projected to the second device 22 according to the analysis content selected by the user. It will be appreciated that the actual shared page is the current sent page.
In a specific implementation, taking the interface 700 shown in fig. 7a or the interface 720 shown in fig. 7b as an example, when the sharing content selection page (for example, the interface 700 or the interface 720) is just entered, the first device 21 may not display the screen-throwing control first, and after a photo is selected by a user, the screen-throwing control "1 piece has been selected, and the screen is removed" may be displayed at the bottom of the screen. And the screen throwing control (0 pieces are selected and the screen throwing is performed) can be displayed just after the sharing content selection page is started. The number within the drop control may vary with the number of photos selected by the user. For example, when the user selects 5 photos, the screen-drop control may display "5 photos have been selected, go to screen". Then, after the user selects 5 photos, clicking the screen throwing control below the photos, and the selection of sharing content can be completed. Meanwhile, after the user clicks the screen throwing control, the first device 21 can generate an actual sharing page according to 5 photos selected by the user. The user may also select the "all drop" control above the screen of the first device 21. Selecting the "all screen throwing" control represents that the user considers that the current page has no high sensitive data. Therefore, when all the screen shots are selected, the actual shared page can be determined according to all the contents of the selected current page, that is, the pre-shared page can be used as the actual shared page of this time.
Preferably, if the user wants to screen only part of the content in the pre-shared page of the first device 21 to the second device 22, then: when determining the actual shared page according to the selected shared content, the first device 21 may perform split-screen display according to the selected shared content, for example, the first device 21 may split the screen into two split screens. The two split screens can display the shared content selected by the user in different modes respectively, and can select the content of one of the split screens as an actual shared page according to a preset rule, and send the content of the split screen to the second device 22 for display.
The above split screen mode will now be exemplarily described with reference to fig. 8a and 8 b. As shown in fig. 8a, interface 800 is a shared content selection interface. After the user clicks the screen-drop control 801 (e.g., clicks "select 5, drop screen") in the interface 800, the handset may split the screen, thereby obtaining an interface 810 as shown in fig. 8b, where the interface 810 is split into two split screens, e.g., an upper split screen 811 and a lower split screen 812, as shown in fig. 8 b. The upper and lower split screens 811 and 812 display contents of 5 photos in different forms according to 5 photos selected by the user. Wherein the upper split screen 811 may display a detail interface of one of the photos and the lower split screen 812 may display a preview of all the photos selected. Then, according to a preset rule, for example, the detail interface of the photo 1 in the upper split screen 811 may be selected as an actual shared page for screen casting, and then the second device 22 also displays the photo 1, and the screen casting result is shown in fig. 8 c.
The user may control the change in the display content of the second device 22 through either of the two split screens of the first device 21. Taking fig. 8b as an example, the user slides left and right in the upper split screen 811 to view different photos. When viewing photo 1, sliding to the left, and switching to view photo 2 is possible, the upper split screen 811 displays photo 2, and since the page of the upper split screen 811 is the actual screen-throwing page, when the display content of the page of the upper split screen 811 changes, the first device 21 transmits the changed page to the second device 22, and the display content of the second device 22 changes accordingly.
Alternatively, the user may also operate a non-actual shared page (e.g., the page of the lower split screen 812) to change the change in the actual shared page (e.g., the page of the upper split screen 811), thereby changing the display content of the second device 22. When the user selects a thumbnail of one photo in the lower split screen 812, the upper split screen 811 displays a photo corresponding to the thumbnail. I.e., the user controls the display of the upper split screen 811 through the lower split screen 812. For example, in fig. 8b, if the user selects photo 3 in the lower split screen 812, the upper split screen 811 changes from displaying photo 1 to displaying photo 3; when the first device 21 obtains the page of the changed upper split screen 811 (i.e., the page on which photo 3 is displayed) according to the operation of the user, on the one hand, the display page of the upper split screen 811 is changed, and on the other hand, the page of the changed upper split screen 811 is transmitted to the second device 22, so that the second device 22 also displays photo 3.
In addition, the user may also operate in a non-actual shared page (e.g., the page of the lower split screen 812) to change the previously selected shared content. For example, in FIG. 8b, the first item of the thumbnail in the lower split screen 812 has a "+" control (i.e., a control for changing the photo), and the user clicks on the "+" control to enter the photo preview interface where the user can change the shared content. If the user finds that an unsuitable photo is selected while viewing the thumbnail of the lower split screen 812, the photo is temporarily canceled or replaced, or that few photos are added, the selected photo may be canceled and a new photo may be selected at the photo preview interface. When the photo preview interface is invoked by the "+" control, the page displaying the thumbnail is switched from displaying the photo preview interface at the lower split screen 812. The display content of the second device 22 is unchanged due to the display change of the non-actual shared page. For example, in FIG. 8c, after the user clicks the add control, the second device 22 still displays photo 1, with only the display content of the lower split screen 812 changed.
Note that, the first device 21 may not split the screen, but may directly switch the shared content selection page to the actual shared page according to the split content selected by the user. For example, in FIG. 8c, the page is switched directly from the shared content selection page to the page displaying the upper split 811.
The first device 21 sends 305 the actual shared page to the second device 22 for display.
Specifically, after determining the actual shared page, the first device 21 may send the actual shared page to the second device 22 through the data channel. When the second device 22 receives the data, it decodes and displays the actual shared page. Taking fig. 8c as an example, after the second device 22 receives the data (e.g., photo 1) sent by the first device 21, the same content (e.g., details of photo 1) as the split screen is displayed.
In response to the detected user operation on the first device 21, a change in the display page on the first device 21 is determined and processing is performed based on the change in the display page, step 306.
Specifically, if the pre-shared page has highly sensitive data, the first device 21 may provide a shared content selection page for the user to select shared content within the pre-shared page. When the user selects to share the content, the first device 21 may split the screen, where the display page in one of the split screens is used as the actual shared page. And meanwhile, the function of changing the shared content and the function of changing the screen throwing content are provided through another split screen. The user may then screen the actual shared page, at which point the actual shared page becomes a shared page.
If the pre-shared page does not have high sensitive data, the pre-shared page can be directly used as a sending page to be sent to the second device 22 for display, and the one-time screen throwing process is completed. After the pre-shared page is sent to the second device 22 for display, the pre-shared page becomes a shared page.
In addition, the user may also operate on a shared page of the first device 21 or act on the non-screen portion through other physical keys/gestures. The operation of the user is not particularly limited in the embodiment of the present application. At this time, the user's operation may cause a page change of the first device 21.
The change types of the display page can include two types: such as a first type and a second type. The first type may be used to characterize changes to the shared page, which may be partial changes or complete changes to the shared page; this second type may be used to characterize that the shared page has not changed. Different types of display page changes, the first device 21 may perform different processing. In particular implementations, the above-described process may include the following:
if the type of change of the displayed page is of the first type, that is, the shared page is changed, the first device 21 may re-determine the pre-shared page. Specifically, when the shared page of the first device 21 changes, the first device 21 may start a new screen-casting process, and may cast the changed page to the second device 22 for display. Wherein the changed page may be a new pre-shared page. If the changed page does not contain highly sensitive data, the first device 21 may screen the new pre-shared page directly. If the changed page contains highly sensitive data, then the shared content may be selected by the user for screening, e.g., step 304 is performed.
If the type of change of the displayed page is the second type, that is, the shared page is unchanged. At this time, the first device 21 may further detect whether the content of the current display interface is changed. It will be appreciated that during the screen dropping process, the user may perform other operations not related to the screen dropping, which may not cause a change in the shared page, for example, displaying a message bubble, clicking on a message bubble, displaying a message in the form of a window when a message is received, which may not cause a change in the shared page, but may cause a change in the current display interface of the first device 21.
In some cases, the user's operation may cause a change in the current interface of the first device 21, the content of the change in the current interface being at the same layer as the shared page, but not causing a change in the shared page. When the current interface change meets the preset criteria, the first device 21 may split the current interface, and the shared page may be displayed separately through one split, and the other split may display the content of the changed current interface. For example, the user opens a new application or software and performs a split screen operation. At this time, the first device 21 may split the screen, one of which may display the shared page, and the other of which may display the changed content of the current interface.
The manner of variation of the current interface described above will now be exemplarily described with reference to fig. 9a and 9 b. As shown in fig. 9a, the user may invoke a side menu bar 901 in the interface 900 via a screen edge gesture, at which point the interface 900 may contain a shared page 902 and side menu bar 901. Then, the user can drag the icon of the APP11 to the lower half screen of the screen, and the screen splitting requirement is met. At this time, the first device 21 may split-screen display the screen, and thus may obtain an interface 910 as shown in fig. 9 b. The newly opened page of APP11 is displayed in the lower half 912, and the contents of the shared page 902 remain displayed in the upper half 911. It will be appreciated that the content of the shared page 901 cannot be displayed in its entirety in the upper half 911 due to the screen reduction (split screen), but the content of the shared page 901 is unchanged, that is, the first device 21 may still acquire all the data in the content of the shared page 901, and the data is unchanged. Thus, the content of the shared page 901 that is screened to the second device 22 is unchanged. At this point, the user may further slide on the upper half screen 911 to view content not visible in the shared page 902, e.g., a photograph of the underside of the shared page 902. When the user slides the shared page 902 in the first device 21, the shared page 902 dropped in the second device 22 will also slide. Because there is no split screen in the second device 22, the entire content of the shared page 902 may be displayed in full in the second device 22 without portions of the content of the shared page 902 being blocked by the lower half screen 912 as in the interface 910.
In the embodiment of the application, a pre-sharing page can be determined in each screen-throwing process, and the sharing content selection page is displayed based on the pre-sharing page, so that a user can actively select the sharing content. And then individually display the shared content. On one hand, the method ensures that the screen-throwing content is carefully selected by the user, protects privacy and safety, and on the other hand, the method of firstly selecting the shared content and then throwing the screen is also beneficial to content sharing after throwing the screen, so that when sharing is half, other shared content needs to be searched, and the method is particularly suitable for the situation that different shared content is in different positions. (e.g., sharing a limited number of photos, the photos are scattered across the photo preview page). When the high-sensitivity data exist, the sharing content selection page is provided, so that the situation that the user screens the high-sensitivity data on a public screen under unintentional operation, privacy leakage is caused, and privacy safety in the screen screening process is protected. In addition, sharing content is displayed in different modes on different split screens in a split screen mode, wherein one of the sharing content is displayed in detail and is used as a screen throwing page; another split screen displays the shared content summary. The change of the screen throwing page can be controlled through the non-screen throwing and dividing, and meanwhile, when the shared content needs to be replaced/added/cancelled, the operation can be performed on the non-screen throwing and dividing, and the operation cannot be perceived by the shared object.
It should be noted that, in the embodiment of the present application, the screen-throwing scenario is illustrated, but the embodiment of the present application is limited, and the embodiment of the present application is also applicable to application scenarios such as a shared screen and a remote control screen.
The manner of projection of the screen is exemplified above by means of fig. 1-9 b. In some application scenarios, the user may also select any split-screen page from the split-screen selection pages as the pre-shared page for screen casting. Next, the selection of the split page is exemplarily described below by way of fig. 10a and 10 b.
In particular implementation, the pre-shared page may include two types, for example, a normal type, which may be a type in which the pre-shared page is displayed entirely on the interface (screen) of the first device 21, and a split type, which may be a type in which the pre-shared page is displayed split on the interface (screen) of the first device 21. If the current pre-shared page is of the split type, the first device 21 may provide a split selection page. The split screen selection page is used for a user to select one split screen page for screen throwing. The split screen selection page may include at least 2 split screens. The user may then select one of the split-screen pages and may determine a new pre-shared page based on the split-screen page selected by the user, whereby the drop content on the second device 22 may be updated.
An exemplary description will now be given with reference to fig. 10a and 10 b. As shown in fig. 10a, the interface 1000 is a display interface of the first device 21 before the screen-throwing link is established. Before the screen is thrown, the first device 21 may split the screen to display two APPs, for example, a photo APP and APP11. Then, the user may start the screen-throwing on the interface shown in fig. 10a, and when the screen-throwing link is successful, the interface shown in fig. 10a is the first pre-shared page. At this time, the first device 21 detects that the pre-shared page includes two split areas, so that an interface 1010 as shown in fig. 10b may be obtained, where the interface 1010 may be a split selection page, and the interface 1010 may be used to allow the user to select a split area to be projected. The user may select the split area, for example, the user may select the upper split area 1011, click the confirmation control, and then may take the page of the split area selected by the user as a new pre-shared page and drop the new pre-shared page to the second device 22.
Alternatively, the user may also directly screen both split areas to the second device 22. Illustratively, in FIG. 10b, the user may click on the cancel control, that is, the user may forgo selecting a split screen (e.g., selecting a split screen) for the screen, and may screen the current display interface (including the upper split screen area 1011 and the lower split screen area 1012) of the first device 21 to the second device 22.
It can be appreciated that the embodiments of the present application are mainly directed to the pre-shared page of the first screen-casting process, for example, may be performed between step 301 and step 302; after determining the page of the split area as a new pre-shared page, step 302 may be further performed.
In the embodiment of the application, the pre-shared page can comprise two split screen areas, when at least two APP are operated at the front end, the split screen selection page is provided, so that a user can autonomously select the split screen areas to be screened, the control force of the user in the screen projection process is enhanced, and privacy protection is facilitated.
Fig. 11 is a schematic structural diagram of an embodiment of a screen sharing apparatus according to the present application, as shown in fig. 11, where the screen sharing apparatus 1100 is applied to a first device, may include: a connection module 1101, a first determination module 1102, a first detection module 1103, a second determination module 1104, and a first transmission module 1105; wherein,,
a connection module 1101 for establishing a screen sharing connection between the first device and the second device in response to a detected screen sharing operation of the user;
a first determining module 1102, configured to determine a first pre-shared page;
a first detection module 1103, configured to detect data in a first pre-shared page;
A second determining module 1104, configured to determine a first sending page if the first device detects that the first pre-shared page contains sensitive data; the first sending page comprises shared content determined by a user;
the first sending module 1105 is configured to send, in response to the detected first sending page, the first sending page to the second device for display.
In one possible implementation manner, the apparatus 1100 further includes:
the third determining module 1106 is configured to determine the first pre-shared page as the first sending page if the first device detects that the first pre-shared page does not include sensitive data.
In one possible implementation manner, the apparatus 1100 further includes:
the first display module 1107 is configured to display a shared content selection page by the first device if the first device detects that the first pre-shared page includes sensitive data, where the shared content selection page is used to determine a first sending page.
In one possible implementation manner, the apparatus 1100 further includes:
a second sending module 1108, configured to send the preset page as a second sending page to the second device for display; wherein the second transmitted page does not contain sensitive data.
In one possible implementation manner, the apparatus 1100 further includes:
the second display module 1109 is configured to display, in response to the detected selection of the shared content by the user in the shared content selection page, a first split screen interface by using the first device, where the first split screen interface includes a first split screen area and a second split screen area;
the first split screen area is used for determining a first sending page.
In one possible implementation manner, the first split screen area displays details of the content to be shared, and the second split screen area displays a summary of the content to be shared.
In one possible implementation manner, the apparatus 1100 further includes:
the first updating module 1110 is configured to update, in response to a detected operation of the user in the second split screen area, display content in the second split screen area, and the first device does not update the first transmission page.
In one possible implementation manner, the apparatus 1100 further includes:
the second detection module 1111 is configured to update the first pre-shared page to a second pre-shared page in response to the detected change of the shared page, where the second pre-shared page is a page after the change of the shared page; or (b)
Responding to the fact that the detected shared page is unchanged, and the display interface of the first device is changed, displaying a second split screen interface by the first device, wherein the second split screen interface comprises a third split screen area and a fourth split screen area; the third split screen area is used for displaying the shared page, and the fourth split screen area is used for displaying the content of the display interface change of the first device;
wherein the shared page is used to characterize a page that the first device has shared with the second device.
In one possible implementation manner, the apparatus 1100 further includes:
the second updating module 1112 is configured to, if the first device detects that the second pre-shared page contains sensitive data, not update the first sending page by the first device.
In one possible implementation, the first pre-shared page includes a first split page and a second split page, and the apparatus 1100 further includes:
a third display module 1113, configured to display a split screen selection page if the first device detects that the first pre-shared page includes sensitive data, where the split screen selection page is used to determine a first sending page;
wherein the first sending page is determined by the first split-screen page, or
The first transmitted page being determined by the second split page, or
The first sent page is determined by the first pre-shared page.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
It should be appreciated that the electronic device 100 and the like include corresponding hardware structures and/or software modules that perform the functions described above. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the present application may divide the functional modules of the electronic device 100 or the like according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A screen sharing method applied to a first device, the method comprising:
responding to the detected screen sharing operation of the user, and establishing screen sharing connection between the first device and the second device;
determining a first pre-shared page;
detecting data in the first pre-shared page;
if the first device detects that the first pre-shared page contains sensitive data, determining a first sending page; the first sending page comprises shared content determined by a user;
and responding to the detected first sending page, and sending the first sending page to the second equipment for display.
2. The method according to claim 1, wherein the method further comprises:
and if the first device detects that the first pre-shared page does not contain sensitive data, determining the first pre-shared page as the first sending page.
3. The method according to claim 1, wherein the method further comprises:
and if the first device detects that the first pre-shared page contains sensitive data, the first device displays a shared content selection page, wherein the shared content selection page is used for determining the first sending page.
4. The method of claim 3, wherein prior to displaying the shared content selection page, the method further comprises:
sending the preset page as a second sending page to the second equipment for display; wherein the second transmitted page does not contain sensitive data.
5. The method according to claim 3 or 4, characterized in that the method further comprises:
after the detected user selects the shared content in the shared content selection page, the first device displays a first split screen interface, wherein the first split screen interface comprises a first split screen area and a second split screen area;
the first split screen area is used for determining the first sending page.
6. The method of claim 5, wherein the first split screen area displays details of the content to be shared and the second split screen area displays a summary of the content to be shared.
7. The method of claim 6, wherein the method further comprises:
and responding to the detected operation of the user in the second split screen area, updating the display content in the second split screen area, and not updating the first transmission page by the first device.
8. The method according to any one of claims 1-7, further comprising:
in response to the detected change of the shared page, updating the first pre-shared page into a second pre-shared page, wherein the second pre-shared page is a page after the change of the shared page; or (b)
Responding to the fact that the detected shared page is unchanged, and the display interface of the first device is changed, displaying a second split screen interface by the first device, wherein the second split screen interface comprises a third split screen area and a fourth split screen area; the third split screen area is used for displaying the shared page, and the fourth split screen area is used for displaying the content of the display interface change of the first device;
wherein the shared page is used to characterize a page that the first device has shared with the second device.
9. The method of claim 8, wherein the method further comprises:
and if the first device detects that the second pre-shared page contains sensitive data, the first device does not update the first sending page.
10. The method of claim 1, wherein the first pre-shared page comprises a first split page and a second split page, the method further comprising:
if the first device detects that the first pre-shared page contains sensitive data, displaying a split screen selection page, wherein the split screen selection page is used for determining the first sending page;
wherein the first sending page is determined by the first split-screen page, or
The first sending page is determined by the second split page, or
The first sent page is determined by the first pre-shared page.
11. An electronic device, comprising: a memory for storing computer program code comprising instructions that, when read from the memory by the electronic device, cause the electronic device to perform the method of any of claims 1-10.
12. A computer readable storage medium comprising computer instructions which, when run on the electronic device, cause the electronic device to perform the method of any of claims 1-10.
CN202111341477.2A 2021-11-12 2021-11-12 Screen sharing method, electronic device and storage medium Pending CN116127540A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111341477.2A CN116127540A (en) 2021-11-12 2021-11-12 Screen sharing method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111341477.2A CN116127540A (en) 2021-11-12 2021-11-12 Screen sharing method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN116127540A true CN116127540A (en) 2023-05-16

Family

ID=86297807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111341477.2A Pending CN116127540A (en) 2021-11-12 2021-11-12 Screen sharing method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN116127540A (en)

Similar Documents

Publication Publication Date Title
EP3933577A1 (en) Display control method and related apparatus
CN109766066B (en) Message processing method, related device and system
CN113905179B (en) Method for switching cameras by terminal and terminal
WO2021036771A1 (en) Electronic device having foldable screen, and display method
CN110347269B (en) Empty mouse mode realization method and related equipment
CN112492193B (en) Method and equipment for processing callback stream
WO2022100610A1 (en) Screen projection method and apparatus, and electronic device and computer-readable storage medium
US20220342516A1 (en) Display Method for Electronic Device, Electronic Device, and Computer-Readable Storage Medium
WO2021036898A1 (en) Application activation method for apparatus having foldable screen, and related device
CN114115770B (en) Display control method and related device
CN113472861B (en) File transmission method and electronic equipment
CN114528581A (en) Safety display method and electronic equipment
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN114554012B (en) Incoming call answering method, electronic equipment and storage medium
CN112532508B (en) Video communication method and video communication device
CN116782023A (en) Shooting method and electronic equipment
CN113867520A (en) Device control method, electronic device, and computer-readable storage medium
CN116127540A (en) Screen sharing method, electronic device and storage medium
CN114071055B (en) Method for rapidly joining conference and related equipment
CN113934352B (en) Notification message processing method, electronic device and computer-readable storage medium
CN116437194B (en) Method, apparatus and readable storage medium for displaying preview image
CN114125144B (en) Method, terminal and storage medium for preventing false touch
WO2023071497A1 (en) Photographing parameter adjusting method, electronic device, and storage medium
WO2023093778A1 (en) Screenshot capture method and related apparatus
CN117319369A (en) File delivery method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination