WO2022267644A1 - 一种屏幕共享方法、系统和虚拟显示设备 - Google Patents

一种屏幕共享方法、系统和虚拟显示设备 Download PDF

Info

Publication number
WO2022267644A1
WO2022267644A1 PCT/CN2022/087103 CN2022087103W WO2022267644A1 WO 2022267644 A1 WO2022267644 A1 WO 2022267644A1 CN 2022087103 W CN2022087103 W CN 2022087103W WO 2022267644 A1 WO2022267644 A1 WO 2022267644A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
screen
file
target file
mobile phone
Prior art date
Application number
PCT/CN2022/087103
Other languages
English (en)
French (fr)
Inventor
黄炳洁
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022267644A1 publication Critical patent/WO2022267644A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0096Synchronisation or controlling aspects

Definitions

  • the present application relates to the technical field of terminals, in particular to a screen sharing method, system and virtual display device.
  • mobile phone 1 can display the screen content of mobile phone 2 on the local screen.
  • mobile phone 1 as the receiving device of the shared content, can only display the screen content of one party (that is, the screen content of mobile phone 1 or mobile phone 2), if user 1 wants to watch the screen of the other party, Then screen switching is required, and the user experience is not good.
  • the present application provides a screen sharing method, virtual display device and system, so that during the screen sharing process, a screen receiving device can simultaneously display its own screen and screens shared by other devices through the virtual display device.
  • the embodiment of the present application provides a screen sharing method applied to a virtual display device, the virtual display device is connected to a first electronic device, and the first electronic device is connected to a second electronic device; the first electronic device is used to display The first screen content, the second electronic device is used to display the second screen content, and the first electronic device receives the second screen content sent by the second electronic device.
  • the method includes: acquiring the first screen content and the second screen content from the first electronic device; displaying the first screen content in the first area of the virtual display device display interface, and displaying the second screen content in the second area of the virtual display device display interface. screen content.
  • the virtual display device may be a VR device or an MR device
  • the first electronic device is a receiving device for sharing a screen (ie, a screen receiving device)
  • the second electronic device is a device that initiates screen sharing (ie, a screen receiving device). screen sharing device).
  • the first electronic device and the second electronic device may be terminal devices such as mobile phones, notebook computers, and tablet computers.
  • the virtual display device may be VR glasses
  • the first electronic device may be a mobile phone 1
  • the second electronic device may be a mobile phone 2 .
  • the virtual display device in the process of the second electronic device sharing the screen with the first electronic device, can make full use of the virtual space generated by it, and simultaneously display the first electronic device and the second electronic device
  • the screen content of the first electronic device enables the user of the first electronic device to watch and operate his own screen content (that is, the screen content of the first electronic device) while viewing the friend's screen content (that is, the screen content of the second electronic device), with Better user experience.
  • the method further includes: after the virtual display device detects the file sending operation, controlling the first electronic device to send the target file to the second electronic device, the file sending operation is: instructing the virtual display device to send the first area The target file shown above sends an operation to the second zone.
  • the file sending operation includes: dragging and dropping the target file from the first area to the second area. For example, through the user's virtual touch operation, the icon of the target file is dragged from the first area to the second area.
  • the file sending operation includes: dragging the icon of the target file from the first area to the second area through the control handle of the virtual display device.
  • the file sending operation includes: copying the target file from the first area to the second area.
  • the target file is copied from the first area through the control handle of the virtual display device, and then the target file is pasted in the second area.
  • the user in the screen sharing mode, the user can quickly control his own device (that is, the first electronic device) to send the target file to a friend's device (that is, the second electronic device) by dragging or copying and pasting.
  • his own device that is, the first electronic device
  • a friend's device that is, the second electronic device
  • controlling the first electronic device to send the target file to the second electronic device includes: the virtual display device detects the first control event in the first area, and After the second control event is detected in the second area, the first electronic device is controlled to send the target file to the second electronic device.
  • the first control event is used to select the target file, and the second control event is used to release the selected target file; or, the first control event is used to copy the target file, and the second control event is used to paste the target file.
  • the method further includes: when the fast file sending switch of the virtual display device is on and the target file supports being shared, the virtual display device controls the first electronic device to send the target file to the second electronic device .
  • the method further includes: after the virtual display device enters the virtual screen sharing mode, the virtual display device automatically turns on the fast file sending switch of the virtual display device.
  • the virtual display device displays a first prompt message, and the first prompt message is used to prompt to turn on the fast file send switch ;
  • the virtual display device controls the file fast sending switch to be in an on state.
  • the method further includes: if the target file does not support sending, the virtual display device displays second prompt information, and the second prompt information is used to prompt the target The file does not support sending. For example, if the file type of the target file is not in the white list, the second prompt message is displayed.
  • the virtual display device controls the first electronic device to send the target file to the second electronic device, including: the virtual display device displays third prompt information, and the third prompt information is used to ask the user whether to send the target file; in response After receiving the instruction indicating to send the target file, the virtual display device controls the first electronic device to send the target file to the second electronic device.
  • the method further includes: the virtual display device displays fourth prompt information in the second area, where the fourth prompt information is used to prompt the second electronic device to receive the target file from the first electronic device.
  • the method further includes: when multiple second electronic devices send the second screen content to the first electronic device, the virtual display device determines multiple second areas around the first area, and The second screen content of each second electronic device is respectively displayed in the second area.
  • the method further includes: the virtual display device acquires the first control track on the second area; displays the first track lines on the second area according to the first control track; The device sends first line information of the first track line, and the first line information is used by the second electronic device to display the first track line based on the content of the second screen.
  • the first control track may be a virtual touch track of the user on the second area, or a motion track of a control handle of a virtual display device.
  • the first track line can be called graffiti content.
  • the user when the user watches the screen content of the first electronic device and the second electronic device through the virtual display device at the same time, the user can also use the virtual display device to do graffiti on the second area, and send the graffiti content to The second electronic device has better user experience.
  • the method further includes: using the first electronic device to receive second line information sent by the second electronic device, the second line information is used to determine the second trajectory line, the second trajectory line is based on the second electronic device determined by the second control track on the ; and, according to the second line information, display the second track line on the second area.
  • the virtual display device can display graffiti content on the second electronic device in the second area, which has better user experience.
  • the embodiment of the present application provides a screen sharing system, including: a virtual display device, a first electronic device and a second electronic device, the virtual display device is connected to the first electronic device, and the first electronic device and the second electronic device Connection; the first electronic device is used to display the content of the first screen, and the second electronic device is used to display the content of the second screen.
  • the first electronic device is configured to: receive the second screen content sent by the second electronic device.
  • the virtual display device is configured to: obtain the first screen content and the second screen content from the first electronic device; display the first screen content in the first area of the virtual display device display interface, and display the first screen content in the second area of the virtual display device display interface Displays the second screen content.
  • the virtual display device is further configured to: after detecting the file sending operation, control the first electronic device to send the target file to the second electronic device, and the file sending operation is: instruct the virtual display device to send the first area The target file shown above sends an operation to the second zone.
  • the file sending operation includes: dragging or copying the target file from the first area to the second area.
  • the virtual display device is further configured to: after detecting the first control event in the first area and detecting the second control event in the second area, control the first electronic device to send the second electronic device Send the target file.
  • the first control event is used to select the target file
  • the second control event is used to release the selected target file
  • the first control event is used to copy the target file
  • the second control event is used to paste the target file.
  • the virtual display device is further configured to: control the first electronic device to send the target file to the second electronic device when the fast file sending switch of the virtual display device is on and the target file supports being shared .
  • the virtual display device is further configured to: automatically turn on the fast file sending switch after entering the virtual screen sharing mode; or, after detecting the second control event in the second area, the virtual display device is further configured It is: if the fast file sending switch of the virtual display device is not turned on, then display the first prompt message, the first prompt message is used to prompt to turn on the file fast sending switch; in response to the operation of turning on the file fast sending switch, control the file fast sending switch is on.
  • the virtual display device is further configured to: if the target file does not support sending, display a second prompt message, where the second prompt message is used to prompt that the target file does not support sending.
  • the virtual display device is further configured to: display third prompt information, the third prompt information is used to ask the user whether to send the target file; The device sends the target file to the second electronic device.
  • the second electronic device is further configured to: display fourth prompt information, where the fourth prompt information is used to prompt the second electronic device to receive the target file from the first electronic device.
  • the virtual display device is further configured to: when multiple second electronic devices send multiple second screen contents to the first electronic device, the virtual display device determines multiple second areas around the first area , and display multiple second screen contents in multiple second areas respectively.
  • the virtual display device is further configured to: obtain the first control track on the second area; display the first track lines on the second area according to the first control track; The device sends first line information of the first track line.
  • the second electronic device is further configured to: display the first trajectory line on the screen of the second electronic device according to the first line information.
  • the second electronic device is further configured to: obtain the second control track on the screen of the second electronic device; display the second track line according to the second control track; send the second control track to the virtual display device through the first electronic device The second line information of the two-track line.
  • the virtual display device is further configured to: display the second trajectory line on the second area according to the second line information.
  • the embodiment of the present application further provides a virtual display device configured to execute the screen sharing method shown in the first aspect above.
  • an embodiment of the present application further provides a chip system, the chip system includes a processor, and the processor executes a computer program stored in a memory, so as to implement the screen sharing method shown in the first aspect above.
  • the embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the screen sharing method shown in the above-mentioned first aspect is implemented.
  • the embodiment of the present application further provides a computer program product, the program product includes a program, and when the program is run by the electronic device, the electronic device is made to perform the screen sharing method shown in the first aspect above.
  • FIG. 1 is a schematic diagram of a screen sharing control process provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a screen sharing control process provided by another embodiment of the present application.
  • FIG. 3 is a schematic diagram of the screen graffiti process provided by the embodiment of the present application.
  • FIG. 4A is a schematic diagram of a file sending control process provided by an embodiment of the application.
  • Fig. 4B is a schematic diagram of a file sending control process provided by another embodiment of the application.
  • FIG. 5 is a schematic diagram of the control process for exiting the screen sharing mode provided by the embodiment of the present application.
  • FIG. 6 is a schematic architecture diagram of a screen sharing system provided by an embodiment of the present application.
  • Fig. 7A is a schematic structural diagram of a wearable device provided by an embodiment of the present application.
  • Fig. 7B is a schematic diagram of the composition of an optical display module provided by an embodiment of the present application.
  • FIG. 8 is a control flow diagram of the virtual screen sharing mode provided by the embodiment of the present application.
  • FIG. 9 is a first schematic diagram of electronic device interaction provided by an embodiment of the present application.
  • Fig. 10 is a first schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 11 is a second schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 12A is a schematic flow chart of the graffiti content sharing process provided by an embodiment of the present application.
  • FIG. 12B is a first schematic diagram of a graffiti scene in the virtual screen sharing mode provided by the embodiment of the present application.
  • Fig. 13A is a schematic flow chart of a graffiti content sharing process provided by another embodiment of the present application.
  • FIG. 13B is a second schematic diagram of a graffiti scene in the virtual screen sharing mode provided by the embodiment of the present application.
  • FIG. 14 is a schematic flowchart 1 of a file sending control method provided by an embodiment of the present application.
  • Fig. 15 is the third schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 16A is a fourth schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 16B is the fifth schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 17 is the second schematic diagram of electronic device interaction provided by the embodiment of the present application.
  • Fig. 18 is a sixth schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • Fig. 19 is the seventh schematic diagram of the virtual display interface provided by the embodiment of the present application.
  • FIG. 20 is a second schematic flowchart of the file sending control method provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • some communication applications support the function of sharing screens between electronic devices of application friends.
  • mobile phone 2 (belonging to user 2) sharing a screen with mobile phone 1 (belonging to user 1) as an example
  • mobile phone 1 can display the screen content of mobile phone 2 on the local screen.
  • the user 1 can view the screen content of the mobile phone 2 in real time through the mobile phone 1 , for example, browse the photo album and application interface of the mobile phone 2 .
  • the electronic device that initiates screen content sharing is called a screen sharing device, such as mobile phone 2; and the electronic device that receives and displays screen content shared by other electronic devices is called a screen receiving device, such as mobile phone 1.
  • FIG. 1 is a schematic diagram of a screen sharing control process provided by an embodiment of the present application.
  • FIG. 1 illustrates the process of mobile phone 2 sharing a screen with mobile phone 1 by taking Changlian application as an example.
  • the call interface of mobile phone 2 usually displays the "switch camera” control, the "hang up” control and the "more "Controls and more.
  • the mobile phone 2 detects the user's operation on the “more” control, it displays the "mute” control, the “speaker” control, the "switch voice” control and the "share screen” control, etc. content.
  • the mobile phone 2 detects the user's operation on the "share screen” control, it displays prompt information, a "cancel” control and a "confirm” control.
  • the prompt information may be "The other party who shares my screen can see your screen. Are you sure to share?".
  • the mobile phone 2 sends a screen sharing request to the mobile phone 1, and displays a prompt message "Sharing the screen, waiting for confirmation from the other party".
  • the mobile phone 2 enters the screen sharing mode after detecting that the mobile phone 1 confirms to accept the screen sharing, so as to send the screen content of the mobile phone 2 to the mobile phone 1 in real time.
  • the content of the screen shared by the mobile phone 2 is specifically determined according to the needs of the user 2 .
  • user 2 can control mobile phone 2 to display a shopping interface for backpacks.
  • the mobile phone 2 can send the shopping interface to the mobile phone 1, so that the mobile phone 1 can display the shopping interface.
  • FIG. 2 is a schematic diagram of a screen sharing control process provided by another embodiment of the present application, involving a process in which mobile phone 1 accepts a screen sharing request from mobile phone 2 and enters a screen sharing mode.
  • the mobile phone 1 displays another prompt message, a "cancel" control and a "confirm” control.
  • the prompt information may be "the other party shares the screen with you and you can see the other party's screen. Are you sure to accept?".
  • mobile phone 1 detects the user's operation on the "OK” control, it will temporarily not display the local screen content of mobile phone 1, but display the screen content shared by mobile phone 2, such as the backpack's shopping interface.
  • mobile phone 1 may also display an invitation control for actively inviting mobile phone 2 to share screen content with mobile phone 1 .
  • mobile phone 2 shares the screen content of mobile phone 2 with mobile phone 1 in real time; mobile phone 1 displays the screen content of mobile phone 2 in real time, but temporarily does not display the local screen content of mobile phone 1.
  • users can not only do graffiti on mobile phone 1 and mobile phone 2, share graffiti information with each other, but also control mobile phone 1 and mobile phone 2 to send files to each other. Each of them will be described below.
  • graffiti refers to a process in which an electronic device displays trace lines according to a user's control operation (such as a touch operation) on its screen.
  • This trajectory line is graffiti content
  • the line style such as solid line, dotted line, dot-dash line, etc.
  • color of this trajectory line can be determined according to the configuration of the user, or can be determined according to the pre-configuration of the system, and this embodiment does not limit.
  • the color, line style, and position on the screen of the track lines displayed by the electronic device are referred to as track information.
  • FIG. 3 is a schematic diagram of a screen graffiti process provided by an embodiment of the present application, which involves a process in which an electronic device (mobile phone 1 or mobile phone 2) performs graffiti according to a user's instruction.
  • an electronic device mobile phone 1 or mobile phone 2 performs graffiti according to a user's instruction. The details are as follows.
  • the electronic device when the electronic device displays the screen shared by the mobile phone 2, it usually displays graffiti controls, sharing duration, exit controls and the like.
  • the graffiti control is used to control turning on or off the screen graffiti function of the electronic device.
  • the sharing duration is used to represent the duration of this screen sharing process.
  • the logout control is used to control the electronic device to log out of screen sharing. User 1 and user 2 can do graffiti on their respective electronic devices.
  • the electronic device after the electronic device detects the user's operation on the graffiti control, it can display operation prompt information about screen graffiti, which is used to prompt the user how to perform graffiti operations on the screen. After the electronic device closes the operation prompt information according to the user's instruction, it enters the graffiti mode.
  • the electronic device after the electronic device enters the graffiti mode, it can display trajectory lines according to the user's control trajectory on the screen.
  • the electronic device also sends the trajectory information of the local trajectory line to another electronic device participating in screen sharing, so that the other party can display the trajectory line on its screen.
  • the mobile phone 1 can also send the local graffiti content of the mobile phone 1 to the mobile phone 2, so that the mobile phone 2 can display the trajectory lines.
  • the files sent between the two electronic devices may be documents, pictures, videos, audio, compressed packages, installation packages, etc., and this embodiment does not limit the types thereof.
  • Fig. 4A is a schematic diagram of a file sending control process provided by an embodiment of the application, involving a process in which a screen receiving device (such as mobile phone 1) sends a file to a screen sharing device (such as mobile phone 2) according to user instructions.
  • a screen receiving device such as mobile phone 1
  • a screen sharing device such as mobile phone 2
  • the mobile phone 1 after the mobile phone 1 quits displaying the screen content shared by the mobile phone 2 , it selects a target file (for example, file 1 ) according to the user's instruction.
  • a target file for example, file 1
  • the screen content shared by mobile phone 2 after exiting mobile phone 2 can be that the mobile phone exits the screen sharing mode and does not receive the screen content shared by mobile phone 2, so as to display the local screen content; it can also continue to maintain the screen sharing mode, but hide and display the screen content shared by mobile phone 2 screen content to display the local screen content.
  • mobile phone 1 displays a variety of share sub-controls, such as “Huawei Share”, “Bluetooth”, “Changlian”, and “Memorandum”. , “Send by email”, “Cast screen”, “Send to computer”, “Upload to cloud disk”, “Information” and other controls.
  • the mobile phone 1 After the mobile phone 1 detects the user's operation on the "Smooth Connection” control, it displays a contact selection interface, which includes “select contacts” and “group chat” controls.
  • the mobile phone 1 After the mobile phone 1 detects the user's operation on the "select contacts” control, it displays contact information, such as contact names "user 2", “user 3", “user 4", etc. "Wait.
  • the mobile phone 1 after the mobile phone 1 detects that the user selects the target contact (for example, user 2), it displays prompt information, a "cancel” control and a "send” control.
  • the prompt information is used to prompt the mobile phone 1 to send the target file (for example, file 1) to the user 2 soon.
  • the mobile phone 2 sends the file 1 to the mobile phone 2 after detecting the user's operation on the "send" control.
  • FIG. 4B is a schematic diagram of a file sending control process provided by another embodiment of the application, involving a process in which a screen receiving device (such as mobile phone 1 ) sends pictures to a screen sharing device (mobile phone 2 ) according to user instructions. (a) in FIG. 4B to (f) in FIG. This will not be repeated here.
  • a screen receiving device such as mobile phone 1
  • a screen sharing device mobile phone 2
  • control process shown in FIG. 4A or 4B can also be used to control the mobile phone 2 to send the file to the mobile phone 1, which will not be repeated in this embodiment.
  • FIG. 5 is a schematic diagram of a control process for exiting a screen sharing mode provided by an embodiment of the present application.
  • the screen receiving device such as mobile phone 1
  • the mobile phone 1 may display prompt information, a "cancel" control and a "confirm” control.
  • This prompt message is used to ask the user whether to exit the screen sharing mode.
  • the query information may be "you will not be able to see the screen of the other party after you exit the shared screen. Are you sure to exit?".
  • the "Cancel” control is used to cancel displaying the prompt message, and continue to display the screen content shared by the mobile phone 2 .
  • the "OK" control is used to control the mobile phone 1 to exit the current screen sharing mode.
  • the mobile phone 2 can also exit the screen sharing mode according to the user's instruction.
  • the specific control process can be referred to as shown in FIG. 5 , which will not be repeated in this embodiment.
  • the shared screen receiving device (such as mobile phone 1) can only display the screen content of one party (that is, the local screen content of mobile phone 1, or the screen content shared by mobile phone 2. If mobile phone 1 wants to display For the screen content of the other party, the screen content needs to be switched, which is inconvenient for the user to operate and the user experience is not good.
  • the embodiment of the present application provides a screen sharing method, so that during the screen sharing process, the user of the screen receiving device can watch the local screen and the screen shared with other electronic devices at the same time, so as to improve user experience.
  • the user operation process of file sending can be simplified, and the efficiency of file sending can be improved.
  • Fig. 6 is a schematic architecture diagram of a screen sharing system provided by an embodiment of the present application.
  • the control system includes a screen receiving device, a screen sharing device and at least one virtual display device.
  • the screen receiving device and the screen sharing device are connected through a communication application (such as a Changlian application).
  • the screen receiving device and the virtual display device can be wired or wirelessly connected.
  • the wired connection includes connecting through a universal serial bus (universal serial bus, USB) cable.
  • the wireless connection includes connection through wireless communication technologies such as Bluetooth (bluetooth, BT) technology, wireless fidelity (wireless fidelity, WiFi), and near field communication (near field communication, NFC).
  • the screen receiving device and the screen sharing device are connected through a communication application (such as a Changlian application).
  • the screen receiving device and the screen sharing device may also be respectively connected to the first virtual display device and the second virtual display device.
  • the screen receiving device and the screen sharing device may be mobile phones, tablet computers, notebook computers, desktop computers, smart TVs, wearable devices (such as smart watches), vehicle-mounted devices, ultra-mobile personal computers (ultra-mobile personal computers) computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other terminal equipment, this embodiment does not limit its specific type.
  • the virtual display device may be a virtual reality (virtual reality, VR) device, such as VR glasses, or other virtual display devices, such as a mixed reality (Mixed reality, MR) device.
  • VR virtual reality
  • MR mixed reality
  • FIG. 7A shows a schematic structural diagram of a wearable device provided by an embodiment of the present application, taking the virtual display device as an example of a wearable device.
  • the wearable device 100 may include a processor 110, a memory 120, a sensor module 130 (which may be used to acquire the user's posture), a microphone 140, a button 150, an input and output interface 160, a communication module 170, a camera 180, battery 190 , optical display module 1100 , eye tracking module 1200 and so on.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the wearable device 100 .
  • the wearable device 100 may include more or fewer components than shown in the illustration, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 is generally used to control the overall operation of the wearable device 100, and may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), image signal processor (image signal processor, ISP), video processing unit (video processing unit, VPU) controller, memory, video codec, digital signal processor (digital signal processor, DSP ), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • video processing unit video processing unit
  • VPU video processing unit
  • memory video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the processor 110 may be used to control the optical power of the wearable device 100 .
  • the processor 110 may be used to control the optical power of the optical display module 1100 to realize the function of adjusting the optical power of the wearable device 100 .
  • the processor 110 can adjust the relative positions of the various optical devices (such as lenses, etc.) When the human eye is imaging, the position of the corresponding virtual image plane can be adjusted. In this way, the effect of controlling the optical power of the wearable device 100 is achieved.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, a universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general input and output (general -purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or universal serial bus (universal serial bus, USB) interface, serial peripheral interface (serial peripheral interface, SPI) interface etc.
  • I2C integrated circuit
  • MIPI mobile industry processor interface
  • GPIO general input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB serial peripheral interface
  • SPI serial peripheral interface
  • the processor 110 may render different objects based on different frame rates, for example, use a high frame rate for rendering for nearby objects, and use a low frame rate for rendering for distant objects.
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the communication module 170 .
  • the processor 110 communicates with the Bluetooth module in the communication module 170 through the UART interface to realize the Bluetooth function.
  • the MIPI interface can be used to connect the processor 110 with the display screen in the optical display module 1100 , the camera 180 and other peripheral devices.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 180 , the display screen in the optical display module 1100 , the communication module 170 , the sensor module 130 , the microphone 140 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the camera 180 can collect images including real objects, and the processor 110 can fuse the images collected by the camera with the virtual objects, and realize the fused images through the optical display module 1100.
  • the processor 110 can fuse the images collected by the camera with the virtual objects, and realize the fused images through the optical display module 1100.
  • the processor 110 can fuse the images collected by the camera with the virtual objects, and realize the fused images through the optical display module 1100.
  • the USB interface is an interface that conforms to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface can be used to connect a charger to charge the wearable device 100, and can also be used to transmit data between the wearable device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as mobile phones.
  • the USB interface may be USB3.0, which is compatible with high-speed display port (DP) signal transmission, and can transmit video and audio high-speed data.
  • DP high-speed display port
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the wearable device 100 .
  • the wearable device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the wearable device 100 may include a wireless communication function.
  • the wearable device 100 may receive rendered images from other electronic devices (such as a VR host or a VR server) for display, or receive an unrendered image and then the processor 110 Render the image and display it.
  • the communication module 170 may include a wireless communication module and a mobile communication module.
  • the wireless communication function can be realized by an antenna (not shown), a mobile communication module (not shown), a modem processor (not shown), and a baseband processor (not shown).
  • Antennas are used to transmit and receive electromagnetic wave signals. Multiple antennas may be included in the wearable device 100, and each antenna may be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module can provide applications on the wearable device 100 including the second generation (2th generation, 2G) network/third generation (3th generation, 3G) network/fourth generation (4th generation, 4G) network/fifth generation (5th generation, 5G) network and other wireless communication solutions.
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module can receive electromagnetic waves through the antenna, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave and radiate it through the antenna.
  • at least part of the functional modules of the mobile communication module may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speakers, etc.), or displays images or videos through the display screen in the optical display module 1100 .
  • the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module or other functional modules.
  • the wireless communication module can provide applications on the wearable device 100 including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module may be one or more devices integrating at least one communication processing module.
  • the wireless communication module receives electromagnetic waves through the antenna, frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module can also receive the signal to be sent from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic wave to radiate through the antenna.
  • the antenna of the wearable device 100 is coupled to the mobile communication module, so that the wearable device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
  • GNSS can include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou satellite navigation system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi-zenith) satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou satellite navigation system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quasi-zenith satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the wearable device 100 realizes the display function through the GPU, the optical display module 1100 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the optical display module 1100 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the memory 120 may be used to store computer-executable program code, including instructions.
  • the processor 110 executes various functional applications and data processing of the wearable device 100 by executing instructions stored in the memory 120 .
  • the memory 120 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the data storage area can store data created during use of the wearable device 100 (such as audio data, phonebook, etc.) and the like.
  • the memory 120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the wearable device 100 can implement audio functions through an audio module, a speaker, a microphone 140, an earphone interface, and an application processor. Such as music playback, recording, etc.
  • the audio module is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module can also be used to encode and decode audio signals.
  • the audio module may be set in the processor 110 , or some functional modules of the audio module may be set in the processor 110 .
  • Loudspeakers also called “horns" are used to convert audio electrical signals into sound signals. Wearable device 100 can listen to music through the speakers, or listen to hands-free calls.
  • the microphone 140 also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the wearable device 100 may be provided with at least one microphone 140 .
  • the wearable device 100 may be provided with two microphones 140, which may also implement a noise reduction function in addition to collecting sound signals.
  • the wearable device 100 can also be provided with three, four or more microphones 140 to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the headphone jack is used to connect wired headphones.
  • the headphone interface can be a USB interface, or a 3.5 mm (mm) open mobile terminal platform (OMTP) standard interface, and the cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the wearable device 100 may include one or more buttons 150 that may control the wearable device and provide the user with access to functions on the wearable device 100 .
  • Keys 150 may be in the form of buttons, switches, dials, and touch or near-touch sensing devices such as touch sensors. Specifically, for example, the user can turn on the optical display module 1100 of the wearable device 100 by pressing a button.
  • the keys 150 include a power key, a volume key and the like.
  • the key 150 may be a mechanical key. It can also be a touch button.
  • the wearable device 100 can receive key input and generate key signal input related to user settings and function control of the wearable device 100 .
  • the wearable device 100 may include an input-output interface 160, and the input-output interface 160 may connect other devices to the wearable device 100 through suitable components.
  • Components may include, for example, audio/video jacks, data connectors, and the like.
  • the optical display module 1100 is used for presenting images to the user under the control of the processor.
  • the optical display module 1100 can convert the real pixel image display into a near-eye projection virtual image display through one or several optical devices such as mirrors, transmission mirrors, or optical waveguides, so as to realize virtual interactive experience, or realize virtual and Interactive experience combined with reality.
  • the optical display module 1100 receives image data information sent by the processor, and presents corresponding images to the user.
  • the wearable device 100 may further include an eye tracking module 1200, which is used to track the movement of human eyes, and then determine the point of gaze of the human eyes.
  • the position of the pupil can be located by image processing technology, the coordinates of the center of the pupil can be obtained, and then the gaze point of the person can be calculated.
  • the virtual display device provided by the embodiment of the present application has the function of automatically adjusting the optical power. In some embodiments, this function can be realized by the optical display module 1100 .
  • FIG. 7B is a schematic composition diagram of an optical display module provided by an embodiment of the present application.
  • the optical display module 1100 includes an eyepiece 701 , a zoom module 702 and a display screen 703 .
  • the virtual display device can be used to support VR or MR technology to provide a virtual display function.
  • the virtual display device may be a head-mounted display (Head-mounted display, HMD) device, such as VR or MR glasses, a VR or MR helmet, or a VR or MR all-in-one machine.
  • the virtual display device may also be included in the above example head-mounted virtual display device.
  • the virtual display device may also be used to support the implementation of mixed reality technology.
  • Fig. 8 shows the control flow chart of the screen sharing mode provided by the embodiment of the present application, which involves the process that the mobile phone 2 shares the screen with the mobile phone 1, and the VR glasses display the screen content of the mobile phone 1 and the screen content shared by the mobile phone 2 at the same time. Specifically, the following steps S801-S812 are included.
  • the mobile phone 1 after the mobile phone 1 establishes a connection with the VR glasses, the mobile phone 1 enters the VR display mode.
  • the virtual display interface of the VR glasses can display some video resources, VR game resources, VR controls, etc. that support VR mode playback.
  • the VR control may include a "VR projection" control, which is used to control the VR glasses to enter the projection mode through the VR projection application, and display the screen content of the mobile phone 1 .
  • the VR glasses open the VR projection application and enter the VR projection mode.
  • the VR glasses display the first screen content of the mobile phone 1 on the first area of the virtual display interface.
  • the VR glasses When the VR glasses detect the user's operation on the "VR projection” control, open the VR projection application and enter the VR projection mode.
  • the mobile phone 1 sends the local screen content of the mobile phone 1 to the VR glasses through the VR projection application, that is, the content of the first screen.
  • the VR glasses for example, as shown in FIG. 10 , determine the first area in the virtual display interface, and display the content of the first screen of the mobile phone 1 in the first area.
  • the user can issue a virtual touch operation through gestures or control the content on the first screen through a control handle (also called a VR handle) matched with the VR glasses.
  • a control handle also called a VR handle
  • control the content of the first screen to slide up and down or left and right, select files, icons or controls in the content of the first screen, operate communication applications (such as Changlian application) to make voice or video calls with friends (such as user 2), or answer calls Audio/video calls sent by friends, etc.
  • the mobile phone 1 when the mobile phone 1 enters the VR projection mode, the mobile phone 1 can locally display the content of the first screen, and can also turn off the screen to save power and reduce the system operation consumption of the mobile phone 1.
  • the user when the mobile phone 1 is in the VR projection mode and the mobile phone 1 is in the off-screen state, the user can also use the mobile phone 1 as a control handle of the VR glasses.
  • the VR glasses determine whether the content on the second screen of the mobile phone 2 is received.
  • the VR glasses determine the second area, and set the second area not to respond to the user's control operation.
  • the VR glasses After receiving the second screen content, the VR glasses determine a second area while displaying the first area, for displaying the received second screen content.
  • the VR glasses can set the second area not to respond to the user's control operation, that is, the user cannot control the display content in the second area, but can only watch it, so as to ensure the information security of the mobile phone 2 .
  • the VR glasses record the location information of the first area and the second area, for file sending control.
  • the position information of the first area is used to indicate the display position of the first area in the virtual display interface.
  • the location information of the first area may include the vertex coordinates of the upper left corner of the first area and the width and height information of the first area.
  • the position information of the second area is used to indicate the display position of the second area in the virtual display interface.
  • the location information of the second area may include the vertex coordinates of the upper left corner of the second area and the width and height information of the second area.
  • the VR glasses After the VR glasses record the location information of the first area and the second area, they can control the mobile phone 1 to send files to the mobile phone 2 according to the location information.
  • the mobile phone 1 After the VR glasses record the location information of the first area and the second area, they can control the mobile phone 1 to send files to the mobile phone 2 according to the location information.
  • the fast file sending switch is used to control whether the VR glasses control the mobile phone 1 to send files to the mobile phone 2 in response to user operations.
  • the VR glasses display the second screen content in real time on the second area.
  • mobile phone 2 Since mobile phone 2 sends the second screen content to mobile phone 1 in real time, as shown in Fig. It is displayed on the second area.
  • the VR glasses judge whether the mobile phone 2 stops screen sharing.
  • both the mobile phone 1 and the mobile phone 2 can control the mobile phone 2 to stop screen sharing according to user instructions. After the VR glasses detect that the mobile phone 2 stops sharing the screen, the next step S810 will be executed.
  • mobile phone 1 During the voice call or video call between mobile phone 1 and mobile phone 2, if mobile phone 2 stops sharing the screen with mobile phone 1, mobile phone 1 will notify the VR glasses that the screen sharing has stopped, and the VR glasses will stop displaying the content of the second screen in the second area, And turn off the quick file sending switch, but still continue to display the contents of the first screen of the mobile phone 1 in the first area.
  • the VR glasses will neither display the content of the first screen nor Display the content of the second screen, and control to switch the current screen sharing mode to normal screen sharing mode.
  • the VR glasses determine whether to exit the VR screen projection application.
  • the VR glasses exit the VR screen projection application.
  • the virtual The display interface can simultaneously display the content of the first screen of the mobile phone 1 and the content of the second screen shared by the mobile phone 2 .
  • the content of the first screen of mobile phone 1 and the content of the second screen shared by mobile phone 2 are jointly displayed on the virtual display interface of the VR glasses, so that the user can see the screens of two electronic devices at the same time, which has a better user experience .
  • the mode of simultaneously displaying the content of the first screen of the mobile phone 1 and the content of the second screen of at least one other mobile phone (eg, mobile phone 1 ) on the virtual display device is called a virtual screen sharing mode.
  • the mode of displaying the content of the second screen of the mobile phone 2 on the screen of the mobile phone 1 shown above is called a normal screen sharing mode.
  • the first area in the VR glasses can display different content according to user control, for example, it can display the call interface of the Changlian application, the main screen interface of the mobile phone 1, the video playback interface, the text interface, the game interface, Application interfaces such as shopping interfaces are not limited in this embodiment.
  • the virtual display interface also displays content such as background controls, rotation controls, and more functional controls.
  • the background control is used to control whether the VR glasses display background content behind the first area and the second area.
  • the rotation control is used to control the rotation of the first area. For example, the control rotates the first area from landscape to portrait, or from portrait to landscape. More functional controls are used to provide more control functions, such as adjusting the brightness, contrast, and aspect ratio of the virtual display interface.
  • the VR glasses can also receive and simultaneously display the screen content shared by the multiple electronic devices. For example, when mobile phone 1 is making a group call with mobile phone 2 and mobile phone 3, if mobile phone 2 and mobile phone 3 share the screen with mobile phone 1 at the same time, the VR glasses will display the contents of the first screen of mobile phone 1 and the contents of the mobile phone 1 at the same time.
  • the second screen content shared by 2 and the screen content shared by mobile phone 3 have better user experience.
  • the screen content shared by the plurality of electronic devices may be displayed around the first area.
  • the VR glasses can make full use of the virtual space generated by it, and display the screen content of mobile phone 1 and mobile phone 2 at the same time, so that User 1 can watch and operate his own screen content (that is, the screen content of mobile phone 1 ) while watching his friend's screen content (screen content of mobile phone 2 ), which has a better user experience.
  • user 1 and user 2 can not only do screen graffiti, but also quickly control mobile phone 1 and mobile phone 2 to send files. Each of them will be described below.
  • the second area displayed by the VR glasses not only includes the screen content shared by the mobile phone 2, but also includes graffiti controls, sharing duration and exit controls. Normally, the VR glasses do not respond to the user's control operations on the screen content. However, when the user controls the VR glasses to enter the graffiti mode through the graffiti control, they can graffiti on the shared screen.
  • Fig. 12A is a schematic flow chart of the graffiti process provided by an embodiment of the present application, which relates to the process of displaying and sharing graffiti content by VR glasses according to user instructions in the virtual screen sharing mode. Specifically, the following steps S1201-S1205 are included.
  • the VR glasses determine a first control track of a user in a second area.
  • the VR glasses start to detect the first control track of the user in the second area after controlling the second area to enter the graffiti mode.
  • the first control track may be the user's virtual touch track in the second area, or it may be the moving track of the cursor under the control of the VR handle when the cursor of the VR handle is in the second area.
  • the VR glasses display the first track line in the second area according to the first control track.
  • the VR glasses can display the first track line according to the user's virtual touch track on the backpack 1 .
  • the VR glasses send the first trajectory information of the first trajectory line to the mobile phone 1 .
  • the first track information includes the color, line style and position of the first track line in the second area.
  • the mobile phone 1 sends the first trajectory information of the first trajectory line to the mobile phone 2.
  • the mobile phone 2 displays the first trajectory line according to the first trajectory information.
  • the first track line displayed by the mobile phone 2 may be shown in FIG. 12B . It should be understood that the position of the first trajectory line on the screen of the mobile phone 2 is the same as its position in the second area of the VR glasses.
  • the user of the mobile phone 1 can browse the screen of the mobile phone 1 through the VR glasses, and at the same time do graffiti on the screen shared by the mobile phone 2, and share the graffiti content with the mobile phone 2, which has a better user experience.
  • FIG. 13A is a schematic flow chart of the graffiti process provided by another embodiment of the present application, which relates to the process of displaying and sharing graffiti content on the mobile phone 2 according to user instructions in the virtual screen sharing mode. Specifically, the following steps S1301-S1305 are included.
  • the mobile phone 2 determines a second control track of the user on the screen.
  • the second control track on the mobile phone 2 can be the user's touch on the screen of the mobile phone 2 track.
  • the mobile phone 2 displays the second track line according to the second control track.
  • the mobile phone 2 can display the second track line according to the touch track of the user on the backpack 3 .
  • the mobile phone 2 sends the second trajectory information of the second trajectory line to the mobile phone 1.
  • the second trajectory information includes the color, style and position of the second trajectory line on the screen of the mobile phone 2 .
  • the mobile phone 1 sends the second trajectory information of the second trajectory line to the VR glasses.
  • the VR glasses display the second track information in the second area according to the second track information.
  • the second trajectory displayed by the VR glasses can be shown in FIG. 13B . It should be understood that the position of the second track line in the second area is the same as its position on the screen of the mobile phone 2 .
  • the VR glasses display the screen content of the mobile phone 2 in the second area, they can also display the second trajectory line drawn by the user on the mobile phone 2 in the second area, which has a better user experience .
  • the screen receiving device such as mobile phone 1
  • the screen sharing device mobile phone 2
  • Mobile phone 1 sends files to mobile phone 2
  • the target file for example, file 1
  • he can select the target file through VR handle or virtual touch operation, and save the target file drag the icon from the first area of the virtual display interface to the second area of the virtual display interface, thereby controlling the mobile phone 1 to send the target file to the mobile phone 2.
  • Fig. 14 is a schematic flow chart of the screen sharing method provided by the embodiment of the present application, involving in the virtual screen sharing mode, the mobile phone 1 as the screen receiving device sends files to the mobile phone 2 as the screen sharing device according to the user's drag operation the process of. Specifically, the following steps S1401-S1407 are included.
  • the VR glasses monitor whether a first control event occurs in a first area.
  • the first control event refers to selecting a target file and dragging a corresponding file icon in the virtual display interface.
  • These include VR glasses selecting the target file and dragging the file icon according to the instructions issued by the VR handle, or selecting the target file and dragging the file icon according to the user's virtual touch operation.
  • the VR glasses record the first control event.
  • recording the first control event includes recording a moving track of the file icon in the virtual display interface.
  • the VR glasses render and display the file icon corresponding to the first control event.
  • the icon of the file 1 is rendered and displayed in real time along the track of the user dragging the file 1 .
  • the VR glasses drag the icon of file 1 according to the movement track of the VR handle.
  • the VR glasses move the icon of the file 1 according to the movement track of the user's touch finger.
  • the VR glasses determine whether a second control event occurs in the monitored second area.
  • the second control event refers to an event of releasing the selected target file. For example, during the process of dragging the target file through the virtual touch operation, the finger releases the target file. Or, during the process of selecting and dragging the target file through the VR handle, cancel the control of the VR handle on the target file, etc.
  • the icon of the target file moves on the virtual display interface following the dragging track.
  • the user stops the dragging operation if the file icon of the target file is located in the second area, it means that the target position of the dragging operation is the second area, and the second control event occurs in the second area, and the electronic device executes the following A step S1405. If the file icon of the target file is not in the second area, it is determined that the second control event does not occur in the second area, and the next step S1405 is not executed.
  • the user can also adjust the display position of the file icon of the target file on the first area by dragging and dropping in the first area. For example, when the icon of the target file is at position A of the first area, if the electronic device detects that the user drags the file icon of the target file to position B of the first area, the electronic device will change the display position of the file icon from A to Position changed to B position.
  • the VR glasses can be set with a quick file sending switch, such as a drag and drop switch, which is used to control whether the VR glasses respond to the user's drag operation, and control the screen receiving device to send the target file to the screen sharing device.
  • a quick file sending switch such as a drag and drop switch
  • the VR glasses can also control whether to enable the fast file sending switch according to user instructions.
  • the electronic device detects that the fast file sending switch has been turned on, continue to execute the next step S1406. If the electronic device detects that the fast file sending switch is not turned on, it can display a first prompt message on the virtual display interface, which is used to remind the user that the file fast sending switch of the VR glasses is not turned on.
  • the first prompt information may include text information: "The switch for quick file sending is not enabled", and a "Cancel” control and a "Deactivate” control. Wherein, the "cancel" control is used to cancel the display of the current first prompt message, and the "deactivate” control is used to guide the user to enable the quick file sending switch of the VR glasses.
  • the VR glasses determine whether the file type of the target file supports sending.
  • the file types supported to be sent between electronic devices include files in jpg, gif, doc, docx, ppt, pdf, wma, wav and other formats.
  • files in some formats do not support sending, such as files with extensions of bin, exe, dll, pem, lgb, etc.
  • a white list may be maintained in the VR glasses, and the white list includes file types that are supported to be sent. Before the electronic device sends the target file, it needs to judge whether the file type of the target file supports sending according to the white list.
  • the VR glasses may also display a second prompt message in the first area.
  • the second prompt information may be "file type not supported" or the like.
  • the VR glasses send the target file to the electronic device corresponding to the second area.
  • the VR glasses receive the user's file sending operation (such as After the user drags the icon of the target file from the first area to the second area), the user needs to send the file sending instruction to the mobile phone 1 first, and the mobile phone 1 sends the target file to the mobile phone 2 according to the file sending instruction.
  • the VR glasses after the VR glasses determine that the target file can be sent, they directly send a file sending instruction to the mobile phone 1, so that the mobile phone 1 sends the target file to the mobile phone 2.
  • a third prompt message may be displayed in a pop-up box in the first area to ask the user whether to confirm sending the target file.
  • the third prompt information may include the text "Are you sure to send "file 1" to user 2?", and a "Cancel” control and a "Confirm” control.
  • the cancel control uses cancel to display the third prompt message, and terminates the current file sending process.
  • the "Confirm” control is used to control the VR glasses to send a file sending instruction to the mobile phone 1, so that the mobile phone 1 sends the target file to the mobile phone 2 according to the file sending instruction.
  • the mobile phone 1 can send the target file to the mobile phone 2 based on the current communication channel with the mobile phone 2 (such as the Changlian channel).
  • the mobile phone 1 after receiving the file sending instruction, directly sends the target file (such as file 1) to the mobile phone 2, and the mobile phone 2 automatically receives the target file.
  • the mobile phone 2 after the mobile phone 2 receives the target file, it may display the file icon of the target file in the dialog box with the user 1 .
  • the mobile phone 1 after receiving the file sending instruction, the mobile phone 1 first sends a file sending request to the mobile phone 2, and temporarily stores the target file in the application server (such as the server of the Changlian application). After the mobile phone 2 receives the file sending request, it displays the fourth prompt message, a "cancel" control and a "receive” control.
  • the fourth prompt information is used to ask the user 2 whether to receive the target file. In an example, as shown in FIG. 19 , the fourth prompt information is "Do you want to receive the "file 1" sent by user 1?".
  • the "Cancel" control is used to refuse to receive the target file.
  • the “Receive” control is used to control the receiving of target files. After the mobile phone 2 detects the user's operation on the "receive” control, it controls the mobile phone 2 to download the target file from the application server to the local, and displays the file icon of the target file in the dialog box with the user 1.
  • the user can also control the mobile phone 1 to send the target file to the mobile phone 2 by copying the target file in the first area and pasting the target file in the second area (copy and paste operation for short).
  • the file sending control process can be referred to as shown in S1401-S1407 in FIG. 14 , and details are not described here in this embodiment.
  • the difference from the embodiment shown in FIG. 14 is that in this embodiment, the first control event is the operation of copying the target file in the first area, and the second control event is the operation of pasting the target file in the second area.
  • the virtual display interface of the VR glasses simultaneously displays the screen content shared by the mobile phone 1 and the multiple electronic devices.
  • the mobile phone 1 can send the target file to the electronic device through the methods provided in the above-mentioned embodiments.
  • the virtual display device can send the file to the target according to the file sending operation (such as drag operation, or copy and paste operation) input by the user on the virtual display interface.
  • the file sending operation such as drag operation, or copy and paste operation
  • the user can control the target file by performing 1-2 steps of control operations on the virtual display interface.
  • the mobile phone 1 sends the target file to the target device, and the user operation is very simple and has a better user experience.
  • both the local screen content of the mobile phone 1 and the shared screen content of the mobile phone 2 can be displayed on the virtual display interface of the VR glasses.
  • the mobile phone 2 since the mobile phone 2 is not connected to the VR glasses, it can only display the local screen of the mobile phone 2 on the mobile phone 2. Based on this, when the user needs to control the mobile phone 2 to send a file to the mobile phone 1, the user can send the file according to the file sending control process shown in FIG. 20 .
  • Fig. 20 is a schematic diagram of a file sending control process provided by another embodiment of the present application.
  • the mobile phone 2 detects that the user selects the target file, it displays the "Share" control.
  • the electronic device displays a shortcut share control, such as a "share with VR friends" control.
  • the VR friend refers to a friend who uses a virtual display device during a current call, such as user 1.
  • the electronic device displays a file sending prompt message, a "cancel" control and a "send” control.
  • the file sending prompt information is used to prompt the electronic device to send the file 1 to the user 1 soon.
  • the mobile phone 2 sends the file 1 to the mobile phone 1 after detecting the user's operation on the "send” control.
  • the mobile phone 1 is connected to the first VR glasses, and the mobile phone 2 is connected to the second VR glasses.
  • the first VR glasses are displaying the content of the first screen of mobile phone 1 and the content of the second screen shared by mobile phone 2.
  • the second VR glasses only display the second screen content of the mobile phone 2 (that is, the screen content shared with the mobile phone 1). Therefore, when the user 2 controls the mobile phone 2 to send the target file to the mobile phone 1, the file sending control operation can be performed on the screen displayed by the second VR glasses with reference to the file sending control process shown in FIG. 2 Send the target file to mobile phone 1.
  • the file sending method provided in this embodiment when the screen receiving device is in the virtual screen sharing mode, the user only needs to operate through 4 steps to control the screen sharing device to send the target to the screen receiving device document.
  • the method provided by this embodiment is easier for users to operate and has better user experience.
  • This embodiment provides a virtual display device, the virtual display device is connected to a screen receiving device, the screen receiving device is connected to a screen sharing device, and the virtual display device is configured to execute the methods shown in the foregoing embodiments.
  • the virtual display device is VR glasses.
  • This embodiment provides a computer program product, the program product includes a program, and when the program is run by the electronic device, the electronic device performs the methods shown in the foregoing embodiments.
  • An embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the methods shown in the foregoing embodiments are implemented.
  • An embodiment of the present application provides a chip, the chip includes a memory and a processor, and the processor executes a computer program stored in the memory, so as to control the above-mentioned electronic device to execute the methods shown in the above-mentioned embodiments.
  • processors mentioned in the embodiment of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits ( application specific integrated circuit (ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory mentioned in the embodiments of the present application may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (RAM), which acts as external cache memory.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • double data rate SDRAM double data rate SDRAM
  • DDR SDRAM enhanced synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • serial link DRAM SLDRAM
  • direct memory bus random access memory direct rambus RAM, DR RAM
  • references to "one embodiment” or “some embodiments” or the like in the specification of the present application means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及终端技术领域,提供了一种屏幕共享方法、系统和虚拟显示设备,能够解决屏幕共享过程中,屏幕接收设备无法同时显示自己的屏幕和其他电子设备共享的屏幕的问题。该方法应用于虚拟显示设备,该虚拟显示设备与第一电子设备连接,第一电子设备与第二电子设备连接;第一电子设备用于显示第一屏幕内容,第二电子设备用于显示第二屏幕内容,第一电子设备接收第二电子设备发送的第二屏幕内容;该方法包括:从第一电子设备处获取第一屏幕内容和第二屏幕内容;在虚拟显示设备显示界面的第一区域显示第一屏幕内容,在虚拟显示设备显示界面的第二区域显示第二屏幕内容。

Description

一种屏幕共享方法、系统和虚拟显示设备
本申请要求于2021年06月23日提交国家知识产权局、申请号为202110701084.1、申请名称为“一种屏幕共享方法、系统和虚拟显示设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种屏幕共享方法、系统和虚拟显示设备。
背景技术
目前,一些通信应用支持在应用好友的电子设备之间共享屏幕的功能。以手机2(属于用户2)向手机1(属于用户1)共享屏幕为例,手机1可以在本地的屏幕上显示手机2的屏幕内容。通常情况下,在共享屏幕的过程中,手机1作为共享内容的接收设备,只能显示一方的屏幕内容(即手机1或者手机2的屏幕内容),若用户1想要观看另一方的屏幕,则需要进行屏幕切换,用户体验不佳。
发明内容
本申请提供一种屏幕共享方法、虚拟显示设备和系统,使得在屏幕共享过程中,屏幕接收设备能够通过虚拟显示设备同时显示自己的屏幕以及其他设备共享的屏幕。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请实施例提供一种屏幕共享方法,应用于虚拟显示设备,该虚拟显示设备与第一电子设备连接,第一电子设备与第二电子设备连接;第一电子设备用于显示第一屏幕内容,第二电子设备用于显示第二屏幕内容,第一电子设备接收第二电子设备发送的第二屏幕内容。该方法包括:从第一电子设备处获取第一屏幕内容和第二屏幕内容;在虚拟显示设备显示界面的第一区域显示第一屏幕内容,在虚拟显示设备显示界面的第二区域显示第二屏幕内容。
在本申请的各个方面中,该虚拟显示设备可以是VR设备或者MR设备,该第一电子设备为共享屏幕的接收设备(即屏幕接收设备),第二电子设备为发起屏幕共享的设备(即屏幕共享设备)。该第一电子设备和第二电子设备可以是手机、笔记本电脑、平板电脑等终端设备。在一个示例中,结合本申请具体实施方式部分的描述,该虚拟显示设备可以是VR眼镜,该第一电子设备可以是手机1,该第二电子设备可以是手机2。
通过本申请实施例提供的屏幕共享方法,在第二电子设备向第一电子设备共享屏幕的过程中,虚拟显示设备能够充分利用其产生的虚拟空间,同时显示第一电子设备和第二电子设备的屏幕内容,使得第一电子设备的用户在观看好友屏幕内容(即第二电子设备的屏幕内容)的过程中,能够观看并操作自己的屏幕内容(即第一电子设备的屏幕内容),具有较好的用户体验。
在一些实施例中,该方法还包括:虚拟显示设备在检测到文件发送操作之后,控制第一电子设备向第二电子设备发送目标文件,该文件发送操作为:指示虚拟显示设备将第一区域上显示的目标文件向第二区域发送的操作。
示例性的,文件发送操作包括:从第一区域向第二区域拖拽目标文件。例如,通过用户的虚拟触控操作,将目标文件的图标从第一区域拖拽到第二区域内。
或者,文件发送操作包括:通过虚拟显示设备的控制手柄,将目标文件的图标从第一区域拖拽到第二区域内。
或者,文件发送操作包括:从第一区域向第二区域复制目标文件。例如,通过虚拟显示设备的控制手柄,从第一区域复制目标文件,随后再将目标文件粘贴在第二区域中。
通过本实施例提供的方法,在屏幕共享模式下,用户可以快速通过拖拽或者复制粘贴操作,控制自己的设备(即第一电子设备)向好友的设备(即第二电子设备)发送目标文件,操作流程简便。
在一些实施例中,虚拟显示设备在检测到文件发送操作之后,控制第一电子设备向第二电子设备发送目标文件,包括:虚拟显示设备在第一区域内检测到第一控制事件,且在第二区域内检测到第二控制事件之后,控制第一电子设备向第二电子设备发送目标文件。其中,第一控制事件用于选中目标文件,第二控制事件用于释放选中的目标文件;或者,第一控制事件用于复制目标文件,第二控制事件用于粘贴目标文件。
在一些实施例中,该方法还包括:在虚拟显示设备的文件快速发送开关处于开启状态,且目标文件支持被分享的情况下,虚拟显示设备控制第一电子设备向第二电子设备发送目标文件。
在一些实施例中,该方法还包括:在虚拟显示设备进入虚拟屏幕共享模式之后,虚拟显示设备自动开启虚拟显示设备的文件快速发送开关。
或者,在第二区域内检测到第二控制事件之后,若虚拟显示设备的文件快速发送开关未开启,虚拟显示设备则显示第一提示信息,该第一提示信息用于提示开启文件快速发送开关;响应于开启文件快速发送开关的操作,虚拟显示设备控制文件快速发送开关处于开启状态。
在一些实施例中,在第二区域内检测到第二控制事件之后,该方法还包括:若目标文件不支持发送,则虚拟显示设备显示第二提示信息,该第二提示信息用于提示目标文件不支持发送。例如,若目标文件的文件类型不在白名单中,则显示第二提示信息。
在一些实施例中,虚拟显示设备控制第一电子设备向第二电子设备发送目标文件,包括:虚拟显示设备显示第三提示信息,该第三提示信息用于询问用户是否发送目标文件;响应于接收到的指示发送目标文件的指令,虚拟显示设备控制第一电子设备向第二电子设备发送目标文件。
在一些实施例中,该方法还包括:虚拟显示设备在第二区域显示第四提示信息,该第四提示信息用于提示第二电子设备接收来自第一电子设备的目标文件。
在一些实施例中,该方法还包括:当多个第二电子设备向第一电子设备发送第二屏幕内容时,虚拟显示设备在第一区域的周围确定多个第二区域,并在多个第二区域中分别显示各个第二电子设备的第二屏幕内容。
在一些实施例中,该方法还包括:虚拟显示设备获取第二区域上的第一控制轨迹;根据第一控制轨迹在第二区域上显示第一轨迹线条;通过第一电子设备向第二电子设备发送第一轨迹线条的第一线条信息,该第一线条信息用于第二电子设备在第二屏幕内容的基础上显示第一轨迹线条。该第一控制轨迹可以是用户在第二区域上的虚拟触控轨迹,也可以 是虚拟显示设备的控制手柄的运动轨迹。该第一轨迹线条可以称为涂鸦内容。
通过本实施例提供的方法,用户在通过虚拟显示设备同时观看第一电子设备和第二电子设备的屏幕内容时,还可以通过虚拟显示设备在第二区域上进行涂鸦,并将涂鸦内容发送给第二电子设备,具有较好的用户体验。
在一些实施例中,该方法还包括:通过第一电子设备接收第二电子设备发送的第二线条信息,第二线条信息用于确定第二轨迹线条,第二轨迹线条是根据第二电子设备上的第二控制轨迹确定的;以及,根据第二线条信息,在第二区域上显示第二轨迹线条。
通过本实施例提供的方法,虚拟显示设备可以在第二区域上显示第二电子设备上的涂鸦内容,具有较好的用户体验。
第二方面,本申请实施例提供一种屏幕共享系统,包括:虚拟显示设备、第一电子设备和第二电子设备,虚拟显示设备与第一电子设备连接,第一电子设备与第二电子设备连接;第一电子设备用于显示第一屏幕内容,第二电子设备用于显示第二屏幕内容。
第一电子设备被配置为:接收第二电子设备发送的第二屏幕内容。
虚拟显示设备被配置为:从第一电子设备处获取第一屏幕内容和第二屏幕内容;在虚拟显示设备显示界面的第一区域显示第一屏幕内容,在虚拟显示设备显示界面的第二区域显示第二屏幕内容。
在一些实施例中,虚拟显示设备还被配置为:在检测到文件发送操作之后,控制第一电子设备向第二电子设备发送目标文件,该文件发送操作为:指示虚拟显示设备将第一区域上显示的目标文件向第二区域发送的操作。
在一些实施例中,文件发送操作包括:从第一区域向第二区域拖拽或者复制目标文件。
在一些实施例中,虚拟显示设备还被配置为:在第一区域内检测到第一控制事件,且在第二区域内检测到第二控制事件之后,控制第一电子设备向第二电子设备发送目标文件。其中,第一控制事件用于选中目标文件,第二控制事件用于释放选中的目标文件;或者,第一控制事件用于复制目标文件,第二控制事件用于粘贴目标文件。
在一些实施例中,虚拟显示设备还被配置为:在虚拟显示设备的文件快速发送开关处于开启状态,且目标文件支持被分享的情况下,控制第一电子设备向第二电子设备发送目标文件。
在一些实施例中,虚拟显示设备还被配置为:在进入虚拟屏幕共享模式之后,自动开启文件快速发送开关;或者,在第二区域内检测到第二控制事件之后,虚拟显示设备还被配置为:若虚拟显示设备的文件快速发送开关未开启,则显示第一提示信息,该第一提示信息用于提示开启文件快速发送开关;响应于开启文件快速发送开关的操作,控制文件快速发送开关处于开启状态。
在一些实施例中,虚拟显示设备还被配置为:若目标文件不支持发送,则显示第二提示信息,该第二提示信息用于提示目标文件不支持发送。
在一些实施例中,虚拟显示设备还被配置为:显示第三提示信息,该第三提示信息用于询问用户是否发送目标文件;响应于接收到的指示发送目标文件的指令,控制第一电子设备向第二电子设备发送目标文件。
在一些实施例中,第二电子设备还被配置为:显示第四提示信息,该第四提示信息用于提示第二电子设备接收来自第一电子设备的目标文件。
在一些实施例中,虚拟显示设备还被配置为:当多个第二电子设备向第一电子设备发送多个第二屏幕内容时,虚拟显示设备在第一区域的周围确定多个第二区域,并在多个第二区域中分别显示多个第二屏幕内容。
在一些实施例中,虚拟显示设备还被配置为:获取第二区域上的第一控制轨迹;根据第一控制轨迹在第二区域上显示第一轨迹线条;通过第一电子设备向第二电子设备发送第一轨迹线条的第一线条信息。第二电子设备还被配置为:根据第一线条信息在第二电子设备的屏幕上显示第一轨迹线条。
在一些实施例中,第二电子设备还被配置为:获取第二电子设备屏幕上的第二控制轨迹;根据第二控制轨迹显示第二轨迹线条;通过第一电子设备向虚拟显示设备发送第二轨迹线条的第二线条信息。虚拟显示设备还被配置为:根据第二线条信息,在第二区域上显示第二轨迹线条。
第三方面,本申请实施例还提供一种虚拟显示设备,该虚拟显示设备被配置为执行上述第一方面示出的屏幕共享方法。
第四方面,本申请实施例还提供一种芯片系统,该芯片系统包括处理器,处理器执行存储器中存储的计算机程序,以实现上述第一方面示出的屏幕共享方法。
第五方面,本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,该计算机程序被处理器执行时实现上述第一方面示出的屏幕共享方法。
第六方面,本申请实施例还提供一种计算机程序产品,该程序产品包括程序,当该程序被电子设备运行时,使得电子设备如上述第一方面示出的屏幕共享方法。
可以理解的是,上述第二方面至第六方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1是本申请的一个实施例提供的屏幕共享控制过程示意图;
图2是本申请另一个实施例提供的屏幕共享控制过程示意图;
图3为本申请实施例提供的屏幕涂鸦过程的示意图;
图4A是申请的一个实施例提供的文件发送控制过程的示意图;
图4B是申请另一个实施例提供的文件发送控制过程的示意图;
图5是本申请实施例提供的退出屏幕共享模式控制过程示意图;
图6是本申请实施例提供的屏幕共享系统的示意性架构图;
图7A是本申请实施例提供的一种可穿戴设备的结构示意图;
图7B是本申请实施例提供的一种光学显示模组的组成示意图;
图8是本申请实施例提供的虚拟屏幕共享模式的控制流程图;
图9是本申请实施例提供的电子设备交互示意图一;
图10是本申请实施例提供的虚拟显示界面示意图一;
图11是本申请实施例提供的虚拟显示界面示意图二;
图12A是本申请的一个实施例提供的涂鸦内容分享过程的示意性流程图;
图12B是本申请实施例提供的虚拟屏幕共享模式下的涂鸦场景示意图一;
图13A是本申请另一个实施例提供的涂鸦内容分享过程的示意性流程图;
图13B是本申请实施例提供的虚拟屏幕共享模式下的涂鸦场景示意图二;
图14是本申请实施例提供的文件发送控制方法的示意性流程图一;
图15是本申请实施例提供的虚拟显示界面示意图三;
图16A是本申请实施例提供的虚拟显示界面示意图四;
图16B是本申请实施例提供的虚拟显示界面示意图五;
图17是本申请实施例提供的电子设备交互示意图二;
图18是本申请实施例提供的虚拟显示界面示意图六;
图19是本申请实施例提供的虚拟显示界面示意图七;
图20是本申请实施例提供的文件发送控制方法的示意性流程图二。
具体实施方式
下面结合附图,对本申请实施例提供的技术方案进行示例性说明。
在本实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
目前,一些通信应用(例如畅连应用)支持在应用好友的电子设备之间共享屏幕的功能。以手机2(属于用户2)向手机1(属于用户1)共享屏幕为例,手机1可以在本地的屏幕上显示手机2的屏幕内容。基于此,用户1通过手机1即可实时观看手机2的屏幕内容,例如,浏览手机2的相册、应用界面等。
需要说明的是,本实施例将发起共享屏幕内容的电子设备称为屏幕共享设备,例如手机2;将接收并显示其他电子设备共享的屏幕内容的电子设备称为屏幕接收设备,例如手机1。
图1是本申请的一个实施例提供的屏幕共享控制过程示意图。图1以畅连应用为例,对手机2向手机1共享屏幕的过程进行示例性说明。
参见图1中的(a)所示,手机2和手机1在基于畅连应用通话的过程中,手机2的通话界面上通常会显示“切换摄像头”控件、“挂断”控件和“更多”控件等内容。参见图1中的(b)所示,手机2在检测到用户对“更多”控件的操作之后,显示“静音”控件、“扬声器”控件、“切换语音”控件和“共享屏幕”控件等内容。参见图1中的(c)所示,手机2在检测到用户对“共享屏幕”控件的操作之后,显示提示信息、“取消”控件和“确定”控件。示例性的,该提示信息可以为“共享我的屏幕对方可以看到您的屏幕。确定共享?”。参见图1中的(d)所示,手机2在检测到用户对“确定”控件的操作之后,向手机1发送屏幕共享请求,并显示提示信息“共享屏幕中,等待对方确认……”。参见图1中的(e)所示,手机2在检测到手机1确认接受屏幕共享之后,进入屏幕共享模式,从而实时向手机1发送手机2的屏幕内容。
手机2所共享的屏幕内容具体根据用户2的需求确定。例如,参见图1中的(e)所示,当用户2想和用户1讨论购买双肩包时,用户2可控制手机2显示双肩包的购物界面。此时,手机2即可将该购物界面发送给手机1,以由手机1显示该购物界面。
图2是本申请另一个实施例提供的屏幕共享控制过程示意图,涉及手机1接受手机2的屏幕共享请求,进入屏幕共享模式的过程。参见图2中的(a)所示,手机1在接收到手机2发送的屏幕共享请求之后,显示另一提示信息、“取消”控件和“确定”控件。示例性的,该提示信息可以为“对方向您共享屏幕您可以看到对方屏幕。确定接受?”。参见图2中的(b)所示,手机1在检测到用户对“确定”控件的操作之后,将暂不显示手机1本地的屏幕内容,而显示手机2共享的屏幕内容,例如双肩包的购物界面。
在另一些实施例中,手机1在与手机2基于畅连应用通话的过程中,手机1还可以显示一个邀请控件,用于主动邀请手机2向手机1共享屏幕内容。在手机1同意手机2的邀请之后,手机2实时向手机1共享手机2的屏幕内容;手机1实时显示手机2的屏幕内容,而暂时不显示手机1本地的屏幕内容。
在屏幕共享模式下,用户不仅可以在手机1和手机2上进行涂鸦、相互共享涂鸦信息,还可以控制手机1和手机2相互发送文件。下面分别对其进行说明。
在本实施例中,涂鸦是指电子设备根据用户在其屏幕上的控制操作(例如触控操作)显示轨迹线条的过程。该轨迹线条即为涂鸦内容,该轨迹线条的线条样式(如实线、虚线、点划线等)、颜色可以根据用户的配置确定,也可以根据系统预先的配置确定,本实施例对此不进行限制。本实施例将电子设备所显示的轨迹线条的颜色、线条样式以及在屏幕上的位置,称为轨迹信息。
图3为本申请实施例提供的屏幕涂鸦过程的示意图,涉及电子设备(手机1或者手机2)根据用户指令进行涂鸦的过程。具体如下所示。
参见图3中的(a)所示,电子设备在显示手机2共享的屏幕时,通常会显示涂鸦控件、共享时长和退出控件等内容。其中,涂鸦控件用于控制开启或者关闭电子设备的屏幕涂鸦功能。该共享时长用于表示本次屏幕共享过程的持续时长。该退出控件用于控制电子设备退出屏幕共享。用户1和用户2可以分别在各自的电子设备上进行涂鸦。
参见图3中的(b)所示,电子设备在检测到用户对涂鸦控件的操作之后,可以显示关于屏幕涂鸦的操作提示信息,用于提示用户如何在屏幕上进行涂鸦操作。电子设备根据用户指令关闭该操作提示信息之后,即进入涂鸦模式。
参见图3中的(c)所示,电子设备在进入涂鸦模式之后,即可根据用户在屏幕上的控制轨迹显示轨迹线条。可选的,电子设备在根据用户指令涂鸦的过程中,还向参与屏幕共享的另一个电子设备发送本地轨迹线条的轨迹信息,以便对方在其屏幕上显示该轨迹线条。例如,手机1在根据用户指令涂鸦的过程中,还可以向手机2发送手机1本地的涂鸦内容,以便手机2显示该轨迹线条。
在本实施例中,两个电子设备之间相互发送的文件可以是文档、图片、视频、音频、压缩包、安装包等,本实施例对其类型不进行限制。
图4A是申请的一个实施例提供的文件发送控制过程的示意图,涉及屏幕接收设备(如手机1)根据用户指令向屏幕共享设备(如手机2)发送文档的过程。
参见图4A中的(a)所示,手机1在退出显示手机2共享的屏幕内容之后,根据用户指令选中目标文件(例如文件1)。其中,手机1退出手机2共享的屏幕内容可以是手机退出屏幕共享模式,不接收手机2共享的屏幕内容,从而显示本地屏幕内容;也可以是继续维持屏幕共享模式,但隐藏显示手机2共享的屏幕内容,从而显示本地屏幕内容。
参见图4A中的(b)所示,手机1在检测到用户选中目标文件之后,显示“分享”、“复制”、“移动”、“删除”和“更多”等控件。
参见图4A中的(c)所示,手机1在检测用户对“分享”控件的操作之后,显示多种分享子控件,例如“华为分享”、“蓝牙”、“畅连”、“备忘录”、“通过邮件发送”、“投屏”、“发送到电脑”、“上传到云盘”、“信息”等控件。
参见图4A中的(d)所示,手机1在检测到用户对“畅连”控件的操作之后,显示联系人选择界面,该联系人选择界面中包括“选择联系人”和“群聊”控件。
参见图4A中的(e)所示,手机1在检测到用户对“选择联系人”控件的操作之后,显示联系人信息,例如联系人名称“用户2”、“用户3”、“用户4”等。
参见图4A中的(f)所示,手机1在检测到用户选择目标联系人(例如用户2)之后,显示提示信息、“取消”控件和“发送”控件。其中,该提示信息用于提示手机1即将向用户2发送目标文件(例如文件1)。手机2在检测到用户对“发送”控件的操作之后,向手机2发送文件1。
图4B是申请另一个实施例提供的文件发送控制过程的示意图,涉及屏幕接收设备(如手机1)根据用户指令向屏幕共享设备(手机2)发送图片的过程。图4B中的(a)至图4B中的(f)示出的图片发送过程,可参见图4A中的(a)至图4A中的(f)示出的文档发送过程,本实施例在此不再赘述。
此外,当用户2向用户1发送文件时,也可以采用图4A或者图4B示出的控制过程,控制手机2向手机1发送文件,本实施例在此不再赘述。
图5是本申请实施例提供的退出屏幕共享模式控制过程示意图。以控制退出屏幕共享模式的电子设备是屏幕接收设备(例如手机1)为例,参见图5中的(a)至图5中的(b)所示,当手机1检测到用户对“退出”控件的操作之后,手机1可以显示提示信息、“取消”控件和“确定”控件。该提示信息用于询问用户是否退出屏幕共享模式。示例性的,该询问信息可以是“退出共享屏幕您将看不到对方的屏幕。确定退出?”。该“取消”控件用于取消显示该提示信息,并继续显示手机2共享的屏幕内容。“确定”控件用于控制手机1退出当前的屏幕共享模式。
应理解,尽管本申请实施例未示出,手机2也可以根据用户指令退出屏幕共享模式,具体控制过程可参见图5所示,本实施例在此不再赘述。
综上可知,目前,在屏幕共享模式下,共享屏幕接收设备(如手机1)只能显示一方的屏幕内容(即手机1本地的屏幕内容,或者手机2共享的屏幕内容。若手机1要显示另一方的屏幕内容,则需要进行屏幕内容切换,用户操作不便,体验不佳。
此外,在上述屏幕共享模式下,当手机1显示手机2共享的屏幕内容时,若用户1需要控制手机1向手机2发送文件,则需要先将手机1当前的显示内容切换至手机1的屏幕内容,随后再通过一系列的操作过程(例如图4A或者图4B所示的6个步骤的控制操作),才能控制手机1向手机2发送文件。在文件发送完成之后,若用户1还想继续观看手机2共享的屏幕内容,还需要重新控制手机1切换屏幕内容。由此可见,该文件发送控制方法涉及的用户操作繁琐,用户体验不佳。
为此,本申请实施例提供一种屏幕共享方法,以使得在共享屏幕的过程中,屏幕接收设备的用户能够同时观看本地的屏幕,和其他电子设备共享的屏幕,提高用户体验。此外, 还可以简化文件发送的用户操作过程,提高文件发送效率。
图6是本申请实施例提供的屏幕共享系统的示意性架构图。参见图6所示,该控制系统包括屏幕接收设备、屏幕共享设备和至少一个虚拟显示设备。屏幕接收设备和屏幕共享设备之间通过通信应用(例如畅连应用)连接。屏幕接收设备与虚拟显示设备之间可以有线连接,也可以无线连接。其中,有线连接包括通过通用串行总线(universal serial bus,USB)线连接。无线连接包括通过蓝牙(bluetooth,BT)技术、无线保真(wireless fidelity,WiFi)、近场通信(near field communication,NFC)等无线通信技术连接。
需要说明的是,在另一些实施例中,屏幕接收设备和屏幕共享设备之间通过通信应用(例如畅连应用)连接。屏幕接收设备和屏幕共享设备也可以分别连接第一虚拟显示设备和第二虚拟显示设备。
在本实施例中,屏幕接收设备和屏幕共享设备可以是手机、平板电脑、笔记本电脑、台式电脑、智能电视、可穿戴设备(如智能手表)、车载设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等终端设备,本实施例对其具体类型不进行限制。该虚拟显示设备可以是虚拟现实(virtual reality,VR)设备,例如VR眼镜,也可以是其他虚拟显示设备,例如混合现实(Mixed reality,MR)设备等。
示例性的,请参考图7A,以虚拟显示设备为可穿戴设备为例,示出了本申请实施例提供的一种可穿戴设备的结构示意图。如图7A所示,可穿戴设备100可以包括处理器110,存储器120,传感器模块130(可以用于获取用户的姿态),麦克风140,按键150,输入输出接口160,通信模块170,摄像头180,电池190、光学显示模组1100以及眼动追踪模组1200等。
可以理解的是,本申请实施例示意的结构并不构成对可穿戴设备100的具体限定。在本申请另一些实施例中,可穿戴设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110通常用于控制可穿戴设备100的整体操作,可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),视频处理单元(video processing unit,VPU)控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在本申请的一些实施例中,处理器110可以用于控制可穿戴设备100的光焦度。示例性的,处理器110可以用于控制光学显示模组1100的光焦度,实现对可穿戴设备100的光焦度的调整的功能。例如,处理器110可以通过调整光学显示模组1100中各个光学器 件(如透镜等)之间的相对位置,使得光学显示模组1100的光焦度得到调整,进而使得光学显示模组1100在向人眼成像时,对应的虚像面的位置可以得到调整。从而达到控制可穿戴设备100的光焦度的效果。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口,串行外设接口(serial peripheral interface,SPI)接口等。
在一些实施例中,处理器110可以基于不同帧率对不同对象进行渲染,比如,对近景对象使用高帧率渲染,对远景对象使用低帧率进行渲染。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与通信模块170。例如:处理器110通过UART接口与通信模块170中的蓝牙模块通信,实现蓝牙功能。
MIPI接口可以被用于连接处理器110与光学显示模组1100中的显示屏,摄像头180等外围器件。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头180,光学显示模组1100中的显示屏,通信模块170,传感器模块130,麦克风140等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。可选的,摄像头180可以采集包括真实对象的图像,处理器110可以将摄像头采集的图像与虚拟对象融合,通过光学显示模组1100现实融合得到的图像,该示例可以参见图10所示的应用场景,在此不重复赘述。
USB接口是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口可以用于连接充电器为可穿戴设备100充电,也可以用于可穿戴设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如手机等。USB接口可以是USB3.0,用于兼容高速显示接口(display port,DP)信号传输,可以传输视音频高速数据。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对可穿戴设备100的结构限定。在本申请另一些实施例中,可穿戴设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
另外,可穿戴设备100可以包含无线通信功能,比如,可穿戴设备100可以从其它电子设备(比如VR主机或VR服务器)接收渲染后的图像进行显示,或者,接收未渲染的图像然后处理器110对图像进行渲染并显示。通信模块170可以包含无线通信模块和移动通信模块。无线通信功能可以通过天线(未示出)、移动通信模块(未示出),调制解调处理器(未示出)以及基带处理器(未示出)等实现。
天线用于发射和接收电磁波信号。可穿戴设备100中可以包含多个天线,每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块可以提供应用在可穿戴设备100上的包括第二代(2th generation,2G)网络/第三代(3th generation,3G)网络/第四代(4th generation,4G)网络/第五代(5th generation,5G)网络等无线通信的解决方案。移动通信模块可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块可以由天线接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块还可以对经调制解调处理器调制后的信号放大,经天线转为电磁波辐射出去。在一些实施例中,移动通信模块的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器等)输出声音信号,或通过光学显示模组1100中的显示屏显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块或其他功能模块设置在同一个器件中。
无线通信模块可以提供应用在可穿戴设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
在一些实施例中,可穿戴设备100的天线和移动通信模块耦合,使得可穿戴设备100可以通过无线通信技术与网络以及其他设备通信。该无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
可穿戴设备100通过GPU,光学显示模组1100,以及应用处理器等实现显示功能。 GPU为图像处理的微处理器,连接光学显示模组1100和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
存储器120可以用于存储计算机可执行程序代码,该可执行程序代码包括指令。处理器110通过运行存储在存储器120的指令,从而执行可穿戴设备100的各种功能应用以及数据处理。存储器120可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储可穿戴设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
可穿戴设备100可以通过音频模块,扬声器,麦克风140,耳机接口,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块还可以用于对音频信号编码和解码。在一些实施例中,音频模块可以设置于处理器110中,或将音频模块的部分功能模块设置于处理器110中。
扬声器,也称“喇叭”,用于将音频电信号转换为声音信号。可穿戴设备100可以通过扬声器收听音乐,或收听免提通话。
麦克风140,也称“话筒”,“传声器”,用于将声音信号转换为电信号。可穿戴设备100可以设置至少一个麦克风140。在另一些实施例中,可穿戴设备100可以设置两个麦克风140,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,可穿戴设备100还可以设置三个,四个或更多麦克风140,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口用于连接有线耳机。耳机接口可以是USB接口,也可以是3.5毫米(mm)的开放移动可穿戴设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
在一些实施例中,可穿戴设备100可以包括一个或多个按键150,这些按键可以控制可穿戴设备,为用户提供访问可穿戴设备100上的功能。按键150的形式可以是按钮、开关、刻度盘和触摸或近触摸传感设备(如触摸传感器)。具体的,例如,用户可以通过按下按钮来打开可穿戴设备100的光学显示模组1100。按键150包括开机键,音量键等。按键150可以是机械按键。也可以是触摸式按键。可穿戴设备100可以接收按键输入,产生与可穿戴设备100的用户设置以及功能控制有关的键信号输入。
在一些实施例中,可穿戴设备100可以包括输入输出接口160,输入输出接口160可以通过合适的组件将其他装置连接到可穿戴设备100。组件例如可以包括音频/视频插孔,数据连接器等。
光学显示模组1100用于在处理器的控制下,为用户呈现图像。光学显示模组1100可以通过反射镜、透射镜或光波导等中的一种或几种光学器件,将实像素图像显示转化为近眼投影的虚拟图像显示,实现虚拟的交互体验,或实现虚拟与现实相结合的交互体验。例如,光学显示模组1100接收处理器发送的图像数据信息,并向用户呈现对应的图像。
在一些实施例中,可穿戴设备100还可以包括眼动跟踪模组1200,眼动跟踪模组1200用于跟踪人眼的运动,进而确定人眼的注视点。如,可以通过图像处理技术,定位瞳孔位置,获取瞳孔中心坐标,进而计算人的注视点。
结合对图7A所示的可穿戴设备100的说明,本申请实施例提供的虚拟显示设备具有自动调节光焦度的功能。在一些实施例中,该功能可以通过光学显示模组1100实现。
示例性的,请参考图7B,为本申请实施例提供的一种光学显示模组的组成示意图。光学显示模组1100包括目镜701,变焦模块702和显示屏703。
该虚拟显示设备能够用于支持VR或MR技术提供虚拟显示功能。在具体实现中,该虚拟显示设备可以为头戴式显示(Head-mounted display,HMD)设备,比如VR或MR眼镜,VR或MR头盔,或者VR或MR一体机。或者该虚拟显示设备也可以包括在以上举例头戴式虚拟显示设备中。需要说明的是,在一些实施例中,该虚拟显示设备还可以用于支持混合现实技术的实现。
下面结合图6示出的屏幕共享系统,以屏幕接收设备是手机1,屏幕共享设备是手机2,虚拟显示设备是VR眼镜(VR Glass)为例,对本申请实施例提供的技术方案进行说明。
图8示出了本申请实施例提供的屏幕共享模式的控制流程图,涉及手机2向手机1共享屏幕的过程中,VR眼镜同时显示手机1的屏幕内容和手机2共享的屏幕内容的过程。具体包括如下步骤S801-S812。
S801,VR眼镜连接手机1,进入VR显示模式。
在本实施例中,当手机1与VR眼镜建立连接之后,手机1进入VR显示模式。在VR显示模式下,VR眼镜的虚拟显示界面可以显示一些支持VR模式播放的影视资源、VR游戏资源、VR控件等。VR控件中可以包括“VR投屏”控件,用于通过VR投屏应用控制VR眼镜进入投屏模式,显示手机1的屏幕内容。
S802,VR眼镜打开VR投屏应用,进入VR投屏模式。
S803,VR眼镜在虚拟显示界面的第一区域上显示手机1的第一屏幕内容。
当VR眼镜检测到用户对“VR投屏”控件的操作之后,打开VR投屏应用,并进入VR投屏模式。在VR投屏模式下,参见图9所示,手机1通过VR投屏应用向VR眼镜发送手机1的本地屏幕内容,即第一屏幕内容。VR眼镜则例如图10所示,在虚拟显示界面中确定第一区域,并在第一区域中显示手机1的第一屏幕内容。
用户可以通过手势发出虚拟触控操作或者通过VR眼镜匹配的控制手柄(也可称作VR手柄)控制第一屏幕内容。例如,控制第一屏幕内容上下或者左右滑动,选择第一屏幕内容中的文件、图标或者控件等,操作通信应用(例如畅连应用)与好友(例如用户2)进行语音或者视频通话,或者接听好友发送的音/视频通话等。
需要说明的是,当手机1进入VR投屏模式之后,手机1本地可以显示第一屏幕内容,也可以息屏节省电量,减少手机1的系统运行消耗。
在一些实施例中,当手机1处于VR投屏模式,且手机1处于息屏状态时,用户也可以将手机1作为VR眼镜的控制手柄使用。
S804,VR眼镜判断是否接收到手机2的第二屏幕内容。
在手机1和手机2进行语音通话或者视频通话的过程中,若手机2向手机1共享屏幕,则参见图9所示,手机2会不断向手机1发送手机2的第二屏幕内容。手机2在接收到第 二屏幕内容之后,会不断地向VR眼镜发送第二屏幕内容。因此,VR眼镜需要判断是否接收到手机2的第二屏幕内容。若VR眼镜接收到手机2的第二屏幕内容,则执行下一步骤S805。
S805,VR眼镜确定第二区域,并设置第二区域不响应用户的控制操作。
VR眼镜在接收到第二屏幕内容之后,在显示第一区域的同时确定第二区域,用于显示接收到的第二屏幕内容。可选的,VR眼镜可以设置第二区域不响应用户的控制操作,即用户无法控制第二区域的显示内容,只能进行观看,以保证手机2的信息安全。
S806,VR眼镜记录第一区域和第二区域的位置信息,用于文件发送控制。
在本实施例中,第一区域的位置信息用于表示第一区域在虚拟显示界面中的显示位置。示例性的,第一区域的位置信息可以包括第一区域左上角的顶点坐标和第一区域的宽高信息。第二区域的位置信息用于表示第二区域在虚拟显示界面中的显示位置。示例性的,第二区域的位置信息可以包括第二区域左上角的顶点坐标和第二区域的宽高信息。
VR眼镜在记录第一区域和第二区域的位置信息之后,可以根据该位置信息控制手机1向手机2发送文件。具体参见下文描述,本实施例在此不进行赘述。
S807,VR眼镜激活文件快速发送开关。
在本实施例中,文件快速发送开关,用于控制VR眼镜是否响应用户操作控制手机1向手机2发送文件。
S808,VR眼镜在第二区域上实时显示第二屏幕内容。
由于手机2是实时向手机1发送第二屏幕内容的,因此,参见图11所示,手机1在接收到第二屏幕内容之后,需要再将其发送给VR眼镜,以由VR眼镜实时在将其显示在第二区域上。
S809,VR眼镜判断手机2是否停止屏幕共享。
在屏幕共享过程中,手机1和手机2均可以根据用户指令控制手机2停止屏幕共享。当VR眼镜检测到手机2停止屏幕共享后,将执行下一步骤S810。
S810,若VR眼镜检测到手机2停止屏幕共享,则停止显示第二屏幕内容,并关闭文件快速发送开关。
在手机1和手机2进行语音通话或者视频通话的过程中,若手机2停止向手机1共享屏幕,则手机1通知VR眼镜屏幕共享已停止,VR眼镜停止在第二区域显示第二屏幕内容,并关闭文件快速发送开关,但仍继续在第一区域显示手机1的第一屏幕内容。
在手机1和手机2进行语音通话或者视频通话的过程中,若手机2未停止向手机1共享屏幕,但手机1退出了VR投屏应用,则VR眼镜既不显示第一屏幕内容,也不显示第二屏幕内容,并控制将当前的屏幕共享模式切换至普通屏幕共享模式。
S811,VR眼镜判断是否退出VR投屏应用。
在VR投屏过程中,若用户向VR眼镜发出VR投屏应用退出指令,或者VR眼镜与手机1的连接断开,则VR眼镜退出VR投屏应用。
S812,若VR眼镜退出VR投屏应用,则VR眼镜停止显示第一屏幕内容。
基于上述S801-S812示出的方法可知,与前文中图1-图3、图4A、图4B以及图5示出屏幕共享模式不同,在本实施例提供的屏幕共享模式中,VR眼镜的虚拟显示界面中能够同时显示手机1的第一屏幕内容,以及手机2共享的第二屏幕内容。换言之,手机1的 第一屏幕内容以及手机2共享的第二屏幕内容被共同显示在VR眼镜的虚拟显示界面中,以使得用户能够同时看到两个电子设备的屏幕,具有较好的用户体验。
需要说明的是,本实施例将在虚拟显示设备上同时显示手机1的第一屏幕内容和至少一个其他手机(例如手机1)的第二屏幕内容的模式称为虚拟屏幕共享模式。而将前文中示出的,在手机1的屏幕上显示手机2的第二屏幕内容的模式称为普通屏幕共享模式。
在虚拟屏幕共享模式下,VR眼镜中的第一区域可以根据用户控制显示不同的内容,例如可以显示畅连应用的通话界面、手机1的主屏幕界面、视频播放界面、文本界面、游戏界面、购物界面等应用界面,本实施例对此不进行限制。
可选的,参见图10所示,虚拟显示界面中还显示有背景控件、旋转控件和更多功能控件等内容。其中,背景控件用于控制VR眼镜是否在第一区域和第二区域之后显示背景内容。旋转控件用于控制旋转第一区域。例如,控制将第一区域由横屏旋转为竖屏,或者由竖屏旋转为横屏。更多功能控件,用于提供更多的控制功能,例如调整虚拟显示界面亮度、对比度、画面比例等。
此外,在手机1和多个电子设备同时通话的过程中,VR眼镜也可以接收并同时显示该多个电子设备共享的屏幕内容。例如,当手机1和手机2、手机3进行群组通话的过程中,若手机2和手机3同时向手机1共享屏幕,则VR眼镜在显示手机1的第一屏幕内容的同时,还显示手机2共享的第二屏幕内容,以及手机3共享的屏幕内容,具有较好的用户体验。在一些实施例中,该多个电子设备共享的屏幕内容可以环绕显示在第一区域周围。
综上可知,通过本申请实施例提供的屏幕共享方法,在手机2向手机1共享屏幕的过程中,VR眼镜能够充分利用其产生的虚拟空间,同时显示手机1和手机2的屏幕内容,使得用户1在观看好友屏幕内容(手机2的屏幕内容)的过程中,能够观看并操作自己的屏幕内容(即手机1的屏幕内容),具有较好的用户体验。
在上述虚拟屏幕共享模式下,用户1和用户2之间不仅可以进行屏幕涂鸦,还可以快速控制手机1和手机2发送文件。下面分别对其进行说明。
(一)屏幕涂鸦
参见图11所示,在虚拟屏幕共享模式下,VR眼镜显示的第二区域中不仅包括手机2共享的屏幕内容,还包括涂鸦控件、共享时长和退出控件。通常情况下,VR眼镜是不响应用户对该屏幕内容的控制操作的。但是,当用户通过涂鸦控件控制VR眼镜进入涂鸦模式之后,便可以在共享屏幕上进行涂鸦。
图12A是本申请的一个实施例提供的涂鸦过程的示意性流程图,涉及在虚拟屏幕共享模式下,VR眼镜根据用户指令显示并分享涂鸦内容的过程。具体包括如下步骤S1201-S1205。
S1201,VR眼镜确定用户在第二区域中的第一控制轨迹。
在虚拟屏幕共享模式下,VR眼镜在控制第二区域进入涂鸦模式之后开始检测用户在第二区域中的第一控制轨迹。该第一控制轨迹可以是用户的在第二区域内的虚拟触控轨迹,也可以是当VR手柄的光标处于第二区域内时,光标在VR手柄控制下的移动轨迹。
S1202,VR眼镜根据第一控制轨迹,在第二区域中显示第一轨迹线条。
参见图12B中的第二区域,以该第二屏幕内容是双肩包的购物界面为例,VR眼镜可以根据用户在双肩包1上的虚拟触控轨迹显示第一轨迹线条。
S1203,VR眼镜向手机1发送第一轨迹线条的第一轨迹信息。该第一轨迹信息包括第一轨迹线条的颜色、线条样式和在第二区域中的位置。
S1204,手机1向手机2发送第一轨迹线条的第一轨迹信息。
S1205,手机2根据第一轨迹信息,显示第一轨迹线条。
示例性的,手机2显示的第一轨迹线条可以参见图12B所示。应理解,第一轨迹线条在手机2屏幕中的位置,与其在VR眼镜第二区域中的位置相同。
通过本实施例提供的方法,手机1的用户通过VR眼镜可以一边浏览手机1的屏幕,一边在手机2共享的屏幕上进行涂鸦,并向手机2分享涂鸦内容,具有较好的用户体验。
图13A是本申请的另一个实施例提供的涂鸦过程的示意性流程图,涉及在虚拟屏幕共享模式下,手机2根据用户指令显示并分享涂鸦内容的过程。具体包括如下步骤S1301-S1305。
S1301,手机2确定用户在屏幕中的第二控制轨迹。
以图6示出的屏幕共享系统为例,在手机2进入涂鸦模式之后,由于手机2未连接虚拟显示设备,因此,手机2上的第二控制轨迹可以是用户在手机2屏幕上的触控轨迹。
S1302,手机2根据第二控制轨迹,显示第二轨迹线条。
参见图13B中的手机2的屏幕,以该屏幕显示双肩包的购物界面为例,手机2可以根据用户在双肩包3上的触控轨迹显示第二轨迹线条。
S1303,手机2向手机1发送第二轨迹线条的第二轨迹信息。该第二轨迹信息包括第二轨迹线条的颜色、线条样式和在手机2的屏幕中的位置。
S1304,手机1向VR眼镜发送第二轨迹线条的第二轨迹信息。
S1305,VR眼镜根据第二轨迹信息,在第二区域中显示第二轨迹信息。
示例性的,VR眼镜显示的第二轨迹线条可以参见图13B所示。应理解,第二轨迹线条在第二区域中的位置,与其在手机2的屏幕中的位置相同。
通过本申请实施例提供的方法,VR眼镜在第二区域中显示手机2的屏幕内容的同时,也可以在第二区域中显示手机2上用户涂鸦的第二轨迹线条,具有较好的用户体验。
(二)文件发送
在虚拟屏幕共享模式下,屏幕接收设备(例如手机1)和屏幕共享设备(手机2)之间,也可以相互快速发送文件。下面对其进行具体的说明。
(1)手机1向手机2发送文件
在虚拟屏幕共享模式下,当用户1向用户2发送文件时,用户1在手机1中找到目标文件(例如文件1)之后,可以通过VR手柄或者虚拟触控操作选中目标文件,并将目标文件的图标从虚拟显示界面的第一区域拖拽到的第二区域内,从而控制手机1向手机2发送该目标文件。具体如下所示。
图14是本申请实施例提供的屏幕共享方法的示意性流程图,涉及在虚拟屏幕共享模式下,作为屏幕接收设备的手机1根据用户的拖拽操作,向作为屏幕共享设备的手机2发送文件的过程。具体包括如下步骤S1401-S1407。
S1401,VR眼镜监测第一区域内是否发生第一控制事件。
在本实施例中,第一控制事件是指选中目标文件,并在虚拟显示界面中拖拽对应的文件图标。其中包括VR眼镜根据VR手柄的发出的指令选中目标文件并拖拽文件图标,或 者根据用户的虚拟触控操作选中目标文件并拖拽文件图标。
S1402,若发生第一控制事件,VR眼镜则记录第一控制事件。
在本实施例中,记录第一控制事件包括记录文件图标在虚拟显示界面中的移动轨迹。
S1403,VR眼镜实现渲染并显示第一控制事件对应的文件图标。
例如,参见图15所示,VR眼镜在根据用户指令选中文件1之后,沿着用户拖拽文件1的轨迹实时渲染并显示文件1的图标。需要说明的是,在本实施例中,在VR眼镜根据VR手柄的控制指令选中目标文件的情况下,VR眼镜根据VR手柄的移动轨迹拖拽文件1的图标。而在VR眼镜根据用户的虚拟触控操作选中目标文件的情况下,VR眼镜则根据用户触控手指的移动轨迹移动文件1的图标。
S1404,VR眼镜判断是否监测第二区域内发生第二控制事件。
在本实施例中,第二控制事件是指释放选中的目标文件的事件。例如,在通过虚拟触控操作拖拽目标文件的过程中,手指松开目标文件。或者,在通过VR手柄选择拖拽目标文件的过程中,取消VR手柄对目标文件的控制等。
在用户拖拽目标文件的过程中,目标文件的图标跟随拖拽轨迹在虚拟显示界面上移动。当用户停止拖拽操作时,若目标文件的文件图标位于第二区域内时,则表示本次拖拽操作的目标位置是第二区域,第二区域内发生第二控制事件,电子设备执行下一步骤S1405。若目标文件的文件图标不在第二区域内,则确定第二区域内未发生第二控制事件,不执行下一步骤S1405。
需要说明的是,在一些实施例中,在该虚拟显示界面中,用户也可以在第一区域内通过拖拽操作来调整目标文件的文件图标在第一区域上的显示位置。例如,当目标文件的图标在第一区域的A位置时,若电子设备检测到用户将目标文件的文件图标拖拽到第一区域的B位置,则电子设备将该文件图标的显示位置由A位置变更为B位置。
S1405,VR眼镜在检测到第二控制事件之后,检测文件快速发送开关是否开启。
可选的,VR眼镜可以设置一个文件快速发送开关,例如拖拽发送开关,用于控制VR眼镜是否响应用户的拖拽操作,控制屏幕接收设备向屏幕共享设备发送目标文件。通常情况下,VR眼镜在进入虚拟屏幕共享模式之后,会自动开启该文件快速发送开关。此外,在一些实施例中,VR眼镜也可以根据用户指令,控制是否开启该文件快速发送开关。
在S1405中,若电子设备检测到文件快速发送开关已开启,则继续执行下一步骤S1406。若电子设备检测到文件快速发送开关未开启,可以在虚拟显示界面上显示第一提示信息,用于提示用户VR眼镜的文件快速发送开关未开启。示例性的,参见图16A所示,该第一提示信息可以包括文字信息:“文件快速发送开关未开启”,以及“取消”控件和“去开启”控件。其中,该“取消”控件用于取消显示当前的第一提示信息,该“去开启”控件用于引导用户开启VR眼镜的文件快速发送开关。
S1406,VR眼镜确定目标文件的文件类型是否支持发送。
通常情况下,电子设备之间支持发送的文件类型包括jpg,gif,doc,docx,ppt,pdf,wma,wav等格式的文件。但是,有一些格式的文件则不支持发送,例如扩展名为bin、exe、dll、pem、lgb等的文件。为此,VR眼镜中可以维护一个白名单,该白名单中包括支持发送的文件类型。在电子设备发送目标文件之前,需要先根据该白名单判断该目标文件的文件类型是否支持发送。
若目标文件的文件类型支持发送则执行下一步骤S1407。若目标文件的文件类型不支持发送,则不执行S1407,文件发送操作结束。可选的,若目标文件的文件类型不支持发送,VR眼镜还可以在第一区域显示第二提示信息。示例性的,参见图16B所示,该第二提示信息可以为“文件类型不支持”等。
S1407,若目标文件的文件类型支持发送,VR眼镜则向第二区域对应的电子设备发送目标文件。
在本实施例中,参见图17所示,由于与手机2直接连接的电子设备是手机1,并且目标文件是存储于手机1中的,因此,VR眼镜在接收到用户的文件发送操作(如用户将目标文件的图标从第一区域拖拽至第二区域的操作)之后,需要先向手机1发送文件发送指令,手机1再根据文件发送指令向手机2发送目标文件。
对于VR眼镜向手机1发送文件发送指令:
在一些实施例中,VR眼镜在确定目标文件能够被发送之后,直接向手机1发送文件发送指令,以便手机1向手机2发送目标文件。
在另一些实施例中,VR眼镜在确定目标文件能够被发送之后,可以在第一区域弹框显示第三提示信息,用于询问用户是否确认发送目标文件。示例性的,参见图18所示,该第三提示信息可以包括文字“确认向用户2发送“文件1”?”,以及“取消”控件和“确认”控件。其中,取消控件用取消显示该第三提示信息,并且终止本次文件发送过程。“确认”控件用于控制VR眼镜向手机1发送文件发送指令,以便手机1根据文件发送指令向手机2发送目标文件。
对于手机1向手机2发送目标文件:
由于手机1和手机2之间已通过通信应用(例如畅连应用)建立连接,因此,手机1可以基于与手机2当前的通信通道(如畅连通道)向手机2发送目标文件。
在一些实施例中,手机1在接收到文件发送指令之后,直接向手机2发送目标文件(例如文件1),手机2自动接收目标文件。可选的,手机2接收目标文件之后,可以将目标文件的文件图标显示在与用户1的对话框中。
在另一些实施例中,手机1在接收到文件发送指令之后,先向手机2发送文件发送请求,并将目标文件临时存储在应用服务器(例如畅连应用的服务器)中。手机2在接收到文件发送请求之后,显示第四提示信息、“取消”控件和“接收”控件。该第四提示信息用于询问用户2是否接收目标文件。在一个示例中,参见图19所示,该第四提示信息为“是否接收用户1发送的“文件1”?”。该“取消”控件用于拒绝接收目标文件。该“接收”控件用于控制接收目标文件。手机2在检测到用户对“接收”控件的操作之后,控制手机2从应用服务器将目标文件下载至本地,并将目标文件的文件图标显示在与用户1的对话框中。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
在一些实施例中,用户还可以通过在第一区域复制目标文件,在第二区域粘贴目标文件的操作(简称复制粘贴操作),控制手机1向手机2发送目标文件。该文件发送控制过程可参见图14中的S1401-S1407所示,本实施例在此不进行赘述。与图14示出的实施例不同的是,在本实施例中,第一控制事件为在第一区域复制目标文件的操作,第二控制事件为在第二区域粘贴目标文件的操作。
在另一些实施例中,在多个电子设备向手机1同时共享屏幕的过程中,VR眼镜的虚拟显示界面上同时显示手机1和该多个电子设备共享的屏幕内容。对于任意一个电子设备,手机1均可通过上述各个本实施例提供的方法向该电子设备发送目标文件。
综上可知,在屏幕共享模式下,通过本申请实施例提供的方法,虚拟显示设备根据用户在虚拟显示界面输入的文件发送操作(例如拖拽操作,或者复制粘贴操作),即可向该目标设备发送目标文件,无需来回切换屏幕显示内容,文件发送过程简便,文件发送效率较高。
此外,对于用户而言,相比于图4A和图4B示出的文件发送方法,在本实施例中,用户通过在虚拟显示界面上执行目标文件的1-2个步骤的控制操作即可控制手机1向目标设备发送目标文件,用户操作非常简便,具有更好的用户体验。
(2)手机2向手机1发送文件
在一些实施例中,结合图6所示的系统,由于手机1与VR眼镜连接,因此手机1本地的屏幕内容和手机2共享的屏幕内容均可显示在VR眼镜的虚拟显示界面中。但由于手机2未连接VR眼镜,因此其仅能在手机2上显示手机2的本地屏幕。基于此,当用户需要控制手机2向手机1发送文件时,用户可以根据如图20所示的文件发送控制过程发送文件。
图20是本申请又一实施例提供的文件发送控制过程示意图。参见图20中的(a)至图20中的(d)所示,手机2在检测到用户选中目标文件之后,显示“分享”控件。响应于用户对“分享”控件的操作,电子设备显示快捷分享控件,例如“分享给VR好友”控件。该VR好友是指当前通话过程中使用虚拟显示设备的好友,例如用户1。响应于用户对快捷分享控件的操作,电子设备显示文件发送提示信息、“取消”控件和“发送”控件。该文件发送提示信息用于提示电子设备即将向用户1发送文件1。手机2在检测到用户对“发送”控件的操作之后,向手机1发送文件1。
在另一些实施例中,手机1连接第一VR眼镜,手机2连接第二VR眼镜。在手机2向手机1共享屏幕的过程中,第一VR眼镜在显示手机1的第一屏幕内容和手机2共享的第二屏幕内容。但第二VR眼镜仅显示手机2的第二屏幕内容(也就是向手机1共享的屏幕内容)。因此,当用户2控制手机2向手机1发送目标文件时,可以参照图20所示的文件发送控制过程,在第二VR眼镜显示的屏幕上进行文件发送控制操作,通过第二VR眼镜控制手机2向手机1发送目标文件。
综上所述,通过本实施例提供的文件发送方法,在屏幕接收设备处于虚拟屏幕共享模式的情况下,用户仅需要通过4个步骤的操作,即可控制屏幕共享设备向屏幕接收设备发送目标文件。相对于图4A和图4B示出的文件发送方法,本实施例提供的方法的用户操作更为简便,具有更好的用户体验。
本实施例提供了一种虚拟显示设备,该虚拟显示设备与屏幕接收设备连接,该屏幕接收设备与屏幕共享设备连接,该虚拟显示设备被配置为执行上述各个实施例中示出的方法。在一些实施例中,该虚拟显示设备为VR眼镜。
本实施例提供了一种计算机程序产品,该程序产品包括程序,当该程序被电子设备运行时,使得电子设备上述各实施例中示出的方法。
本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质存储有计算机程 序,该计算机程序被处理器执行时实现上述各个实施例中示出的方法。
本申请实施例提供一种芯片,该芯片包括存储器和处理器,该处理器执行存储器中存储的计算机程序,以实现控制上述电子设备执行上述各个实施例中示出的方法。
应理解,本申请实施例中提及的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (27)

  1. 一种屏幕共享方法,其特征在于,所述方法应用于虚拟显示设备,所述虚拟显示设备与第一电子设备连接,所述第一电子设备与第二电子设备连接;所述第一电子设备用于显示第一屏幕内容,所述第二电子设备用于显示第二屏幕内容,所述第一电子设备接收所述第二电子设备发送的所述第二屏幕内容;所述方法包括:
    从所述第一电子设备处获取所述第一屏幕内容和所述第二屏幕内容;
    在所述虚拟显示设备显示界面的第一区域显示所述第一屏幕内容,在所述虚拟显示设备显示界面的第二区域显示所述第二屏幕内容。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    在检测到文件发送操作之后,控制所述第一电子设备向所述第二电子设备发送目标文件,所述文件发送操作为:指示所述虚拟显示设备将所述第一区域上显示的所述目标文件向所述第二区域发送的操作。
  3. 根据权利要求2所述的方法,其特征在于,所述文件发送操作包括:从所述第一区域向所述第二区域拖拽或者复制所述目标文件。
  4. 根据权利要求3所述的方法,其特征在于,检测到文件发送操作之后,控制所述第一电子设备向所述第二电子设备发送目标文件,包括:
    在所述第一区域内检测到第一控制事件,且在所述第二区域内检测到第二控制事件之后,控制所述第一电子设备向所述第二电子设备发送所述目标文件;
    其中,所述第一控制事件用于选中所述目标文件,所述第二控制事件用于释放选中的所述目标文件;或者,所述第一控制事件用于复制所述目标文件,所述第二控制事件用于粘贴所述目标文件。
  5. 根据权利要求2-4任一项所述的方法,其特征在于,在所述虚拟显示设备的文件快速发送开关处于开启状态,且所述目标文件支持被分享的情况下,控制所述第一电子设备向所述第二电子设备发送所述目标文件。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    在所述虚拟显示设备进入虚拟屏幕共享模式之后,自动开启所述虚拟显示设备的文件快速发送开关;
    或者,
    在所述第二区域内检测到第二控制事件之后,所述方法还包括:
    若所述虚拟显示设备的文件快速发送开关未开启,则显示第一提示信息,所述第一提示信息用于提示开启所述文件快速发送开关;
    响应于开启所述文件快速发送开关的操作,控制所述文件快速发送开关处于开启状态。
  7. 根据权利要求6所述的方法,其特征在于,在所述第二区域内检测到第二控制事件之后,所述方法还包括:
    若所述目标文件不支持发送,则显示第二提示信息,所述第二提示信息用于提示所述目标文件不支持发送。
  8. 根据权利要求2-7任一项所述的方法,其特征在于,所述控制所述第一电子设备向所述第二电子设备发送所述目标文件,包括:
    显示第三提示信息,所述第三提示信息用于询问用户是否发送所述目标文件;
    响应于接收到的指示发送所述目标文件的指令,控制所述第一电子设备向所述第二电子设备发送所述目标文件。
  9. 根据权利要求2-8任一项所述的方法,其特征在于,所述方法还包括:
    在第二区域显示第四提示信息,所述第四提示信息用于提示所述第二电子设备接收来自所述第一电子设备的所述目标文件。
  10. 根据权利要求2-9任一项所述的方法,其特征在于,所述方法还包括:
    当多个第二电子设备向所述第一电子设备发送第二屏幕内容时,所述虚拟显示设备在所述第一区域的周围确定多个第二区域,并在所述多个第二区域中分别显示各个第二电子设备的第二屏幕内容。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述方法还包括:
    获取所述第二区域上的第一控制轨迹;
    根据所述第一控制轨迹在所述第二区域上显示第一轨迹线条;
    通过所述第一电子设备向所述第二电子设备发送所述第一轨迹线条的第一线条信息,所述第一线条信息用于所述第二电子设备在所述第二屏幕内容的基础上显示所述第一轨迹线条。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    通过所述第一电子设备接收所述第二电子设备发送的第二线条信息,所述第二线条信息用于确定第二轨迹线条,所述第二轨迹线条是根据所述第二电子设备上的第二控制轨迹确定的;以及,
    根据所述第二线条信息,在所述第二区域上显示所述第二轨迹线条。
  13. 一种屏幕共享系统,其特征在于,包括:虚拟显示设备、第一电子设备和第二电子设备,所述虚拟显示设备与第一电子设备连接,所述第一电子设备与所述第二电子设备连接;所述第一电子设备用于显示第一屏幕内容,所述第二电子设备用于显示第二屏幕内容;
    所述第一电子设备被配置为:接收所述第二电子设备发送的所述第二屏幕内容;
    所述虚拟显示设备被配置为:
    从所述第一电子设备处获取所述第一屏幕内容和所述第二屏幕内容;
    在所述虚拟显示设备显示界面的第一区域显示所述第一屏幕内容,在所述虚拟显示设备显示界面的第二区域显示所述第二屏幕内容。
  14. 根据权利要求13所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    在检测到文件发送操作之后,控制所述第一电子设备向所述第二电子设备发送目标文件,所述文件发送操作为:指示所述虚拟显示设备将所述第一区域上显示的所述目标文件向所述第二区域发送的操作。
  15. 根据权利要求14所述的系统,其特征在于,所述文件发送操作包括:从所述第一区域向所述第二区域拖拽或者复制所述目标文件。
  16. 根据权利要求15所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    在所述第一区域内检测到第一控制事件,且在所述第二区域内检测到第二控制事件之后,控制所述第一电子设备向所述第二电子设备发送所述目标文件;
    其中,所述第一控制事件用于选中所述目标文件,所述第二控制事件用于释放选中的所述目标文件;或者,所述第一控制事件用于复制所述目标文件,所述第二控制事件用于 粘贴所述目标文件。
  17. 根据权利要求14-16任一项所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    在所述虚拟显示设备的文件快速发送开关处于开启状态,且所述目标文件支持被分享的情况下,控制所述第一电子设备向所述第二电子设备发送所述目标文件。
  18. 根据权利要求17所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    在进入虚拟屏幕共享模式之后,自动开启所述文件快速发送开关;
    或者,在所述第二区域内检测到第二控制事件之后,所述虚拟显示设备还被配置为:
    若所述虚拟显示设备的文件快速发送开关未开启,则显示第一提示信息,所述第一提示信息用于提示开启所述文件快速发送开关;
    响应于开启所述文件快速发送开关的操作,控制所述文件快速发送开关处于开启状态。
  19. 根据权利要求17或18所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    若所述目标文件不支持发送,则显示第二提示信息,所述第二提示信息用于提示所述目标文件不支持发送。
  20. 根据权利要求14-19任一项所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    显示第三提示信息,所述第三提示信息用于询问用户是否发送所述目标文件;
    响应于接收到的指示发送所述目标文件的指令,控制所述第一电子设备向所述第二电子设备发送所述目标文件。
  21. 根据权利要求14-20任一项所述的系统,其特征在于,所述第二电子设备还被配置为:
    显示第四提示信息,所述第四提示信息用于提示所述第二电子设备接收来自所述第一电子设备的所述目标文件。
  22. 根据权利要求13-21任一项所述的系统,其特征在于,所述虚拟显示设备还被配置为:
    当多个所述第二电子设备向所述第一电子设备发送多个第二屏幕内容时,所述虚拟显示设备在所述第一区域的周围确定多个第二区域,并在所述多个第二区域中分别显示所述多个第二屏幕内容。
  23. 根据权利要求13-22任一项所述的系统,其特征在于,
    所述虚拟显示设备还被配置为:
    获取所述第二区域上的第一控制轨迹;
    根据所述第一控制轨迹在所述第二区域上显示第一轨迹线条;
    通过所述第一电子设备向所述第二电子设备发送所述第一轨迹线条的第一线条信息;
    所述第二电子设备还被配置为:
    根据所述第一线条信息在所述第二电子设备的屏幕上显示所述第一轨迹线条。
  24. 根据权利要求13-23任一项所述的系统,其特征在于,
    所述第二电子设备,还被配置为:
    获取所述第二电子设备屏幕上的第二控制轨迹;
    根据所述第二控制轨迹显示第二轨迹线条;
    通过所述第一电子设备向所述虚拟显示设备发送所述第二轨迹线条的第二线条信息;
    所述虚拟显示设备还被配置为:
    根据所述第二线条信息,在所述第二区域上显示所述第二轨迹线条。
  25. 一种虚拟显示设备,其特征在于,所述虚拟显示设备被配置为执行如权利要求1-12 任一项所述的屏幕共享方法。
  26. 一种芯片系统,其特征在于,所述芯片系统包括处理器,所述处理器执行存储器中存储的计算机程序,以实现如权利要求1-12任一项所述的屏幕共享方法。
  27. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-12任一项所述的屏幕共享方法。
PCT/CN2022/087103 2021-06-23 2022-04-15 一种屏幕共享方法、系统和虚拟显示设备 WO2022267644A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110701084.1 2021-06-23
CN202110701084.1A CN115509476A (zh) 2021-06-23 2021-06-23 一种屏幕共享方法、系统和虚拟显示设备

Publications (1)

Publication Number Publication Date
WO2022267644A1 true WO2022267644A1 (zh) 2022-12-29

Family

ID=84499119

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087103 WO2022267644A1 (zh) 2021-06-23 2022-04-15 一种屏幕共享方法、系统和虚拟显示设备

Country Status (2)

Country Link
CN (1) CN115509476A (zh)
WO (1) WO2022267644A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117056869A (zh) * 2023-10-11 2023-11-14 轩创(广州)网络科技有限公司 一种基于人工智能的电子信息数据关联方法及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110162283A (zh) * 2018-02-13 2019-08-23 北京三星通信技术研究有限公司 共享外接显示设备的方法、共享信息的方法及用户设备
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质
CN112527221A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种数据传输的方法及相关设备
US20210099565A1 (en) * 2018-01-18 2021-04-01 Samsung Electronics Co., Ltd. Electronic device and method of operating electronic device in virtual reality
CN112947825A (zh) * 2021-01-28 2021-06-11 维沃移动通信有限公司 显示控制方法、装置、电子设备及介质
CN112988102A (zh) * 2021-05-11 2021-06-18 荣耀终端有限公司 投屏方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210099565A1 (en) * 2018-01-18 2021-04-01 Samsung Electronics Co., Ltd. Electronic device and method of operating electronic device in virtual reality
CN110162283A (zh) * 2018-02-13 2019-08-23 北京三星通信技术研究有限公司 共享外接显示设备的方法、共享信息的方法及用户设备
CN112527221A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种数据传输的方法及相关设备
CN111327769A (zh) * 2020-02-25 2020-06-23 北京小米移动软件有限公司 多屏互动方法及装置、存储介质
CN112947825A (zh) * 2021-01-28 2021-06-11 维沃移动通信有限公司 显示控制方法、装置、电子设备及介质
CN112988102A (zh) * 2021-05-11 2021-06-18 荣耀终端有限公司 投屏方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117056869A (zh) * 2023-10-11 2023-11-14 轩创(广州)网络科技有限公司 一种基于人工智能的电子信息数据关联方法及系统

Also Published As

Publication number Publication date
CN115509476A (zh) 2022-12-23

Similar Documents

Publication Publication Date Title
US11604572B2 (en) Multi-screen interaction method and apparatus, and storage medium
US11595566B2 (en) Camera switching method for terminal, and terminal
WO2020098437A1 (zh) 一种播放多媒体数据的方法及电子设备
WO2021078284A1 (zh) 一种内容接续方法及电子设备
WO2022100305A1 (zh) 画面跨设备显示方法与装置、电子设备
WO2021036771A1 (zh) 具有可折叠屏幕的电子设备及显示方法
KR101901720B1 (ko) 더미 장치와의 연동 방법 및 그 전자 장치
WO2022100610A1 (zh) 投屏方法、装置、电子设备及计算机可读存储介质
US20220408020A1 (en) Image Processing Method, Electronic Device, and Cloud Server
WO2024016832A1 (zh) 应用接续方法和装置
WO2022121775A1 (zh) 一种投屏方法及设备
EP4336950A1 (en) Multi-device cooperation method, electronic device and related product
WO2021037146A1 (zh) 一种移动终端的文件共享方法及设备
CN115378900A (zh) 歌单共享方法、装置、终端及存储介质
US20230208790A1 (en) Content sharing method, apparatus, and system
CN110996305A (zh) 连接蓝牙设备的方法、装置、电子设备及介质
CN114885442A (zh) 一种输入设备的连接方法、设备及系统
WO2022267644A1 (zh) 一种屏幕共享方法、系统和虚拟显示设备
WO2022127670A1 (zh) 一种通话方法、相关设备和系统
CN113672133A (zh) 一种多指交互方法及电子设备
US20230273902A1 (en) File Opening Method and Device
EP3968671B1 (en) Method for information transmission and electronic device
CN111031394A (zh) 视频制作的方法、装置、设备及存储介质
CN113805825B (zh) 设备之间的数据通信方法、设备及可读存储介质
WO2022088926A1 (zh) 一种拍摄方法和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE