CN112416284B - Method, apparatus, device and storage medium for sharing screen - Google Patents

Method, apparatus, device and storage medium for sharing screen Download PDF

Info

Publication number
CN112416284B
CN112416284B CN202011432446.3A CN202011432446A CN112416284B CN 112416284 B CN112416284 B CN 112416284B CN 202011432446 A CN202011432446 A CN 202011432446A CN 112416284 B CN112416284 B CN 112416284B
Authority
CN
China
Prior art keywords
screen
terminal
casting
face
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011432446.3A
Other languages
Chinese (zh)
Other versions
CN112416284A (en
Inventor
孟宪宇
赵瑞
马聪
张松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN202011432446.3A priority Critical patent/CN112416284B/en
Publication of CN112416284A publication Critical patent/CN112416284A/en
Application granted granted Critical
Publication of CN112416284B publication Critical patent/CN112416284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements

Abstract

The application discloses a method, a device, equipment and a storage medium for sharing a screen, and relates to the technical field of screen projection. The specific implementation scheme is as follows: sending a screen-throwing preparation message to a terminal meeting a preset condition in response to detecting a screen-throwing starting instruction triggered by a user; responding to a received screen-throwing ready message sent by a target terminal, carrying out face recognition on a face image in the screen-throwing ready message, and determining a target face object, wherein the screen-throwing ready message is sent after the target terminal acquires a video through a connected image acquisition device and determines that the watching time of the face object in the video exceeds a preset time; and responding to the fact that the target face object belongs to the preset face object set, and sending screen projection content to the target terminal to share the screen. The implementation mode simplifies the operation of a user in the screen projection process and realizes quick screen projection.

Description

Method, apparatus, device and storage medium for sharing screen
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for sharing a screen.
Background
In the prior art, the information can be quickly shared in a screen projection mode. In the screen projection process, screen projection information needing to be shared is sent to large-size screen projection receiving equipment such as a projector, a liquid crystal display screen and a splicing screen by a user of the screen projection sending equipment, and screen projection content is displayed by the screen projection receiving equipment so that other users can conveniently and uniformly view the screen projection content. For example, the screen projection process can be involved in application scenes such as enterprise meetings, teaching demonstration, collective declaration and the like. At present, the screen projection sending device and the screen projection receiving device are wirelessly connected across platforms mainly based on a wireless video Display (WFD) technology, so as to realize screen projection sharing.
The existing screen projection method generally requires a user to select the equipment needing screen projection from a plurality of equipment lists, and is complex to operate.
Disclosure of Invention
A method, apparatus, device, and storage medium for sharing a screen are provided.
According to a first aspect, there is provided a method for sharing a screen, comprising: sending a screen-throwing preparation message to a terminal meeting a preset condition in response to detecting a screen-throwing starting instruction triggered by a user; responding to a received screen-throwing ready message sent by a target terminal, carrying out face recognition on a face image in the screen-throwing ready message, and determining a target face object, wherein the screen-throwing ready message is sent after the target terminal acquires a video through a connected image acquisition device and determines that the watching time of the face object in the video exceeds a preset time; and responding to the fact that the target face object belongs to the preset face object set, and sending screen projection content to the target terminal to share the screen.
According to a second aspect, there is provided a method for sharing a screen, comprising: in response to receiving a screen projection preparation message sent by a screen projection terminal, acquiring a video through a connected image acquisition device, wherein the screen projection preparation message is sent by the screen projection terminal after detecting a screen projection starting instruction triggered by a user; in response to the fact that the watching time of the face object in the video exceeds the preset time, generating a screen throwing ready message and sending the screen throwing ready message to a screen throwing terminal according to the face image in the video; and receiving a screen-casting content sharing screen sent by a screen-casting terminal, wherein the screen-casting content is sent after the screen-casting terminal carries out face recognition on a face image in a screen-casting ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
According to a third aspect, there is provided an apparatus for sharing a screen, comprising: the screen-casting preparation message sending unit is configured to respond to the detection of a screen-casting starting instruction triggered by a user and send a screen-casting preparation message to the terminal meeting the preset condition; the face recognition unit is configured to respond to a received screen-throwing ready message sent by the target terminal, perform face recognition on a face image in the screen-throwing ready message, and determine a target face object, wherein the screen-throwing ready message is sent after the target terminal acquires a video through a connected image acquisition device and determines that the watching time of the face object in the video exceeds a preset time; and the screen projection content sending unit is configured to respond to the fact that the target face object belongs to the preset face object set, and send screen projection content to the target terminal to share the screen.
According to a fourth aspect, there is provided an apparatus for sharing a screen, comprising: the video acquisition unit is configured to respond to the fact that a screen projection preparation message sent by the screen projection terminal is received, and acquire videos through the connected image acquisition device, wherein the screen projection preparation message is sent by the screen projection terminal after a screen projection starting instruction triggered by a user is detected; the screen-casting ready message sending unit is configured to respond to the fact that the watching duration of the face object in the video exceeds the preset duration, generate a screen-casting ready message according to the face image in the video and send the screen-casting ready message to the screen-casting terminal; and the screen-casting content receiving unit is configured to receive a screen-casting content sharing screen sent by the screen-casting terminal, wherein the screen-casting content is sent after the screen-casting terminal carries out face recognition on a face image in a screen-casting ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
According to a fifth aspect, there is provided an electronic device for sharing a screen, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described in the first or second aspect.
According to a sixth aspect, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method as described in the first or second aspect.
According to the screen projection method and the screen projection device, the technical problem that an existing screen projection method is complex in operation is solved, operation of a user in the screen projection process is simplified, and rapid screen projection is achieved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for sharing a screen according to the present application;
FIG. 3 is a flow diagram of another embodiment of a method for sharing a screen according to the present application;
FIG. 4 is a flow diagram of one embodiment of a method for sharing a screen according to the present application;
FIG. 5 is a flow diagram of another embodiment of a method for sharing a screen according to the present application;
FIG. 6 is a schematic diagram of an application scenario for a method for sharing a screen according to the present application;
FIG. 7 is a schematic diagram illustrating the structure of one embodiment of an apparatus for sharing a screen according to the present application;
FIG. 8 is a schematic diagram of an embodiment of an apparatus for sharing a screen according to the present application;
fig. 9 is a block diagram of an electronic device for implementing a method for sharing a screen according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the present method for sharing a screen or apparatus for sharing a screen may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. Network 104 is the medium used to provide communication links between terminal devices 101, 102, 103 and server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Similarly, the server 105 may interact with the terminal devices 101, 102, and 103 via the network 104, for example, to transmit an instruction transmitted by one of the terminal devices to another terminal device. Various communication client applications, such as a sharing application, an audio/video playing application, etc., may be installed on the terminal devices 101, 102, and 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices including, but not limited to, smart phones, tablet computers, e-book readers, car computers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server forwarding instructions sent by the terminal devices 101, 102, 103. The background server can receive the instruction of the screen projection device and send the instruction to the receiving device.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It is understood that in some specific cases (for example, the terminal devices are located in the same local area network or communication connections are established between the terminal devices), the system architecture may not include the server 105, and the terminal devices may interact directly through the network 104.
It should be noted that the method for sharing a screen provided in the embodiments of the present application may be executed by the terminal devices 101, 102, and 103. Accordingly, means for sharing a screen may be provided in the terminal devices 101, 102, 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for sharing a screen according to the present application is shown. The method for sharing the screen can be applied to the screen projection terminal. Here, the screen projection terminal refers to a terminal that projects a content to another terminal. The method of this embodiment may specifically include the following steps:
step 201, in response to detecting a screen-throwing starting instruction triggered by a user, sending a screen-throwing preparation message to a terminal meeting a preset condition.
In this embodiment, the user can operate the screen projection terminal to start the screen projection mode. Specifically, the user can send a screen-casting starting instruction to the screen-casting terminal by dragging a certain button in the screen-casting terminal. After receiving a screen projection starting instruction triggered by a user, the screen projection terminal can send a screen projection preparation message to the terminal meeting the preset condition. The preset conditions herein may include, but are not limited to: access the same wireless network, have previously established a connection, and so on. The screen-casting terminal can send a screen-casting preparation message to each terminal in a broadcasting mode. The screen-casting preparation message is intended to inform terminals that screen casting is ready. After receiving the screen-casting preparation message, each terminal intentionally sends a screen-casting ready message to the screen-casting terminal, wherein the screen-casting preparation message can be responded to the terminal receiving the screen-casting content.
Step 202, in response to receiving a screen-casting ready message sent by a target terminal, performing face recognition on a face image in the screen-casting ready message to determine a target face object.
If the terminal has intention to receive the screen-casting content, a screen-casting ready message can be sent to the screen-casting terminal. The terminal that sends the screen-on ready message may be referred to as a target terminal. Specifically, the target terminal may capture a video through the connected image capture device after receiving the screen-casting preparation message. And analyzing the watching time length of the face object in the video, and if the watching time length is longer than the preset time length, determining that the user watching the target terminal receives the screen-casting content in an intended manner. The target terminal can package the face image in a screen projection ready message and send the screen projection ready message to the screen projection terminal. After receiving the screen-casting ready message, the screen-casting terminal can perform face recognition on the face image in the screen-casting ready message to determine the target face object.
Step 203, in response to determining that the target face object belongs to the preset face object set, sending screen-casting content to the target terminal to share the screen.
After determining the target face object, the screen projection terminal can judge whether the target face object belongs to a preset face object set. Here, the preset face object set may include a plurality of face objects, and the face objects may be face objects authorized by a user using the screen projection terminal. If the screen projection terminal determines that the target face object belongs to the preset face object set, and the target face object is an authorized face object, screen projection content can be sent to the target terminal. After receiving the screen projection content, the target terminal can display the screen projection content on the screen to realize screen sharing. It can be understood that, when displaying the screen-cast content, the target terminal may display in a full-screen form or in a split-screen form.
The method for sharing the screen, provided by the embodiment of the application, is applied to the screen projection terminal, and can analyze the target face object in the video acquired by the target terminal after the user triggers the screen projection starting instruction, and send screen projection content to the target terminal to realize screen sharing when the target face object belongs to the preset face object set. The method of the embodiment can simplify screen projection operation, enables the face object receiving screen projection content to be located in the preset face object set, and provides screen sharing safety.
With continued reference to FIG. 3, a flow 300 of another embodiment of a method for sharing a screen according to the present application is shown. As shown in fig. 3, the method for sharing a screen of the present embodiment may include the steps of:
step 301, in response to detecting a screen-throwing starting instruction triggered by a user, sending a screen-throwing preparation message to a terminal meeting a preset condition.
In some optional implementation manners of this embodiment, the screen-casting terminal may send the screen-casting preparation message in a broadcast manner, and find a terminal that meets a preset condition in the broadcast process. The screen projection terminal can also add the identifier (such as a physical identifier) of the terminal meeting the preset condition into a preset maintenance equipment list.
In this implementation manner, after the screen projection terminal updates the maintenance device list, the screen projection terminal can directly send a message to each terminal in the maintenance device list for interaction. Similarly, when the user triggers the screen-throwing instruction next time, the screen-throwing preparation message can be directly sent to each terminal in the maintenance equipment list.
Step 302, acquiring a preset shared user information list; and carrying out face recognition on the face images in the shared user information list to obtain a face object set.
In this embodiment, the screen projection terminal may further obtain a preset shared user information list. At least one piece of shared user information may be included in the shared user information list, and the shared user information may include, but is not limited to: a face image, a user identification (which can be a nickname, and a label of a user of the screen projection terminal to a sharing user). The screen projection terminal can perform face recognition on face images in the shared user information list to obtain a face object set. Here, the screen projection terminal may use an existing face recognition algorithm to obtain a face object set.
In some specific applications, the step of performing face recognition on the face image may be performed by other electronic devices. It is understood that other electronic devices may transmit the obtained set of face objects to the screen projection terminal after execution.
In some optional implementations of this embodiment, the method may further include the following steps not shown in fig. 3: receiving modification information of a user on a shared user information list; and updating the shared user information list according to the modification information.
In this implementation, the user may also modify the shared user information list through the screen projection terminal. The modification may include adding shared user information, deleting shared user information, modifying shared user information, and the like. And the screen projection terminal can update the shared user information list according to the modification information after receiving the modification information. Therefore, the user can manage the sharing user, and the security of the shared content is improved.
Step 303, in response to receiving the screen-casting ready message sent by the target terminal, performing face recognition on the face image in the screen-casting ready message to determine a target face object.
The screen-casting ready message is sent after the target terminal collects the video through the connected image collecting device and determines that the watching time of the face object in the video exceeds the preset time.
And step 304, determining screen projection contents according to the screen projection content determination instruction of the user.
In this embodiment, the user may also select the screen-projecting content through the screen-projecting terminal, and the operation of the screen-projecting terminal by the user may generate a screen-projecting content determination instruction. The screen projection terminal can determine the screen projection content according to the screen projection content determination instruction. Here, the screen-shot content refers to content displayed on the target terminal. The screen projection content may be content currently displayed by the screen projection terminal, or content such as images, videos, music and the like selected by the user.
In step 305, in response to determining that the target face object belongs to the preset face object set, screen-casting content is sent to the target terminal to share the screen.
It can be understood that, after determining that the target face object belongs to the preset face object set, the screen projection terminal needs to maintain the communication connection between the screen projection terminal and the target terminal, so that the screen projection content can be sent to the target terminal in real time. In some specific applications, the screen projection terminal may detect whether a connection between the screen projection terminal and the target terminal is available in a form of sending a heartbeat packet to the target terminal.
Step 306, receiving a control instruction of the user to the screen projection content; and sending the control instruction to the target terminal for controlling the screen projection content.
In this embodiment, the user may also control the screen-projected content through the screen-projected terminal, that is, generate a control instruction for the screen-projected content. The control may be fast forward, fast backward, double speed play, etc. The screen projection terminal can not only control screen projection contents according to the control instruction, but also send the control instruction to the target terminal. And after receiving the control instruction, the target terminal can control the screen projection content.
And step 307, in response to receiving the screen projection ending message sent by the target terminal, stopping sharing the screen to the target terminal.
In this embodiment, if the image acquisition device connected to the target terminal does not acquire the face image, it is considered that the user watching the target terminal does not watch the screen projection content through the target terminal any more, and at this time, the target terminal may send a screen projection end message to the screen projection terminal. And if the screen projection terminal receives a screen projection ending message sent by the target terminal, the screen can be stopped from being shared with the target terminal.
The method for sharing the screen provided by the embodiment of the application can allow a user to control the screen-casting content through the screen-casting terminal, so that the screen-casting content displayed on the target terminal is controlled; in addition, the user can update the shared user information list so as to improve the safety of the shared content; meanwhile, when a screen projection ending message is received, screen sharing can be ended, screen sharing operation is simplified, and safety is improved.
With continued reference to FIG. 4, a flow 400 of yet another embodiment of a method for sharing a screen according to the present application is shown. The method for sharing a screen of the present embodiment may be applied to a target terminal, i.e., to a terminal that receives screen-cast content. As shown in fig. 4, the method of the present embodiment may include the following steps:
step 401, in response to receiving a screen-casting preparation message sent by a screen-casting terminal, capturing a video through a connected image capturing device.
In this embodiment, after receiving a screen-casting preparation message sent by a screen-casting terminal, a target terminal may collect a video through a connected image acquisition device. Specifically, the image capturing device may be disposed in front of the target terminal to capture a video in front of a display of the target terminal.
Step 402, in response to the fact that the watching duration of the face object in the video exceeds a preset duration, generating a screen-casting ready message according to the face image in the video, and sending the screen-casting ready message to a screen-casting terminal.
The target terminal can analyze the collected video. Specifically, the target terminal may identify the position of the face object in the video. Then, eyeballs of the human face object are analyzed, and the gazing duration of the human face object is determined. And if the watching duration exceeds the preset duration, the human face object is considered to have the intention of watching the screen projection content. In some specific applications, the target terminal may further analyze a gaze direction of the human face object. And if the gazing direction is the direction of the gazing target terminal, the human face object is considered to have the intention of watching the screen-shot content.
The target terminal can package the face image in the screen-casting ready message and send the generated screen-casting ready message to the screen-casting terminal. After receiving the screen-casting ready message, the screen-casting terminal can analyze the face image therein to determine the target face object. And if the target face object belongs to the preset face object set, screen projection content can be sent to the target terminal.
And step 403, receiving a screen-casting content sharing screen sent by the screen-casting terminal.
And after receiving the screen projection content sent by the screen projection terminal, the target terminal can display the screen projection content to realize screen sharing.
The method for sharing the screen provided by the embodiment of the application can respond to the screen projection terminal to realize screen sharing when confirming that the user wants to watch the screen projection content, so that the watching experience of the user is improved.
With continued reference to FIG. 5, a flow 500 of another embodiment of a method for sharing a screen according to the present application is shown. As shown in fig. 5, the method of the present embodiment may include the following steps:
step 501, responding to a received screen-casting preparation message sent by a screen-casting terminal, and outputting a preset prompt message; and in response to the fact that the preset prompt information output time is longer than the preset time, acquiring the video through the connected image acquisition device.
In this embodiment, if the target terminal receives the screen-casting preparation message sent by the screen-casting terminal, the preset prompt message may be output first. The preset prompt information is used for prompting that the user needs to watch the target terminal if the user wants to watch the screen projection content. The preset prompting message may be displayed on the target terminal in various forms for the user who views the target terminal to pay attention to, for example, may be displayed in a pop-up window form, or may be displayed in a voice driving form. The target terminal can also time the output time of the preset prompt message, and if the output time is longer than the preset time, the user is determined to have received the preset prompt message, and then the video can be collected through the connected image collecting device.
Step 502, in response to the fact that the watching duration of the face object in the video exceeds the preset duration, generating a screen-casting ready message according to the face image in the video, and sending the screen-casting ready message to the screen-casting terminal.
And 503, receiving a screen-casting content sharing screen sent by the screen-casting terminal.
And the screen projection content is sent after the screen projection terminal carries out face recognition on the face image in the screen projection ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
And 504, responding to the received control instruction of the screen projection content sent by the screen projection terminal, and controlling the screen projection content according to the control instruction.
In this embodiment, the target terminal may further receive a control instruction for the screen-projecting content sent by the screen-projecting terminal, and analyze the control instruction, so as to control the screen-projecting content.
And 505, in response to determining that the connected image acquisition device does not acquire the face image, sending a screen projection end message to the screen projection terminal for stopping sharing the screen.
In this embodiment, the target terminal may collect a video in real time through the connected image collecting device, and analyze the video in real time. If the target terminal finds that the collected video does not include the face image, the user is determined not to watch the screen projection content, or the user does not watch the screen projection content through the target terminal and wants to watch the screen projection content through other terminals. At this time, the target terminal may send a screen-casting end message to the screen-casting terminal, where the screen-casting end message is used to inform the screen-casting terminal that the user does not watch any more at this time, and then the screen-casting terminal may stop sharing the screen.
The method for sharing the screen provided by the embodiment of the application can receive the control instruction of the screen projection terminal to control the screen projection content, so that the screen projection convenience is improved; and a screen projection end message can be sent to the screen projection terminal when the face is not acquired, so that the safety of the screen projection content is improved.
With continued reference to fig. 6, a schematic diagram of one application scenario of a method for sharing a screen according to the present application is shown. In the application scenario of fig. 6, the user a triggers a screen-projection start instruction to the screen-projection terminal 601, and the screen-projection terminal 601 broadcasts a screen-projection preparation message to the target terminals 602 and 603 accessing the same wireless network. Among them, the target terminal 602 is installed in room 1, and the target terminal 603 is installed in room 2.
The user B watches the camera installed on the target terminal 602 for 3 seconds in the room 1, and the camera on the target terminal 602 acquires a face image of the user B and then sends the face image to the screen projection terminal 601. The screen-casting terminal 601 determines that the user B is located in the pre-stored shared user information list, and then sends the currently displayed content to the target terminal 602, and the user B can view the screen-casting content through the target terminal 602 in the room 1. The screen sharing mode does not need the user A and the user B to perform complicated operation, and the user experience is improved.
The user B moves to the room 2 after watching the target terminal 602 for a certain period of time. And looks at the camera installed on the target terminal 603 for 3 seconds, and the camera on the target terminal 603 acquires the face image of the user B and sends the face image to the screen projection terminal 601. The screen-casting terminal 601 transmits the currently displayed content to the target terminal 603, and the user B can view the screen-casting content through the target terminal 603 in room 2. Meanwhile, the camera installed on the target terminal 602 does not acquire the face image of the user B, and sends a screen projection end message to the screen projection terminal 601, and the screen projection terminal 601 does not send screen projection content to the target terminal 602 any more, thereby stopping screen sharing. By the screen sharing mode, not only one-to-many screen projection can be realized, but also seamless connection of screen projection contents is basically realized.
It is understood that if the user C in the room 2 looks at the camera on the target terminal 603 for 3 seconds while the user B is in the room 1 looking at the target terminal 602, and the user C is also in the pre-stored shared user information list, the screen projection terminal 601 may also transmit the currently displayed content to the target terminal 603 at the same time. In this way, user C can simultaneously view the screen-cast content through the target terminal 603 in room 2.
It will also be appreciated that if user D enters room 1 when user B leaves room 1 and user D is not in the pre-stored shared user information list, the screen-projection terminal 601 may also stop simultaneously transmitting the currently displayed content to the target terminal 602. That is, the user D cannot view the screen-projected content alone in the case of the room 1.
With further reference to fig. 7, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for sharing a screen, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 7, the device 700 for sharing a screen of the present embodiment includes: a screen-casting preparation message transmitting unit 701, a face recognition unit 702, and a screen-casting content transmitting unit 703.
A screen-casting preparation message sending unit 701 configured to send a screen-casting preparation message to a terminal satisfying a preset condition in response to detecting a screen-casting start instruction triggered by a user.
The face recognition unit 702 is configured to, in response to receiving a screen-on ready message sent by a target terminal, perform face recognition on a face image in the screen-on ready message, and determine a target face object. The screen-casting ready message is sent after the target terminal collects a video through a connected image collecting device and determines that the watching time of the face object in the video exceeds a preset time.
A screen-shot content sending unit 703 configured to send screen-shot content to the target terminal to share a screen in response to determining that the target face object belongs to a preset face object set.
In some optional implementations of this embodiment, the apparatus 700 may further include a screen projection content determining unit, not shown in fig. 7, configured to: and determining the screen projection content according to the screen projection content determination instruction of the user.
In some optional implementations of this embodiment, the apparatus 700 may further include a first screen-projection content control unit, not shown in fig. 7, configured to: receiving a control instruction of a user to the screen projection content; and sending the control instruction to the target terminal for controlling the screen projection content.
In some optional implementations of this embodiment, the apparatus 700 may further include a screen-projection end control unit, not shown in fig. 7, configured to: and in response to receiving a screen projection ending message sent by the target terminal, stopping sharing a screen to the target terminal, wherein the screen projection ending message is sent when an image acquisition device connected with the target terminal does not acquire a face image.
In some optional implementations of this embodiment, the apparatus 700 may further include a shared user information maintenance unit, not shown in fig. 7, configured to obtain the set of face objects by: acquiring a preset shared user information list, wherein the shared user information comprises a face image; and carrying out face recognition on the face images in the shared user information list to obtain the face object set.
In some optional implementations of this embodiment, the shared user information maintenance unit is further configured to: receiving modification information of the shared user information list from a user; and updating the shared user information list according to the modification information.
It should be understood that the units 701 to 703 recited in the apparatus 700 for sharing a screen correspond to respective steps in the method described with reference to fig. 2, respectively. Thus, the operations and features described above for the method for sharing a screen are also applicable to the apparatus 700 and the units included therein, and are not described herein again.
With further reference to fig. 8, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for sharing a screen, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 8, the device 800 for sharing a screen of the present embodiment includes: a video capture unit 801, a screen-shot ready message sending unit 802, and a screen-shot content receiving unit 803.
A video capture unit 801 configured to capture a video by a connected image capture device in response to receiving a screen-projection preparation message sent by a screen-projection terminal. The screen projection preparation message is sent by the screen projection terminal after detecting a screen projection starting instruction triggered by a user.
A screen-ready message sending unit 802, configured to generate a screen-ready message according to a face image in the video and send the screen-ready message to the screen-throwing terminal in response to determining that the gazing duration of the face object in the video exceeds a preset duration.
A screen-casting content receiving unit 803 configured to receive a screen-casting content sharing screen transmitted by the screen-casting terminal. And the screen-casting content is sent after the screen-casting terminal carries out face recognition on the face image in the screen-casting ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
In some optional implementations of this embodiment, the video capture unit 801 may be further configured to: responding to a received screen-casting preparation message sent by a screen-casting terminal, and outputting preset prompt information; and in response to the fact that the preset prompt information output time is longer than the preset time, acquiring a video through the connected image acquisition device.
In some optional implementations of this embodiment, the apparatus 800 may further include a second screen-projection content control unit, not shown in fig. 8, configured to: and responding to a received control instruction of the screen projecting content sent by the screen projecting terminal, and controlling the screen projecting content according to the control instruction.
In some optional implementations of this embodiment, the apparatus 800 may further include a second screen-projection-end message sending unit, not shown in fig. 8, configured to: and responding to the fact that the connected image acquisition device does not acquire the face image, and sending a screen projection ending message to the screen projection terminal for stopping the screen sharing.
It should be understood that units 801 to 803 recited in the apparatus 800 for sharing a screen correspond to respective steps in the method described with reference to fig. 4, respectively. Thus, the operations and features described above for the method for sharing a screen are equally applicable to the apparatus 800 and the units included therein, and are not described in detail here.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 9, is a block diagram of an electronic device that performs a method for sharing a screen according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 9, the electronic apparatus includes: one or more processors 901, memory 902, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing some of the necessary operations (e.g., as an array of servers, a group of blade servers, or a multi-processor system). Fig. 9 illustrates an example of a processor 901.
Memory 902 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by at least one processor to cause the at least one processor to perform the method for sharing a screen provided herein. A non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method for sharing a screen provided by the present application.
The memory 902, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the execution of the method for sharing a screen in the embodiments of the present application (e.g., the screen-cut preparation message transmitting unit 701, the face recognition unit 702, and the screen-cut content transmitting unit 703 shown in fig. 7, or the video capture unit 801, the screen-cut ready message transmitting unit 802, and the screen-cut content receiving unit 803 shown in fig. 8). The processor 901 executes various functional applications of the server and data processing, i.e., implements the method for sharing a screen performed in the above-described method embodiments, by executing non-transitory software programs, instructions, and modules stored in the memory 902.
The memory 902 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device performed for sharing the screen, and the like. Further, the memory 902 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 902 may optionally include memory remotely located from the processor 901, which may be connected over a network to an electronic device executing instructions for sharing screens. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device performing the method for sharing a screen may further include: an input device 903 and an output device 904. The processor 901, the memory 902, the input device 903 and the output device 904 may be connected by a bus or other means, and fig. 9 illustrates the connection by a bus as an example.
The input device 903 may receive input numeric or character information and generate key signal inputs related to performing user settings and function control of the electronic apparatus for sharing a screen, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 904 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS").
According to the technical scheme of the embodiment of the application, the operation of a user in the screen projection process is simplified, and the rapid screen projection is realized.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (22)

1. A method for sharing a screen, comprising:
sending a screen-throwing preparation message to a terminal meeting a preset condition in response to detecting a screen-throwing starting instruction triggered by a user;
responding to a received screen-throwing ready message sent by a target terminal, carrying out face recognition on a face image in the screen-throwing ready message, and determining a target face object, wherein the screen-throwing ready message is sent after the target terminal acquires a video through a connected image acquisition device and determines that the watching time of the face object in the video exceeds a preset time;
and in response to the fact that the target face object belongs to a preset face object set, screen projection content is sent to the target terminal to share a screen.
2. The method of claim 1, wherein the method further comprises:
and determining the screen projection content according to the screen projection content determination instruction of the user.
3. The method of claim 1, wherein the method further comprises:
receiving a control instruction of a user to the screen projection content;
and sending the control instruction to the target terminal for controlling the screen projection content.
4. The method of claim 1, wherein the method further comprises:
and in response to receiving a screen projection ending message sent by the target terminal, stopping sharing a screen to the target terminal, wherein the screen projection ending message is sent when an image acquisition device connected with the target terminal does not acquire a face image.
5. The method of claim 1, wherein the set of face objects is obtained by:
acquiring a preset shared user information list, wherein the shared user information comprises a face image;
and carrying out face recognition on the face images in the shared user information list to obtain the face object set.
6. The method of claim 5, wherein the method further comprises:
receiving modification information of the shared user information list from a user;
and updating the shared user information list according to the modification information.
7. A method for sharing a screen, comprising:
in response to receiving a screen projection preparation message sent by a screen projection terminal, acquiring a video through a connected image acquisition device, wherein the screen projection preparation message is sent by the screen projection terminal after detecting a screen projection starting instruction triggered by a user;
in response to the fact that the watching duration of the face object in the video exceeds the preset duration, generating a screen-casting ready message according to the face image in the video and sending the screen-casting ready message to the screen-casting terminal;
and receiving a screen-casting content sharing screen sent by the screen-casting terminal, wherein the screen-casting content is sent after the screen-casting terminal carries out face recognition on a face image in the screen-casting ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
8. The method according to claim 7, wherein the capturing of the video by the connected image capturing device in response to receiving the screen-shot preparation message sent by the screen-shot terminal comprises:
responding to a received screen-casting preparation message sent by a screen-casting terminal, and outputting preset prompt information;
and in response to the fact that the preset prompt information output time is longer than the preset time, acquiring a video through the connected image acquisition device.
9. The method of claim 7, wherein the method further comprises:
and responding to a received control instruction of the screen projecting content sent by the screen projecting terminal, and controlling the screen projecting content according to the control instruction.
10. The method of claim 7, wherein the method further comprises:
and responding to the fact that the connected image acquisition device does not acquire the face image, and sending a screen projection ending message to the screen projection terminal for stopping the screen sharing.
11. An apparatus for sharing a screen, comprising:
the screen-throwing preparation message sending unit is configured to respond to a screen-throwing starting instruction triggered by a user and send a screen-throwing preparation message to the terminal meeting a preset condition;
the face recognition unit is configured to respond to a received screen-throwing ready message sent by a target terminal, perform face recognition on a face image in the screen-throwing ready message, and determine a target face object, wherein the screen-throwing ready message is sent after the target terminal collects a video through a connected image collection device and determines that the watching duration of the face object in the video exceeds a preset duration;
a screen-shot content transmitting unit configured to transmit screen-shot content to the target terminal to share a screen in response to determining that the target face object belongs to a preset face object set.
12. The apparatus of claim 11, wherein the apparatus further comprises a screen-casting content determination unit configured to:
and determining the screen projection content according to the screen projection content determination instruction of the user.
13. The apparatus of claim 11, wherein the apparatus further comprises a first screen-casting content control unit configured to:
receiving a control instruction of a user to the screen projection content;
and sending the control instruction to the target terminal for controlling the screen projection content.
14. The apparatus of claim 11, wherein the apparatus further comprises an end-of-screen-projection control unit configured to:
and in response to receiving a screen projection ending message sent by the target terminal, stopping sharing a screen to the target terminal, wherein the screen projection ending message is sent when an image acquisition device connected with the target terminal does not acquire a face image.
15. The apparatus of claim 11, wherein the apparatus further comprises a shared user information maintenance unit configured to derive the set of face objects by:
acquiring a preset shared user information list, wherein the shared user information comprises a face image;
and carrying out face recognition on the face images in the shared user information list to obtain the face object set.
16. The apparatus of claim 15, wherein the shared user information maintenance unit is further configured to:
receiving modification information of the shared user information list from a user;
and updating the shared user information list according to the modification information.
17. An apparatus for sharing a screen, comprising:
the video acquisition unit is configured to respond to the fact that a screen projection preparation message sent by a screen projection terminal is received, and acquire a video through a connected image acquisition device, wherein the screen projection preparation message is sent by the screen projection terminal after a screen projection starting instruction triggered by a user is detected;
the screen-casting ready message sending unit is configured to respond to the fact that the watching duration of a human face object in the video exceeds a preset duration, generate a screen-casting ready message according to a human face image in the video and send the screen-casting ready message to the screen-casting terminal;
and the screen-casting content receiving unit is configured to receive a screen-casting content sharing screen sent by the screen-casting terminal, wherein the screen-casting content is sent after the screen-casting terminal performs face recognition on the face image in the screen-casting ready message, determines a target face object and determines that the target face object belongs to a preset face object set.
18. The apparatus of claim 17, wherein the video capture unit is further configured to:
responding to a received screen-throwing preparation message sent by a screen-throwing terminal, and outputting preset prompt information;
and in response to the fact that the preset prompt information output time is longer than the preset time, acquiring a video through the connected image acquisition device.
19. The apparatus of claim 17, wherein the apparatus further comprises a second screen-casting content control unit configured to:
and responding to a received control instruction of the screen projecting content sent by the screen projecting terminal, and controlling the screen projecting content according to the control instruction.
20. The apparatus of claim 17, wherein the apparatus further comprises a second screen-shot end message transmitting unit configured to:
and responding to the fact that the connected image acquisition device does not acquire the face image, and sending a screen projection ending message to the screen projection terminal for stopping the screen sharing.
21. An electronic device for sharing a screen, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6 or to perform the method of any one of claims 7-10.
22. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-6 or to perform the method of any one of claims 7-10.
CN202011432446.3A 2020-12-10 2020-12-10 Method, apparatus, device and storage medium for sharing screen Active CN112416284B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011432446.3A CN112416284B (en) 2020-12-10 2020-12-10 Method, apparatus, device and storage medium for sharing screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011432446.3A CN112416284B (en) 2020-12-10 2020-12-10 Method, apparatus, device and storage medium for sharing screen

Publications (2)

Publication Number Publication Date
CN112416284A CN112416284A (en) 2021-02-26
CN112416284B true CN112416284B (en) 2022-09-23

Family

ID=74776454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011432446.3A Active CN112416284B (en) 2020-12-10 2020-12-10 Method, apparatus, device and storage medium for sharing screen

Country Status (1)

Country Link
CN (1) CN112416284B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112994997B (en) * 2021-05-21 2022-07-29 深圳传音控股股份有限公司 Processing method, processing apparatus, and storage medium
CN113691849A (en) * 2021-08-05 2021-11-23 深圳康佳电子科技有限公司 Screen projection control method and device, terminal equipment and storage medium
CN113938742B (en) * 2021-10-11 2023-07-18 深圳创维-Rgb电子有限公司 Control method, system, equipment and storage medium for automatic screen-throwing content playing
CN113965496B (en) * 2021-10-15 2023-11-17 上汽通用五菱汽车股份有限公司 Method for optimizing screen-throwing process response
CN114442986A (en) * 2022-01-30 2022-05-06 深圳创维-Rgb电子有限公司 Information transmission method and device, screen projector and storage medium
CN115037979B (en) * 2022-07-13 2023-09-01 北京字跳网络技术有限公司 Screen projection method and related equipment
CN115529490A (en) * 2022-09-16 2022-12-27 广州市保伦电子有限公司 Many-to-many screen projection method, device and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107948281A (en) * 2017-11-24 2018-04-20 维沃移动通信有限公司 A kind of photo be shared method, mobile terminal and Cloud Server
CN110458564A (en) * 2019-08-12 2019-11-15 腾讯科技(深圳)有限公司 Method of payment, device, terminal, system and storage medium based on recognition of face
CN111258414A (en) * 2018-11-30 2020-06-09 百度在线网络技术(北京)有限公司 Method and device for adjusting screen

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107948281A (en) * 2017-11-24 2018-04-20 维沃移动通信有限公司 A kind of photo be shared method, mobile terminal and Cloud Server
CN111258414A (en) * 2018-11-30 2020-06-09 百度在线网络技术(北京)有限公司 Method and device for adjusting screen
CN110458564A (en) * 2019-08-12 2019-11-15 腾讯科技(深圳)有限公司 Method of payment, device, terminal, system and storage medium based on recognition of face

Also Published As

Publication number Publication date
CN112416284A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN112416284B (en) Method, apparatus, device and storage medium for sharing screen
US10218651B2 (en) Virtual assistance for chat agents
CN107533417B (en) Presenting messages in a communication session
CN111669438B (en) Live broadcast message transmission method and device, electronic equipment and medium
US20200007944A1 (en) Method and apparatus for displaying interactive attributes during multimedia playback
US9600152B2 (en) Providing feedback for screen sharing
US20190051147A1 (en) Remote control method, apparatus, terminal device, and computer readable storage medium
CN112016068A (en) Account control method, device, equipment and computer readable storage medium
CN109819268B (en) Live broadcast room play control method, device, medium and equipment in video live broadcast
US11900006B2 (en) Synchronizing local room and remote sharing
CN111695516A (en) Thermodynamic diagram generation method, device and equipment
WO2022205784A1 (en) Interface information control method and apparatus
CN111901671B (en) Video connection method and device
KR102012049B1 (en) Mothode of screen sharing in terminal and terminal for screen sharing
CN111865630B (en) Topological information acquisition method, device, terminal and storage medium
CN111368184A (en) Screen saver release method and device for intelligent voice device and storage medium
CN110704151A (en) Information processing method and device and electronic equipment
EP4274237A1 (en) Information display method and apparatus, and device and medium
US11656834B2 (en) Information processing device, non-transitory recording medium, and information processing system
CN112752323B (en) Method and device for changing hot spot access state
US11456981B2 (en) System and method for capturing, storing, and transmitting presentations
CN112528052A (en) Multimedia content output method, device, electronic equipment and storage medium
CN109729410B (en) Live broadcast room interactive event processing method, device, equipment and storage medium
CN113589988A (en) Team activity implementation method and device, storage medium and electronic equipment
CN109726026B (en) Interactive data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant