CN111338721A - Online interaction method, system, electronic device and storage medium - Google Patents

Online interaction method, system, electronic device and storage medium Download PDF

Info

Publication number
CN111338721A
CN111338721A CN202010065891.4A CN202010065891A CN111338721A CN 111338721 A CN111338721 A CN 111338721A CN 202010065891 A CN202010065891 A CN 202010065891A CN 111338721 A CN111338721 A CN 111338721A
Authority
CN
China
Prior art keywords
terminal
data
display data
display
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010065891.4A
Other languages
Chinese (zh)
Inventor
苏浩
季东悦
郝虹阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dami Future Technology Co ltd
Original Assignee
Beijing Dami Future Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dami Future Technology Co ltd filed Critical Beijing Dami Future Technology Co ltd
Priority to CN202010065891.4A priority Critical patent/CN111338721A/en
Publication of CN111338721A publication Critical patent/CN111338721A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation

Abstract

An online interaction method, system, electronic device and storage medium are disclosed. The method comprises the steps of determining first display data according to second display data of a second terminal, sending the first display data to the first terminal, meanwhile, obtaining first operation data of the first terminal, and sending the first operation data to the second terminal, so that the second terminal synchronizes first input operation according to the first operation data. Therefore, the user receiving the screen sharing can operate the screen data according to the requirement, and the user communication is facilitated.

Description

Online interaction method, system, electronic device and storage medium
Technical Field
The present invention relates to the field of internet technologies, and in particular, to an online interaction method, system, electronic device, and storage medium.
Background
Online education is a learning behavior based on the network, and the students and teachers develop teaching activities through the network, so that the students can learn at any time and any place by means of network courseware.
In some scenarios, a platform operator (e.g., a lesson advisor or teacher) needs to communicate with a user (e.g., a student or parent), such as: in teaching explanation, post-school summary or course selling links, and the like, when a platform operator (such as a course advisor or teacher) communicates with a user (such as a student or a parent), the user experience can be increased by adopting a screen sharing mode. However, in the screen sharing process in the prior art, a user receiving the screen data often only views the screen data and cannot operate the screen data. For example, when a parent needs a course advisor to explain a certain area on the screen in detail, the parent cannot directly specify the area, which is not favorable for the user to communicate with.
Disclosure of Invention
In view of this, embodiments of the present invention provide an online interaction method, system, electronic device and storage medium, so that a user receiving screen sharing can operate screen data according to a requirement, thereby facilitating user communication.
In a first aspect, an embodiment of the present invention provides an online interaction method, where the method includes:
acquiring second display data from at least one second terminal, wherein the second display data are real-time display data of at least one part of a display device of the second terminal;
determining first display data according to the second display data;
sending the first display data to the first terminal; and
acquiring first operation data of the first terminal, wherein the first operation data is used for representing a first input operation received by the first terminal, the first operation data comprises a first operation command and a first operation coordinate, and the first operation coordinate is used for representing position information of the first input operation in the first display data and/or position information on a display screen of the first terminal; and
and sending the first operation data to the second terminal so that the second terminal synchronizes the first input operation according to the first operation data.
Preferably, the first input operation comprises dragging, drawing, mouse clicking and/or keyboard input;
wherein in response to the first input operation being a scribe line, the first operation data further comprises a line color and/or a line width.
Preferably, determining the first display data from the second display data comprises:
identifying sensitive data in the second display data;
carrying out protection processing on sensitive data; and
obtaining the first display data based on the second display data after protection processing;
wherein the protection process is to cause all or part of the sensitive data to be displayed or invisible in an obscured state.
Preferably, identifying sensitive data in the second display data comprises:
identifying sensitive data in the second display data based on a first mapping relation and/or a second mapping relation, wherein the first mapping relation is a mapping relation between page address information in the second display data and a sensitive label, and the second mapping relation is a mapping relation between identification information corresponding to each display area in the second display data and the sensitive label; and/or
Identifying the sensitive data in the second display data based on a selection instruction, wherein the selection instruction is an acquired input instruction of the second terminal.
Preferably, the method further comprises:
acquiring second operation data of the second terminal, wherein the second operation data is used for representing a second input operation received by the second terminal, the second operation data comprises a second operation command and a second operation coordinate, and the second operation coordinate is used for representing position information of the second input operation in the second display data and/or position information on a display screen of the second terminal; and
and sending the second operation data to the first terminal so that the first terminal synchronizes the second input operation according to the second operation data.
Preferably, the method further comprises:
receiving second video data from the second terminal, wherein the second video data is video data recorded by a camera device of the second terminal; and
and sending the second video data to the first terminal.
Preferably, the method further comprises:
receiving first video data from the first terminal, wherein the first video data is video data recorded by a camera device of the first terminal; and
and sending the first video data to the second terminal.
In a second aspect, an embodiment of the present invention provides an online interaction system, where the system includes at least one first terminal, at least one server, and at least one second terminal;
the first terminal sends first operation data to the server, wherein the first operation data are used for representing a first input operation received by the first terminal, the first operation data comprise a first operation command and a first operation coordinate, and the first operation coordinate is used for representing position information of the first input operation in the first display data and/or position information on a display screen of the first terminal;
the second terminal sends second display data to the server, wherein the second display data are real-time display data of at least one part of a display device of the second terminal;
the server receives the second display data, determines first display data according to the second display data, and sends the first display data to the first terminal, and the server receives the first operation data and sends the first operation data to the second terminal, so that the second terminal synchronizes the first input operation according to the first operation data.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory is used to store one or more computer program instructions, where the one or more computer program instructions are executed by the processor to implement the method according to the first aspect.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium on which computer program instructions are stored, which when executed by a processor implement the method according to the first aspect.
According to the technical scheme of the embodiment of the invention, first display data are determined according to second display data of a second terminal, the first display data are sent to the first terminal, meanwhile, first operation data of the first terminal are obtained, and the first operation data are sent to the second terminal, so that the second terminal synchronizes the first input operation according to the first operation data. Therefore, the user receiving the screen sharing can operate the screen data according to the requirement, and the user communication is facilitated.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of the embodiments of the present invention with reference to the accompanying drawings, in which:
FIG. 1 is a schematic diagram of an online interaction system of an embodiment of the invention;
FIG. 2 is a flow chart of an online interaction method of an embodiment of the invention;
FIG. 3 is a flow chart of determining first display data according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating second display data of a second terminal according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating first display data of a first terminal according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating first display data of a first terminal according to an embodiment of the present invention;
fig. 7 is a diagram illustrating second display data of a second terminal according to an embodiment of the present invention;
fig. 8 is a flowchart illustrating sharing of second operation data by the second terminal according to the embodiment of the present invention;
FIG. 9 is a flow chart of a video call of an embodiment of the present invention;
FIG. 10 is a flow chart of a method of online interaction of a server according to an embodiment of the invention;
fig. 11 is a schematic diagram of an electronic device of an embodiment of the invention.
Detailed Description
The present invention will be described below based on examples, but the present invention is not limited to only these examples. In the following detailed description of the present invention, certain specific details are set forth. It will be apparent to one skilled in the art that the present invention may be practiced without these specific details. Well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
Further, those of ordinary skill in the art will appreciate that the drawings provided herein are for illustrative purposes and are not necessarily drawn to scale.
Meanwhile, it should be understood that, in the following description, a "circuit" refers to a conductive loop constituted by at least one element or sub-circuit through electrical or electromagnetic connection. When an element or circuit is referred to as being "connected to" another element or element/circuit is referred to as being "connected between" two nodes, it may be directly coupled or connected to the other element or intervening elements may be present, and the connection between the elements may be physical, logical, or a combination thereof. In contrast, when an element is referred to as being "directly coupled" or "directly connected" to another element, it is intended that there are no intervening elements present.
Unless the context clearly requires otherwise, throughout the description, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including, but not limited to".
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
FIG. 1 is a schematic diagram of an online interaction system of an embodiment of the invention. As shown in fig. 1, the online interactive system according to the embodiment of the present invention includes at least one first terminal 1, at least one server 3, and at least one second terminal 2. The first terminal 1 sends first operation data to the server 3, where the first operation data is used to represent a first input operation received by the first terminal 1, the first operation data includes a first operation command and a first operation coordinate, and the first operation coordinate is used to represent position information of the first input operation in the first display data and/or position information on a display screen of the first terminal 1. The second terminal 2 sends second display data to the server 3, where the second display data is real-time display data of at least a part of a display device of the second terminal 2. The server 3 receives the second display data, determines first display data according to the second display data, and sends the first display data to the first terminal 1, and the server 3 receives the first operation data, and sends the first operation data to the second terminal 2, so that the second terminal 2 synchronizes the first input operation according to the first operation data.
Further, the first input operation includes dragging, drawing, mouse clicking, keyboard inputting and/or the like. Wherein in response to the first input operation being a scribe line, the first operation data further comprises a line color and/or a line width.
Thereby, the user can operate the screen data through the first terminal 1.
Further, before the server 3 acquires the second display data of the second terminal 2, it needs to establish a communication connection with the first terminal 1 and the second terminal 2. Specifically, the first terminal 1 or the second terminal 2 sends a connection request to the server 3, where the connection request includes identification information of the first terminal 1 and/or the second terminal 2. The server 3 is configured to establish a first communication connection with the first terminal 1 according to the connection request, and establish a second communication connection with the second terminal 2.
In this embodiment, the at least one first terminal 1 is a terminal that receives display data. Preferably, the first terminal 1 is a terminal used by students or parents. It should be understood that, in the embodiment, the screen sharing system is described by taking two first terminals as an example, but the number of the first terminals 1 is not limited in the embodiment, and the number of the first terminals may be one or multiple.
In this embodiment, the second terminal 2 is a terminal sharing display data. Preferably, the second terminal 2 is a terminal used by a platform operator, for example, a terminal used by a lesson counselor, a salesperson or a lecture teacher. It should be understood that, in the embodiment, the screen sharing system is described by taking an example that the screen sharing system includes one second terminal, but the number of the second terminals is not limited in the embodiment, and the second terminal may be one or multiple.
In this embodiment, the first terminal 1 and the second terminal 2 may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, or other electronic devices with wireless communication functions. The first terminal 1 and the second terminal 2 comprise at least a processor capable of running image data and a display device capable of displaying image data. In this embodiment, the second terminal has a function of sharing display data with other terminals, and the first terminal has at least a function of displaying screen data.
In this embodiment, an example in which one second terminal shares display data with a plurality of first terminals is taken as an example for description, it should be understood that the scheme provided in this embodiment is also applicable to a scenario in which a plurality of second terminals share display data with a plurality of first terminals at the same time, or a scenario in which a plurality of second terminals share display data with one first terminal, or a scenario in which a plurality of second terminals share display data with a plurality of first terminals, and the like.
Further, the server 3 may be implemented by an independent server or a server cluster composed of a plurality of servers. In an optional implementation manner, the server 3 includes a front-end server and a back-end server, where the back-end server is configured to obtain second display data of the second terminal 2, and the front-end server is configured to send the first display data to the first terminal 1, so that the first display data is displayed on the first terminal. Preferably, the front-end server and the back-end server establish a connection through a websocket, which is a full duplex communication Protocol based on TCP (transmission control Protocol).
FIG. 2 is a flow chart of an online interaction method of an embodiment of the invention. As shown in fig. 2, the online interaction method according to the embodiment of the present invention includes the following steps:
step S201, the first terminal sends a connection request to the server.
In this embodiment, the first terminal 1 sends a connection request to the server 3 to request a connection to be established with the second terminal. Wherein the connection request comprises identification information of the second terminal 2.
In an alternative implementation, the first terminal 1 may send the connection request by scanning a two-dimensional code of the second terminal 2. Specifically, when the user of the second terminal 2 needs to share the second display data to the first terminal 1, the user of the second terminal 2 inputs an operation command for obtaining the two-dimensional code to obtain the two-dimensional code of the second terminal 2, and optionally, the user may click a predetermined function area on the screen to obtain the two-dimensional code. After obtaining the two-dimensional code, the user of the second terminal 2 may send the two-dimensional code to the user of the first terminal 1 through an instant messaging application (e.g., WeChat, QQ, etc.). The user scans the two-dimensional code of the second terminal 2 through the first terminal 1 to send a connection request to the server 3. Optionally, the first terminal 1 scans the two-dimensional code to obtain a scanning result, and directly sends the connection request to the server, or the first terminal 1 scans the two-dimensional code to obtain the scanning result, and requests the user whether to send the connection request, and sends the connection request to the server 3 after receiving the confirmation information of the user.
In another optional implementation manner, the first terminal 1 may input identification information of the second terminal 2 through an operation interface to send the connection request to the server 3, where the identification information of the second terminal 2 may be a user account, a user name, or the like.
It should be understood that the present embodiment may also be that the second terminal 2 sends a connection request to the server 3. Wherein the connection request comprises identification information of at least one first terminal 1.
Step S202, the server establishes communication connection.
In this embodiment, after receiving a connection request of a first terminal, the server 3 parses the connection request to obtain identification information of a second terminal 2, establishes a first communication connection with the first terminal 1 according to the identification information, and establishes a second communication connection with the second terminal 2.
Step S203, the server sends a connection establishment notification.
In this embodiment, after establishing the communication connection with the first terminal 1 and the second terminal 2, the server 3 sends a notification of successful establishment of the communication connection to the second terminal 2.
Optionally, after establishing the communication connection with the first terminal 1 and the second terminal 2, the server 3 sends a notification of successful establishment of the communication connection to the first terminal 1.
And step S204, the second terminal sends second display data.
In this embodiment, after receiving the notification message of successful establishment of communication connection from the server, the second terminal 2 obtains second display data of the display device in real time, and sends the second display data to the server 3. The second display data is real-time display data of at least one part of a display device of the second terminal.
Further, the second display data includes display interface data, and the display interface data is screen data of a display device of the second terminal.
In an alternative implementation, the second display data sent by the second terminal 2 to the server may be all real-time display data of the display device of the second terminal 2.
In another alternative implementation, the second display data sent by the second terminal 2 to the server may be real-time display data of a portion of the display device of the second terminal 2. Specifically, the real-time display data of the portion of the display device of the second terminal 2 may be a region designated by the user or a predetermined region. When the area is designated by the user, the user can select an area to be transmitted by a frame selection or the like. In the case of a preset area, the area to be transmitted may be selected by setting a domain name of display data or a designated application or page in advance.
Step S205, the server determines the first display data.
In this embodiment, the server 3 obtains sensitive data in second display data, and performs protection processing on the second display data according to the sensitive data to obtain first display data, where the protection processing is used to display or make invisible all or part of the sensitive data in a blurred state.
Further, the protection process may be any one or more of scratch-out, replacement, blocking (e.g., mosaic), hiding, blurring, and the like on the sensitive data.
Specifically, the step of the server 3 determining the presentation data may refer to fig. 3, and includes the following steps:
and step S2051, identifying sensitive data in the second display data.
In an optional implementation manner, the second terminal 2 sends a selection instruction to the server, and the server identifies the sensitive data in the second display data according to the selection instruction, where the selection instruction is an acquired input instruction of the second terminal. Specifically, when the user needs to protect all data or part of data in the second display data of the second terminal 2, the user may select a sensitive area in the second display data in a frame selection manner or the like to generate a selection instruction, and send the selection instruction to the server 3.
In another alternative implementation, the server 3 may identify sensitive data based on a predetermined mapping relationship.
For example, the server 3 may identify the sensitive data in the second display data of the second terminal based on a first mapping relationship, where the first mapping relationship is a mapping relationship between page address information and a sensitive tag in the second display data. Specifically, a sensitive tag of each page address information may be preset, where the page address information may be a domain name or an IP (Internet Protocol) address, and the like. And after receiving the second display data, the server 3 acquires page address information of the second display data, and identifies the sensitive data in the second display data according to the mapping relation between the page address information and the sensitive label.
For another example, the server 3 may identify the sensitive data in the second display data of the second terminal based on a second mapping relationship, where the second mapping relationship is a mapping relationship between the identification information corresponding to each display area in the second display data and the sensitive label. Specifically, one page often includes different functional areas, the page may be divided into a plurality of display areas according to the different functional areas, and the sensitive tag corresponding to the identification information of each display area in the page is preset. And after receiving the second display data, the server 3 acquires the identification information of each display area of the second display data, and identifies the sensitive data in the second display data according to the mapping relation between the identification information and the sensitive label.
For another example, the server 3 may further identify sensitive data in the second display data of the second terminal based on the first mapping relationship and the second mapping relationship. For example, the server 3 may first detect whether the received second display data is a sensitive page through the first mapping relationship, and determine that the second display data does not include the sensitive data in response to that the second display data is not a sensitive page. And in response to that the second display data is the sensitive page, identifying whether each display area in the second display data is the sensitive data or not through the second mapping relation.
And step S2052, protection processing is carried out on the sensitive data.
Step S2053 is to obtain the first display data based on the second display data after the protection processing.
In this embodiment, after the server 3 identifies the sensitive data through the above steps, the screen data is protected according to the sensitive data to obtain the first display data, and the protection processing is used to make all or part of the sensitive data displayed in a blurred state or invisible.
Further, the protection process may be any one or more of scratch-out, replacement, blocking (e.g., mosaic), hiding, blurring, and the like on the sensitive data.
Step S206, the server sends the first display data.
In this embodiment, the server 3 sends the first display data to the first terminal 1 through the first communication connection after determining the first display data.
And step S207, the first terminal displays the first display information.
In this embodiment, after receiving the first display information sent by the server 3, the first terminal 1 displays the first display information through the display device.
Specifically, screen data sharing between the first terminal 1 and the second terminal 2 may refer to fig. 4 and 5. Fig. 4 is a schematic diagram of second display data of the second terminal according to the embodiment of the present invention, and fig. 5 is a schematic diagram of first display data of the first terminal according to the embodiment of the present invention. In the present embodiment, the first terminal 1 is a mobile phone, and the second terminal 2 is a computer, for example, it should be understood that both the first terminal 1 and the second terminal 2 can be implemented by a mobile phone, a tablet computer, a notebook computer, a desktop computer, or other electronic devices with wireless communication functions. As shown in fig. 4, the second display data of the second terminal 2 may be divided into a plurality of regions including a region a1, a region a2, and a region a3, and a control k1, a control k2, and a control k 3. In this embodiment, taking the data in the area a2 as an example for explanation, the first display data received by the first terminal is as shown in fig. 5, and includes an area b1, an area b2, an area b3, a control k1, a control k2, and a control k 3. Since the data of the area a2 is sensitive data, the server 3 performs a protection process on the data in the area a2 so that the data is not visible at the first terminal.
Step S208, the first terminal acquires first operation data.
In this embodiment, a first terminal 1 obtains first operation data input by a user, where the first operation data is used to represent a first input operation received by the first terminal, the first operation data includes a first operation command and a first operation coordinate, and the first operation coordinate is used to represent position information of the first input operation in first display data and/or position information on a display screen of the first terminal.
In this embodiment, the first input operation includes operations such as scribing, and the first operation data further includes a line color and/or a line width.
Further, the user may set a drawing tool, such as canvas, etc., on the first terminal 1, and various figures or lines may be drawn on the screen through an API (Application Programming Interface) provided by the canvas. Thus, when the user of the first terminal needs to select a certain region, the corresponding region is selected by drawing a corresponding shape with a scribing tool. The first terminal 1 obtains user operation and generates a first screen operation command and corresponding operation parameters, wherein the first screen operation command is used for representing expected operation input by a user in the first terminal, the operation parameters comprise operation coordinates, line colors and the like, and the operation coordinates are used for representing the position of the first screen operation command in the display data.
In this embodiment, the first input operation includes dragging, mouse clicking, keyboard inputting, and/or the like. Specifically, the user may input an operation to be performed through an input device such as a mouse, a keyboard, or a touch screen, for example, an operation of clicking a certain control in the screen. Specifically, a mouse and keyboard response software may be set at the first terminal, and when a user inputs an operation through the mouse or the keyboard, the corresponding response event and the first screen operation command and the corresponding operation parameter may be acquired. Thus, the user can perform operations such as dragging, mouse clicking, and keyboard input.
Step S209, the first terminal sends the first operation data.
In the present embodiment, the first terminal 1 generates first operation data according to an operation by a user, and transmits the first operation data to the server 3 through the first communication connection.
Step S210, the server sends the first operation data.
In this embodiment, the server 3 transmits said first operation data to the second terminal 2 via the second communication connection.
And step S211, the second terminal executes the first operation.
In the present embodiment, in response to the first operation data being a scribe line, the user may set a scribing tool, such as a canvas, etc., at the second terminal 2. And after receiving the first operation data, executing a first operation command at the corresponding position according to the first operation coordinate to draw the corresponding shape.
Further, in response to the first operation data being drag, mouse click, keyboard input, or the like, the User may set a GUI (Graphical User Interface) automation tool, such as RobotJS or the like, at the second terminal 2, where RobotJS is a GUI automation tool that may be used to control operations of a mouse, a keyboard, a reading screen, or the like. And after receiving the first operation data, controlling to execute input operations of a mouse, a keyboard and the like in a corresponding area in the screen according to the first operation coordinate, for example, controlling the mouse to click a certain control in the screen and the like. Therefore, the user of the first terminal can operate the screen data according to the requirement, and the user communication is facilitated.
Specifically, the first terminal 1 may refer to fig. 6 and 7 for the screen data operation. Fig. 6 is a schematic diagram of first display data of a first terminal according to an embodiment of the present invention, and fig. 7 is a schematic diagram of second display data of a second terminal according to an embodiment of the present invention. Similarly, in the embodiment, the first terminal 1 is a mobile phone, and the second terminal 2 is a computer, for example, it should be understood that both the first terminal 1 and the second terminal 2 can be implemented by a mobile phone, a tablet computer, a notebook computer, a desktop computer, or other electronic devices with wireless communication functions. As shown in FIG. 6, for example, assuming that the user of the first terminal desires the user of the second terminal to specify the contents of the second row in region b1, the corresponding section is selected by the line tool, as in c1 of FIG. 6. After receiving the first operation data, the second terminal 2 draws a corresponding shape in a corresponding area in the screen according to the first operation parameter, such as the first operation coordinate and the line color, as shown in d1 in fig. 7. For another example, if the user of the first terminal 1 needs to operate the control k1, the corresponding region is clicked through the touch screen or the mouse, and after receiving the first operation data, the second terminal 2 generates a click event according to the first operation coordinate and clicks the corresponding region.
According to the embodiment of the invention, first display data are determined according to second display data of a second terminal, the first display data are sent to the first terminal, meanwhile, first operation data of the first terminal are obtained, and the first operation data are sent to the second terminal, so that the second terminal synchronizes the first input operation according to the first operation data. Therefore, the user receiving the screen sharing can operate the screen data according to the requirement, and the user communication is facilitated.
Further, the first terminal 1 may also synchronize the input operation of the second terminal. As shown in fig. 8, the method includes the following steps:
step S801, the second terminal acquires second operation data.
In this embodiment, the second operation data is used to represent a second input operation received by the second terminal, the second operation data includes a second operation command and a second operation coordinate, and the second operation coordinate is used to represent position information of the second input operation in the second display data and/or position information on a display screen of the second terminal.
Further, the first input operation includes dragging, drawing, mouse clicking and/or keyboard input. Wherein in response to the first input operation being a scribe line, the first operation data further comprises a line color and/or a line width.
And step S802, the second terminal sends second operation data to the server.
In this embodiment, the second terminal 2 transmits the second operation data to the server 3 through the second communication connection.
Step S803, the server sends the second operation data to the first terminal through the first communication connection.
And step S804, the first terminal displays the second operation data.
In this embodiment, the first terminal displays the corresponding operation in the corresponding area according to the second operation position in the second operation data.
Therefore, the second operation command and the screen data can be displayed on the first terminal synchronously, so that the user of the first terminal can more intuitively see the operation of the user of the second terminal.
Optionally, in order to further facilitate user communication, the method according to the embodiment of the present invention may further provide a video call function for the user, and specifically as shown in fig. 9, the method includes the following steps:
step S901, the second terminal acquires the second video data.
In the present embodiment, the second terminal 2 acquires the second video data by the camera.
And step S902, the second terminal sends the second video data.
In this embodiment, after the second video data is recorded by the camera device of the second terminal 2, the second video data is sent to the server through the second communication connection.
And step S903, the server sends the video data.
In this embodiment, the server 3 transmits the second video data to the first terminal 1 via a first communication connection.
And step S904, displaying by the first terminal.
In this embodiment, after receiving the second video data sent by the server, the first terminal 1 displays the second video data in a predetermined area of the screen.
Alternatively, the user of the first terminal may change the display area and size of the second video data by dragging or the like.
Thereby, the user of the first terminal can acquire the video data of the second terminal.
Optionally, in order to further facilitate user communication, the method according to the embodiment of the present invention further includes:
step S905, the first terminal obtains video data.
In the present embodiment, the first terminal 1 acquires the first video data by the image pickup device.
And step S906, the first terminal sends the first video data.
In this embodiment, after the second video data is recorded by the camera device of the first terminal 1, the first video data is sent to the server through the first communication connection.
Step S907, the server transmits the first video data.
In this embodiment, the server 3 transmits the first video data to the second terminal 2 via a second communication connection.
And step S908, displaying by the second terminal.
In this embodiment, after receiving the first video data sent by the server, the second terminal 2 displays the first video data in a predetermined area of the screen.
Alternatively, the user of the second terminal may change the display area and size of the first video data by dragging or the like.
Thus, the video call between the first terminal and the second terminal can be realized through the above steps S901 to S908.
Fig. 10 is a flowchart of an online interaction method of a server according to an embodiment of the present invention. As shown in fig. 10, the method of the embodiment of the present invention includes the steps of:
and step S1010, acquiring second display data from at least one second terminal.
In this embodiment, before the server 3 acquires the second display data of the second terminal 2, it needs to establish a communication connection with the first terminal 1 and the second terminal 2. Specifically, the first terminal 1 or the second terminal 2 sends a connection request to the server 3, where the connection request includes identification information of the first terminal 1 and/or the second terminal 2. The server 3 is configured to establish a first communication connection with the first terminal 1 according to the connection request, and establish a second communication connection with the second terminal 2.
The second display data is real-time display data of at least a part of a display device of the second terminal. And after receiving the notification information of successful communication connection establishment of the server, the second terminal 2 acquires second display data of the display device in real time and sends the second display data to the server 3. The second display data is real-time display data of at least one part of a display device of the second terminal.
Further, the second display data includes display interface data, and the display interface data is screen data of a display device of the second terminal.
In an alternative implementation, the second display data sent by the second terminal 2 to the server may be all real-time display data of the display device of the second terminal 2.
In another alternative implementation, the second display data sent by the second terminal 2 to the server may be real-time display data of a portion of the display device of the second terminal 2. Specifically, the real-time display data of the portion of the display device of the second terminal 2 may be a region designated by the user or a predetermined region. When the area is designated by the user, the user can select an area to be transmitted by a frame selection or the like. In the case of a preset area, the area to be transmitted may be selected by setting a domain name of the second display data to be presented or a designated application program in advance.
Further, the second display data further includes input operation data, which is a mark left by the user of the second terminal 2 when operating the screen. For example, when the user of the second terminal 2 performs operations such as dragging, drawing, and clicking, input operation data is formed on the screen. The second terminal obtains input operation data, the input operation data are sent to the server 3 through the second communication connection, and the server sends the input operation data to the first terminal through the first communication connection, so that the first terminal synchronously displays the operation of the second terminal when displaying interface data. Therefore, the input operation data can be displayed on the first terminal, so that the user of the first terminal can more intuitively see the operation of the user of the second terminal.
Step S1020, determining first display data according to the second display data;
in this embodiment, the server 3 detects the sensitive data in the screen data of the second terminal 2 according to a predetermined mapping relationship.
In an alternative implementation manner, the server 3 may identify the sensitive data in the second display data of the second terminal based on a first mapping relationship, where the first mapping relationship is a mapping relationship between page address information in the second display data and the sensitive tag. Specifically, a sensitive tag of each page address information may be preset, where the page address information may be a domain name or an IP (Internet Protocol) address, and the like. And after receiving the second display data, the server 3 acquires page address information of the second display data, and identifies the sensitive data in the second display data according to the mapping relation between the page address information and the sensitive label.
In another optional implementation manner, the server 3 may identify the sensitive data in the second display data of the second terminal based on a second mapping relationship, where the second mapping relationship is a mapping relationship between the identification information and the sensitive label corresponding to each display area in the second display data. Specifically, one page often includes different functional areas, the page may be divided into a plurality of display areas according to the different functional areas, and the sensitive tag corresponding to the identification information of each display area in the page is preset. And after receiving the second display data, the server 3 acquires the identification information of each display area of the second display data, and identifies the sensitive data in the second display data according to the mapping relation between the identification information and the sensitive label.
In yet another alternative implementation, the server 3 may further identify sensitive data in the second display data of the second terminal based on the first mapping relationship and the second mapping relationship. For example, the server 3 may first detect whether the received second display data is a sensitive page through the first mapping relationship, and determine that the second display data does not include the sensitive data in response to that the second display data is not a sensitive page. And in response to that the second display data is the sensitive page, identifying whether each display area in the second display data is the sensitive data or not through the second mapping relation.
The embodiment of the present invention may further identify the sensitive data in the second display data based on a selection instruction, specifically, the server obtains the selection instruction sent by the second terminal, where the selection instruction is an input instruction of the second terminal obtained through the second communication connection, and further, the selection instruction includes location information of the sensitive data in the second display data.
In this embodiment, the user of the second terminal 2 may select the sensitive data in the second display data by a frame selection or the like to generate a selection instruction, and send the selection instruction to the server 3. And the server 3 carries out protection processing on the corresponding display area according to the selection instruction. Therefore, sensitive data can be effectively protected in a mode designated by a user, and leakage is prevented.
Thus, the server can acquire the sensitive data in the second display data.
Further, the server 3 performs protection processing on the sensitive data to obtain first display data.
In this embodiment, the protection process is used to make all or part of the sensitive data appear or invisible in a blurred state.
Further, the protection process may be any one or more of scratch-out, replacement, blocking (e.g., mosaic), hiding, blurring, and the like on the sensitive data.
Step S1030, sending the first display data to the first terminal.
In this embodiment, the server 3 sends the first display data to the first terminal 1 through the first communication connection after determining the first display data. After receiving the first display data sent by the server 3, the first terminal 1 displays the first display data through the display device. Thus, screen sharing can be achieved.
Step S1040, obtaining first operation data of the first terminal.
In this embodiment, the first operation data is used to represent a first input operation received by the first terminal, the first operation data includes a first operation command and a first operation coordinate, and the first operation coordinate is used to represent position information of the first input operation in the first display data and/or position information on a display screen of the first terminal.
Step S1050, sending the first operation data to the second terminal, so that the second terminal synchronizes the first input operation according to the first operation data.
In this embodiment, a first terminal 1 obtains first operation data input by a user, where the first operation data is used to represent a first input operation received by the first terminal, the first operation data includes a first operation command and a first operation coordinate, and the first operation coordinate is used to represent position information of the first input operation in first display data and/or position information on a display screen of the first terminal.
In this embodiment, the first input operation includes operations such as scribing, and the first operation data further includes a line color and/or a line width.
Further, the user may set a drawing tool, such as canvas, etc., on the first terminal 1, and various figures or lines may be drawn on the screen through an API (Application Programming Interface) provided by the canvas. Thus, when the user of the first terminal needs to select a certain region, the corresponding region is selected by drawing a corresponding shape with a scribing tool. The first terminal 1 obtains user operation and generates a first screen operation command and corresponding operation parameters, wherein the first screen operation command is used for representing expected operation input by a user in the first terminal, the operation parameters comprise operation coordinates, line colors and the like, and the operation coordinates are used for representing the position of the first screen operation command in the display data.
In this embodiment, the first input operation includes dragging, mouse clicking, keyboard inputting, and/or the like. Specifically, the user may input an operation to be performed through an input device such as a mouse, a keyboard, or a touch screen, for example, an operation of clicking a certain control in the screen. Specifically, a mouse and keyboard response software may be set at the first terminal, and when a user inputs an operation through the mouse or the keyboard, the corresponding response event and the first screen operation command and the corresponding operation parameter may be acquired. Thus, the user can perform operations such as dragging, mouse clicking, and keyboard input.
In the present embodiment, the first terminal 1 generates first operation data according to an operation by a user, and transmits the first operation data to the server 3 through the first communication connection.
In this embodiment, the server 3 transmits said first operation data to the second terminal 2 via the second communication connection.
In the present embodiment, in response to the first operation data being a scribe line, the user may set a scribing tool, such as a canvas, etc., at the second terminal 2. And after receiving the first operation data, executing a first operation command at the corresponding position according to the first operation coordinate to draw the corresponding shape.
Further, in response to the first operation data being drag, mouse click, keyboard input, or the like, the User may set a GUI (Graphical User Interface) automation tool, such as RobotJS or the like, at the second terminal 2, where RobotJS is a GUI automation tool that may be used to control operations of a mouse, a keyboard, a reading screen, or the like. And after receiving the first operation data, controlling to execute input operations of a mouse, a keyboard and the like in a corresponding area in the screen according to the first operation coordinate, for example, controlling the mouse to click a certain control in the screen and the like. Therefore, the user of the first terminal can operate the screen data according to the requirement, and the user communication is facilitated.
Optionally, in order to further facilitate user communication, the server according to the embodiment of the present invention may further provide a video call for the user, and specifically includes the following steps:
step S1061, receiving second video data from the second terminal through the second communication connection, where the second video data is video data recorded by a camera of the second terminal.
Step S1062, sending the second video data to the first terminal through the first communication connection.
Step S1063, receiving first video data from the first terminal through the first communication connection, where the first video data is video data recorded by a camera of the first terminal.
Step S1064, sending the first video data to the second terminal through the second communication connection.
It should be understood that the above steps S1061-S1062 and steps S1063-S1064 do not limit the execution order. Therefore, the video call between the first terminal and the user of the second terminal can be realized.
Fig. 11 is a schematic diagram of an electronic device of an embodiment of the invention. The electronic device shown in fig. 11 is a general-purpose data processing apparatus comprising a general-purpose computer hardware structure including at least a processor 111 and a memory 112. The processor 111 and the memory 112 are connected by a bus 113. The memory 112 is adapted to store instructions or programs executable by the processor 111. Processor 111 may be a stand-alone microprocessor or may be a collection of one or more microprocessors. Thus, processor 111 implements the processing of data and the control of other devices by executing instructions stored by memory 112 to perform the method flows of embodiments of the present invention as described above. The bus 113 connects the above components together, and also connects the above components to a display controller 114 and a display device and an input/output (I/O) device 115. Input/output (I/O) device 115 may be a mouse, keyboard, modem, network interface, touch input device, motion sensing input device, printer, and other devices known in the art. Typically, the input/output devices 115 are coupled to the system through input/output (I/O) controllers 116.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus (device) or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-readable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations of methods, apparatus (devices) and computer program products according to embodiments of the application. It will be understood that each flow in the flow diagrams can be implemented by computer program instructions.
These computer program instructions may be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows.
These computer program instructions may also be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An online interaction method, the method comprising:
acquiring second display data from at least one second terminal, wherein the second display data are real-time display data of at least one part of a display device of the second terminal;
determining first display data according to the second display data;
sending the first display data to the first terminal; and
acquiring first operation data of the first terminal, wherein the first operation data is used for representing a first input operation received by the first terminal, the first operation data comprises a first operation command and a first operation coordinate, and the first operation coordinate is used for representing position information of the first input operation in the first display data and/or position information on a display screen of the first terminal; and
and sending the first operation data to the second terminal so that the second terminal synchronizes the first input operation according to the first operation data.
2. The method of claim 1, wherein the first input operation comprises a drag, a line, a mouse click, and/or a keyboard input;
wherein in response to the first input operation being a scribe line, the first operation data further comprises a line color and/or a line width.
3. The method of claim 1, wherein determining first display data from the second display data comprises:
identifying sensitive data in the second display data;
carrying out protection processing on sensitive data; and
obtaining the first display data based on the second display data after protection processing;
wherein the protection process is to cause all or part of the sensitive data to be displayed or invisible in an obscured state.
4. The method of claim 3, wherein identifying sensitive data in the second display data comprises:
identifying sensitive data in the second display data based on a first mapping relation and/or a second mapping relation, wherein the first mapping relation is a mapping relation between page address information in the second display data and a sensitive label, and the second mapping relation is a mapping relation between identification information corresponding to each display area in the second display data and the sensitive label; and/or
Identifying the sensitive data in the second display data based on a selection instruction, wherein the selection instruction is an acquired input instruction of the second terminal.
5. The method of claim 1, further comprising:
acquiring second operation data of the second terminal, wherein the second operation data is used for representing a second input operation received by the second terminal, the second operation data comprises a second operation command and a second operation coordinate, and the second operation coordinate is used for representing position information of the second input operation in the second display data and/or position information on a display screen of the second terminal; and
and sending the second operation data to the first terminal so that the first terminal synchronizes the second input operation according to the second operation data.
6. The method of claim 1, further comprising:
receiving second video data from the second terminal, wherein the second video data is video data recorded by a camera device of the second terminal; and
and sending the second video data to the first terminal.
7. The method of claim 1, further comprising:
receiving first video data from the first terminal, wherein the first video data is video data recorded by a camera device of the first terminal; and
and sending the first video data to the second terminal.
8. An online interactive system, characterized in that the system comprises at least one first terminal, at least one server and at least one second terminal;
the first terminal sends first operation data to the server, wherein the first operation data are used for representing a first input operation received by the first terminal, the first operation data comprise a first operation command and a first operation coordinate, and the first operation coordinate is used for representing position information of the first input operation in the first display data and/or position information on a display screen of the first terminal;
the second terminal sends second display data to the server, wherein the second display data are real-time display data of at least one part of a display device of the second terminal;
the server receives the second display data, determines first display data according to the second display data, and sends the first display data to the first terminal, and the server receives the first operation data and sends the first operation data to the second terminal, so that the second terminal synchronizes the first input operation according to the first operation data.
9. An electronic device comprising a memory and a processor, wherein the memory is configured to store one or more computer program instructions, wherein the one or more computer program instructions are executed by the processor to implement the method of any of claims 1-7.
10. A computer-readable storage medium on which computer program instructions are stored, which, when executed by a processor, implement the method of any one of claims 1-7.
CN202010065891.4A 2020-01-20 2020-01-20 Online interaction method, system, electronic device and storage medium Pending CN111338721A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010065891.4A CN111338721A (en) 2020-01-20 2020-01-20 Online interaction method, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010065891.4A CN111338721A (en) 2020-01-20 2020-01-20 Online interaction method, system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN111338721A true CN111338721A (en) 2020-06-26

Family

ID=71181471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010065891.4A Pending CN111338721A (en) 2020-01-20 2020-01-20 Online interaction method, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN111338721A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153619A (en) * 2020-09-08 2020-12-29 Oppo广东移动通信有限公司 Track display method, first terminal, second terminal and storage medium
CN113419693A (en) * 2021-05-17 2021-09-21 广州佰锐网络科技有限公司 Multi-user track synchronous display method and system
CN113704824A (en) * 2021-08-31 2021-11-26 平安普惠企业管理有限公司 Synchronous generation method, device and equipment of page guide mark and storage medium
CN113890945A (en) * 2021-08-31 2022-01-04 江苏微皓智能科技有限公司 Data sharing method, device and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105656982A (en) * 2015-03-23 2016-06-08 深圳酷派技术有限公司 IMS (IP Multimedia Subsystem) technology based remote control method and wireless terminal
CN105867779A (en) * 2016-03-29 2016-08-17 北京金山安全软件有限公司 Picture transmission method and device and electronic equipment
CN106101457A (en) * 2016-08-23 2016-11-09 努比亚技术有限公司 A kind of information screen apparatus and method
CN108933965A (en) * 2017-05-26 2018-12-04 腾讯科技(深圳)有限公司 screen content sharing method, device and storage medium
CN109976617A (en) * 2019-04-03 2019-07-05 腾讯科技(深圳)有限公司 Document display method and apparatus
CN110378145A (en) * 2019-06-10 2019-10-25 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105656982A (en) * 2015-03-23 2016-06-08 深圳酷派技术有限公司 IMS (IP Multimedia Subsystem) technology based remote control method and wireless terminal
CN105867779A (en) * 2016-03-29 2016-08-17 北京金山安全软件有限公司 Picture transmission method and device and electronic equipment
CN106101457A (en) * 2016-08-23 2016-11-09 努比亚技术有限公司 A kind of information screen apparatus and method
CN108933965A (en) * 2017-05-26 2018-12-04 腾讯科技(深圳)有限公司 screen content sharing method, device and storage medium
CN109976617A (en) * 2019-04-03 2019-07-05 腾讯科技(深圳)有限公司 Document display method and apparatus
CN110378145A (en) * 2019-06-10 2019-10-25 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153619A (en) * 2020-09-08 2020-12-29 Oppo广东移动通信有限公司 Track display method, first terminal, second terminal and storage medium
WO2022052649A1 (en) * 2020-09-08 2022-03-17 Oppo广东移动通信有限公司 Trajectory display method, first terminal, second terminal, and storage medium
CN113419693A (en) * 2021-05-17 2021-09-21 广州佰锐网络科技有限公司 Multi-user track synchronous display method and system
CN113704824A (en) * 2021-08-31 2021-11-26 平安普惠企业管理有限公司 Synchronous generation method, device and equipment of page guide mark and storage medium
CN113890945A (en) * 2021-08-31 2022-01-04 江苏微皓智能科技有限公司 Data sharing method, device and equipment

Similar Documents

Publication Publication Date Title
CN111338721A (en) Online interaction method, system, electronic device and storage medium
CN110716645A (en) Augmented reality data presentation method and device, electronic equipment and storage medium
US10545658B2 (en) Object processing and selection gestures for forming relationships among objects in a collaboration system
CN107636584B (en) Follow mode and position tagging of virtual workspace viewports in a collaborative system
US7996776B2 (en) Shared telepointer
US20160142471A1 (en) Systems and methods for facilitating collaboration among multiple computing devices and an interactive display device
CN107666987A (en) Robotic process automates
KR20150043344A (en) Integrating co-browsing with other forms of information sharing
WO2017140242A1 (en) Information processing method and client
CN111290722A (en) Screen sharing method, device and system, electronic equipment and storage medium
CN109587031A (en) Data processing method
US11921983B2 (en) Method and apparatus for visualization of public welfare activities
JP2020516983A (en) Live ink for real-time collaboration
JP6339550B2 (en) Terminal program, terminal device, and terminal control method
CN108920230B (en) Response method, device, equipment and storage medium for mouse suspension operation
CN111290721A (en) Online interaction control method, system, electronic device and storage medium
CN111651102B (en) Online teaching interaction method and device, storage medium and electronic equipment
CN113963355A (en) OCR character recognition method, device, electronic equipment and storage medium
US10867445B1 (en) Content segmentation and navigation
JP7002164B1 (en) A system that allows the sharer to detect the identity when sharing the screen and a system that allows the sharer to detect information about the input answer field.
CN115516867B (en) Method and system for reducing latency on collaboration platforms
US20230177831A1 (en) Dynamic User Interface and Data Communications Via Extended Reality Environment
CN115700450A (en) Control method and device for whiteboard application and intelligent interactive panel
US20230177855A1 (en) Notifications in Extended Reality Environments
JP5002498B2 (en) Information processing apparatus, display control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626