CN113419693A - Multi-user track synchronous display method and system - Google Patents

Multi-user track synchronous display method and system Download PDF

Info

Publication number
CN113419693A
CN113419693A CN202110534341.7A CN202110534341A CN113419693A CN 113419693 A CN113419693 A CN 113419693A CN 202110534341 A CN202110534341 A CN 202110534341A CN 113419693 A CN113419693 A CN 113419693A
Authority
CN
China
Prior art keywords
user
image data
screen
terminal
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110534341.7A
Other languages
Chinese (zh)
Other versions
CN113419693B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Bairui Network Technology Co ltd
Original Assignee
Guangzhou Bairui Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Bairui Network Technology Co ltd filed Critical Guangzhou Bairui Network Technology Co ltd
Priority to CN202110534341.7A priority Critical patent/CN113419693B/en
Publication of CN113419693A publication Critical patent/CN113419693A/en
Application granted granted Critical
Publication of CN113419693B publication Critical patent/CN113419693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a method and a system for synchronously displaying tracks of multiple users, wherein the multiple terminals are terminals of a plurality of Windows, iOS, Android, Web, H5 and applets which are provided with electronic whiteboards; the data can be stored in the transmission process, and when the user quits and joins again in the midway due to reasons, the drawn data can be recovered and viewed again; in addition, the invention provides a plurality of drawing modes, the drawing track data volume is small, the track data is synchronous in real time, the data transmission has good anti-packet loss capability, and application scenes such as online collaborative office, online teaching, remote assistance and the like can be met.

Description

Multi-user track synchronous display method and system
Technical Field
The invention relates to the technical field of electronic whiteboards, in particular to a method and a system for synchronously displaying multi-user tracks.
Background
An electronic Whiteboard (also called interactive Whiteboard, abbreviated as Whiteboard) is a high and new technology product which integrates the research and development of various high-tech means such as electronic technology, software technology and the like, and the electronic Whiteboard uses the electromagnetic induction principle, is combined with a computer and a projector, can record and store the writing traces of a user on the electronic Whiteboard in real time, and can realize paperless office work and teaching.
With the high-speed development of internet technology, the demand for online communication and information sharing is increasing, and in order to meet the demand for online communication, a currently common method is to receive a sliding operation of a user through one of the clients, convert the operation of the user into operation data after the operation of the user is completed, and transmit the operation data to each terminal for sharing, so that the user at each terminal can watch the operation data.
However, the conventional method has the following problems that the terminal can record and transmit data only after the user finishes the operation, so that the waiting time of the terminal is prolonged, and when the content of the graffiti of the user is more, the recorded data capacity is large, the transmission time of the terminal is long, and the transmission efficiency is low; and the user can only watch when data interactive transmission is carried out, and the user can not watch the interactive content when leaving in a certain time, so that the interactive operation experience of the user is reduced.
Disclosure of Invention
The invention provides a method and a system for synchronously displaying a multi-user track, which are used for solving the technical problems of slow data transmission, low interaction efficiency and poor interaction experience caused by long waiting time and large data capacity of a terminal which can only be watched at present.
A first aspect of an embodiment of the present invention provides a method for synchronously displaying multiple user tracks, which is applied to a first terminal provided with a first screen, where the first terminal is connected to an SDK server, and the SDK server is in communication connection with multiple second terminals, and the method includes:
when a first user touches the first screen, continuously acquiring a plurality of first coordinate points of the first screen touched by the first user within a preset time length;
generating and displaying first image data according to the plurality of first coordinate points and a drawing mode selected when a user touches the first screen;
sending the first image data to the SDK server so that the SDK server stores and sends the first image data to the plurality of second terminals in a broadcast mode, and the plurality of second terminals can synchronously display the first image data;
receiving second image data sent by the SDK server, wherein the second image data is generated by the second terminal and is sent to the SDK server;
rendering and displaying the second image data on the first screen.
In a possible implementation manner of the first aspect, the method further includes:
receiving a recording instruction of a user;
responding to the recording instruction, starting a preset first camera to record the shot image of the first user and the interactive image displayed by the first screen, wherein the first operation comprises an operation of touching the first screen by the user, and the interactive image is the image of the first image data or the second image data rendered and displayed by the first screen.
In a possible implementation manner of the first aspect, after the step of rendering and displaying the second image data on the first screen, the method further includes:
continuously intercepting the interface of the first screen change in the first screen rendering process to obtain a plurality of first change images;
adding the plurality of first variation images to the video of the first user.
In a possible implementation manner of the first aspect, the generating first image data according to the plurality of first coordinate points includes:
connecting the plurality of first coordinate points into a graffiti image;
and generating first image data according to the scrawling image.
In a possible implementation manner of the first aspect, the second terminal is provided with a second screen;
the second image data is specifically generated according to a plurality of second coordinate points and sent to the SDK server after the second terminal obtains the plurality of second coordinate points of the second screen continuously touched by the second user.
In a possible implementation manner of the first aspect, the enabling the plurality of second terminals to synchronously display the first image data specifically includes:
and the plurality of second terminals receive the first graphic data at the same time and render and display the first image data on the second screen.
In a possible implementation manner of the first aspect, the method further includes:
and in the second screen rendering process, the second terminal continuously intercepts the interface of the second screen change to obtain a plurality of second change images, and adds the plurality of second change images to the image of the second user recorded by the preset camera of the second terminal.
A second aspect of the embodiments of the present invention provides a system for synchronously displaying multiple user tracks, which is applied to a first terminal having a first screen, the first terminal is connected to an SDK server, the SDK server is in communication connection with multiple second terminals, and the system includes:
the acquisition module is used for continuously acquiring a plurality of first coordinate points of a first screen touched by a first user within a preset time length when the first screen is touched by the first user;
the generating module is used for generating and displaying first image data according to the plurality of first coordinate points and a drawing mode selected when a user touches the first screen;
the sending module is used for sending the first image data to the SDK server so that the SDK server stores the first image data and sends the first image data to the plurality of second terminals in a broadcast mode, and the plurality of second terminals can synchronously display the first image data;
the receiving module is used for receiving second image data sent by the SDK server, and the second image data is generated by the second terminal and is sent to the SDK server;
and the display module is used for rendering and displaying the second image data on the first screen.
In a possible implementation manner of the second aspect, the system further includes:
the recording instruction module is used for receiving a recording instruction of a user;
the first recording module is used for responding to the recording instruction, starting a preset first camera to record the image of the first user and the interactive image displayed by the first screen, wherein the first operation comprises an operation that the user touches the first screen, and the interactive image is the image of the first image data or the second image data which is displayed by the first screen in a rendering mode.
In a possible implementation manner of the second aspect, the system further includes:
the intercepting module is used for continuously intercepting the interface of the first screen change in the first screen rendering process to obtain a plurality of first change images;
and the adding module is used for adding the first change images to the image of the first user.
In a possible implementation manner of the second aspect, the generating module is further configured to:
connecting the plurality of first coordinate points into a graffiti image;
and generating first image data according to the scrawling image.
In a possible implementation manner of the second aspect, the second terminal is provided with a second screen;
the second image data is specifically generated according to a plurality of second coordinate points and sent to the SDK server after the second terminal obtains the plurality of second coordinate points of the second screen continuously touched by the second user.
In a possible implementation manner of the second aspect, the enabling the plurality of second terminals to synchronously display the first image data specifically includes:
and the plurality of second terminals receive the first graphic data at the same time and render and display the first image data on the second screen.
In a possible implementation manner of the second aspect, the system further includes:
and in the second screen rendering process, the second terminal continuously intercepts the interface of the second screen change to obtain a plurality of second change images, and adds the plurality of second change images to the image of the second user recorded by the preset camera of the second terminal.
Compared with the prior art, the method and the system for synchronously displaying the multi-user track provided by the embodiment of the invention have the beneficial effects that: the invention can acquire the data of the user doodling operation in the preset time interval, shorten the waiting time of the terminal and reduce the capacity of the doodling data, thereby reducing the time for sending data by the terminal and improving the transmission efficiency, and the data transmission between the terminal and the terminal can be broadcasted through the SDK server, thereby reducing the pressure of the terminal for transmitting data, further improving the efficiency of data transmission and increasing the user experience.
Drawings
Fig. 1 is a schematic flowchart of a method for synchronously displaying multiple user tracks according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a system for synchronously displaying multiple user tracks according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the current common data sharing method, a sliding operation of a user is received through one of the clients, and after the operation of the user is completed, the operation of the user is converted into operation data, and then the operation data is transmitted to each terminal for sharing so as to be watched by the user of each terminal.
However, the terminal can record and transmit data only after the user finishes the operation, so that the waiting time of the terminal is prolonged, and when the content of the graffiti of the user is more, the recorded data capacity is large, the transmission time of the terminal is long, and the transmission efficiency is low; and the user can only watch when data interactive transmission is carried out, and the user can not watch the interactive content when leaving in a certain time, so that the interactive operation experience of the user is reduced.
In order to solve the above problem, a method for synchronously displaying multiple user tracks provided by the embodiments of the present application will be described and explained in detail by the following specific embodiments.
Referring to fig. 1, a flowchart of a method for synchronously displaying multiple user tracks according to an embodiment of the present invention is shown. The method for synchronously displaying the multi-user track relates to an SDK (software Development kit) server, a first terminal and a plurality of second terminals. The first terminal and the second terminal are clients provided with electronic whiteboard software, the first terminal is provided with a first screen, the second terminal is provided with a second screen, and the first screen and the second screen can display the electronic whiteboard for a user to scrawl or modify. The SDK server is specifically an AnyChat SDK server, and is a server provided with an audio and video development kit AnyChat SDK deployment. The SDK server may be connected with the first terminal and the second terminal, respectively.
In addition, the electronic whiteboard software can support various different terminals such as iOS, Android, web, h5 and the like.
In actual operation, the first terminal and the second terminal may be the same client or different clients. For convenience of explanation, in the present embodiment, the first terminal and the second terminal are the same client. Specifically, the first terminal may be a mobile phone, a tablet, a smart watch, or other different portable terminal devices.
In this embodiment, the method for synchronously displaying the multi-user trajectory may be applied to a first terminal, where the method for synchronously displaying the multi-user trajectory may include, as an example:
s11, when the first user touches the first screen, continuously acquiring a plurality of first coordinate points of the first screen touched by the first user within a preset time length.
During the recording process, the user may need to apply medicines or mark modifications to the document or content displayed on the electronic whiteboard. Therefore, a toolbar option can be set in the interface of the electronic whiteboard, a button for selecting a painting brush is arranged in the toolbar, a user can click the toolbar, the painting brush tool appears in the toolbar, the user can click the corresponding painting brush tool, and then the user can touch and move on the screen through fingers on the painting board to draw the contents of the painting board.
When a user touches the first screen to start drawing content, the first terminal can receive the coordinate point of the screen touched by the user in real time, and the content or image data drawn by the user is determined by recording the coordinate point.
If the user touches the mobile terminal for a long time or draws a large amount of content, if the first terminal continuously waits, the waiting time is too long, and the transmitted data capacity is also extremely large, so that the transmission time is increased, and the data transmission efficiency is reduced. In this embodiment, the first terminal may obtain the coordinate point touched by the user within the preset duration, and if the time touched by the user exceeds the preset duration, the coordinate point is obtained again from the current time, so that the time of obtaining the data each time is limited to the content of the preset duration, and the waiting time of the first terminal is shortened.
For example, if the preset duration is 30 seconds, the first terminal may continuously acquire a plurality of coordinate points touched by the user on the first screen within 30 seconds, acquire a plurality of coordinate points touched by the user on the first screen within 30 seconds if the time continuously touched by the user is less than 30 seconds, and if the time continuously touched by the user exceeds 30 seconds, acquire the coordinate points again from 31 seconds until 60 th second, and so on.
In order to improve the data transmission efficiency, the preset time period may be reduced as appropriate, and may be set to 1 second or 30 milliseconds.
And S12, generating and displaying first image data according to the plurality of first coordinate points and the drawing mode selected when the user touches the first screen.
The drawing mode can be used for drawing displayed graphs by scrawling, such as circles, squares, straight lines, free drawing and the like, can display different drawing colors, and can also be the thickness degree of a painting brush of the drawn graphs.
In specific implementation, when a user touches the first screen, the user can select a required drawing mode, draw after the selection, and adjust the selection according to personal preferences or actual needs in the middle of the graffiti.
The method comprises the steps that a user selects a painting brush tool before drawing, the user can correspondingly select a display color when selecting the painting brush tool, when the user continuously touches a first screen, the content drawn by the user can be displayed in an electronic whiteboard interface of a first terminal, and the first terminal can also generate a corresponding first image through a plurality of coordinate points and the color selected when the user selects a painting brush after acquiring a plurality of first coordinate points, and acquire first image data of the first image.
Because the first image data is the image data of the user scrawling on the first screen, the first terminal can synchronously display the first image data in the first screen in real time after acquiring the first image data.
In addition, in order to clearly and accurately acquire and present the image drawn by the user, the step S12 may include the following sub-steps, as an example
And a substep S121 of connecting the plurality of first coordinate points into a graffiti image.
And a substep S122 of generating first image data according to the scrawling image.
The first terminal can be according to the time sequence of obtaining a plurality of first coordinate point, and connect a plurality of first coordinate point according to time sequence, obtain the orbit that the user touched first screen in succession to confirm according to the colour and the orbit that the user selected earlier and obtain the scribble image.
After the graffiti image is obtained, the first terminal may perform data conversion on the graffiti image, thereby generating first image data. The first image data is data of an image drawn by a user within a preset time length.
S13, sending the first image data to the SDK server so that the SDK server stores and sends the first image data to the plurality of second terminals in a broadcast mode for the plurality of second terminals to synchronously display the first image data.
When the first terminal acquires the first image data, the first terminal can send the first image data to the SDK server, and the SDK server can send the first image data to the second terminals simultaneously in a broadcasting mode due to the fact that the SDK server is connected with the second terminals, so that the second terminals can synchronously display the first image data. And the broadcast mode of the SDK server can shorten the data transmission time, thereby improving the data transmission efficiency.
The second terminal can also be provided with a screen, the screen is a second screen, the electronic whiteboard software can be opened by a plurality of second terminals, and the electronic whiteboard software can also receive the doodle or drawing operation of the user of the second terminal, so that the drawing content of the second terminal user can be displayed on the electronic whiteboard of the second terminal.
The rendering mode may be that the second terminal acquires a plurality of first coordinate points from the first image data, acquires a time sequence of the first coordinate points and a position of the first coordinate points, and gradually renders the first coordinate points and the position of the first coordinate points into a screen of the second terminal according to the time sequence of the first coordinate points and the position of the first coordinate points.
In addition, after the electronic whiteboard is opened, the user of the second terminal may trigger the second terminal to record the image, so that the recorded image can be combined with the received first image data for sharing. Specifically, in the second screen rendering process, the second terminal continuously captures an interface of the second screen change to obtain a plurality of second change images, and adds the plurality of second change images to an image of the second user recorded by a preset camera of the second terminal.
In the rendering process, the second screen of the second terminal gradually adds the corresponding color selected by the first terminal user in the position of the first coordinate point of the first graph data in the order of acquiring the first coordinate point to the second screen to form a rendering effect. In the adding process, the image of the interface displayed by the second screen changes once every time the second terminal adds the image, and the image of the interface displayed by the second screen is intercepted once every time the second terminal detects that the image displayed by the second screen changes, so that a second changed image is obtained. In the continuous adding process, the second terminal can continuously intercept the images, so that a plurality of second change images are obtained.
When the second terminal acquires a second change image, the second terminal can add the second change image to the image recorded by the current second terminal, and when enough second change images are captured, a coherent video picture can be formed, so that the effect of simultaneously recording the picture of the camera of the terminal and the picture of the electronic whiteboard is realized.
In actual operation, the SDK server may store the first image data of the first terminal and the second image data of the second terminal, and after the user exits in the middle due to a cause, if the user joins the interactive connection again, the terminal of the user may also obtain the data of drawing the graffiti by another terminal from the SDK server, so that the previous drawing data may be restored and the user may watch the graffiti again.
And S14, receiving second image data sent by the SDK server, wherein the second image data is generated by the second terminal and sent to the SDK server.
Because the user of the second terminal can also perform drawing and doodling operations in the using process, when the second terminal receives the drawing and doodling operations of the user, the second terminal can also generate second image data, and then the second image data is sent to the SDK server. The SDK server may then broadcast the second image data to the first terminal and other second terminals. The other second terminal is a terminal that does not transmit the second image data among the plurality of second terminals.
Specifically, the second image data is generated and sent to the SDK server according to a plurality of second coordinate points after the second terminal obtains the plurality of second coordinate points at which the second user continuously touches the second screen.
In this embodiment, since the second terminal and the first terminal are the same terminal, the operation of generating the second image data by the second terminal is the same as the operation of generating the first image data by the first terminal, and specifically, the generation manner of the first image data may be referred to.
And S15, rendering and displaying the second image data on the first screen.
When the first terminal receives the second image data, the first terminal may acquire a plurality of second coordinate points of the second image data, an acquisition order of the plurality of second coordinate points, and a color of the second terminal user drawn image.
The first terminal can add the color of the image drawn by the second terminal user to the position of the corresponding second coordinate point according to the acquisition sequence of the plurality of second coordinate points, so that the rendering effect is realized.
And S16, continuously intercepting the interface of the first screen change in the first screen rendering process to obtain a plurality of first change images.
And S17, adding the first change images to the video of the first user.
Similarly, since the user of the first terminal starts the function of recording the image by the first terminal, when the first terminal receives the second image data, the first terminal may also change the image of the interface displayed on the first screen once the first terminal adds the second image data in the process of rendering the second image data, and intercept the image of the interface displayed on the first screen once when the first terminal detects that the image displayed on the first screen changes, so as to obtain a first changed image. In the continuous adding and rendering process, the first terminal continuously intercepts the images, so that a plurality of first change images are obtained.
Then the first terminal can add a plurality of first change images to the image recorded by the first terminal respectively, so that the plurality of first change images can form a coherent video picture, and the effect of simultaneously recording the picture of the terminal camera and the picture of the electronic whiteboard is realized.
In the operation process, since the user may leave the device in the middle and cannot view the device, in order to enable the user to see the data change of the interactive operation after the device or after the device is finished, in this embodiment, the method may further include the following steps:
and S18, receiving a recording instruction of the user.
The recording instruction is generated after a user opens the electronic whiteboard software of the first terminal and touches an interface of the first whiteboard software displayed by the first screen.
Before the user uses the electronic whiteboard, the user can open the electronic whiteboard software in the first terminal, or can open the document on the electronic whiteboard, and then starts to use the electronic whiteboard.
The recording instruction may also be received by the first terminal after the user clicks the first screen in an operation process of the user, for example, when the user scrawls the first image data, or may also be sent to the first terminal after the user clicks the first screen after watching the second image data sent by the other second terminal.
During recording, the first terminal may be connected to a plurality of second terminals, and the plurality of second terminals may all transmit the second image data, so that the user may select an object to be recorded. For example, with three second terminals, A, B and C respectively, the user may choose to record the second image data of two second terminals B and C, but not record the second image data of a second terminal.
Specifically, after receiving the recording instruction of the user, the first terminal may also receive a selection instruction of the user, and determine to record the second terminal object according to the selection instruction.
S19, responding to the recording instruction, starting a preset first camera to record the shot image of the first user and the interactive image displayed by the first screen, wherein the first operation comprises an operation of touching the first screen by the user, and the interactive image is the image of the first image data or the second image data rendered and displayed by the first screen.
It should be noted that the first terminal is provided with a first camera, and the first camera may be a front camera of the first terminal or a rear camera. After receiving a recording instruction of a user, the first terminal starts the first camera to record images shot by the camera, and meanwhile, the first terminal also contains sound generated during user operation. And recording an interactive image displayed in the first screen, wherein the interactive image comprises electronic whiteboard content, first image data and second image data, and the second image data is second image data corresponding to a second terminal selected to be recorded by a user. After the user completes the interaction and closes the electronic whiteboard, a video file can be generated and stored in the first terminal for the user to watch in the later period.
In a specific implementation, the first camera may be a front-facing camera, so that the first terminal may record a user of the current first terminal and a touch operation of the user on the first terminal. The touch operation may be a sliding or clicking operation of the user in the first screen.
The image is recorded through the first terminal, so that the action of a person, the voice of the person and the real-time track picture of the drawing board can be recorded at the same time, the picture is also convenient to file, and meanwhile, if a user cannot watch the picture midway or cannot watch the picture in some cases or participate in sharing, the content shared by the electronic whiteboard can be reviewed afterwards, so that the use experience of the user is improved.
In addition, during the drawing, each terminal can also carry out audio and video communication with each other, particularly, real-time transmission of voice data can be carried out through the SDK server, and meanwhile, the electronic whiteboard software also supports multiple recording modes, for example, a user can select to record only a drawing track, record audio and video plus the drawing track, and also can carry out mutual switching of recording of the audio and video and the drawing track.
Because the electronic whiteboard software supports common terminal equipment, special hardware support is not needed, common mobile phones, computers and pads can be used, and the interaction cost can be greatly reduced.
In this embodiment, an embodiment of the present invention provides a method for synchronously displaying multiple user tracks, which has the following beneficial effects: the invention can acquire the data of the user doodling operation within the preset time length, shorten the waiting time of the terminal and reduce the capacity of the doodling data, thereby reducing the time for sending the data by the terminal and improving the transmission efficiency, and the data transmission between the terminal and the terminal can be broadcasted through the SDK server, thereby reducing the pressure of the terminal for transmitting the data, further improving the efficiency of data transmission and increasing the use experience of the user.
An embodiment of the present invention further provides a system for synchronously displaying multiple user tracks, and referring to fig. 2, a schematic structural diagram of the system for synchronously displaying multiple user tracks according to an embodiment of the present invention is shown.
The multi-user track synchronous display system can be applied to a first terminal provided with a first screen, the first terminal is connected with an SDK server, and the SDK server is connected with a plurality of second terminals in a communication manner, wherein the multi-user track synchronous display system can include, as an example:
the acquiring module 201 is configured to continuously acquire, within a preset time period, a plurality of first coordinate points at which a first user touches a first screen when the first user touches the first screen;
the generating module 202 is configured to generate and display first image data according to the drawing selected when the plurality of first coordinate points and the user touch the first screen;
a sending module 203, configured to send the first image data to the SDK server, so that the SDK server stores and sends the first image data to the plurality of second terminals in a broadcast manner, and the plurality of second terminals synchronously display the first image data;
a receiving module 204, configured to receive second image data sent by the SDK server, where the second image data is generated by the second terminal and is sent to the SDK server;
a display module 205, configured to render and display the second image data on the first screen.
In a possible implementation manner of the second aspect, the system further includes:
the recording instruction module is used for receiving a recording instruction of a user;
the first recording module is used for responding to the recording instruction, starting a preset first camera to record the image of the first user and the interactive image displayed by the first screen, wherein the first operation comprises an operation that the user touches the first screen, and the interactive image is the image of the first image data or the second image data which is displayed by the first screen in a rendering mode.
In a possible implementation manner of the second aspect, the system further includes:
the intercepting module is used for continuously intercepting the interface of the first screen change in the first screen rendering process to obtain a plurality of first change images;
and the adding module is used for adding the first change images to the image of the first user.
In a possible implementation manner of the second aspect, the generating module is further configured to:
connecting the plurality of first coordinate points into a graffiti image;
and generating first image data according to the scrawling image.
In a possible implementation manner of the second aspect, the second terminal is provided with a second screen;
the second image data is specifically generated according to a plurality of second coordinate points and sent to the SDK server after the second terminal obtains the plurality of second coordinate points of the second screen continuously touched by the second user.
In a possible implementation manner of the second aspect, the enabling the plurality of second terminals to synchronously display the first image data specifically includes:
and the plurality of second terminals receive the first graphic data at the same time and render and display the first image data on the second screen.
In a possible implementation manner of the second aspect, the system further includes:
and in the second screen rendering process, the second terminal continuously intercepts the interface of the second screen change to obtain a plurality of second change images, and adds the plurality of second change images to the image of the second user recorded by the preset camera of the second terminal.
Further, an embodiment of the present application further provides an electronic device, including: the multi-user trajectory display device comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the multi-user trajectory synchronous display method according to the embodiment.
Further, an embodiment of the present application also provides a computer-readable storage medium, where computer-executable instructions are stored, and the computer-executable instructions are configured to enable a computer to perform the method for synchronously displaying the multi-user trajectory according to the embodiment.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention.

Claims (10)

1. A method for synchronously displaying a multi-user track is applied to a first terminal provided with a first screen, the first terminal is connected with an SDK server, and the SDK server is in communication connection with a plurality of second terminals, and the method comprises the following steps:
when a first user touches the first screen, continuously acquiring a plurality of first coordinate points of the first screen touched by the first user within a preset time length;
generating and displaying first image data according to the plurality of first coordinate points and a drawing mode selected when a user touches the first screen;
sending the first image data to the SDK server so that the SDK server stores and sends the first image data to the plurality of second terminals in a broadcast mode, and the plurality of second terminals can synchronously display the first image data;
receiving second image data sent by the SDK server, wherein the second image data is generated by the second terminal and is sent to the SDK server;
rendering and displaying the second image data on the first screen.
2. The method for synchronized display of multiple user trajectories of claim 1, further comprising:
receiving a recording instruction of a user;
responding to the recording instruction, starting a preset first camera to record the shot image of the first user and the interactive image displayed by the first screen, wherein the first operation comprises an operation of touching the first screen by the user, and the interactive image is the image of the first image data or the second image data rendered and displayed by the first screen.
3. The method for synchronously displaying multiple user trajectories according to claim 2, wherein after the step of rendering and displaying the second image data on the first screen, the method further comprises:
continuously intercepting the interface of the first screen change in the first screen rendering process to obtain a plurality of first change images;
adding the plurality of first variation images to the video of the first user.
4. The method for synchronously displaying the multi-user trajectory according to claim 1, wherein the generating the first image data according to the plurality of first coordinate points comprises:
connecting the plurality of first coordinate points into a graffiti image;
and generating first image data according to the scrawling image.
5. The method for synchronously displaying the multi-user track according to claim 1, wherein the second terminal is provided with a second screen;
the second image data is specifically generated according to a plurality of second coordinate points and sent to the SDK server after the second terminal obtains the plurality of second coordinate points of the second screen continuously touched by the second user.
6. The method according to claim 5, wherein the step of allowing the plurality of second terminals to synchronously display the first image data comprises:
and the plurality of second terminals receive the first graphic data at the same time and render and display the first image data on the second screen.
7. The method for synchronized display of multiple user trajectories of claim 6, further comprising:
and in the second screen rendering process, the second terminal continuously intercepts the interface of the second screen change to obtain a plurality of second change images, and adds the plurality of second change images to the image of the second user recorded by the preset camera of the second terminal.
8. A multi-user track synchronous display system is applied to a first terminal provided with a first screen, the first terminal is connected with an SDK server, and the SDK server is in communication connection with a plurality of second terminals, and the system comprises:
the acquisition module is used for continuously acquiring a plurality of first coordinate points of a first screen touched by a first user within a preset time length when the first screen is touched by the first user;
the generating module is used for generating and displaying first image data according to the plurality of first coordinate points;
the sending module is used for sending the first image data to the SDK server so that the SDK server sends the first image data to the plurality of second terminals in a broadcast mode for the plurality of second terminals to synchronously display the first image data;
the receiving module is used for receiving second image data sent by the SDK server, and the second image data is generated by the second terminal and is sent to the SDK server;
and the display module is used for rendering and displaying the second image data on the first screen.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements a method for synchronized display of a multi-user trajectory according to any one of claims 1 to 7 when executing the program.
10. A computer-readable storage medium having stored thereon computer-executable instructions for causing a computer to perform the method for synchronized display of multi-user trajectories as set forth in any one of claims 1-7.
CN202110534341.7A 2021-05-17 2021-05-17 Multi-user track synchronous display method and system Active CN113419693B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110534341.7A CN113419693B (en) 2021-05-17 2021-05-17 Multi-user track synchronous display method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110534341.7A CN113419693B (en) 2021-05-17 2021-05-17 Multi-user track synchronous display method and system

Publications (2)

Publication Number Publication Date
CN113419693A true CN113419693A (en) 2021-09-21
CN113419693B CN113419693B (en) 2023-04-18

Family

ID=77712432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110534341.7A Active CN113419693B (en) 2021-05-17 2021-05-17 Multi-user track synchronous display method and system

Country Status (1)

Country Link
CN (1) CN113419693B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114168098A (en) * 2021-12-10 2022-03-11 天津洪恩完美未来教育科技有限公司 Data synchronization method, device, equipment and storage medium of electronic whiteboard
CN115134318A (en) * 2022-06-16 2022-09-30 王蕾茜 Application program and chatting method capable of realizing synchronous same-screen real-time hand drawing by user
CN117395231A (en) * 2023-08-31 2024-01-12 国联人寿保险股份有限公司 Multi-terminal same-screen interactive display method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309520A (en) * 2013-05-20 2013-09-18 南京恒知讯科技有限公司 Screen operation trace and sound input synchronous storage and processing method, system and terminal
CN105491414A (en) * 2015-11-19 2016-04-13 深圳市时尚德源文化传播有限公司 Synchronous display method and device of images
CN106559696A (en) * 2016-12-01 2017-04-05 北京小米移动软件有限公司 Method for sending information and device
CN107454433A (en) * 2017-08-09 2017-12-08 广州视源电子科技股份有限公司 Live annotation method and device, terminal and live broadcast system
WO2017222258A1 (en) * 2016-06-21 2017-12-28 (주)해든브릿지 Multilateral video communication system and method using 3d depth camera
CN108108091A (en) * 2017-11-28 2018-06-01 贵阳语玩科技有限公司 The refreshing display methods and system of sliding trace
CN109710165A (en) * 2018-12-25 2019-05-03 维沃移动通信有限公司 A kind of drawing processing method and mobile terminal
CN111338721A (en) * 2020-01-20 2020-06-26 北京大米未来科技有限公司 Online interaction method, system, electronic device and storage medium
CN111352599A (en) * 2019-05-07 2020-06-30 鸿合科技股份有限公司 Data processing method and device for remote whiteboard and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309520A (en) * 2013-05-20 2013-09-18 南京恒知讯科技有限公司 Screen operation trace and sound input synchronous storage and processing method, system and terminal
CN105491414A (en) * 2015-11-19 2016-04-13 深圳市时尚德源文化传播有限公司 Synchronous display method and device of images
WO2017222258A1 (en) * 2016-06-21 2017-12-28 (주)해든브릿지 Multilateral video communication system and method using 3d depth camera
CN106559696A (en) * 2016-12-01 2017-04-05 北京小米移动软件有限公司 Method for sending information and device
CN107454433A (en) * 2017-08-09 2017-12-08 广州视源电子科技股份有限公司 Live annotation method and device, terminal and live broadcast system
CN108108091A (en) * 2017-11-28 2018-06-01 贵阳语玩科技有限公司 The refreshing display methods and system of sliding trace
CN109710165A (en) * 2018-12-25 2019-05-03 维沃移动通信有限公司 A kind of drawing processing method and mobile terminal
CN111352599A (en) * 2019-05-07 2020-06-30 鸿合科技股份有限公司 Data processing method and device for remote whiteboard and electronic equipment
CN111338721A (en) * 2020-01-20 2020-06-26 北京大米未来科技有限公司 Online interaction method, system, electronic device and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114168098A (en) * 2021-12-10 2022-03-11 天津洪恩完美未来教育科技有限公司 Data synchronization method, device, equipment and storage medium of electronic whiteboard
CN115134318A (en) * 2022-06-16 2022-09-30 王蕾茜 Application program and chatting method capable of realizing synchronous same-screen real-time hand drawing by user
CN117395231A (en) * 2023-08-31 2024-01-12 国联人寿保险股份有限公司 Multi-terminal same-screen interactive display method

Also Published As

Publication number Publication date
CN113419693B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN113419693B (en) Multi-user track synchronous display method and system
JP5472882B2 (en) CONFERENCE TERMINAL, CONFERENCE SERVER, CONFERENCE SYSTEM, AND DATA PROCESSING METHOD
US11533354B1 (en) Storage and retrieval of video conference state based upon participants
CN110266992A (en) A kind of long-distance video interactive system and method based on augmented reality
CN110597774A (en) File sharing method, system, device, computing equipment and terminal equipment
CN104106037A (en) Projector, graphical input/display device, portable terminal and program
CN110727361A (en) Information interaction method, interaction system and application
CN104780423A (en) A method and system for synchronizing mobile terminal instant messaging to smart TV set
US20230091539A1 (en) Information processing method, device, system, storage medium, and computer program product
JP2001313915A (en) Video conference equipment
CN104777991A (en) Remote interactive projection system based on mobile phone
CN111309226B (en) Terminal control method and device based on communication quality, terminal and computer equipment
CN113691829B (en) Virtual object interaction method, device, storage medium and computer program product
CN104932814A (en) Data transmission method and system and electronic terminal
CN102510469A (en) Intelligent conference table integrated system
CN110168630B (en) Augmented video reality
CN105407313A (en) Video calling method, equipment and system
Kachach et al. The owl: Immersive telepresence communication for hybrid conferences
CN103607632A (en) Previewing method and device based on desktop live broadcast
CN113672087A (en) Remote interaction method, device, system, electronic equipment and storage medium
US20230195403A1 (en) Information processing method and electronic device
CN112148182A (en) Interaction control method, terminal and storage medium
CN115529498A (en) Live broadcast interaction method and related equipment
CN114827686A (en) Recording data processing method and device and electronic equipment
CN115022573A (en) Desktop video conference system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant