CN111610946A - Data processing method, system, device, storage medium and processor - Google Patents

Data processing method, system, device, storage medium and processor Download PDF

Info

Publication number
CN111610946A
CN111610946A CN202010457901.9A CN202010457901A CN111610946A CN 111610946 A CN111610946 A CN 111610946A CN 202010457901 A CN202010457901 A CN 202010457901A CN 111610946 A CN111610946 A CN 111610946A
Authority
CN
China
Prior art keywords
terminal
target
data
information
target information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010457901.9A
Other languages
Chinese (zh)
Other versions
CN111610946B (en
Inventor
苏宁博
范志刚
卢涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wanxiang Electronics Technology Co Ltd
Original Assignee
Xian Wanxiang Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wanxiang Electronics Technology Co Ltd filed Critical Xian Wanxiang Electronics Technology Co Ltd
Priority to CN202010457901.9A priority Critical patent/CN111610946B/en
Publication of CN111610946A publication Critical patent/CN111610946A/en
Application granted granted Critical
Publication of CN111610946B publication Critical patent/CN111610946B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The invention discloses a data processing method, a system, a device, a storage medium and a processor. Wherein, the method comprises the following steps: the method comprises the steps that a first terminal obtains first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal; the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data. The method and the device solve the technical problem that the process of inputting data by the object cannot be determined in the prior art.

Description

Data processing method, system, device, storage medium and processor
Technical Field
The present invention relates to the field of computers, and in particular, to a data processing method, system, apparatus, storage medium, and processor.
Background
Currently, when remote data transmission is performed, one terminal device can perform voice and video communication with another terminal device to simulate a field communication scene.
However, one terminal device can only send the data that the corresponding object finally replies to another terminal device, so that the other terminal device can display the data, but cannot display the process of inputting the data by the object, thereby affecting the communication effect.
Aiming at the technical problem that the process of object input data cannot be determined in the prior art, an effective solution is not provided at present.
Disclosure of Invention
Embodiments of the present invention provide a data processing method, system, device, storage medium, and processor, so as to at least solve the technical problem in the prior art that the process of object input data cannot be determined.
According to an aspect of an embodiment of the present invention, there is provided a data processing method. The method can comprise the following steps: the method comprises the steps that a first terminal obtains first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal; the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data.
Optionally, the first terminal acquires first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal; the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data.
Optionally, before the first terminal acquires the first target information, the method further includes: the first terminal acquires second target data sent by the target terminal, responds to a first operation instruction, and generates a floating layer on a screen upper layer of the first terminal, wherein at least one second terminal comprises the target terminal, and the first target data is reply data of the second target data.
Optionally, after responding to the first operation instruction, the method further comprises: the first terminal sends a start input message to the server, wherein the start input message carries first identification information of the first terminal and second identification information of second target data and is used for indicating the first object to start inputting the first target data, and the first identification information and the second identification information are stored by the server.
Optionally, the obtaining, by the first terminal, the first target information includes: the method comprises the steps that a first terminal periodically obtains image data of a floating image layer and generates first target information, wherein the first target information comprises at least one of the following information: the image processing device comprises first identification information, second identification information and image data, wherein the image data comprises an input track of first target data and/or ground color image data of a floating layer.
Optionally, after generating the first target information, the method further includes: the first terminal sends the first target information to the server periodically, or the first terminal sends the first target information to the server under the condition that the first target data input by the first object is finished, wherein the image data in the first target information is used for establishing an association relation with the first identification information and the second identification information which are stored by the server.
Optionally, the method further comprises: the first terminal responds to a second operation instruction acting on the floating layer to generate an input track; the first terminal displays an input track on the floating layer.
Optionally, after the first terminal acquires the first target information, the method further includes: and the first terminal responds to the third operation instruction, and sends an ending input message to the server, and/or deletes the floating image layer from the screen, wherein the ending input message carries the first identification information and the second identification information and is used for indicating the server to send the received first target information to the second terminal based on the first identification information and the second identification information.
Optionally, after the first terminal sends the first target information to the at least one second terminal, the method further includes: the first terminal acquires second target information, wherein the second target information is used for indicating a process that a second object modifies the first target data on the target terminal, at least one second terminal comprises the target terminal, and other second terminals except the target terminal in the at least one second terminal forbid modifying the first target data.
According to another aspect of the embodiments of the present invention, another data processing method is also provided. The method can comprise the following steps: the method comprises the steps that a target terminal obtains first target information, wherein the first target information is used for indicating the process that a first object inputs first target data in a floating layer of the first terminal; the target terminal displays a process of inputting first target data by the first object based on the first target information.
Optionally, before the target terminal acquires the first target information, the method further includes: and the target terminal sends the second target data to the first terminal, wherein the first target data is reply data of the second target data.
Optionally, the obtaining, by the target terminal, the first target information includes: the target terminal acquires first target information sent by the server, wherein the first target information comprises at least one of the following information: the first identification information of the first terminal, the second identification information of the second target data and the image data of the floating layer, wherein the image data comprises an input track of the first target data and/or ground color image data of the floating layer.
Optionally, the method further comprises: the target terminal displays the floating layer on the upper layer of a screen of the target terminal and displays first identification information and second identification information; and/or the target terminal generates a target video from the image data of the multiple floating image layers, and the target video is used for showing the process that the first object is input into the first target data.
Optionally, after the process of the target terminal displaying the first object based on the first object information to input the first object data, the method further comprises: and the target terminal sends second target information to the first terminal, wherein the second target information is used for indicating the process that the second object modifies the first target data on the target terminal.
According to another aspect of the embodiments of the present invention, another data processing method is also provided. The method can comprise the following steps: the method comprises the steps that a server obtains first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of a first terminal; the server forwards the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process that the first object inputs the first target data.
According to another aspect of the embodiment of the invention, a data processing system is also provided. The system may include: the first terminal is used for acquiring first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal; the server is used for forwarding the first target information; and at least one second terminal for displaying a process of inputting the first object data by the first object based on the received first object information.
According to another aspect of the embodiment of the invention, a data processing device is also provided. The apparatus may include: the first obtaining unit is used for enabling the first terminal to obtain first target information, wherein the first target information is used for indicating a process that the first object inputs first target data in a floating layer of the first terminal; a first sending unit, configured to enable the first terminal to send first target information to at least one second terminal, where the first target information is used to enable the second terminal to display a process of inputting first target data by the first object
According to another aspect of the embodiments of the present invention, there is provided another data processing apparatus. The apparatus may include: the second acquisition unit is used for enabling the target terminal to acquire first target information, wherein the first target information is used for indicating a process that the first object inputs first target data in a floating layer of the first terminal; and the display unit is used for displaying the process of inputting the first target data by the first target on the basis of the first target information by the target terminal.
According to another aspect of the embodiments of the present invention, there is provided another data processing apparatus. The apparatus may include: a third obtaining unit, configured to enable the server to obtain first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal; and the forwarding unit is used for enabling the server to forward the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process that the first object inputs the first target data.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium. The storage medium includes a stored program, wherein the apparatus in which the storage medium is located is controlled to execute the data processing method of the embodiment of the present invention when the program is executed by the processor.
According to another aspect of the embodiments of the present invention, there is also provided a processor. The processor is used for running a program, wherein the program executes the data processing method of the embodiment of the invention when running.
In the embodiment of the invention, a first terminal is adopted to obtain first target information, wherein the first target information is used for indicating the process that a first object inputs first target data in a floating layer of the first terminal; the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data. That is to say, the first terminal of the present application sends the first target information used for indicating the process of inputting the first target data by the first object in the floating layer of the first terminal to the at least one second terminal, so that the process of inputting the first target data by the first object is displayed on the second terminal, and the process that only the final result of the input data can be displayed but the input data cannot be displayed is avoided, thereby solving the technical problem that the process of inputting data by the object cannot be determined, and further achieving the technical effect of determining the process of inputting data by the object.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a data processing system according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of data processing according to an embodiment of the present invention;
FIG. 3 is a flow diagram of another data processing method according to an embodiment of the invention;
FIG. 4 is a flow diagram of another data processing method according to an embodiment of the invention;
FIG. 5 is a schematic diagram of a distance education system according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a floating layer according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a response track displayed on a floating layer according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a data processing apparatus according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of another data processing apparatus according to an embodiment of the present invention; and
FIG. 10 is a schematic diagram of another data processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The embodiment of the invention provides a data processing system.
FIG. 1 is a schematic diagram of a data processing system according to an embodiment of the present invention. As shown in FIG. 1, data processing system 10 may include: a first terminal 11, a server 12 and at least one second terminal 13.
The first terminal 11 is configured to obtain first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal.
In this embodiment, the first terminal 11 is a terminal for inputting first target data, where the first target data is data input by the first object in the floating layer, for example, if the first terminal 11 is a student terminal device, that is, a terminal device at a student, and the first object is a student user, the first target data may be topic data input by the student user. The first target information of this embodiment may be used to completely indicate a process of inputting the first target data at the floating layer of the first terminal by the first object, and may include first identification information of the first terminal 11, second identification information of the second target data, and image data of the floating layer, where the first target data is reply data of the second target data, and the image data may include ground color image data and answering track data.
And the server 12 is used for forwarding the first target information.
The server 12, i.e. the processing server, of this embodiment establishes a communication connection with the first terminal 11, and may be configured to forward the first target information, for example, when the first target data is input by the first object, the server 12 forwards the first identification information of the first terminal 11, the second identification information of the second target data, and the image data of the floating layer to at least one second terminal 13.
At least one second terminal 13 for displaying a process of inputting the first object data by the first object based on the received first object information.
In this embodiment, the at least one second terminal 13 may include a target terminal, which may be a teacher terminal device, i.e., a terminal device at a teacher, and the at least one second terminal 13 may also include a terminal having the same properties as the first terminal, e.g., including other student user terminals. The target terminal in this embodiment may display a process of inputting the first target data by the first object, and may also correct and approve the received first target data on the target terminal by the second object, and return the second target information indicating the process of correcting and approving the first target data to the first terminal 11 through the server 12, so that the first object views the process of correcting and approving the first target data on the first terminal 11, thereby achieving the purposes of correcting errors and missing a gap, and achieving the purpose of determining the process of inputting data by the object.
The above-described data processing system of this embodiment is further described below.
In this embodiment, the first terminal and the second terminal may be the same, for example, both may be intelligent terminal devices such as a computer, a tablet computer, and a smart phone, or may be different. The first terminal and the second terminal are connected with the server.
In this embodiment, the second object transmits the second target data to the first terminal through the second terminal. Alternatively, the second object may present the second target data to the first object by means of voice or courseware on the second terminal. The first object receives the second target data through the first terminal. When the second object is ready to reply to the second target data, the first object triggers a first operation instruction on the first terminal. And after receiving the first operation instruction, the processing module of the first terminal starts a data input mode. Optionally, the input data mode refers to generating a floating layer on a screen of a display of the first terminal, where the floating layer is disposed on a top layer of the screen, and the floating layer may be opaque or translucent, and may be specifically set according to a habit of a user. Alternatively, in order to reduce the amount of image data, the ground color of the floating layer of this embodiment may be a solid color image. Optionally, the first object may set a size and a position of a floating layer on the display of the first terminal according to a size and a personal habit of the display of the first terminal, and the size of the floating layer may be smaller than a screen size of the display of the first terminal.
After the first terminal acquires the first operation instruction triggered by the first object, the first terminal of this embodiment sends, to the server, a start input message, where the start input message carries first identification information of the first terminal and second identification information of the second target data, for example, the first identification information may be a device ID, and the second identification information may be a topic ID.
After receiving the start input information sent by the first terminal, the server of this embodiment may store the first identification information and the second identification information carried by the start input information.
In this embodiment, when the first object inputs the first target data on the first terminal, the first target data may be input by triggering a keystroke event or a touch event. The first terminal of this embodiment may generate an input trace, for example, a reply trace, on the floating layer after receiving a keystroke event or a touch event for the floating layer.
Optionally, the first terminal of this embodiment may use an input device such as a mouse, a keyboard, a touch screen, and the like to input the first target data on the floating layer of the first terminal, and may generate the input track. It should be noted that, this embodiment may use a keystroke event or a touch event within the floating layer as a keystroke event or a touch event for the floating layer. After receiving the keyboard and mouse event or the touch event for the floating layer, the processing module of the first terminal of the embodiment displays an input track corresponding to the keyboard and mouse event or the touch event on the floating layer. The input trajectory may be a specific display position of the input data, for example, a is 1, b is 2, and c is 3.
It should be noted that the first object of this embodiment may modify the input track of the first target data on the first terminal.
In this embodiment, in order to reduce the data amount of the image coding on the floating layer, the font or image of the first target data may be solid color.
In this embodiment, the first terminal may periodically acquire image data of a floating layer, and generate first target information according to the acquired image data of the floating layer, where the first target information may include the image data of the floating layer, second identification information of the second target data, and identification information of the first terminal, and the image data of the floating layer may include ground color image data and an input track used for indicating the first target data. And the first terminal of this embodiment may periodically transmit the first target information to the server.
After the server receives the first target information periodically sent by the first terminal, the second identification information of the second target data and the image data of the floating image layer may be parsed from the first target information. The server may establish an association relationship between the image data of the floating image layer and the second identification information and the first identification information that have been previously stored, and store the image data of the floating image layer.
As another alternative example, the processing module of the first terminal of this embodiment may store the first target information that is periodically obtained in the local storage module without sending the first target information to the processing server. After the first target data is input on the first terminal, first target information used for indicating the input process of the first target data is sent to the server, and then the server forwards the first target information to the second terminal.
In this embodiment, after the first object completes inputting the first target data, an end input message is triggered to the first terminal, and the end input message carries the second identification information and the first identification information. After receiving the end input message, the processor of the first terminal sends the end input message to the server, so that the server sends the previously received first target information to the second terminal in a unified manner, and meanwhile, the floating layer is deleted, and at this time, the floating layer does not appear on the screen of the display of the first terminal any more.
In this embodiment, the processing server already stores a plurality of copies of the image data of the floating image layer associated with the second identification information of the second target data. Therefore, after receiving the end input message, the server generates response data from the first identification information and the second identification information carried in the end input message and the stored image data of the floating image layer associated with the first identification information and the second identification information of the end of answering, and sends the response data to at least one second terminal, which may be a teacher terminal device and other student terminal devices.
It is to be understood that the image data of the floating layer in the response data may include image data of a plurality of floating layers of the first object at a time of inputting the first target data.
After the second terminal receives the response data, the image data of the floating layer can be obtained from the response data, wherein the image data of the floating layer comprises ground color image data and response track data.
The second terminal displays the floating layer on the uppermost layer of a screen of a display of the second terminal according to the image data of the floating layer, and can also display second identification information of second target data and first identification information of the first terminal.
In this embodiment, the second object may view the images of the multiple floating layers in the response data on the display of the second terminal, and it is understood that the images of the multiple floating layers in the response data may constitute a video to show the whole process of inputting the first object data by the first object.
Optionally, before the second terminal displays the floating layer on the uppermost layer of the screen of the display of the second terminal according to the image data of the floating layer, the second terminal may further obtain a preset bottom image, and cover an input track in the image data of the floating layer on the preset bottom image, so as to generate the floating layer. The second terminal of this embodiment may further display the floating layer on an uppermost layer of a screen of a display of the second terminal, and display second identification information of the second target data and first identification information of the first terminal.
The image data of the floating image in this embodiment may be encoded in the transmission process, and then the processing module of the second terminal in this embodiment may decode the image data of the floating image layer, so as to obtain the floating image layer, and display the floating image layer on the uppermost layer of the screen of the display of the second terminal.
In addition, the embodiment may help the second object distinguish a process in which the second terminal displays which first object of the first terminal replies to which second object data is input.
Optionally, the processing module of the second terminal in this embodiment may preset a corresponding relationship between the attribute information of the first object and the first identification information of the first terminal, for example, the attribute information may be a corresponding relationship between personal information of the first object, such as a name and a class to which the first object belongs, and an ID of the student end device, so that the processing module of the second terminal may determine the attribute information of the second object according to the first identification information of the first terminal by querying the preset corresponding relationship, and further may display the attribute information of the first object on the ground color image of the floating layer.
Optionally, in this embodiment, when the second object corresponding to the target terminal views the first object input process embodied by the image data of the multiple floating layers, it may be found that there is an error or a place that needs to be modified additionally, and at this time, the second object may modify and display the image data of the floating layers by operating an input device connected to the teacher-side device, such as a mouse, a keyboard, a touch screen, and the like. After receiving a keyboard and mouse event or a touch event for a floating image, a processing module of the second terminal may correct and display an input trajectory of first target data in image data of a frame of floating layer currently displayed on a display of the second terminal, and replace an original input trajectory in the image data of the frame of floating layer by the modified and displayed input trajectory. Wherein the modification may include addition, deletion, and modification.
It should be noted that, after receiving the response data, the target terminal of this embodiment may modify and indicate the input process of the first target data therein, and after receiving the response data, the second terminal of the at least one second terminal except the target terminal may not need to modify the input process of the first target data.
In this embodiment, the target terminal may generate, according to the corrected and approved image data of the floating layer, second target information used for indicating a process in which the second object modifies the first target data on the target terminal, for example, modified and approved answer data, where the second target information may include the corrected and approved image data of the floating layer, the first identification information of the first terminal, and the second identification information of the second target data, and then send the second target information to the server.
After receiving the second target information, the server may forward the second target information to the first terminal corresponding to the first identification information.
After receiving the second target information, the first terminal analyzes the corrected and indicated image data of the floating layer, displays the corrected and indicated image data of the floating layer on a display of the first terminal, and displays second identification information of the second target information.
In this embodiment, only the input trajectory may be included in the image data of the floating image periodically acquired by the first terminal, without including the ground color image data. Correspondingly, after the second terminal receives the response data, the image data of the floating layer obtained from the response data may only include the input track of the first target data, and does not include the background image data.
The data processing system of the embodiment can realize that the target terminal in the at least one second terminal and other second terminals except the target terminal in the at least one second terminal can see the process that the first object inputs the first target data at the first terminal, and the target terminal can also correct and display the process that the first object inputs the first target data and return the corrected and displayed data to the first object. In this way, for the terminals (the first terminal and the other second terminals) except the target terminal, not only the process of the object input data corresponding to the other terminals can be seen, but also the correction and the approval of the target object for the input data can be seen; for the second object corresponding to the target terminal, the process of inputting data by the first terminal and other second terminals can be seen, so that the second object can be helped to know the response condition of the first object to the second target data, and the method is convenient for guidance in a targeted manner, solves the technical problem that the process of inputting data by the object cannot be determined in the prior art, achieves the technical effect of determining the process of inputting data by the object, and further can improve the quality of remote tutoring.
Example 2
In accordance with an embodiment of the present invention, there is provided an embodiment of a data processing method, it should be noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than that herein.
Fig. 2 is a flow chart of a data processing method according to an embodiment of the present invention. As shown in fig. 2, the method may include the steps of:
step S202, the first terminal obtains first target information, wherein the first target information is used for indicating a process that the first object inputs first target data in a floating layer of the first terminal.
In the technical solution provided in step S202 of the present invention, the first terminal is a terminal for inputting first target data, where the first target data is data input by the first object in the floating layer, for example, if the first terminal is a student terminal device, and the first object is a student user, the first target data may be subject data input by the student user. The first target information of this embodiment may be used to completely indicate a process of inputting the first target data by the first object in the floating layer of the first terminal, and may include first identification information of the first terminal, second identification information of the second target data, and image data of the floating layer, where the first target data is reply data of the second target data, and the image data may include ground color image data and answering track data.
Step S204, the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data.
In the technical solution provided in step S204 of the present invention, after the first terminal acquires the first target information, the first terminal sends the first target information to at least one second terminal, where the first target information is used to enable the second terminal to display a process of inputting the first target data by the first object.
In this embodiment, the first terminal may send the first target information to at least one second terminal through the server, where the at least one second terminal may include a target terminal, and the target terminal may be a teacher terminal device, and may also include a terminal having the same property as the first terminal, for example, including other student user terminals. The target terminal of the embodiment may display a process of inputting the first target data by the first object, and may also correct and display the received first target data on the target terminal by the second object to obtain the second target data, and return the second target data to the first terminal through the server, so that the first object can check the process of correcting the first target data on the first terminal, thereby achieving the purposes of correcting errors and missing and completing the process of determining the input data by the object.
Through the above steps S202 to S204 in the present application, the first terminal obtains first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal; the first terminal sends first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display a process that the first object inputs first target data. That is to say, the first terminal of this embodiment sends the first target information used for indicating the process of inputting the first target data by the first object in the floating layer of the first terminal to the at least one second terminal, so that the process of inputting the first target data by the first object is displayed on the second terminal, and the process of displaying only the final result of the input data but not the input data is avoided, thereby solving the technical problem that the process of inputting data by the object cannot be determined, and further achieving the technical effect of determining the process of inputting data by the object.
The above-described method of this embodiment is further described below.
As an optional implementation manner, in step S202, before the first terminal acquires the first target information, the method further includes: the first terminal acquires second target data sent by the target terminal, responds to a first operation instruction, and generates a floating layer on a screen upper layer of the first terminal, wherein at least one second terminal comprises the target terminal, and the first target data is reply data of the second target data.
In this embodiment, the second object sends the second target data to the first terminal through the target terminal, and the second target data may be practice questions in a student answering scene. Alternatively, the second object may present the second target data to the first object by means of speech or courseware or the like. The first object receives the second target data through the first terminal. When the first object starts to answer the second target data, for example, when the student is ready to start answering, the first operation instruction triggered by the first object at the first terminal is responded, and the first operation instruction can be an answering starting instruction. After the first terminal receives the first operation instruction, an input data mode, for example, an answer mode, may be started.
Optionally, the input data mode of this embodiment is to generate a floating layer on the screen of the first terminal in response to the first operation instruction, where the floating layer may be disposed on the uppermost layer of the screen of the first terminal. The floating layer can be opaque or semitransparent, and can be specifically set according to user habits.
Alternatively, this embodiment may reduce the amount of image data, and the ground color of the floating layer may be a solid color image.
Optionally, the first object may set the size and the position of the floating layer on the display of the first terminal according to the size and the personal habit of the display of the first terminal. The size of the floating layer may be smaller than a screen size of a display of the first terminal of the display.
As an optional implementation manner, after responding to the first operation instruction, the method further includes: the first terminal sends a start input message to the server, wherein the start input message carries first identification information of the first terminal and second identification information of second target data and is used for indicating the first object to start inputting the first target data, and the first identification information and the second identification information are stored by the server.
In this embodiment, after responding to the first operation instruction, the first terminal may send a start input message to the server, where the start input message is used to indicate that the first target data is started to be input, and may be answer start information, and the first identification information of the first terminal and the second identification information of the second target data may be carried, for example, an ID of the student end device and an ID of the subject are carried. After the server acquires the start input information, the server may store the first identification information and the second identification information.
As an optional implementation manner, in step S202, the first terminal acquires first target information, including: the method comprises the steps that a first terminal periodically obtains image data of a floating image layer and generates first target information, wherein the first target information comprises at least one of the following information: the image processing device comprises first identification information, second identification information and image data, wherein the image data comprises an input track of first target data and/or ground color image data of a floating layer.
In this embodiment, when the first terminal acquires the first target information, the first terminal may periodically acquire the image data of the floating layer and generate the first target information, for example, the student end device periodically acquires the image data of the floating layer and generates the answer data according to the acquired image data of the floating layer. The first target information of this embodiment may include the first identification information of the first terminal, the second identification information, and the image data described above, and the image data may further include an input trajectory of the first target data and/or ground color image data of the floating layer. Alternatively, the image data may include only the input trajectory, without including the ground color image data.
As an optional implementation manner, after generating the first target information, the method further includes: the first terminal sends the first target information to the server periodically, or the first terminal sends the first target information to the server under the condition that the first target data input by the first object is finished, wherein the image data in the first target information is used for establishing an association relation with the first identification information and the second identification information which are stored by the server.
The first terminal of the embodiment may send the first target information to the server periodically, or may first store the first target information acquired periodically in a local storage module, without sending the first target information to the server, and when the first target data is input by the first object, the first target information, the second target information, and the image data of the floating image layer are sent to the server in a unified manner, and then the server forwards the image data to the second terminal. The server may store image data of a floating layer in the first target information after receiving the first target information periodically sent by the first terminal, and associate first identification information that has been stored in advance with second identification information.
As an optional implementation, the method further comprises: the first terminal responds to a second operation instruction acting on the floating layer to generate an input track; the first terminal displays an input track on the floating layer.
In this embodiment, the second operation instruction may be generated by a keyboard and mouse event or a touch event for the floating layer. The first terminal may generate an input trace of the first target data on the floating layer after receiving a keystroke event or a touch event for the floating layer, for example, a response trace. Alternatively, the first object may write the input trace on the floating layer using an input device such as a mouse, a keyboard, a touch screen, or the like. It should be noted that, this embodiment may use a keystroke event or a touch event within the floating layer as a keystroke event or a touch event for the floating layer. After receiving the keyboard and mouse event or the touch event for the floating layer, the first terminal may display an input track corresponding to the keyboard and mouse event or the touch event on the floating layer, where, for example, a ═ 1, b ═ 2, and c ═ 3 displayed on the floating layer are input tracks displayed on the floating layer.
It should be noted that the first object of this embodiment may also modify the input trajectory of the first target data. To reduce the amount of data encoded by the image, the font or image of the input track may be solid.
As an optional implementation manner, after the first terminal acquires the first target information in step S202, the method further includes: and the first terminal responds to the third operation instruction, and sends an ending input message to the server, and/or deletes the floating image layer from the screen, wherein the ending input message carries the first identification information and the second identification information and is used for indicating the server to send the received first target information to the second terminal based on the first identification information and the second identification information.
In this embodiment, the first object may respond to the third operation instruction after completing the input of the first target data, for example, after the student user completes the answer of the question, and the third operation instruction may be an answer ending instruction triggered by the student end device, which may carry a question ID and an ID of the student end device in an answer ending message. And the first terminal responds to the third operation instruction and sends an ending input message to the server so that the processing server sends the first target information received before to the second terminal in a unified manner, and meanwhile, a floating layer of the first terminal can be deleted, and at the moment, the floating layer does not appear on the screen of the first terminal.
As an optional implementation manner, after the first terminal sends the first target information to the at least one second terminal, the method further includes: the first terminal acquires second target information, wherein the second target information is used for indicating a process that a second object modifies the first target data on the target terminal, at least one second terminal comprises the target terminal, and other second terminals except the target terminal in the at least one second terminal forbid modifying the first target data.
In this embodiment, the target terminal may generate, according to the corrected and approved image data of the floating layer, second target information used for indicating a process in which the second object modifies the first target data on the target terminal, for example, modified and approved answer data, where the second target information may include the corrected and approved image data of the floating layer, the first identification information of the first terminal, and the second identification information of the second target data, and then send the second target information to the server.
It should be noted that, after receiving the first target information of the at least one second terminal, other second terminals than the target terminal do not need to modify the input track of the first target data therein.
In this embodiment, the server may store a plurality of copies of the image data of the floating image layer associated with the second identification information of the second target data. And after receiving a third response operation instruction, generating response data according to the first identification information and the second identification information carried in the third response operation instruction and the stored image data of the floating image layer associated with the first identification information and the second identification information in the first target information, and further sending the response data to the second terminal.
It is to be understood that the image data of the floating layer in the response data may include image data of a plurality of floating layers of the first object during the time of inputting the first target data.
After receiving the response data, the second terminal obtains the image data of the floating layer from the response data, that is, obtains the ground color image data in the image data of the floating layer and the input track of the first target data.
The data processing method of this embodiment is further described below from the target terminal side.
Fig. 3 is a flow chart of another data processing method according to an embodiment of the present invention. As shown in fig. 3, the method may include the steps of:
step S302, a target terminal acquires first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal.
In the technical solution provided by step S302 of the present invention, the target terminal may be a teacher-side device, and may obtain the first target information sent by the server, where the first target information in this embodiment may be used to completely indicate a process in which the first object inputs the first target data in a floating layer of the first terminal, and may include first identification information of the first terminal, second identification information of the second target data, and image data of the floating layer, where the first target data is reply data of the second target data, and the image data may include ground color image data and answering track data.
In step S304, the target terminal displays a process of inputting first target data of the first object based on the first target information.
In the technical solution provided by step S304 of the present invention, after the target terminal acquires the first target information, a process of inputting the first target data by the first object may be displayed based on the first target information.
The target terminal of the embodiment may display a process of inputting the first target data by the first object, and may also correct and display the received first target data on the target terminal by the second object to obtain the second target data, and return the second target data to the first terminal through the server, so that the first object can check the process of correcting the first target data on the first terminal, thereby achieving the purposes of correcting errors and missing and completing the process of determining the input data by the object.
Through the steps S302 to S304, the target terminal obtains first target information, where the first target information is used to indicate a process of inputting first target data by a first object in a floating layer of the first terminal; the target terminal displays a process of inputting first target data by the first object based on the first target information. That is to say, the first terminal of this embodiment sends the first target information used for indicating the process of inputting the first target data by the first object in the floating layer of the first terminal to the at least one second terminal, so that the process of inputting the first target data by the first object is displayed on the second terminal, and the process of displaying only the final result of the input data but not the input data is avoided, thereby solving the technical problem that the process of inputting data by the object cannot be determined, and further achieving the technical effect of determining the process of inputting data by the object.
The above-described method of this embodiment is further described below.
As an optional implementation manner, before the target terminal acquires the first target information in step S302, the method further includes: the target terminal sends second target data to the first terminal, wherein the first target data is reply data of the second target data
In this embodiment, the second object sends the second target data to the first terminal through the target terminal, and the second target data may be practice questions in a student answering scene. Alternatively, the second object may present the second target data to the first object by means of speech or courseware or the like. The first object receives the second target data through the first terminal.
As an optional implementation manner, in step S302, the target terminal acquires first target information, including: the target terminal acquires first target information sent by the server, wherein the first target information comprises at least one of the following information: the first identification information of the first terminal, the second identification information of the second target data and the image data of the floating layer, wherein the image data comprises an input track of the first target data and/or ground color image data of the floating layer.
In this embodiment, the target terminal obtains the first target information sent by the server, where the first target information may include first identification information of the first terminal, second identification information, and the image data, for example, an ID of the student device, an ID of a theme, an input track of the first target information, and/or ground color image data of a floating layer. Alternatively, the image data may include only the input trajectory, without including the ground color image data.
As an optional implementation, the method further comprises: the target terminal displays the floating layer on the upper layer of a screen of the target terminal and displays first identification information and second identification information; and/or the target terminal generates a target video from the image data of the multiple floating image layers, and the target video is used for showing the process that the first object is input into the first target data.
In this embodiment, the target terminal displays the floating layer on the upper layer of the screen of the target terminal according to the image data of the floating layer, for example, the teacher device displays the floating layer on the uppermost layer of the screen of the display of the target terminal according to the image data of the floating layer. The embodiment may also display the first identification information and the second identification information, for example, display the title ID and the ID of the student side device.
In this embodiment, the second object may view, on the display of the first terminal, image data of multiple floating layers in the first target information, and it may be understood that the image data of the multiple floating layers may form a video, and the english shows an entire process of inputting the first target data by the first object at the first terminal.
In this embodiment, the image data of the floating layer needs to be encoded during transmission, and in this embodiment, the target terminal may decode the image data of the floating layer to obtain the floating layer, and display the floating layer on the uppermost layer of the screen of the display of the target terminal.
In addition, the first representation information of the first terminal and the second identification information of the second object data displayed on the uppermost layer of the screen of the display of the object terminal may help the second object determine which first object corresponding to which first terminal is displayed by the second object to reply to which second object data.
Optionally, in this embodiment, a corresponding relationship between the attribute information of the first object and the first identification information of the first terminal may be preset on the target terminal, where the attribute information may include personal information such as a name of the first object and a class to which the first object belongs, so that the target terminal may determine the attribute information of the first object according to the identification information of the first terminal by querying the preset corresponding relationship, and further display the attribute information of the first object on the ground color image of the floating layer.
As an optional implementation, after the process of the target terminal displaying the first object to input the first target data based on the first target information, the method further includes: and the target terminal sends second target information to the first terminal, wherein the second target information is used for indicating the process that the second object modifies the first target data on the target terminal.
In this embodiment, the target terminal may generate, according to the corrected and approved image data of the floating layer, second target information used for indicating a process in which the second object modifies the first target data on the target terminal, for example, modified and approved answer data, where the second target information may include the corrected and approved image data of the floating layer, the first identification information of the first terminal and the second identification information of the second target data, and then send the second identification information to the server, and the server sends the second identification information to the first terminal corresponding to the first identification information.
In this embodiment, when the second object corresponding to the target terminal views the process of inputting the first target data by the first object represented by the image data of the plurality of floating layers, it may be found that there is an error or a place that needs to be modified additionally. After receiving a keyboard and mouse event or a touch event for a floating image, a processing module of the second terminal may correct and display an input trajectory of first target data in image data of a frame of floating layer currently displayed on a display of the second terminal, and replace an original input trajectory in the image data of the frame of floating layer by the modified and displayed input trajectory. Wherein the modification may include addition, deletion, and modification.
It should be noted that, after receiving the first target information of the at least one second terminal, other second terminals than the target terminal do not need to modify the input track of the first target data therein.
Another data processing method according to an embodiment of the present invention is described below from the server side.
Fig. 4 is a flow chart of another data processing method according to an embodiment of the present invention. As shown in fig. 4, the method may include the steps of:
step S402, the server acquires first target information, wherein the first target information is used for indicating a process that the first object inputs first target data in a floating layer of the first terminal.
In the technical solution provided in step S402 of the present invention, the first terminal is a terminal for inputting first target data, where the first target data is data input by the first object in the floating layer, for example, if the first terminal is a student terminal device, and the first object is a student user, the first target data may be subject data input by the student user. The server obtains first target information sent by the first terminal, where the first target information may be used to completely indicate a process in which the first object inputs first target data in a floating layer of the first terminal, and may include first identification information of the first terminal, second identification information of second target data, and image data of the floating layer, where the first target data is reply data of the second target data, and the image data may include ground color image data and answering track data.
Step S404, the server forwards the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process that the first object inputs the first target data.
In the technical solution provided by step S404 of the present invention, both the first terminal and the second terminal are connected to the server. After the server acquires the first target information, the first target information may be forwarded to at least one second terminal, where the first target information is used to enable the second terminal to display a process of inputting the first target data by the first object, and the at least one second terminal may include a target terminal, which may be a teacher terminal device or may include a terminal having the same property as the first terminal, for example, including other student user terminals. The target terminal of the embodiment may display a process of inputting the first target data by the first object, and may also correct and display the received first target data on the target terminal by the second object to obtain the second target data, and return the second target data to the first terminal through the server, so that the first object can check the process of correcting the first target data on the first terminal, thereby achieving the purposes of correcting errors and missing and completing the process of determining the input data by the object.
Optionally, in this embodiment, after the first terminal acquires the first operation instruction triggered by the first object, the server acquires a start input message sent by the first terminal, where the start input message carries first identification information of the first terminal and second identification information of the second target data, for example, the first identification information may be a device ID, and the second identification information may be a topic ID.
After receiving the start input information sent by the first terminal, the server may store the first identification information and the second identification information carried by the start input information.
The server of this embodiment may periodically acquire the first target information transmitted by the first terminal. After the server receives the first target information periodically sent by the first terminal, the second identification information of the second target data and the image data of the floating image layer may be parsed from the first target information. The server may establish an association relationship between the image data of the floating image layer and the second identification information and the first identification information that have been previously stored, and store the image data of the floating image layer.
Optionally, the processing module of the first terminal of this embodiment may store the first target information that is periodically obtained in the local storage module, without sending the first target information to the server. After the first target data is input, the server acquires first target information which is sent by the first terminal and used for indicating the input process of the first target data, and then forwards the first target information to the second terminal.
In this embodiment, after the first object completes inputting the first target data, an end input message is triggered to the first terminal, and the end input message carries the second identification information and the first identification information. After receiving the end input message, the processor of the first terminal sends the end input message to the server, so that the server sends the previously received first target information to the second terminal in a unified manner, and meanwhile, the floating layer is deleted, and at this time, the floating layer does not appear on the screen of the display of the first terminal any more.
In this embodiment, the server already stores a plurality of copies of the image data of the floating image layer associated with the second identification information of the second target data. Therefore, after receiving the end input message, the server generates response data from the first identification information and the second identification information carried in the end input message and the stored image data of the floating image layer associated with the first identification information and the second identification information of the end of answering, and sends the response data to at least one second terminal, which may be a teacher terminal device and other student terminal devices.
In this embodiment, the target terminal may obtain, according to the corrected and approved image data of the floating layer, second target information used to indicate a process in which the second object modifies the first target data on the target terminal, and may forward the second target information to the first terminal corresponding to the first identification information.
The data processing method of the embodiment can realize that the target terminal in the at least one second terminal and other second terminals except the target terminal in the at least one second terminal can see the process that the first object inputs the first target data at the first terminal, the target terminal can also correct and display the process that the first object inputs the first target data, and the corrected and displayed data is returned to the first object. In this way, for the terminals (the first terminal and the other second terminals) except the target terminal, not only the process of the object input data corresponding to the other terminals can be seen, but also the correction and the approval of the target object for the input data can be seen; for the second object corresponding to the target terminal, the process of inputting data by the first terminal and other second terminals can be seen, so that the second object can be helped to know the response condition of the first object to the second target data, and the method is convenient for guidance in a targeted manner, solves the technical problem that the process of inputting data by the object cannot be determined in the prior art, achieves the technical effect of determining the process of inputting data by the object, and further can improve the quality of remote communication.
Example 3
The technical solutions of the embodiments of the present invention will be illustrated below with reference to preferred embodiments. Specifically, in a student answering scene, the first terminal is student-side equipment, and the second terminal is teacher-side equipment for example receiving.
In the current remote education system, students can see pictures of on-site teaching of a teacher or courseware pages of the teacher in terminal equipment at home, and the teacher and the students can communicate with each other through voice and video through the terminal equipment arranged at the teacher and the terminal equipment arranged at the students respectively, so that a real teaching scene is simulated.
However, in a scenario of student answering, students can send final question answering to teacher-side equipment through student-side equipment, so that teachers can check the questions on the teacher-side equipment. However, the teacher cannot know the answering process of the students, thereby causing adverse effects on the teaching effect.
The method performed by the distance education system of this embodiment can solve the above-mentioned problems. As described further below.
Fig. 5 is a schematic diagram of a distance education system according to an embodiment of the present invention. As shown in fig. 5, the distance education system may include: teacher end device 51, processing server 52 and student end device 53. The teacher-side device 51 may include: processing module 1 and display 1, student side device 53 may include: a processing module 2 and a display 2.
In this embodiment, the teacher-side device 51 and the student-side device 53 may be the same, and are intelligent terminal devices such as a computer, a tablet computer, and a smart phone. Teacher end device 51 and student end device 53 are both connected to processing server 52.
The processing steps of the distance education system of this embodiment may be as follows:
step 1, after obtaining an instruction for starting answering issued by a student user, a student end device 53 generates a floating layer on a screen of a display 2, wherein the floating layer is arranged on the uppermost layer of the screen; and sending a message for starting answering to the processing server 52, wherein the message for starting answering carries the ID of the student end device 53 and the ID of the topic.
In this step, the teacher sends practice questions to the students through the teacher-side device 51, and optionally, the teacher may show the practice questions to the students through voice or courseware; the student receives the exercise questions through the student end device 53, and when the student prepares to answer the questions, the student issues an instruction for answering the questions at the student end device 53; the processing module 2 of the student-side device 53 starts the answering mode after receiving the instruction to start answering.
Optionally, the answer mode in this embodiment is to generate a floating layer on the screen of the display 2 of the student-side device 53, where the floating layer is disposed on the uppermost layer of the screen of the display 2. The floating layer can be opaque or semitransparent, and can be specifically set according to user habits.
Alternatively, in order to reduce the amount of image data, the ground color of the floating image layer may be a solid color image.
The student user can set the size and position of the floating layer according to the size and personal habits of the display 2. The size of the floating layer may be smaller than the screen size of the display 2. Reference may be made in particular to fig. 6. FIG. 6 is a schematic diagram of a floating layer according to an embodiment of the invention. As shown in fig. 6, the vertical bar thick frame is a screen of the display 2, and the transparent layer a on the screen is a floating layer, which may be a blue semitransparent floating layer.
Step 2, after receiving the message of starting answering, the processing server 52 stores the ID of the student end device 53 and the ID of the subject carried in the message of starting answering.
And 3, after receiving the keyboard and mouse event or the touch event aiming at the floating layer, the student end device 53 generates a response track on the floating layer.
In this step, the student answers the question through the student-side device 53, and optionally, the student may write an answer track on the floating layer by using an input device such as a mouse, a keyboard, a touch screen, or the like. It should be noted that, this embodiment may use a keystroke event or a touch event within the floating layer as a keystroke event or a touch event for the floating layer. After receiving the keyboard and mouse event or the touch event for the floating layer, the processing module 2 of the student end device 53 displays the answering track corresponding to the keyboard and mouse event or the touch event on the floating layer.
FIG. 7 is a diagram illustrating a response track displayed on a floating layer according to an embodiment of the present invention. As shown in fig. 7, a is 1, b is 2, and c is 3, which is the response track displayed on the floating layer.
It should be noted that the student user of this embodiment can modify the answering track on the student side device 53. To reduce the amount of data encoded by the image, the font or image of the reply track may be solid.
Step 4, the student end device 53 periodically obtains the image data of the floating layer, and generates answer data according to the obtained image data of the floating layer, wherein the answer data comprises the image data of the floating layer, the ID of the question and the ID of the student end device 53, and the image data of the floating layer can comprise ground color image data and answer track data; and periodically transmits the answer data to the processing server 52.
In this step, the processing module 2 of the student side device 53 may periodically acquire the image data of the floating layer, and generate answer data, where the answer data includes the image data of the floating layer and the ID of the question, the floating layer data includes ground color image data and answer track data, and periodically send the answer data to the processing server 52.
Step 5, after receiving the answer data, the processing server 52 analyzes the ID of the question and the image data of the floating layer from the answer data; and establishing an association relation between the image data of the floating layer and the stored ID of the subject and the ID of the student-side device 53, and storing the image data of the floating layer.
In this step, the processing server 52, after receiving the answer data periodically transmitted by the student side device 53, saves the image data of the floating layer in the answer data, and associates the ID of the subject and the ID of the student side device 53, which have been saved in advance, with the image data of the floating layer.
Step 6, after obtaining an answer ending instruction issued by the user, the student end device 53 sends an answer ending message to the processing server 52, wherein the answer ending message carries the question ID and the ID of the student end device 53; and deleting the floating image layer from the screen.
In this step, after completing answering the question, the student user issues an instruction to end answering to the student side device 53; after receiving the answer ending instruction, the processing module 2 of the student side device 53 sends an answer ending message to the processing server 52, so that the processing server 52 sends the previously received answer data corresponding to the question to the teacher side device 51 in a unified manner; at the same time, the floating layer is deleted, at which point the floating layer no longer appears on the screen of the display 2.
Step 7, after receiving the message of answering, the processing server 52 generates answering data according to the question ID and the ID of the student end device 53 carried in the message of answering and the image data of the floating layer which is stored and is associated with the ID of the question carried in the message of answering and the ID of the student end device 53; the response data is transmitted to the teacher's side device 51.
In this step, since the processing server 52 has already saved a plurality of sets of image data of the floating layer associated with the ID of the title, the processing server 52, after receiving the message of answering end, can transmit the answer data including the image data of the floating layer associated with the ID of the title and the ID of the student side device 53, and the ID of the title and the ID of the student side device 53 to the teacher side device 51 and the other student side devices.
It is understood that the image data of the floating layers in the response data may include images of a plurality of floating layers of the student user during the answering time.
Step 8, after receiving the response data, the teacher device 51 obtains image data of the floating layer from the response data, where the image data of the floating layer includes ground color image data and response trajectory data.
In this step, the processing module 1 of the tutor-side device 51 obtains the image data of the floating layer from the response data after receiving the response data.
Step 9, the teacher-side device 51 displays the floating layer on the uppermost layer of the screen of the display 1 according to the image data of the floating layer, and displays the title ID and the ID of the student-side device 53.
In this step, the teacher user may view the images of the multiple floating layers in the response data on the display 1 of the teacher-side device 51, and it can be understood that the images of the multiple floating layers in the response data may form a video, which shows the whole process of answering questions of the student user.
In a normal case, image data needs to be encoded during transmission, and in this step, the processing module 1 decodes the image data of the floating layer, acquires the floating layer, and displays the floating layer on the uppermost layer of the screen of the display 1.
In addition, displaying the title ID and the ID of the student-side device 53 can help the teacher user distinguish which student is the process of answering which question.
As a preferred embodiment, the processing module 1 of the teacher-side device 51 may preset a corresponding relationship between personal information such as names and classes of students and the ID of the student-side device 53, so that the processing module 1 may determine the personal information of the students according to the ID of the student-side device 53 by querying the preset corresponding relationship, and display the personal information of the students on the ground color image of the floating layer.
It should be noted that, after receiving the response data, the other student side devices process the same steps as steps 8 to 9. Since other students do not need to modify the answering process in the answering data, the follow-up steps do not need to be executed for other student-side devices.
Step 10, after receiving a keyboard and mouse event or a touch event for a floating layer, teacher end device 51 corrects the answering trajectory data in the image data of a floating layer currently displayed on display 1; and replacing the response track data in the image data of the frame floating layer with the corrected response track data.
Wherein, the modification includes addition, deletion and modification.
In this step, when the teacher user views the answer process of the student user represented by the images of the plurality of floating layers, the teacher user may find a place where an error exists or the supplementary modification is needed, and at this time, the teacher user may modify the image of the floating layer by operating an input device connected to the teacher-side device 51, such as a mouse, a keyboard, a touch screen, or the like. After receiving a keyboard and mouse event or a touch event for a floating image, the processing module 1 of the teacher device 51 may change the answering trajectory data in the image data of a frame of floating image layer currently displayed on the display 1; and replacing the response track data in the image data of the frame floating layer with the corrected response track data.
Step 11, the teacher end device 51 generates correction answer data according to the corrected image data of the floating layer, wherein the correction answer data comprises the corrected image data of the floating layer, the ID of the student end device 53 and the ID of the subject; the modified answer data is sent to the processing server 52.
In step 12, after receiving the corrected answer data, the processing server 52 forwards the corrected answer data to the student side device 53 corresponding to the ID of the student side device 53.
Step 13, after receiving the corrected answer data, the student side device 53 analyzes the corrected image data of the floating layer, displays the corrected image of the floating layer on the display 2, and displays the ID of the question.
In another implementation, the processing module 2 of the student-side device 53 may store the periodically acquired answer data in a local storage module without sending the answer data to the processing server 52. After completion of the answering, the answer data corresponding to the ID of the question is transmitted to the processing server 52, and the processing server 52 forwards the answer data to the tutor side device 51.
In another implementation, the image data of the floating image periodically acquired by the student in step 4 may include only the answering track data, and not the background image data. Correspondingly, in step 8, after receiving the response data, the teacher device 51 obtains the image data of the floating layer from the response data, and only includes the response trajectory data, and does not include the background image data.
Accordingly, step 801 and step 802 are added before step 9.
Step 801, the teacher device 51 obtains a preset bottom image, and covers the answering track data in the image data of the floating layer on the preset bottom image to generate a floating layer.
In step 802, the teacher-side device 51 displays the floating layer on the uppermost layer of the screen of the display 1, and displays the title ID and the ID of the student-side device 53.
In this embodiment, the teacher user and the other student users can see the answer process of the student user, and the teacher user can also correct the answer process of the student user and return the corrected answer to the student user. Therefore, students can see not only the answering processes of other students, but also the correction and the approval of the teacher for the answering process; for the teacher, the teacher can see the answering process of the students, and the teacher can be helped to know the mastering degree of the students on the knowledge points so as to assist the students in pertinence, and the improvement points are all helpful for improving the teaching quality of distance education. According to the embodiment, the floating layer can be arranged in the screen of the display of the student end device used by the student user, and the student can input the answering track in the floating layer; when the student users answer, the answering data which embodies the complete answering process and comprises the image data of the floating image layer can be sent to the teacher end device used by the teacher user, so that the teacher user can check the answering process of the student users on a display of the teacher end device. In addition, the teacher user can also correct the received answer data at the teacher end equipment and return the corrected answer data to the student end equipment, so that the student user can check the answer process corrected by the teacher user on the student end equipment, correct wrong questions and check missing and fill in gaps, and the teaching quality of remote education is improved.
It should be noted that, the method in this embodiment is exemplified by a student answering scene, and is not limited to the method in this embodiment of the present invention being only used in a student answering scene, and any scene requiring a process of determining object input data may be applicable to the method in this embodiment of the present invention, and is not illustrated here.
Example 4
The embodiment of the invention also provides a data processing device. It should be noted that the data processing apparatus of this embodiment can be used to execute the data processing method shown in fig. 2 according to the embodiment of the present invention.
Fig. 8 is a schematic diagram of a data processing apparatus according to an embodiment of the present invention. As shown in fig. 8, the data processing apparatus 80 may include: a first acquisition unit 81 and a first transmission unit 82.
The first obtaining unit 81 is configured to enable the first terminal to obtain first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal.
A first sending unit 82, configured to enable the first terminal to send first target information to at least one second terminal, where the first target information is used to enable the second terminal to display a process of the first object inputting the first target data.
The embodiment of the invention also provides another data processing device. It should be noted that the data processing apparatus of this embodiment can be used to execute the data processing method shown in fig. 3 according to the embodiment of the present invention.
FIG. 9 is a schematic diagram of another data processing apparatus according to an embodiment of the present invention. As shown in fig. 9, the data processing apparatus 90 may include: a second acquisition unit 91 and a display unit 92.
The second obtaining unit 91 is configured to enable the target terminal to obtain first target information, where the first target information is used to indicate a process in which the first object inputs first target data in a floating layer of the first terminal.
A display unit 92 for displaying a process of the target terminal inputting the first target data of the first object based on the first target information.
The embodiment of the invention also provides another data processing device. It should be noted that the data processing apparatus of this embodiment can be used to execute the data processing method shown in fig. 4 according to the embodiment of the present invention.
FIG. 10 is a schematic diagram of another data processing apparatus according to an embodiment of the present invention. It should be noted that the data processing apparatus of this embodiment can be used to execute the data processing method shown in fig. 4 according to the embodiment of the present invention.
Fig. 10 is a flowchart of another data processing apparatus according to an embodiment of the present invention. As shown in fig. 10, the data processing apparatus 100 may include: a third acquisition unit 101 and a forwarding unit 102.
A third obtaining unit 101, configured to enable the server to obtain first target information, where the first target information is used to instruct a process in which the first object inputs first target data in a floating layer of the first terminal.
A forwarding unit 102, configured to enable the server to forward the first target information to at least one second terminal, where the first target information is used to enable the second terminal to display a process of inputting the first target data by the first object.
In the data processing apparatus of this embodiment, the first terminal sends the first target information indicating a process in which the first object inputs the first target data in the floating layer of the first terminal to the at least one second terminal, so that the process in which the first object inputs the first target data is displayed on the second terminal, and a process in which only a final result of the input data can be displayed but the input data cannot be displayed is avoided, thereby solving a technical problem that the process in which the object inputs the data cannot be determined, and further achieving a technical effect of determining the process in which the object inputs the data.
Example 5
According to an embodiment of the present invention, there is also provided a storage medium including a stored program, wherein when the program is executed by a processor, the apparatus on which the storage medium is located is controlled to execute the data processing method according to the embodiment of the present invention.
Example 6
According to an embodiment of the present invention, there is also provided a processor, configured to execute a program, where the program executes the data processing method described in embodiment 1.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (19)

1. A data processing method, comprising:
a first terminal acquires first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal;
and the first terminal sends the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process of inputting the first target data by the first object.
2. The method of claim 1, wherein before the first terminal obtains the first target information, the method further comprises:
the first terminal acquires second target data sent by a target terminal, responds to a first operation instruction, and generates the floating layer on the upper layer of the screen of the first terminal, wherein the at least one second terminal comprises the target terminal, and the first target data is reply data of the second target data.
3. The method of claim 2, wherein after responding to the first operational instruction, the method further comprises:
the first terminal sends a start input message to a server, wherein the start input message carries first identification information of the first terminal and second identification information of the second target data and is used for indicating the first object to start inputting the first target data, and the first identification information and the second identification information are stored by the server.
4. The method of claim 3, wherein the obtaining of the first target information by the first terminal comprises:
the first terminal periodically acquires the image data of the floating image layer and generates the first target information, wherein the first target information comprises at least one of the following information: the first identification information, the second identification information and the image data, wherein the image data includes an input track of the first target data and/or ground color image data of the floating image layer.
5. The method of claim 4, wherein after generating the first target information, the method further comprises:
the first terminal sends the first target information to a server periodically, or the first terminal sends the first target information to the server under the condition that the first target data is input by the first object, wherein image data in the first target information is used for establishing an association relationship with the first identification information and the second identification information which are stored by the server.
6. The method of claim 4, further comprising:
the first terminal responds to a second operation instruction acting on the floating layer to generate the input track;
and the first terminal displays the input track on the floating layer.
7. The method of claim 3, wherein after the first terminal obtains the first target information, the method further comprises:
and the first terminal responds to a third operation instruction, sends an ending input message to a server, and/or deletes the floating image layer from the screen, wherein the ending input message carries the first identification information and the second identification information and is used for indicating the server to send the received first target information to the second terminal based on the first identification information and the second identification information.
8. The method according to any of claims 1 to 7, wherein after the first terminal sends the first target information to at least one second terminal, the method further comprises:
the first terminal acquires second target information, wherein the second target information is used for indicating a process that a second object modifies the first target data on a target terminal, and other second terminals except the target terminal in the at least one second terminal forbid modifying the first target data.
9. A data processing method, comprising:
the method comprises the steps that a target terminal obtains first target information, wherein the first target information is used for indicating the process that a first object inputs first target data in a floating layer of the first terminal;
and the target terminal displays the process of inputting the first target data by the first object based on the first target information.
10. The method of claim 9, wherein before the target terminal obtains the first target information, the method further comprises:
and the target terminal sends second target data to the first terminal, wherein the first target data is reply data of the second target data.
11. The method of claim 10, wherein the target terminal obtaining the first target information comprises:
the target terminal acquires the first target information sent by the server, wherein the first target information comprises at least one of the following: the first identification information of the first terminal, the second identification information of the second target data and the image data of the floating layer, wherein the image data comprises an input track of the first target data and/or ground color image data of the floating layer.
12. The method of claim 11, further comprising:
the target terminal displays the floating layer on the upper layer of a screen of the target terminal and displays the first identification information and the second identification information; and/or
And the target terminal generates a target video from the image data of the plurality of floating image layers, wherein the target video is used for showing the process of inputting the first target data by the first object.
13. A data processing method, comprising:
the method comprises the steps that a server obtains first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of a first terminal;
the server forwards the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process of inputting the first target data by the first object.
14. A data processing system, comprising:
the first terminal is used for acquiring first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal;
the server is used for forwarding the first target information;
and the at least one second terminal is used for displaying the process of inputting the first target data by the first object based on the received first target information.
15. A data processing apparatus, comprising:
the first obtaining unit is used for enabling a first terminal to obtain first target information, wherein the first target information is used for indicating a process that a first object inputs first target data in a floating layer of the first terminal;
and the first sending unit is used for enabling the first terminal to send the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process that the first target data is input by the first object.
16. A data processing apparatus, comprising:
the second obtaining unit is used for enabling the target terminal to obtain first target information, wherein the first target information is used for indicating a process that the first object inputs first target data in a floating layer of the first terminal;
and the display unit is used for displaying the process of inputting the first target data by the first object based on the first target information by the target terminal.
17. A data processing apparatus, comprising:
a third obtaining unit, configured to enable a server to obtain first target information, where the first target information is used to indicate a process in which a first object inputs first target data in a floating layer of a first terminal;
and the forwarding unit is used for enabling the server to forward the first target information to at least one second terminal, wherein the first target information is used for enabling the second terminal to display the process that the first object inputs the first target data.
18. A storage medium comprising a stored program, wherein the program, when executed by a processor, controls an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 13.
19. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 13.
CN202010457901.9A 2020-05-26 2020-05-26 Data processing method, system, device, storage medium and processor Active CN111610946B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010457901.9A CN111610946B (en) 2020-05-26 2020-05-26 Data processing method, system, device, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010457901.9A CN111610946B (en) 2020-05-26 2020-05-26 Data processing method, system, device, storage medium and processor

Publications (2)

Publication Number Publication Date
CN111610946A true CN111610946A (en) 2020-09-01
CN111610946B CN111610946B (en) 2024-03-05

Family

ID=72194555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010457901.9A Active CN111610946B (en) 2020-05-26 2020-05-26 Data processing method, system, device, storage medium and processor

Country Status (1)

Country Link
CN (1) CN111610946B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
CN102708716A (en) * 2012-05-16 2012-10-03 深圳市海云天科技股份有限公司 Answering device and examination data processing method thereof
CN105931514A (en) * 2016-07-08 2016-09-07 北京圣合软件科技有限公司 Method and device used for collecting interactive teaching data and teacher terminal
CN107123329A (en) * 2017-06-09 2017-09-01 浙江新盛蓝科技有限公司 A kind of webpage operation import system implementation method
CN107219935A (en) * 2017-05-25 2017-09-29 哈尔滨工业大学 It is a kind of towards continuous writing Chinese character, support interaction Chinese character input system and method
CN107273002A (en) * 2017-05-15 2017-10-20 深圳市助天使软件技术有限公司 Handwriting input answer method, terminal and computer-readable recording medium
CN109299859A (en) * 2018-08-31 2019-02-01 深圳市天英联合教育股份有限公司 Evaluating method, device, equipment and the storage medium of data
CN109359272A (en) * 2018-09-11 2019-02-19 宁波思骏科技有限公司 A kind of display methods, device and the equipment of electronics rough draft
CN109922352A (en) * 2019-02-26 2019-06-21 李钢江 A kind of data processing method, device, electronic equipment and readable storage medium storing program for executing
CN111047934A (en) * 2020-01-07 2020-04-21 上海奇初教育科技有限公司 Examination paper making and automatic correcting system
CN111192171A (en) * 2019-12-27 2020-05-22 创而新(北京)教育科技有限公司 Teaching assistance method, teaching assistance device, teaching assistance equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
CN102708716A (en) * 2012-05-16 2012-10-03 深圳市海云天科技股份有限公司 Answering device and examination data processing method thereof
CN105931514A (en) * 2016-07-08 2016-09-07 北京圣合软件科技有限公司 Method and device used for collecting interactive teaching data and teacher terminal
CN107273002A (en) * 2017-05-15 2017-10-20 深圳市助天使软件技术有限公司 Handwriting input answer method, terminal and computer-readable recording medium
CN107219935A (en) * 2017-05-25 2017-09-29 哈尔滨工业大学 It is a kind of towards continuous writing Chinese character, support interaction Chinese character input system and method
CN107123329A (en) * 2017-06-09 2017-09-01 浙江新盛蓝科技有限公司 A kind of webpage operation import system implementation method
CN109299859A (en) * 2018-08-31 2019-02-01 深圳市天英联合教育股份有限公司 Evaluating method, device, equipment and the storage medium of data
CN109359272A (en) * 2018-09-11 2019-02-19 宁波思骏科技有限公司 A kind of display methods, device and the equipment of electronics rough draft
CN109922352A (en) * 2019-02-26 2019-06-21 李钢江 A kind of data processing method, device, electronic equipment and readable storage medium storing program for executing
CN111192171A (en) * 2019-12-27 2020-05-22 创而新(北京)教育科技有限公司 Teaching assistance method, teaching assistance device, teaching assistance equipment and storage medium
CN111047934A (en) * 2020-01-07 2020-04-21 上海奇初教育科技有限公司 Examination paper making and automatic correcting system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SUMBUL GHULAMANI等: "Educating students in remote areas using augmented reality", 2018 INTERNATIONAL CONFERENCE ON COMPUTING, MATHEMATICS AND ENGINEERING TECHNOLOGIES (ICOMET), 26 April 2018 (2018-04-26) *
高洪波;牛卉原;: "Pro/E教学中若干问题的对策研究与实践", 辽宁省交通高等专科学校学报, no. 04, 15 August 2010 (2010-08-15) *
龙俊仰;: "计算机交互系统的设计在课堂教学中的应用", 教育现代化, no. 42, 16 October 2017 (2017-10-16) *

Also Published As

Publication number Publication date
CN111610946B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN110570698B (en) Online teaching control method and device, storage medium and terminal
CN106971635B (en) Teaching training method and system
JP6190122B2 (en) Problem selection server, learning support system, problem selection method, and problem selection program
CN108628940B (en) Information display device, control method thereof, and recording medium
CN102881191A (en) Independent lesson preparation and interactive teaching system supporting intelligent terminal equipment
JP2016177306A (en) E-learning system
CN109637221B (en) Electronic courseware for online learning and generation method
CN111066075A (en) Classroom teaching interaction method, terminal and system
CN116349230A (en) Teaching live broadcast method and display device
KR102225443B1 (en) System for solving learning problem and method thereof
CN111610946A (en) Data processing method, system, device, storage medium and processor
US20200005833A1 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
CN111796846A (en) Information updating method and device, terminal equipment and readable storage medium
CN108108138B (en) Intelligent pen box, image display method thereof and computer readable storage medium
CN112215973A (en) Data display method, multimedia platform and electronic equipment
JP2021162790A (en) Learning support system and learning support program
CN105118341A (en) Network classroom teaching method and system
CN113703765B (en) Course data generation method, course data generation device, computer equipment and storage medium
US20240039876A1 (en) Interactive messaging system and the method
KR101533760B1 (en) System and method for generating quiz
CN112261431B (en) Image processing method and device and electronic equipment
JP2000321968A (en) Remote learning system and educational video contents producing method
CN116384617A (en) Teaching management method, device, electronic equipment and computer readable storage medium
KR20240037462A (en) Method for Operating a Standalone Education Site using General-Purpose Streaming Video
CN115129206A (en) Interaction method and system designed for discussion questions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant