CN115686281A - Data processing method, data display method, data processing device, data display device, computer equipment and storage medium - Google Patents

Data processing method, data display method, data processing device, data display device, computer equipment and storage medium Download PDF

Info

Publication number
CN115686281A
CN115686281A CN202110869991.7A CN202110869991A CN115686281A CN 115686281 A CN115686281 A CN 115686281A CN 202110869991 A CN202110869991 A CN 202110869991A CN 115686281 A CN115686281 A CN 115686281A
Authority
CN
China
Prior art keywords
target area
area image
displaying
interactive
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110869991.7A
Other languages
Chinese (zh)
Inventor
王玉晓
陈晓芬
张洋
赵元红
唐振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202110869991.7A priority Critical patent/CN115686281A/en
Publication of CN115686281A publication Critical patent/CN115686281A/en
Pending legal-status Critical Current

Links

Images

Abstract

The present disclosure provides a data processing method, a data display method, a data processing apparatus, a data display apparatus, a computer device, and a storage medium, which are applied to a first user side, and include: acquiring and displaying an interactive file with buried points; determining a target area corresponding to the embedded point touch spring in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area; and acquiring a target area image corresponding to the target area, and sending the target area image to a second user end for displaying.

Description

Data processing method, data display method, data processing device, data display device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a data processing method, a data display method, a data processing apparatus, a data display apparatus, a computer device, and a storage medium.
Background
At present, a teacher generally needs to know the learning condition of students and adjust the teaching content in time by looking up the classroom homework of the students in a classroom.
Generally, a teacher invites students to share answers of classroom homework in languages, or enables the students to write the answers of the students on a blackboard, and the like, so that sharing of the answers of the classroom homework is achieved, however, with the development of network technology, more and more classes are transferred from offline to online, and in online classrooms, how the teacher checks the classroom homework of the students becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the disclosure at least provides a data processing method, a data display method, a data processing device, a data display device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a data processing method, including:
acquiring and displaying an interactive file with buried points;
determining a target area corresponding to the embedded point touch spring in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area;
and acquiring a target area image corresponding to the target area, and sending the target area image to a second user end for displaying.
In a possible implementation manner, the acquiring and displaying an interactive file provided with a buried point includes:
acquiring the interactive file;
and after receiving an interactive instruction sent by the second user terminal, displaying the interactive file based on an interactive content identifier carried in the interactive instruction.
In one possible embodiment, the determining, in response to the first user side satisfying the buried point trigger condition, a target area in the interaction file corresponding to the buried point trigger component includes:
and determining a preset target area corresponding to a target button in response to the target button in the interactive file being triggered.
In a possible embodiment, after the interactive file with the buried point is obtained and displayed, the method further comprises the following steps:
and responding to the editing operation aiming at the interactive file, and editing the interactive file.
In a possible implementation manner, after obtaining the target area image corresponding to the target area, before sending the target area image to the second user for display, the method further includes:
and marking the target area image based on the first identification information of the first user side and the second identification information of the target button.
In a possible embodiment, the method further comprises:
and after receiving an interaction stopping instruction sent by the second user side, stopping displaying the interaction file, and displaying a shared page sent by the second user side through a server.
In a second aspect, an embodiment of the present disclosure provides a data display method, including:
responding to a buried point setting instruction, adding buried points to the interactive file, and sending the interactive file added with the buried points to a first user terminal;
responding to target trigger operation, and sending an interaction instruction to the first user side to indicate the first user side to interact with the displayed interaction file;
and acquiring a target area image which is sent by the first user end and carries a response result, and displaying the target area image.
In a possible implementation manner, the display interface of the second user side also displays the identification information of the first user side;
the displaying the target area image comprises:
responding to a trigger operation aiming at target identification information in the identification information of the first user terminal, and displaying a target area image sent by the first user terminal corresponding to the target identification information; alternatively, the first and second liquid crystal display panels may be,
and displaying the target area image in a preset position area of the display interface.
In a possible embodiment, the presenting the target area image includes:
aiming at any target area image, identifying an answer result carried by the target area image, and determining an reading and amending result corresponding to the target area image based on the identified answer result;
and displaying the target area image and the reading and amending result corresponding to the target area image.
In a third aspect, an embodiment of the present disclosure further provides a data processing apparatus, including:
the receiving module is used for acquiring and displaying the interactive file with the embedded points;
the determining module is used for determining a target area corresponding to the embedded point touch component in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area;
and the first acquisition module is used for acquiring a target area image corresponding to the target area and sending the target area image to a second user terminal for displaying.
In a possible implementation manner, the receiving module, when acquiring and displaying the interactive file with the embedded point, is configured to:
acquiring the interactive file;
and after receiving the interactive instruction sent by the second user terminal, displaying the interactive file based on the interactive content identification carried in the interactive instruction.
In a possible embodiment, the determining module, when determining the target area corresponding to the buried point trigger component in the interaction file in response to the first user side satisfying the buried point trigger condition, is configured to:
and determining a preset target area corresponding to a target button in response to the target button in the interactive file being triggered.
In a possible implementation manner, after acquiring and displaying the interaction file provided with the embedded point, the receiving module is further configured to:
and responding to the editing operation aiming at the interactive file, and editing the interactive file.
In a possible implementation manner, the first obtaining module, after obtaining the target area image corresponding to the target area, is further configured to, before sending the target area image to a second user for display:
and marking the target area image based on the first identification information of the first user terminal and the second identification information of the target button.
In a possible implementation, the receiving module is further configured to:
and after receiving an interaction stopping instruction sent by the second user side, stopping displaying the interaction file, and displaying a shared page sent by the second user side through a server.
In a fourth aspect, an embodiment of the present disclosure provides a data display apparatus, including:
the embedded point module is used for responding to an embedded point setting instruction, adding embedded points to the interactive file and sending the interactive file with the embedded points added to the first user terminal;
the interaction module is used for responding to target trigger operation and sending an interaction instruction to the first user terminal so as to indicate the first user terminal to interact with the displayed interaction file;
and the second acquisition module is used for acquiring the target area image which is sent by the first user end and carries the answer result and displaying the target area image.
In a possible implementation manner, the display interface of the second user side also displays the identification information of the first user side;
the second obtaining module, when displaying the target area image, is configured to:
responding to a trigger operation aiming at target identification information in the identification information of the first user terminal, and displaying a target area image sent by the first user terminal corresponding to the target identification information; alternatively, the first and second electrodes may be,
and displaying the target area image in a preset position area of the display interface.
In a possible implementation manner, the second obtaining module, when presenting the target area image, is configured to:
aiming at any target area image, recognizing a response result carried by the target area image, and determining a reading and amending result corresponding to the target area image based on the recognized response result;
and displaying the target area image and the reading and amending result corresponding to the target area image.
In a fifth aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect, or any one of the possible implementations of the first aspect, or the second aspect, or any one of the possible implementations of the second aspect.
In a sixth aspect, this disclosed embodiment also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps in the first aspect, or any one of the possible embodiments of the first aspect, or performs the steps in the second aspect, or any one of the possible embodiments of the second aspect.
According to the data processing method and device, the data display method and device, the computer equipment and the storage medium, after an interactive file with embedded points is obtained and displayed, a first user terminal is responded to meet embedded point triggering conditions, and a target area corresponding to the embedded point triggering condition in the interactive file is determined; by the method, the second user end can know the answering result of the first user end in time, and can adjust the teaching condition in real time according to the answering result of the first user end, so that the teaching quality can be improved.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flow chart of a data processing method provided by an embodiment of the present disclosure;
FIG. 2 is a diagram illustrating a screen interface of a second user terminal provided by an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating another second user-side screen interface provided by an embodiment of the present disclosure;
FIG. 4 is a flow chart illustrating a data presentation method provided by an embodiment of the present disclosure;
FIG. 5 is a block diagram illustrating an architecture of a data processing apparatus provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating an architecture of a data presentation apparatus provided in an embodiment of the present disclosure;
FIG. 7 shows a schematic structural diagram of a computer device 700 provided by an embodiment of the present disclosure;
fig. 8 shows a schematic structural diagram of another computer device 800 provided by the embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
At present, a teacher generally needs to know the learning condition of students and adjust the teaching content in time by looking up the classroom homework of the students in a classroom.
Generally, teachers can share answers of classroom operations by inviting students to share languages, or students write answers of students on a blackboard, and the like to share answers of classroom operations, however, with the development of network technology, more and more classes are transferred from offline to online, and in online classrooms, how teachers check the classroom operations of students becomes an urgent problem to be solved.
In the related art, in the scene of an online classroom, students generally send answers of classroom homework to a teacher in addition, and then the teacher can share the answers of the student homework to other students in a mode of manually capturing pictures and sharing the captured pictures.
Based on this, the disclosure provides a data processing method, a data display method, a data processing device, a data display device, a computer device and a storage medium, which can respond to that a first user terminal meets a buried point triggering condition after acquiring and displaying an interactive file provided with a buried point, and determine a target area corresponding to the buried point triggering condition in the interactive file; by the method, the second user end can know the answering result of the first user end in time, and can adjust the teaching condition in real time according to the answering result of the first user end, so that the teaching quality can be improved. The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined or explained in subsequent figures.
In order to facilitate understanding of the embodiment, first, a data processing method disclosed in the embodiment of the present disclosure is described in detail, an execution main body of the data presentation method provided in the embodiment of the present disclosure generally includes a terminal device, and the terminal device may be a smart phone, a tablet computer, a smart television, a personal computer, and the like.
Referring to fig. 1, a flowchart of a data processing method provided in the embodiment of the present disclosure is applied to a first user side, where the method includes steps 101 to 103, where:
step 101, obtaining and displaying an interactive file with buried points.
Step 102, in response to the first user side meeting a buried point triggering condition, determining a target area corresponding to the buried point triggering component in the interactive file; the target area is used for displaying the answer result.
And 103, acquiring a target area image corresponding to the target area, and sending the target area image to a second user end for displaying.
The following is a detailed description of the above steps 101 to 103.
For step 101,
The first user side can be a student side in an online classroom, and the second user side can be a teacher side in the online classroom. The student can realize the interaction with the teacher end and the interaction with the interactive courseware through the student end. The student end and the teacher end can transmit data through the server.
The interactive file refers to a file with an interactive function, which is produced by a second user side and sent to a first user side, the interactive file can include information resources such as characters, pictures, videos and the like, and the interactive file can be an interactive courseware, an interactive video, an interactive document and the like. In one possible application scenario, the interactive file may be made by the second user.
After the second user end finishes making the interactive file, the second user end can upload the interactive file to a server, and when the first user end enters an online classroom, the interactive file can be automatically obtained from the server.
In a possible implementation manner, after the first user side acquires the interactive file, the interactive file may not be displayed, and after receiving the interactive instruction sent by the second user side, the interactive file is displayed based on the interactive content identifier carried in the interactive instruction.
The interactive instruction is used for indicating that the first user side can interact with the interactive file, and before the interactive instruction sent by the second user side is received, the content displayed on the screen interface of the first user side can be the content of the screen page shared by the second user side.
Specifically, in a scene of an online classroom, the first user side and the second user side both store the interactive file, before the second user side sends an interactive instruction, the second user side can display the interactive file, the second user side can interact with the interactive file, and then the second user side can share a first region interface for displaying the interactive file in a screen interface to the first user side through a server, so that the first user side can view the interactive file displayed by the second user side and an interaction process of the second user side and the interactive file.
The interaction between the second user side and the interactive file can exemplarily comprise video playing, question answering, page turning, doodling and the like.
In a possible implementation manner, after receiving an interaction stopping instruction sent by the second user, the displaying of the interaction file may be stopped, and a shared page sent by the second user through a server is displayed, where the shared page is a screen page of the second user.
The interactive content identifier may include the number of pages of a courseware, the time progress of a video, paragraphs of characters, question numbers, and the like, the interactive file is displayed based on the interactive content identifier carried in the interactive instruction, and the interactive file may be displayed based on the interactive content identifier, where the interactive file is determined in display progress and displayed according to the display progress.
In a possible implementation manner, the second user terminal may generate the interaction instruction automatically after detecting that the interaction button is triggered when generating the interaction instruction, where the interaction content identifier carried in the interaction instruction may be obtained by the second user terminal by recognizing a screen interface of the second user terminal after triggering the interaction button.
Illustratively, the content displayed in the preset position area may be identified, and the identified content may be used as the interactive content identifier, where the preset position area may include, for example, an area for displaying the number of pages, an area for displaying the time progress of the video, an area for displaying the title number, and the like.
For example, the screen interface of the second user end may be as shown in fig. 2, where the screen interface includes an interactive button, and after the interactive button is triggered, the number of pages "5" displayed in the preset position region may be identified, and then an interactive instruction carrying the number of pages "5" is generated, where the number of pages is an interactive content identifier, and after the interactive instruction is received by the first user end, the content of the page 5 may be displayed.
With respect to step 102,
In a possible implementation manner, the target area corresponding to the buried point trigger component in the interactive file is determined in response to the first user side meeting the buried point trigger condition, and for example, a preset target area corresponding to a target button in the interactive file is determined in response to the target button being triggered. The target button may be a button provided with a buried point, and the trigger condition may be clicking the buried point button.
The area location information of the target area may be previously set by the second user terminal. For example, when the target area is a circular shape, the area location information of the target area may include coordinates of a center of the circle and a radius; when the target area is rectangular, the area position information of the target area may include coordinates of four vertices.
The target area may be a presentation area of the answer result, or may be a larger area including the presentation area of the answer result. And the answering result display area is used for answering. For example, the screen interface of the second user terminal may be as shown in fig. 3, and the target area includes a question on PPT page 5, question 2 and a response result display area below the question, in which the student fills out an answer.
The first user end can respond to the editing operation aiming at the interactive file after receiving the interactive instruction and displaying the interactive file, wherein the editing of the interactive file can comprise sliding, clicking, long-pressing, dragging, typing and the like. Illustratively, students use a mouse to long press, slide, perform vertical calculations, use a keyboard to input answers, and the like.
It should be noted that the editing operation for the interactive file described herein is generally an editing operation performed in the answer result display area.
In a possible implementation manner, after it is detected that the first user side meets the buried contact spring, the prompt message may be displayed through the first user side, so that the user may confirm the response result conveniently. The prompting method includes but is not limited to popping up a prompt box with prompting characters near the target area. For example, after the user clicks the target button, a text box written with "response result submitted" may pop up 1 cm above the target area.
For step 103,
In a possible implementation manner, the purpose of acquiring the target area image corresponding to the target area and sending the target area image to the second user side for displaying is that the user (teacher) can check the answer result displayed in the target area image through the second user side, if the user (student) does not answer the answer result display result through the first user side, the target area image checked by the teacher at this time does not contain the answer result, and the image transmission at this time is meaningless, so that it can be detected before acquiring the target area image corresponding to the target area whether the student answers the answer result display area.
Whether the students answer in the answer result display area or not can be detected, and whether the students edit in the answer result display area or not can be detected. If the students are detected to answer in the answering result display area, the target area image corresponding to the target area can be obtained; if it is detected that the student does not answer in the answer result display area, a null instruction can be sent to the second user end to indicate that the student does not answer in the first user end.
Because the transmission resource required for transmitting the target area image is higher than that required for transmitting the null command, by the method, the waste of the transmission resource can be reduced, and the meaningless data transmission can be reduced.
The obtaining of the target area image corresponding to the target area may first determine, according to the area location information of the target area, the target area corresponding to the area location information in the screen interface of the first user side, and then obtain the image corresponding to the determined target area.
In a possible implementation manner, after the target area image is obtained, the target area image may be marked based on the first identification information of the first user terminal and the second identification information of the target button, so that when the second user terminal displays the target area image and the identification of each first user terminal displayed in the screen interface of the second user terminal, the target area image and the identification of each first user terminal displayed in the screen interface of the second user terminal may be correspondingly displayed. The first identification information may exemplarily include a first user terminal serial number, a user name, and the like. The second identification information may be an identification of an area where the target button is located, and may be, for example, a number of the button.
In a possible implementation manner, if the user repeatedly triggers the target button, the first user side may acquire and upload the pictures of the target area images to the server many times, and since the acquired marks of the pictures are the same, the server may cover the picture uploaded first and only retain the picture uploaded most recently, thereby saving the storage space of the server and preventing the second user side from receiving the repeated pictures.
According to the data processing method provided by the embodiment of the disclosure, after an interactive file with a buried point is obtained and displayed, a first user side is responded to meet a buried point triggering condition, and a target area corresponding to the buried point triggering component in the interactive file is determined; and acquiring a target area image corresponding to the target area, and sending the target area image to a second user side for displaying.
Based on the same concept, the embodiment of the present disclosure further provides a data display method, which is shown in fig. 4 and is a flowchart of the data display method provided by the embodiment of the present disclosure, and the method is applied to a second user end, and includes the following steps:
step 401, responding to a buried point setting instruction, adding buried points to an interactive file, and sending the interactive file with the buried points added to a first user end;
step 402, responding to a target trigger operation, and sending an interaction instruction to the first user end to indicate the first user end to interact with the displayed interaction file;
and 403, acquiring a target area image which is sent by the first user and carries the response result, and displaying the target area image.
For step 401,
Adding the buried points to the interactive file may include adding the buried points to the target button and setting a target region corresponding to the target button.
Here, sending the interaction file with the embedded points added to the first user side may be understood as uploading the interaction file with the embedded points added to a server corresponding to an online classroom, and when any first user side enters the online classroom, the interaction file with the embedded points added to the server may be obtained from the server.
For step 402,
In a possible implementation manner, the response target triggering operation can respond to a triggering operation aiming at the interactive button; when sending the interactive instruction to the first user end, the interactive instruction carrying the interactive content identifier may be sent to the first user end. The specific method for determining the interactive content identifier may refer to the description in the foregoing embodiments, and will not be described herein again.
For step 403,
In a possible implementation manner, when the target area images are displayed, for any target area image, an answer result carried by the target area image may be identified first, and an reviewing result corresponding to the target area image is determined based on the identified answer result; and then displaying the target area image and the reading and amending result corresponding to the target area image.
Specifically, the answer result carried by the target area image may be determined by a server, and after receiving the target area image uploaded by the first user, the server may identify the target area image, determine answer information in the target area image, and then read the answer information based on pre-stored answer information, to determine the answer result corresponding to the target area image.
The reading result can exemplarily include scoring, topic parsing, error analysis, and the like.
In a possible implementation, the target area image may also be read through manually. Specifically, the user inputs the reading result to the second user terminal. The input mode includes, but is not limited to, keyboard input and mouse input.
In a possible implementation manner, the display interface of the second user side also displays the identification information of the first user side. The identification information may include a user side serial number, a user name, a user image, and the like.
When the target area image is displayed, the target area image sent by the first user side corresponding to the target identification information may be displayed in response to a trigger operation for the target identification information in the identification information of the first user side. For example, by clicking a user name, the target area image uploaded by the first user side corresponding to the user name may be displayed in the display interface, so that comparison is facilitated.
In another possible implementation manner, after the target area image of the second user end is received, the target area image is displayed in a preset position area of the display interface. After the target area image is displayed, in response to a trigger operation for any target area image, the triggered target area image can be enlarged or reduced, and the specific enlarged or reduced size can be set according to actual conditions.
According to the method and the device for data processing and data display, after the interactive file with the embedded point is obtained and displayed, the first user terminal responds to the fact that the embedded point triggering condition is met, and a target area corresponding to the embedded point triggering component in the interactive file is determined; by the method, the second user end can know the answering result of the first user end in time, and can adjust the teaching condition in real time according to the answering result of the first user end, so that the teaching quality can be improved.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, a data processing apparatus corresponding to the data processing method is also provided in the embodiments of the present disclosure, and because the principle of the apparatus in the embodiments of the present disclosure for solving the problem is similar to the data processing method described above in the embodiments of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 5, which is a schematic diagram of an architecture of a data processing apparatus provided in an embodiment of the present disclosure, the apparatus includes a receiving module 501, a determining module 502, and a first obtaining module 503; wherein, the first and the second end of the pipe are connected with each other,
the receiving module 501 is configured to acquire and display an interactive file with a buried point;
a determining module 502, configured to determine, in response to that a first user side meets a buried point trigger condition, a target area corresponding to the buried point trigger component in the interaction file; the target area comprises a response result display area;
the first obtaining module 503 is configured to obtain a target area image corresponding to the target area, and send the target area image to a second user side for display.
In a possible implementation manner, the receiving module 501, when acquiring and displaying an interactive file with a buried point, is configured to:
acquiring the interactive file;
and after receiving the interactive instruction sent by the second user terminal, displaying the interactive file based on the interactive content identification carried in the interactive instruction.
In a possible implementation, the determining module 502, when determining, in response to the first user side satisfying the buried-point triggering condition, a target area corresponding to the buried-point triggering component in the interactive file, is configured to:
and determining a preset target area corresponding to a target button in response to the target button in the interactive file being triggered.
In a possible implementation, after acquiring and displaying the interaction file with the embedded point, the receiving module 501 is further configured to:
and responding to the editing operation aiming at the interactive file, and editing the interactive file.
In a possible implementation manner, the first obtaining module 503, after obtaining the target area image corresponding to the target area, is further configured to, before sending the target area image to a second user for displaying, further:
and marking the target area image based on the first identification information of the first user side and the second identification information of the target button.
In a possible implementation, the receiving module 501 is further configured to:
and after receiving an interaction stopping instruction sent by the second user side, stopping displaying the interaction file, and displaying a shared page sent by the second user side through a server.
Referring to fig. 6, which is a schematic diagram of an architecture of a data display apparatus provided in an embodiment of the present disclosure, the apparatus includes a buried point module 601, an interaction module 602, and a second obtaining module 603; wherein the content of the first and second substances,
the embedded point module 601 is configured to respond to an embedded point setting instruction, add an embedded point to an interactive file, and send the interactive file after the embedded point is added to a first user end;
an interaction module 602, configured to send an interaction instruction to the first user end in response to a target trigger operation, so as to instruct the first user end to interact with the displayed interaction file;
a second obtaining module 603, configured to obtain a target area image sent by the first user and carrying a response result, and display the target area image.
In a possible implementation manner, the display interface of the second user side also displays the identification information of the first user side;
the second obtaining module 603, when displaying the target area image, is configured to:
responding to a trigger operation aiming at target identification information in the identification information of the first user terminal, and displaying a target area image sent by the first user terminal corresponding to the target identification information; alternatively, the first and second electrodes may be,
and displaying the target area image in a preset position area of the display interface.
In a possible implementation, the second obtaining module 603, when presenting the target area image, is configured to:
aiming at any target area image, recognizing a response result carried by the target area image, and determining a reading and amending result corresponding to the target area image based on the recognized response result;
and displaying the target area image and the reading and amending result corresponding to the target area image.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 7, a schematic structural diagram of a computer device 700 provided in the embodiment of the present disclosure includes a processor 701, a memory 702, and a bus 703. The memory 702 is used for storing execution instructions and includes a memory 7021 and an external memory 7022; the memory 7021 is also referred to as an internal memory, and is used to temporarily store operation data in the processor 701 and data exchanged with an external memory 7022 such as a hard disk, the processor 701 exchanges data with the external memory 7022 through the memory 7021, and when the computer apparatus 700 is operated, the processor 701 communicates with the memory 702 through the bus 703, so that the processor 701 executes the following instructions:
acquiring and displaying an interactive file with buried points;
determining a target area corresponding to the embedded point touch spring in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area;
and acquiring a target area image corresponding to the target area, and sending the target area image to a second user end for displaying.
In a possible embodiment, the instructions executed by the processor 701 for obtaining and displaying the interaction file with the embedded point include:
acquiring the interactive file;
and after receiving the interactive instruction sent by the second user terminal, displaying the interactive file based on the interactive content identification carried in the interactive instruction.
In a possible embodiment, the determining, in the instructions executed by the processor 701, a target area corresponding to the buried point trigger component in the interaction file in response to the first user side satisfying the buried point trigger condition includes:
and determining a preset target area corresponding to a target button in response to the target button in the interactive file being triggered.
In a possible implementation, the processor 701 executes instructions, and after acquiring and displaying the interaction file with the embedded point, the method further includes:
and responding to the editing operation aiming at the interactive file, and editing the interactive file.
In a possible embodiment, in the instructions executed by the processor 701, after obtaining the target area image corresponding to the target area, before sending the target area image to the second user side for displaying, the method further includes:
and marking the target area image based on the first identification information of the first user side and the second identification information of the target button.
In a possible implementation, in the instructions executed by the processor 701, the method further includes: and after receiving an interaction stopping instruction sent by the second user side, stopping displaying the interaction file, and displaying a shared page sent by the second user side through a server.
Based on the same technical concept, the embodiment of the disclosure also provides another computer device. Referring to fig. 8, a schematic structural diagram of a computer device 800 provided in the embodiment of the present disclosure includes a processor 801, a memory 802, and a bus 803. The memory 802 is used for storing execution instructions, and includes a memory 8021 and an external memory 8022; the memory 8021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 801 and data exchanged with an external storage 8022 such as a hard disk, the processor 801 exchanges data with the external storage 8022 through the internal memory 8021, and when the computer apparatus 800 operates, the processor 801 communicates with the storage 802 through the bus 803, so that the processor 801 executes the following instructions:
responding to a buried point setting instruction, adding buried points to the interactive file, and sending the interactive file with the buried points added to the first user terminal;
responding to target trigger operation, and sending an interaction instruction to the first user end to indicate the first user end to interact with the displayed interaction file;
and acquiring a target area image which is sent by the first user end and carries a response result, and displaying the target area image.
In a possible implementation manner, in the instructions executed by the processor 801, the display interface of the second user side also displays the identification information of the first user side;
the displaying the target area image comprises:
responding to a trigger operation aiming at target identification information in the identification information of the first user terminal, and displaying a target area image sent by the first user terminal corresponding to the target identification information; alternatively, the first and second electrodes may be,
and displaying the target area image in a preset position area of the display interface.
In a possible implementation, the instructions executed by the processor 801 to present the target area image include:
aiming at any target area image, recognizing a response result carried by the target area image, and determining a reading and amending result corresponding to the target area image based on the recognized response result; and displaying the target area image and the reading and amending result corresponding to the target area image.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the data processing method and the data presentation method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
An embodiment of the present disclosure further provides a computer program product, where the computer program product carries a program code, and instructions included in the program code may be used to execute steps of the data processing method and the data displaying method in the foregoing method embodiments, which may be specifically referred to in the foregoing method embodiments and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some communication interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used to illustrate the technical solutions of the present disclosure, but not to limit the technical solutions, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (13)

1. A data processing method is applied to a first user end and comprises the following steps:
acquiring and displaying an interactive file with buried points;
determining a target area corresponding to the embedded point touch spring in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area;
and acquiring a target area image corresponding to the target area, and sending the target area image to a second user end for displaying.
2. The method according to claim 1, wherein the obtaining and displaying of the interaction file provided with the embedded point comprises:
acquiring the interactive file;
and after receiving an interactive instruction sent by the second user terminal, displaying the interactive file based on an interactive content identifier carried in the interactive instruction.
3. The method of claim 1, wherein the determining a target region in the interactive file corresponding to the buried-point trigger component in response to the first user terminal satisfying the buried-point trigger condition comprises:
and determining a preset target area corresponding to a target button in response to the target button in the interactive file being triggered.
4. The method of claim 1, wherein after obtaining and displaying the interactive file with the buried points, the method further comprises:
and responding to the editing operation aiming at the interactive file, and editing the interactive file.
5. The method according to claim 3, wherein after acquiring the target area image corresponding to the target area, before sending the target area image to a second user side for display, the method further comprises:
and marking the target area image based on the first identification information of the first user side and the second identification information of the target button.
6. The method of claim 2, further comprising:
and after receiving an interaction stopping instruction sent by the second user side, stopping displaying the interaction file, and displaying a shared page sent by the second user side through a server.
7. A data display method is applied to a second user end and comprises the following steps:
responding to a buried point setting instruction, adding buried points to the interactive file, and sending the interactive file added with the buried points to a first user terminal;
responding to target trigger operation, and sending an interaction instruction to the first user side to indicate the first user side to interact with the displayed interaction file;
and acquiring a target area image which is sent by the first user end and carries a response result, and displaying the target area image.
8. The method according to claim 7, wherein the display interface of the second user terminal further displays the identification information of the first user terminal;
the displaying the target area image comprises:
responding to a trigger operation aiming at target identification information in the identification information of the first user terminal, and displaying a target area image sent by the first user terminal corresponding to the target identification information; alternatively, the first and second electrodes may be,
and displaying the target area image in a preset position area of the display interface.
9. The method of claim 7, wherein said presenting the target area image comprises:
aiming at any target area image, recognizing a response result carried by the target area image, and determining a reading and amending result corresponding to the target area image based on the recognized response result;
and displaying the target area image and the reading and amending result corresponding to the target area image.
10. A data processing apparatus, characterized by comprising:
the receiving module is used for acquiring and displaying the interactive file with the embedded points;
the determining module is used for determining a target area corresponding to the embedded point touch component in the interactive file in response to the first user side meeting the embedded point trigger condition; the target area comprises a response result display area;
and the first acquisition module is used for acquiring a target area image corresponding to the target area and sending the target area image to a second user terminal for displaying.
11. A data presentation device, comprising:
the embedded point module is used for responding to an embedded point setting instruction, adding embedded points to the interactive file and sending the interactive file after the embedded points are added to the first user terminal;
the interaction module is used for responding to target trigger operation and sending an interaction instruction to the first user end so as to indicate the first user end to interact with the displayed interaction file;
and the second acquisition module is used for acquiring the target area image which is sent by the first user end and carries the answering result and displaying the target area image.
12. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when a computer device is running, the machine-readable instructions when executed by the processor performing the data processing method of any one of claims 1 to 6 or the steps of the data presentation method of any one of claims 7 to 9.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the data processing method of one of the claims 1 to 6 or the steps of the data presentation method of one of the claims 7 to 9.
CN202110869991.7A 2021-07-30 2021-07-30 Data processing method, data display method, data processing device, data display device, computer equipment and storage medium Pending CN115686281A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110869991.7A CN115686281A (en) 2021-07-30 2021-07-30 Data processing method, data display method, data processing device, data display device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110869991.7A CN115686281A (en) 2021-07-30 2021-07-30 Data processing method, data display method, data processing device, data display device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115686281A true CN115686281A (en) 2023-02-03

Family

ID=85057888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110869991.7A Pending CN115686281A (en) 2021-07-30 2021-07-30 Data processing method, data display method, data processing device, data display device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115686281A (en)

Similar Documents

Publication Publication Date Title
CN107316520B (en) Video teaching interaction method, device, equipment and storage medium
CN103957190A (en) Online education interaction method, client-sides, server and system
CN110136032B (en) Classroom interaction data processing method based on courseware and computer storage medium
CN111066075A (en) Classroom teaching interaction method, terminal and system
CN113362206A (en) Online classroom information processing method and device, computer equipment and storage medium
KR20180094637A (en) System and method for review through handwriting note and voice note
JP4117626B2 (en) Electronic education server and program
KR20160139786A (en) System and method for solving learnig problems using augmented reality
US20160180727A1 (en) Student assessment grading engine
KR20130089962A (en) Real-time remote multimedia education system using app
KR101640574B1 (en) Method for transmitting and playing writing and voice information based on Push, and system thereof
CN111464859B (en) Method and device for online video display, computer equipment and storage medium
VanLehn et al. Some less obvious features of classroom orchestration systems
KR101166303B1 (en) Online education system and communication terminal therfor
CN112799612A (en) Printing control method and device, computer equipment and storage medium
WO2015188041A1 (en) Dynamic scheduling of participants into groups
KR20140122415A (en) Method for generating learning contents and learning management system and method using the same
CN115686281A (en) Data processing method, data display method, data processing device, data display device, computer equipment and storage medium
McLoone et al. Design, implementation and evaluation of a tablet-based student response system for an engineering classroom
CN106933443B (en) Method and device for processing electronic book data and electronic book reader
KR20190092796A (en) Method for coding education based virtual environment graphic
Almjawel et al. Campus Guide Using Augmented Reality Techniques
CN111787127A (en) Classroom information transmission method and classroom information transmission system
CN113268190A (en) Information interaction method, system, device, electronic equipment and storage medium
CN114038254A (en) Virtual reality teaching method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination