CN112488890A - Interactive learning auxiliary method based on remote display - Google Patents

Interactive learning auxiliary method based on remote display Download PDF

Info

Publication number
CN112488890A
CN112488890A CN202110158747.XA CN202110158747A CN112488890A CN 112488890 A CN112488890 A CN 112488890A CN 202110158747 A CN202110158747 A CN 202110158747A CN 112488890 A CN112488890 A CN 112488890A
Authority
CN
China
Prior art keywords
display
display terminal
identification
equipment
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110158747.XA
Other languages
Chinese (zh)
Other versions
CN112488890B (en
Inventor
叶玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xiongda Future Window Intelligent Technology Co ltd
Original Assignee
Nanjing Xiongda Future Window Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xiongda Future Window Intelligent Technology Co ltd filed Critical Nanjing Xiongda Future Window Intelligent Technology Co ltd
Priority to CN202110158747.XA priority Critical patent/CN112488890B/en
Publication of CN112488890A publication Critical patent/CN112488890A/en
Application granted granted Critical
Publication of CN112488890B publication Critical patent/CN112488890B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3006Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is distributed, e.g. networked systems, clusters, multiprocessor systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The invention discloses an interactive learning auxiliary method based on remote display, and belongs to the field of interactive learning equipment. According to the interactive learning auxiliary method, a first user controls display terminal equipment in remote display by utilizing shooting equipment and control equipment so as to perform learning activities; the second user monitors the display content of the display terminal equipment by using the monitoring equipment so as to monitor the learning activity of the first user; and/or after the third user receives the information transmitted by the display terminal equipment by using the remote equipment, the third user feeds back the problem solving information to the display terminal equipment by using the remote equipment so as to complete the tutoring of the learning activities of the first user. Because the display terminal equipment is in an exposed state, the second user can easily observe the display content, so that the first user generates a warning mind and regulates the use behavior of the display terminal equipment at any moment, and the learning efficiency of the first user is improved.

Description

Interactive learning auxiliary method based on remote display
Technical Field
The invention relates to the technical field of interactive learning equipment, in particular to an interactive learning auxiliary system and method based on remote display.
Background
Under the influence of health, travel efficiency, safety and other factors, the forms of future education and training can gradually transit to an online mode, particularly in the field of family education. In the prior art, remote family education usually depends on handheld terminal devices such as a mobile phone and a tablet personal computer or desktop terminal devices such as a computer terminal and an intelligent desk lamp, and the handheld terminal devices and the desktop terminal devices have high convenience and are close-range display devices.
For example, the chinese patent document with application number 2017109693907 discloses a problem search method, which is based on a handheld terminal device, and can start a camera according to a received problem search instruction to obtain a related picture, analyze the related picture, and search a problem corresponding to the related picture according to an analysis result; and classifying and storing the exercises according to preset rules of a user, and displaying or/and playing the exercises. In the process, the intelligent terminal can start the camera according to a search instruction of a user by one key and then search the corresponding exercises according to the shot pictures. The method is a man-machine interaction learning method, and is essentially characterized in that the problem to be done is identified, the problem to be done is compared with a plurality of problems in a database, and a plurality of nearest problems are found out to be referred by a user.
However, the self-control of minors is weak, and parents have difficulty in controlling the use purpose of the devices displayed in a short distance, such as a handheld terminal device and a desktop terminal device, when the devices are completely under the control of minors.
In addition, if the terminal device is installed far away from minors, inconvenience is brought. For example, in some learning scenarios, two parties participating in remote home education need to interact, when learning on a certain topic, a minor needs to see a topic display teacher, the teacher displays the solution idea and process to the minor, and the minor records the solution idea and completes the answer. The desktop terminal equipment and the handheld terminal equipment can be used for conveniently realizing the process, but if the terminal equipment is far away from the minor, how to enhance the interaction effect of the minor and the terminal equipment is a problem to be solved urgently.
Disclosure of Invention
1. Technical problem to be solved by the invention
The invention aims to overcome the defect of low efficiency when a user utilizes display terminal equipment at a short distance to learn in the prior art, provides an interactive learning auxiliary method based on remote display, and aims to set the display terminal equipment far away from the use so as to improve the learning efficiency of the user.
2. Technical scheme
The invention relates to an interactive learning auxiliary method based on remote display,
the method comprises the following steps that a first user controls display terminal equipment in remote display by utilizing shooting equipment and control equipment to perform learning activities; and the number of the first and second groups,
the second user monitors the display content of the display terminal equipment by utilizing the monitoring equipment so as to monitor the learning activity of the first user; and/or the presence of a gas in the gas,
and after receiving the information transmitted by the display terminal equipment by the third user by using the remote equipment, feeding back the problem solving information to the display terminal equipment by using the remote equipment so as to complete the tutoring of the learning activities of the first user.
Further, the second user utilizes the display terminal device with the touch function to tutor the learning activity of the first user.
Further, the first user specifically includes the following steps when controlling the display terminal device to perform a learning activity by using the photographing device and the manipulation device,
the method comprises the following steps that firstly, a display terminal device receives and displays a job image transmitted by a shooting device, and an identification frame is formed on the job image;
secondly, the display terminal equipment receives an operation instruction of the control equipment and adjusts the size and the position of the identification frame to determine a target area;
thirdly, the display terminal equipment identifies the text information in the target area and displays the text information in a first area of a display card; meanwhile, the display terminal equipment transmits the text information and the image of the target area to remote equipment;
step four, the display terminal equipment receives and synchronously displays the problem solving information fed back by the remote equipment in a second area of the display card;
and step five, the display terminal equipment receives the answer information transmitted by the control equipment and displays the answer information in a third area of the display card.
Further, in the second step, the identification frame has two identification modes, the identification frame is identified in a first identification mode when being formed, and the identification frame is identified in a second identification mode when the display terminal device receives the operation instruction of the control device;
when the identification frame is identified in a first identification mode, if the operation instruction of the control device is not received within t1 time, determining a target area;
when the identification frame is identified in the second identification mode, if the operation instruction of the control device is not received within t2 time, the identification frame is identified in the first identification mode.
Further, when at least two target areas are provided, all the target areas are identified to form a plurality of identification frames; the concrete steps of the second step are that,
step 2.1, identifying all the identification frames in a third identification mode;
2.2, selecting one of a plurality of identification frames identified in a third identification mode, and identifying the identification frame in the first identification mode;
step 2.3, detecting whether the display terminal device receives an operation instruction of the control device, if the display terminal device does not receive the operation instruction of the control device within t1 time, determining a target area, and executing step 2.5; if the display terminal device receives the operation instruction of the control device within t1 time, identifying the identification frame selected in the step 2.2 in a second identification mode;
step 2.4, if the display terminal device does not receive the operation instruction of the control device within t2 time, identifying the identification frame selected in the step 2.2 in a first identification mode, and executing the step 2.3;
step 2.5, if the mark frame marked by the third mark mode still exists, executing the step 2.2; and if all the identification frames are identified in the first identification mode, finishing the second step.
Further, in the step 2.3, if the display terminal device receives the deletion instruction of the control device within the time t1, the identifier frame selected in the step 2.2 is deleted, and the step 2.5 is executed.
Further, in the third step, a plurality of display cards are established, the display cards are divided into the first area, the second area and the third area, and the first area on the display cards only displays the text information in one identification frame.
Furthermore, in the fourth step, the received problem solving information is identified and displayed on a second area of the display card for displaying the text information corresponding to the problem solving information.
Further, only one display card of the plurality of display cards is provided on the display terminal device, and the plurality of display cards are switched to be provided on the display terminal device in response to an operation instruction of the manipulation device; and the display card provided on the display terminal equipment responds to the transmission instruction of the answer information of the control equipment and displays the answer information in a third area of the display card.
Further, the display card provided on the display terminal device is deleted in response to a deletion instruction of the manipulation device, and another display card of the plurality of display cards is provided on the display terminal device.
3. Advantageous effects
Compared with the prior art, the technical scheme provided by the invention has the following beneficial effects:
(1) in the invention, when the display terminal equipment is far away from the user, the display terminal equipment can be set as large-screen display terminal equipment, thereby solving the problem that the display content of the handheld terminal equipment and the desktop terminal equipment is limited due to small screens; meanwhile, the long-term use of the handheld terminal device and the desktop terminal device with close-range display may damage the eyesight of the user, and the long-term maintenance of the posture of the desk is not good for the health of the user, especially for teenagers in the growth stage, so that the display terminal device can play a role in protecting eyes when being arranged far away from the user.
(2) In the invention, when the display terminal device is far away from the first user, the display terminal device is in an exposed state, and the second user can easily observe the display content, so that the first user generates a warning mind and the use behavior of the display terminal device is standardized at any time, thereby improving the learning efficiency of the first user.
(3) The method of the invention accurately positions the target area in the operation image by shooting paper as an image and displaying the image on the display terminal device and adjusting the position and the size of the identification frame; then, recognizing the text information in the target area and transmitting the text information to the remote equipment, so that the questions on the paper can be clearly and accurately displayed to the user in front of the remote equipment; in addition, the problem solving information fed back by the remote equipment is displayed on the display terminal equipment, so that the efficiency of both parties participating in the learning activity can be effectively improved.
(4) In the invention, when the number of the target areas is more than two, a plurality of identification frames can be formed on the operation image, the positions and the sizes of the identification frames are sequentially adjusted through the operation instruction sent by the control equipment so as to accurately determine the target areas, and meanwhile, the character information in each target area is identified and sent to the remote equipment, so that the efficiency of both parties participating in the learning activity is further improved.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a schematic diagram of the steps of the method of the present invention;
FIG. 3 is a logic diagram for adjusting a plurality of identification boxes in the present invention;
FIG. 4 is a schematic diagram of a process of determining a target area of a job image marked with a plurality of marking frames according to the present invention;
FIG. 5 is a schematic diagram illustrating a process of switching a plurality of display cards to be provided on a display screen according to the present invention.
Detailed Description
For a further understanding of the invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
The structure, proportion, size and the like shown in the drawings are only used for matching with the content disclosed in the specification, so that the person skilled in the art can understand and read the description, and the description is not used for limiting the limit condition of the implementation of the invention, so the method has no technical essence, and any structural modification, proportion relation change or size adjustment still falls within the scope of the technical content disclosed by the invention without affecting the effect and the achievable purpose of the invention. In addition, the terms "upper", "lower", "left", "right" and "middle" used in the present specification are for clarity of description, and are not intended to limit the scope of the present invention, and the relative relationship between the terms and the relative positions may be changed or adjusted without substantial technical changes.
In various embodiments, the display terminal device may be any large-sized display terminal device that can be set away from a user, such as an LCD display screen, an LED display screen, an OLED display screen; the display terminal device may also be a projection device, in which case the display screen projected and visualized by the projection device is located far away from the user. The display terminal device is integrated with a computer module, so that a user can run an application program or the characteristics of an application program management mode of an operating system on the display terminal device, wherein the operating system can be a Windows operating system, an Android operating system, an iOS operating system or any other operating system capable of realizing human-computer interaction; and a key program and an application program management program are arranged in a storage module of the display terminal equipment.
In addition, a touch module may be further disposed on the display screen of the display terminal device, where the touch module is used to implement a short-distance interaction operation of the display terminal device, for example, a second user may implement a touch operation by using the touch module to guide a learning activity of a first user. The touch module may be an infrared touch controller, a capacitive touch controller, or a resistive touch controller, which are commonly used in the related field, and the specific structure and the working principle of the touch controllers may be implemented with reference to the content disclosed in the prior art, and the related content is not described in detail in this embodiment.
The photographing apparatus may be any apparatus capable of photographing an image, and the photographing apparatus may be capable of transmitting the photographed image to the display terminal apparatus. The shooting device can be a handheld computer, a smart phone, a tablet computer, a high-speed shooting instrument and a camera. The control device may be any device capable of issuing an operation command even when being remote from the display terminal device, such as a touch pad, a page-turning pen, a mouse, a keyboard, a handwriting input board, a joystick, or the voice control device 4, wherein the keyboard and the handwriting input board can be directly used for inputting characters.
The display terminal equipment can transmit the operation image shot by the shooting equipment and the character information identified from the operation image to the remote equipment, and a third user can utilize the remote equipment to feed back corresponding problem solving information to the display terminal equipment, so that the third user can be helped to the first user to learn activities. The remote device may be a handheld computer, smart phone, tablet computer, etc.
The monitoring device comprises a monitoring device, a key program and an application program management program, wherein the monitoring device is used for monitoring the monitoring device, the key program is used for receiving the data of the monitoring program in real time, and the display content of a display screen of the display terminal device responds to the monitoring result of the key program. The monitoring device may be a handheld terminal device, may be a handheld computer, a smart phone, a tablet computer, or the like. The second user may monitor the learning activity of the first user using the monitoring device. It should also be noted that the specific embodiments of the monitoring program, the key program and the application program management program can refer to related programs commonly used in the art, such as but not limited to the solution disclosed in chinese patent application No. 2020210844604.
In the following, the following description is given,
referring to fig. 1, as an embodiment of the interactive learning auxiliary system, a display terminal device is a display screen 1 integrated with an OPS module, the display screen 1 is an LCD display screen, and the display screen 1 is loaded with a Windows operating system; the shooting equipment is a high shooting instrument 2, and the high shooting instrument 2 is provided with a wireless data transmission unit and is used for transmitting shot image information to the display screen 1 in real time; the control device is a handwriting input board 3, the handwriting input board 3 can be provided with buttons for sending corresponding operation instructions, and the wireless data transmission unit of the handwriting input board 3 can transmit the operation instructions to the display screen 1.
Specifically, the display screen 1 is used for displaying a job image transmitted by the high-speed shooting instrument 2 and forming an identification frame on the job image; the display screen 1 is used for responding to an operation instruction of the handwriting control equipment 3 so as to adjust the size and the position of the identification frame; the display screen 1 is used for responding to the transmission instruction of the answering information of the handwriting control equipment 3.
The display screen 1 may include a processor, a wireless transmission module, a display screen, and modules necessary for maintaining the display screen 1 to operate normally, such as a power supply module, a storage module, and a display driving module. The wireless transmission module and the display screen are both coupled with the processor, and can execute corresponding instructions sent by the processor.
Specifically, the processor receives image information transmitted by the shooting device, and an operation instruction and answer information transmitted by the control device, and operates the system in any one of two display modes. The two display modes of the embodiment are respectively a recognition mode and a response mode, wherein the recognition mode refers to that the display screen 1 displays a job image and adjusts the identification frame; the answering mode is that the display screen 1 displays character information, answer information and answer information.
In the recognition mode, the processor provides an image generated according to the image information transmitted by the shooting equipment on the display screen, and simultaneously generates an identification frame; moving the position of the identification frame on the display screen, zooming the size of the identification frame, and changing the identification mode of the identification frame so as to respond to the operation instruction transmitted by the control equipment;
in the answering mode, the wireless transmission module transmits the character information identified by the image to the remote equipment and receives the problem solving information transmitted by the remote equipment; the processor provides text information on the display screen, provides question solving information to respond to the wireless transmission module and provides answer information input by the control equipment.
In the following, the following description is given,
referring to fig. 2, the method for performing assisted learning by using the interactive learning assistance system of the above embodiment may specifically include the following steps:
step one, the display screen 1 receives and displays the operation image transmitted by the high shooting instrument 2, and a mark frame is formed on the operation image.
Specifically, in the first step, the paper with the recorded work title may be placed under the camera of the high-speed scanner 2, and the focal length of the camera of the high-speed scanner 2 may be adjusted, so that the paper can be clearly photographed; then, the wireless data transmission unit of the high-speed shooting instrument 2 transmits the job image shot by the camera of the high-speed shooting instrument 2 to the display screen 1, and the display screen 1 displays the job image. Preferably, the camera can take a picture of a paper piece bearing a work title, and the display screen 1 can display a work image on the display screen 1 in real time, so that the shooting area of the paper piece by the camera of the high shooting instrument 2 can be conveniently adjusted.
After the job image is displayed on the display screen 1, the display screen 1 can generate the identification frame from the character information in the job image. The method of forming the logo frame from the job image may employ a logo frame forming method commonly used in the related art, for example, methods disclosed in patent documents nos. CN104573675A, CN103268481A, CN103456195A, US2011222773a 1.
And step two, the display screen 1 receives an operation instruction of the handwriting input board and adjusts the size and the position of the identification frame to determine a target area.
Specifically, in the second step, the identification frame has two identification modes, the identification frame is identified in a first identification mode when being formed, and the identification frame is identified in a second identification mode when the display screen 1 receives an operation instruction of the handwriting input board 3; when the mark frame is marked in the first mark mode, if the operation instruction of the handwriting input board 3 is not received within t1 time, determining the target area; when the tag frame is tagged in the second tag method, if the operation command of the handwriting tablet 3 is not received within time t2, the tag frame is tagged in the first tag method.
In addition, when the number of the target areas is at least two, all the target areas are marked to form a plurality of marking frames; referring to fig. 3, the specific steps of identifying and determining the target area are,
step 2.1, identifying all the identification frames in a third identification mode;
2.2, selecting one of a plurality of identification frames identified in a third identification mode, and identifying the identification frame in the first identification mode;
step 2.3, detecting whether the display screen 1 receives an operation instruction of the handwriting input board 3, if the display screen 1 does not receive the operation instruction of the handwriting input board 3 within t1, determining a target area, and executing step 2.5; if the display screen 1 receives an operation instruction of the handwriting input board 3 within t1 time, the identification frame selected in the step 2.2 is identified in a second identification mode;
step 2.4, if the display screen 1 does not receive the operation instruction of the handwriting input board 3 within the time t2, identifying the identification frame selected in the step 2.2 in a first identification mode, and executing the step 2.3;
step 2.5, if the mark frame marked by the third mark mode still exists, executing the step 2.2; and if all the identification frames are identified in the first identification mode, finishing the step two.
When the job image shot by the high shooting apparatus 2 contains a plurality of topics and not all the topics require tutoring, the handwriting input board 3 may delete the identification frame corresponding to the topic that does not require tutoring. Specifically, in step 2.3, if the display terminal device receives the delete instruction of the control device within the time t1, the identifier frame selected in step 2.2 is deleted, and step 2.5 is executed.
Step three, the display screen 1 identifies the text information in the target area, then a display card is established, and the text information is displayed in a first area of the display card; and simultaneously the display screen 1 transmits the text information and the image of the target area to the remote equipment.
It should be noted that screen regions for displaying text information, answer information, and question information are merely referred to as "display cards" for the sake of simplicity in this embodiment. The "display card" is similar in some respects to a window in the field of desktop user interfaces.
Specifically, in the third step, if there are at least two target areas, a plurality of display cards may be established, and the display cards are divided into a first area, a second area, and a third area, where the first area on one display card only displays text information in one identification frame. Therefore, the user can move, switch, delete, save, etc. several display cards through the handwriting input board 3, thereby facilitating the management of the display cards. For example, the display card provided on the display terminal device deletes the display card in response to a deletion instruction of the manipulation device, and provides another display card among the plurality of display cards on the display terminal device.
And step four, the display screen 1 receives and synchronously displays the problem solving information fed back by the remote equipment in a second area of the display card.
Specifically, in the fourth step, if there are at least two target areas, the received problem solving information is identified and displayed on the second area of the display card for displaying the text information corresponding to the problem solving information. For example, when the number of the display cards is two, if the received problem solving information is judged to be the problem solving information of the problem on the first display card, the problem solving information is displayed in a second area of the first display card; and if the received problem solving information is judged to be the problem solving information of the problem on the second display card, displaying the problem solving information in a second area of the second display card.
And step five, the display screen 1 receives the answering information transmitted by the handwriting input board and displays the answering information in a third area of the display card.
Specifically, in step five, if there are at least two target areas, in order to improve the simplicity of display on the display screen 1 and improve the utilization efficiency of the user, only one display card of the plurality of display cards is provided on the display screen 1, and the plurality of display cards are switched and provided on the display screen 1 in response to the operation instruction of the handwriting input board 3; the display card provided on the display screen 1 responds to a transmission instruction of answer information of the handwriting input board 3 and displays the answer information in a third area of the display card.
In the following, the following description is given,
as a specific example of the learning-assisting method of the present embodiment, the image a in fig. 4 is a job image received by the display screen 1 and displayed on the display screen, and at the same time, three identification frames are formed on the job image, and the identification frames are identified in a third identification manner, that is, a single solid line frame 103.
Referring to the b image in fig. 4, the first identification frame for framing the 17 th question is identified by the first identification manner, i.e., the single solid line frame 103 is changed into the double solid line frame 101, the display screen 1 is in a state of waiting for receiving the operation instruction of the handwriting touchpad 3, and the second identification frame for framing the 18 th question and the third identification frame for framing the 19 th question are still identified by the first identification manner.
Referring to the image c in fig. 4, after the display screen 1 receives an operation instruction of the handwriting touch pad 3 within a time t1, the display screen 1 converts the identification manner of the first identification frame from the first identification manner to the second identification manner, that is, the double-solid frame 101 is converted into the dashed frame 102, corresponding to the operation instruction. At this time, the dashed line 102 may be adjusted according to the specific operation instruction of the handwriting touch pad 3, such as moving left and right, zooming in and out, and rotating angle.
After the size and position of the dashed box 102 are adjusted, the handwriting touch pad 3 does not issue any more commands. At this time, if the display screen 1 does not receive the operation instruction within the time t2, referring to the image d in fig. 4, the identification mode of the first identification frame is converted from the second identification mode to the first identification mode; when the first mark frame is marked in the first manner, if the display screen 1 does not receive the operation instruction of the handwriting touch pad 3 within the time t1, the first mark is selected.
And then, adjusting the second identification frame and the third identification frame in sequence. Since the 19 th question is not completely framed by the third identification frame for framing the 19 th question, a deletion instruction can be issued by the handwriting touch pad 3 to delete the third identification frame when the third identification frame is identified in the first identification mode.
Referring to fig. 5, after the adjustment of the first identification frame and the second identification frame is completed, the 17 th question framed by the first identification frame and the 18 th question framed by the second identification frame are displayed on the first display card 111 and the second display card 112, respectively. At this time, the first display card 111 is provided on the display screen of the display screen 1, and the text information in the first identification frame is displayed in the first area 121; meanwhile, when receiving the 17 th question resolution information of the remote device, the display screen 1 displays the question resolution information in the second display area 122 of the first display card 111, and when receiving the answer information of the handwriting input board 3, the display screen 1 displays the answer information in the third display area 123 of the first display card 111.
Of course, the display screen 1 can also correspond to an operation instruction of the handwriting input board 3 to provide the second display card 112 on the display screen of the display screen 1, such as an operation instruction corresponding to a "next page" button.
The present invention and its embodiments have been described above schematically, without limitation, and what is shown in the drawings is only one of the embodiments of the present invention, and the actual structure is not limited thereto. Therefore, if the person skilled in the art receives the teaching, without departing from the spirit of the invention, the person skilled in the art shall not inventively design the similar structural modes and embodiments to the technical solution, but shall fall within the scope of the invention.

Claims (10)

1. An interactive learning auxiliary method based on remote display is characterized in that:
the method comprises the following steps that a first user controls display terminal equipment in remote display by utilizing shooting equipment and control equipment to perform learning activities; and the number of the first and second groups,
the second user monitors the display content of the display terminal equipment by utilizing the monitoring equipment so as to monitor the learning activity of the first user; and/or the presence of a gas in the gas,
and after receiving the information transmitted by the display terminal equipment by the third user by using the remote equipment, feeding back the problem solving information to the display terminal equipment by using the remote equipment so as to complete the tutoring of the learning activities of the first user.
2. The method of claim 1, wherein: and the second user coaches the learning activity of the first user by using the display terminal equipment with the touch control function.
3. The method of claim 1, wherein: the first user controlling the display terminal device with the photographing device and the manipulation device to perform the learning activity specifically includes the following steps,
the method comprises the following steps that firstly, a display terminal device receives and displays a job image transmitted by a shooting device, and an identification frame is formed on the job image;
secondly, the display terminal equipment receives an operation instruction of the control equipment and adjusts the size and the position of the identification frame to determine a target area;
thirdly, the display terminal equipment identifies the text information in the target area and displays the text information in a first area of a display card; meanwhile, the display terminal equipment transmits the text information and the image of the target area to remote equipment;
step four, the display terminal equipment receives and synchronously displays the problem solving information fed back by the remote equipment in a second area of the display card;
and step five, the display terminal equipment receives the answer information transmitted by the control equipment and displays the answer information in a third area of the display card.
4. The method of claim 3, wherein: in the second step, the identification frame has two identification modes, the identification frame is identified in a first identification mode when being formed, and the identification frame is identified in a second identification mode when the display terminal device receives the operation instruction of the control device;
when the identification frame is identified in a first identification mode, if the operation instruction of the control device is not received within t1 time, determining a target area;
when the identification frame is identified in the second identification mode, if the operation instruction of the control device is not received within t2 time, the identification frame is identified in the first identification mode.
5. The method of claim 4, wherein: when the number of the target areas is at least two, marking all the target areas to form a plurality of marking frames; the concrete steps of the second step are that,
step 2.1, identifying all the identification frames in a third identification mode;
2.2, selecting one of a plurality of identification frames identified in a third identification mode, and identifying the identification frame in the first identification mode;
step 2.3, detecting whether the display terminal device receives an operation instruction of the control device, if the display terminal device does not receive the operation instruction of the control device within t1 time, determining a target area, and executing step 2.5; if the display terminal device receives the operation instruction of the control device within t1 time, identifying the identification frame selected in the step 2.2 in a second identification mode;
step 2.4, if the display terminal device does not receive the operation instruction of the control device within t2 time, identifying the identification frame selected in the step 2.2 in a first identification mode, and executing the step 2.3;
step 2.5, if the mark frame marked by the third mark mode still exists, executing the step 2.2; and if all the identification frames are identified in the first identification mode, finishing the second step.
6. The method of claim 5, wherein: in the step 2.3, if the display terminal device receives the deletion instruction of the control device within the time t1, the identifier frame selected in the step 2.2 is deleted, and the step 2.5 is executed.
7. The method of claim 6, wherein: and in the third step, establishing a plurality of display cards, dividing the display cards into the first area, the second area and the third area, wherein the first area on the display cards only displays the text information in one identification frame.
8. The method of claim 7, wherein: and in the fourth step, identifying the received problem solving information, and displaying the problem solving information on a second area of the display card for displaying the character information corresponding to the problem solving information.
9. The method of claim 8, wherein: only one display card of the plurality of display cards is provided on the display terminal device, and the plurality of display cards are provided on the display terminal device in response to an operation instruction of the manipulation device; and the display card provided on the display terminal equipment responds to the transmission instruction of the answer information of the control equipment and displays the answer information in a third area of the display card.
10. The method of claim 9, wherein: the display card provided on the display terminal device responds to the deletion instruction of the control device, deletes the display card, and provides another display card in the plurality of display cards on the display terminal device.
CN202110158747.XA 2021-02-05 2021-02-05 Interactive learning auxiliary method based on remote display Active CN112488890B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110158747.XA CN112488890B (en) 2021-02-05 2021-02-05 Interactive learning auxiliary method based on remote display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110158747.XA CN112488890B (en) 2021-02-05 2021-02-05 Interactive learning auxiliary method based on remote display

Publications (2)

Publication Number Publication Date
CN112488890A true CN112488890A (en) 2021-03-12
CN112488890B CN112488890B (en) 2021-05-07

Family

ID=74912369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110158747.XA Active CN112488890B (en) 2021-02-05 2021-02-05 Interactive learning auxiliary method based on remote display

Country Status (1)

Country Link
CN (1) CN112488890B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575138A (en) * 2015-01-19 2015-04-29 肖龙英 Scene interaction type multimedia system and application thereof
CN104573675A (en) * 2015-01-29 2015-04-29 百度在线网络技术(北京)有限公司 Operating image displaying method and device
CN106803427A (en) * 2016-12-15 2017-06-06 网易(杭州)网络有限公司 Method and system and computer-readable recording medium for information to be presented
CN106991198A (en) * 2017-06-01 2017-07-28 江苏学正教育科技有限公司 A kind of subjective problem database system participated at many levels based on characteristics of image collection and student
CN107832742A (en) * 2017-11-28 2018-03-23 上海与德科技有限公司 Measure of supervision and robot applied to robot
CN209543189U (en) * 2018-12-24 2019-10-25 辜茂军 A kind of split screen mobile terminal that can be dressed
CN110837833A (en) * 2019-11-14 2020-02-25 广东小天才科技有限公司 Question selection method and device, terminal equipment and readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575138A (en) * 2015-01-19 2015-04-29 肖龙英 Scene interaction type multimedia system and application thereof
CN104573675A (en) * 2015-01-29 2015-04-29 百度在线网络技术(北京)有限公司 Operating image displaying method and device
CN106803427A (en) * 2016-12-15 2017-06-06 网易(杭州)网络有限公司 Method and system and computer-readable recording medium for information to be presented
CN106991198A (en) * 2017-06-01 2017-07-28 江苏学正教育科技有限公司 A kind of subjective problem database system participated at many levels based on characteristics of image collection and student
CN107832742A (en) * 2017-11-28 2018-03-23 上海与德科技有限公司 Measure of supervision and robot applied to robot
CN209543189U (en) * 2018-12-24 2019-10-25 辜茂军 A kind of split screen mobile terminal that can be dressed
CN110837833A (en) * 2019-11-14 2020-02-25 广东小天才科技有限公司 Question selection method and device, terminal equipment and readable storage medium

Also Published As

Publication number Publication date
CN112488890B (en) 2021-05-07

Similar Documents

Publication Publication Date Title
CN107273002B (en) Handwriting input answering method, terminal and computer readable storage medium
CN110827596A (en) Question answering method based on intelligent pen
CN106371593A (en) Projection interaction calligraphy practice system and implementation method thereof
US20130183651A1 (en) Server, learning terminal apparatus, and learning content managing method
JP5099595B2 (en) Electronic pen system and program thereof
CN102880360A (en) Infrared multipoint interactive electronic whiteboard system and whiteboard projection calibration method
US20130266923A1 (en) Interactive Multimedia Instructional System and Device
CN113950822A (en) Virtualization of a physical active surface
CN208861501U (en) A kind of intelligent school table desktop and tutoring system
KR101700317B1 (en) System, method and computer readable recording medium for managing an education using a smart pen based on a dot code
CN111580903A (en) Real-time voting method, device, terminal equipment and storage medium
CN113434081A (en) Teaching system capable of implementing man-machine interaction and facilitating teaching information exchange and use method
KR101166303B1 (en) Online education system and communication terminal therfor
CN112488890B (en) Interactive learning auxiliary method based on remote display
CN113485570A (en) Multi-user writing method and device, computer readable storage medium and terminal equipment
CN112835505B (en) Remote display interaction system for assisting learning
JP2014145893A (en) Information processor, information processing method and program
TW202016904A (en) Object teaching projection system and method thereof
JP2003187242A (en) System and method for inputting data
CN101493636A (en) Interactive projecting system and interactive input method thereof
JP5141997B2 (en) Computer, display system using the same, and program thereof
JP5288340B2 (en) Display system and program thereof
JP2012155739A (en) Electronic pen system, terminal device and program therefor
CN112799626A (en) Multi-screen remote real-time writing and presenting method
CN206741741U (en) A kind of intelligent operation answering system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant