CN112732076A - Real-time teaching guidance method and system in virtual reality environment - Google Patents

Real-time teaching guidance method and system in virtual reality environment Download PDF

Info

Publication number
CN112732076A
CN112732076A CN202011610529.7A CN202011610529A CN112732076A CN 112732076 A CN112732076 A CN 112732076A CN 202011610529 A CN202011610529 A CN 202011610529A CN 112732076 A CN112732076 A CN 112732076A
Authority
CN
China
Prior art keywords
virtual reality
student user
student
viewpoint
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011610529.7A
Other languages
Chinese (zh)
Inventor
王晓敏
柴贵山
张琨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Gelingruke Technology Co ltd
Original Assignee
Jiangxi Gelingruke Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Gelingruke Technology Co ltd filed Critical Jiangxi Gelingruke Technology Co ltd
Priority to CN202011610529.7A priority Critical patent/CN112732076A/en
Publication of CN112732076A publication Critical patent/CN112732076A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a real-time teaching guidance method and a real-time teaching guidance system in a virtual reality environment, wherein the real-time teaching guidance method comprises the following steps: acquiring screen coordinates of a marking point input by a teacher; converting the screen coordinates of the marked points into world coordinates of a 3D scene in real time, and sending the world coordinate data of the marked points to a virtual reality helmet of a student end through network protocol broadcasting; after receiving the world coordinate data, the virtual reality helmet carries out resolving processing, converts the world coordinate data into a local world coordinate and generates a marking point M; acquiring viewpoint information of a current student user; and determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user. By the method, the students can be prevented from missing key knowledge points and demonstration, and the problem that the students cannot keep up with the demonstration progress of the teacher is avoided; meanwhile, teachers can master the sight conditions of students in real time, and the teaching efficiency and quality are improved.

Description

Real-time teaching guidance method and system in virtual reality environment
Technical Field
The invention relates to the technical field of virtual reality education, in particular to a real-time teaching guidance method and a real-time teaching guidance system in a virtual reality environment.
Background
The current virtual reality equipment (VR helmet) is widely applied to classroom teaching and has a positive effect. However, after the student wears the VR helmet, the student may be out of sight of the teacher due to the nature of the virtual reality. Therefore, in the course of viewing courseware, teachers cannot master the sight conditions of students in real time and give guidance; especially, when important knowledge points in a VR scene are played, if the sight lines of students are separated from the current teaching demonstration, the sight lines of the students cannot quickly return to the key points of the teaching demonstration, the students can miss the learning of key knowledge, and the teaching effect is greatly reduced.
Disclosure of Invention
In view of the above, the present invention aims to overcome the defects of the prior art, and provides a real-time teaching guidance method and system in a virtual reality environment, which is convenient for students to obtain guidance prompts of the system even if the sight line is separated from the current teaching demonstration when watching virtual reality equipment, and return the sight line to the teaching demonstration focus according to the prompts, thereby further improving the use effect of virtual reality courseware, and avoiding the problems that the focus cannot be found in the courseware, and the teacher's explanation cannot be followed.
In order to achieve the purpose, the invention adopts the following technical scheme: a method of real-time tutoring in a virtual reality environment, comprising:
acquiring screen coordinates of a marking point input by a teacher;
converting the screen coordinates of the marked points into world coordinates of a 3D scene in real time, and sending the world coordinate data of the marked points to a virtual reality helmet of a student end through network protocol broadcasting;
after receiving the world coordinate data, the virtual reality helmet carries out resolving processing, converts the world coordinate data into a local world coordinate and generates a marking point M;
acquiring viewpoint information of a current student user;
and determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
Optionally, the viewpoint information includes: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the sight direction of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction';
determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet;
if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
Optionally, the method further includes:
and transmitting the viewpoint information of the student users to a teacher end in real time through a network protocol, and generating the current sight point conditions of all virtual reality helmets at the teacher end.
Optionally, the step of transmitting the viewpoint information of the student user to the teacher end in real time through a network protocol, and generating the current sight point conditions of all virtual reality helmets at the teacher end includes:
coordinate conversion is carried out on the viewpoint information of the student users, and the viewpoint information is transmitted to a teacher end in real time through a network protocol after the coordinate conversion;
the teacher end analyzes the received coordinate data into world coordinates;
generating a viewpoint icon by the world coordinates;
and displaying the viewpoint icon data of all the virtual reality helmets to the teacher.
Optionally, the sending of the guidance prompt to the student user includes:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
The invention also provides a real-time teaching guidance system in the virtual reality environment, which comprises:
the system comprises a screen coordinate acquisition module, a label editing module and a label message broadcasting module which are positioned at a teacher end, and a label analysis module, a viewpoint acquisition module and a control module which are positioned at a student end;
the screen coordinate acquisition module is used for acquiring the screen coordinates of the annotation points input by the teacher;
the annotation editing module is used for converting the screen coordinates of the annotation points into world coordinates of the 3D scene in real time;
the annotation message broadcasting module is used for sending the world coordinate data of the annotation point to a virtual reality helmet of a student end through network protocol broadcasting;
the marking analysis module is used for resolving and converting the world coordinate data received by the virtual reality helmet into local world coordinates and generating marking points M;
the viewpoint acquisition module is used for acquiring viewpoint information of a current student user;
and the control module is used for determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
Optionally, the viewpoint information includes: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction'; determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet; if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
Optionally, the method further includes:
the system comprises a viewpoint message broadcasting module positioned at a student end, a viewpoint icon generating module and a display module positioned at a teacher end;
the viewpoint message broadcasting module is used for transmitting the viewpoint information of the student users to the teacher end in real time through a network protocol;
the viewpoint icon generating module is used for generating current viewpoint icons of all virtual reality helmets at a teacher end;
and the display module is used for displaying the viewpoint icon data of all the virtual reality helmets to a teacher.
Optionally, the sending of the guidance prompt to the student user includes:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
According to the real-time teaching guidance method in the virtual reality environment, a teacher end provides a labeling function of virtual reality courseware, a labeled screen coordinate is converted into a world coordinate of a 3D scene in real time, the world coordinate is broadcasted to a VR helmet of a student end through a network protocol, the helmet end receives the labeled coordinate, then carries out resolving processing, converts the labeled coordinate into a local world coordinate, then generates a labeling point M, and determines whether to send out guidance reminding according to the labeling point M and the viewpoint information of a current student user; meanwhile, the sight point of the current helmet of the student end can be transmitted to the teacher end in real time through the network protocol, and the current sight point condition of all the helmets is generated at the teacher end, so that the teacher can conveniently master the teaching condition and adjust the teaching rhythm. By the real-time teaching guidance method, students can be prevented from missing key knowledge points and demonstration, and the problem that the students cannot keep up with the demonstration progress of a teacher is avoided; meanwhile, teachers can master the sight conditions of students in real time, so that the teachers can conveniently master the teaching conditions, the teaching rhythm is adjusted, and the teaching efficiency and quality are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a real-time teaching guidance method in a virtual reality environment according to the present invention;
FIG. 2 is a schematic diagram of a real-time teaching guidance system in a virtual reality environment according to the present invention;
FIG. 3 is a data flow diagram provided by a real-time tutorial guidance system in a virtual reality environment in accordance with the present invention.
In the figure: 11. a screen coordinate acquisition module; 12. a label editing module; 13. a label message broadcasting module; 14. a viewpoint icon generating module; 15. a display module; 21. a label analysis module; 22. a viewpoint acquisition module; 23. a control module; 24. and a viewpoint message broadcasting module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail below. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the examples given herein without any inventive step, are within the scope of the present invention.
FIG. 1 is a flow chart diagram provided by a real-time teaching guidance method in a virtual reality environment according to the present invention.
As shown in fig. 1, the method for real-time teaching guidance in a virtual reality environment according to the present invention includes:
s11: acquiring screen coordinates of a marking point input by a teacher;
s12: converting the screen coordinates of the marked points into world coordinates of a 3D scene in real time, and sending the world coordinate data of the marked points to a virtual reality helmet of a student end through network protocol broadcasting;
s13: after receiving the world coordinate data, the virtual reality helmet carries out resolving processing, converts the world coordinate data into a local world coordinate and generates a marking point M;
s14: acquiring viewpoint information of a current student user;
s15: and determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
Further, the viewpoint information includes: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction';
determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet;
if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
In practical use, if the included angle between the sight line direction of the user and the connecting line direction of the marking point M and the helmet is less than 5 degrees (the protection ranges of the left 5 degrees and the right 5 degrees are 10 degrees), no guide prompt is sent out; if the included angle between the sight line direction of the user and the connecting line direction of the marking point M and the helmet is more than 5 degrees, a perpendicular line is drawn from the point M to the connecting line direction of the marking point M and the helmet, and the intersection point is P. At this time, the system generates an animation prompt pointing from the P point coordinate to the M point coordinate, so as to remind the student user of where the student user should watch.
The values of the above angles can be set according to actual requirements, and this document is only an example and is not limited specifically.
Further, the method further comprises:
and transmitting the viewpoint information of the student users to a teacher end in real time through a network protocol, and generating the current sight point conditions of all virtual reality helmets at the teacher end.
Further, the step of transmitting the viewpoint information of the student user to the teacher end in real time through a network protocol, and generating the current sight point conditions of all virtual reality helmets at the teacher end includes:
coordinate conversion is carried out on the viewpoint information of the student users, and the viewpoint information is transmitted to a teacher end in real time through a network protocol after the coordinate conversion;
the teacher end analyzes the received coordinate data into world coordinates;
generating a viewpoint icon by the world coordinates;
and displaying the viewpoint icon data of all the virtual reality helmets to the teacher.
Further, the sending of the guidance prompt to the student user includes:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
The real-time teaching guidance method comprises the steps that a teacher end provides a labeling function of virtual reality courseware, labeled screen coordinates are converted into world coordinates of a 3D scene in real time, the world coordinates are broadcast to a VR helmet of a student end through a network protocol, the helmet end carries out resolving processing after receiving the labeled coordinates, the labeled coordinates are converted into local world coordinates, a labeling point M is generated, and whether a guiding prompt is sent or not is determined according to the labeling point M and viewpoint information of a current student user; meanwhile, the sight point of the current helmet of the student end can be transmitted to the teacher end in real time through the network protocol, and the current sight point condition of all the helmets is generated at the teacher end, so that the teacher can conveniently master the teaching condition and adjust the teaching rhythm.
By the real-time teaching guidance method, students can be prevented from missing key knowledge points and demonstration, and the problem that the students cannot keep up with the demonstration progress of a teacher is avoided; meanwhile, teachers can master the sight conditions of students in real time, so that the teachers can conveniently master the teaching conditions, the teaching rhythm is adjusted, and the teaching efficiency and quality are improved.
FIG. 2 is a schematic diagram of a real-time teaching guidance system in a virtual reality environment according to the present invention.
As shown in fig. 2, the real-time teaching guidance system in a virtual reality environment according to the present invention includes:
the system comprises a screen coordinate acquisition module 11, a label editing module 12 and a label message broadcasting module 13 which are positioned at a teacher end, and a label analysis module 21, a viewpoint acquisition module 22 and a control module 23 which are positioned at a student end;
the screen coordinate acquisition module 11 is used for acquiring the screen coordinates of the annotation points input by the teacher;
the annotation editing module 12 is configured to convert the screen coordinates of the annotation point into world coordinates of the 3D scene in real time;
the annotation message broadcasting module 13 is configured to send the world coordinate data of the annotation point to a virtual reality helmet of the student end through network protocol broadcasting;
the annotation analyzing module 21 is configured to perform resolving processing after the virtual reality helmet receives the world coordinate data, convert the world coordinate data into a local world coordinate, and generate an annotation point M;
the viewpoint collecting module 22 is configured to obtain viewpoint information of a current student user;
and the control module 23 is configured to determine whether to send a guidance prompt to the student user according to the annotation point M and the viewpoint information of the current student user.
Further, the viewpoint information includes: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction'; determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet; if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
Further, the system further comprises:
the viewpoint message broadcasting module 24 is located at the student end, and the viewpoint icon generating module 14 and the display module 15 are located at the teacher end;
the viewpoint message broadcasting module 24 is used for transmitting the viewpoint information of the student users to the teacher end in real time through a network protocol;
the viewpoint icon generating module 14 is configured to generate current viewpoint icons of all virtual reality helmets at a teacher end;
and the display module 15 is configured to display the viewpoint icon data of all the virtual reality helmets to a teacher.
Further, the sending of the guidance prompt to the student user includes:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
The hardware of the system is a virtual reality helmet and a handle of a student end, a PC which is arranged at a teacher end and comprises a mouse, a keyboard and/or a touch screen, and the system further comprises a network.
It should be noted that, the composition structure of the system of the present invention includes: the module of setting on student's end virtual reality helmet to and, the module of setting on teacher's end PC.
It should be noted that the viewpoint collecting module 22 may be an eye tracking module.
In actual use, a teacher marks important operation node coordinates through a screen coordinate acquisition module 11 on a PC, screen coordinate data are transmitted to a mark editing module, world coordinate data of a 3D scene of a marked point are generated and transmitted to a marked message broadcasting module 13, and the world coordinate data are broadcasted and distributed to a virtual reality helmet of a student end through data assembly by using a network protocol. And after receiving the world coordinate data, the virtual reality helmet of the student end resolves the world coordinate data into local world coordinates and generates a marking point M. The viewpoint collecting module 22 of the student end collects the viewpoint information of the current student user, and the control module 23 of the student end determines whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
For a specific determination process of whether to issue a guidance prompt to the student user, please refer to the corresponding description above.
It should be noted that, as long as the module of the student end is started, the student viewpoint information is resolved, the coordinate is converted and then sent to the teacher end in real time, the module of the teacher end resolves the data and then converts the data into world coordinates, and viewpoint icons are generated, so that the sight data of all student users are fed back, the teacher can conveniently master teaching conditions, and the teaching rhythm is adjusted. Fig. 3 is a data flow diagram of the system.
The working principle of the real-time teaching guidance system in the virtual reality environment is the same as that of the real-time teaching guidance method in the virtual reality environment, and the details are not repeated herein.
The invention provides a real-time teaching guidance system in a virtual reality environment, which can be convenient for students to obtain a guidance prompt of the system even if the sight line is separated from the current teaching demonstration when watching virtual reality equipment, and return the sight line to the teaching demonstration focus according to the prompt, thereby further improving the use effect of virtual reality courseware and avoiding the problems that the focus cannot be found in the courseware, the explanation of teachers cannot be followed and the like; the system can also enable a teacher to master the sight condition of students in real time, so that the teacher can conveniently master the teaching condition, the teaching rhythm is adjusted, and the teaching efficiency and the teaching quality are improved.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that the terms "first," "second," and the like in the description of the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present invention, the meaning of "a plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (9)

1. A real-time teaching guidance method in a virtual reality environment is characterized by comprising the following steps:
acquiring screen coordinates of a marking point input by a teacher;
converting the screen coordinates of the marked points into world coordinates of a 3D scene in real time, and sending the world coordinate data of the marked points to a virtual reality helmet of a student end through network protocol broadcasting;
after receiving the world coordinate data, the virtual reality helmet carries out resolving processing, converts the world coordinate data into a local world coordinate and generates a marking point M;
acquiring viewpoint information of a current student user;
and determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
2. The method of claim 1, wherein the viewpoint information comprises: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the sight direction of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction';
determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet;
if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
3. The method of claim 1, further comprising:
and transmitting the viewpoint information of the student users to a teacher end in real time through a network protocol, and generating the current sight point conditions of all virtual reality helmets at the teacher end.
4. The method of claim 3, wherein transmitting the viewpoint information of the student users to the teacher end in real time through a network protocol, and generating the current sight point situation of all virtual reality helmets at the teacher end comprises:
coordinate conversion is carried out on the viewpoint information of the student users, and the viewpoint information is transmitted to a teacher end in real time through a network protocol after the coordinate conversion;
the teacher end analyzes the received coordinate data into world coordinates;
generating a viewpoint icon by the world coordinates;
and displaying the viewpoint icon data of all the virtual reality helmets to the teacher.
5. The method of any one of claims 1 to 4, wherein said issuing a guidance prompt to the student user comprises:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
6. A real-time tutoring system in a virtual reality environment, comprising:
the system comprises a screen coordinate acquisition module, a label editing module and a label message broadcasting module which are positioned at a teacher end, and a label analysis module, a viewpoint acquisition module and a control module which are positioned at a student end;
the screen coordinate acquisition module is used for acquiring the screen coordinates of the annotation points input by the teacher;
the annotation editing module is used for converting the screen coordinates of the annotation points into world coordinates of the 3D scene in real time;
the annotation message broadcasting module is used for sending the world coordinate data of the annotation point to a virtual reality helmet of a student end through network protocol broadcasting;
the marking analysis module is used for resolving and converting the world coordinate data received by the virtual reality helmet into local world coordinates and generating marking points M;
the viewpoint acquisition module is used for acquiring viewpoint information of a current student user;
and the control module is used for determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user.
7. The system of claim 6, wherein the viewpoint information comprises: a direction of sight;
the step of determining whether to send a guidance prompt to the student user according to the marking point M and the viewpoint information of the current student user comprises the following steps:
acquiring a 'marking point M and a helmet connecting line direction';
determining an included angle between the sight line direction of the current student user and the connecting line direction of the marking point M and the helmet;
if the included angle between the sight line direction of the current student user and the connecting line direction of the labeling point M and the helmet is smaller than a specific angle, no guide prompt is sent to the student user; otherwise, sending out guidance prompt to the student user.
8. The system of claim 6, further comprising:
the system comprises a viewpoint message broadcasting module positioned at a student end, a viewpoint icon generating module and a display module positioned at a teacher end;
the viewpoint message broadcasting module is used for transmitting the viewpoint information of the student users to the teacher end in real time through a network protocol;
the viewpoint icon generating module is used for generating current viewpoint icons of all virtual reality helmets at a teacher end;
and the display module is used for displaying the viewpoint icon data of all the virtual reality helmets to a teacher.
9. The system of any one of claims 6 to 8, wherein said issuing of the guidance prompt to the student user comprises:
sending out a prompt to the student user in an animation guidance mode;
the animation guidance mode specifically includes:
and generating animation guide from the sight point of the current virtual reality helmet to the marking point M.
CN202011610529.7A 2020-12-30 2020-12-30 Real-time teaching guidance method and system in virtual reality environment Pending CN112732076A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011610529.7A CN112732076A (en) 2020-12-30 2020-12-30 Real-time teaching guidance method and system in virtual reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011610529.7A CN112732076A (en) 2020-12-30 2020-12-30 Real-time teaching guidance method and system in virtual reality environment

Publications (1)

Publication Number Publication Date
CN112732076A true CN112732076A (en) 2021-04-30

Family

ID=75610963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011610529.7A Pending CN112732076A (en) 2020-12-30 2020-12-30 Real-time teaching guidance method and system in virtual reality environment

Country Status (1)

Country Link
CN (1) CN112732076A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113641246A (en) * 2021-08-25 2021-11-12 兰州乐智教育科技有限责任公司 Method and device for determining user concentration degree, VR equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106205250A (en) * 2016-09-06 2016-12-07 广州视源电子科技股份有限公司 Lecture system and teaching methods
CN209118529U (en) * 2018-08-29 2019-07-16 重庆和贯科技有限公司 Wisdom education Environment-Ecosystem
CN110728756A (en) * 2019-09-30 2020-01-24 亮风台(上海)信息科技有限公司 Remote guidance method and device based on augmented reality
CN112070641A (en) * 2020-09-16 2020-12-11 东莞市东全智能科技有限公司 Teaching quality evaluation method, device and system based on eye movement tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106205250A (en) * 2016-09-06 2016-12-07 广州视源电子科技股份有限公司 Lecture system and teaching methods
CN209118529U (en) * 2018-08-29 2019-07-16 重庆和贯科技有限公司 Wisdom education Environment-Ecosystem
CN110728756A (en) * 2019-09-30 2020-01-24 亮风台(上海)信息科技有限公司 Remote guidance method and device based on augmented reality
CN112070641A (en) * 2020-09-16 2020-12-11 东莞市东全智能科技有限公司 Teaching quality evaluation method, device and system based on eye movement tracking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113641246A (en) * 2021-08-25 2021-11-12 兰州乐智教育科技有限责任公司 Method and device for determining user concentration degree, VR equipment and storage medium

Similar Documents

Publication Publication Date Title
CN104408983B (en) Intelligent tutoring information processing system based on recorded broadcast equipment
CN109509375B (en) Recording and broadcasting integrated intelligent comprehensive screen blackboard system and control method thereof
Scurati et al. Converting maintenance actions into standard symbols for Augmented Reality applications in Industry 4.0
AU2021261950B2 (en) Virtual and augmented reality instruction system
CN202758328U (en) Nanometer interactive electronic white board
CN113129661A (en) VR-based multi-user remote teaching system and teaching method thereof
CN112732076A (en) Real-time teaching guidance method and system in virtual reality environment
KR20210134614A (en) Data processing methods and devices, electronic devices and storage media
CN111709362A (en) Method, device, equipment and storage medium for determining key learning content
CN112052800A (en) Intelligent teaching auxiliary system for foreign language teaching based on Internet of things
CN113391745A (en) Method, device, equipment and storage medium for processing key contents of network courses
JP2005234368A (en) Remote lecture system
US9747813B2 (en) Braille mirroring
CN115691267A (en) Driving simulation control method, system, driving simulator and storage medium
Draxler et al. An Environment-Triggered Augmented-Reality Application for Learning Case Grammar
KR100598939B1 (en) Presentation system of image identification and method thereof
Herheim et al. Scratch programming and student’s explanations
CN112269612A (en) Interactive interactive teaching system supporting same screen
CN114205640B (en) VR scene control system is used in teaching
CN114283638B (en) Online teaching method and device and online teaching cloud platform
CN112261431B (en) Image processing method and device and electronic equipment
CN205160718U (en) Use long -range training set in emergent rehearsal and training system
Hildebrandt et al. Report for 2.2. 1 Task 2: Provide execution support for the human factors studies conducted for the LWRS Control Room Modernization project for PVNGS
CN117151943A (en) Building teaching training system based on virtual reality
Dotsenko Interactive Posters as a Learning Tool for Practical Tasks in the Context of Electrical Engineering Education

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430

RJ01 Rejection of invention patent application after publication