CN116301368A - Teaching method, system and medium based on immersion type XR teaching management platform - Google Patents

Teaching method, system and medium based on immersion type XR teaching management platform Download PDF

Info

Publication number
CN116301368A
CN116301368A CN202310258381.2A CN202310258381A CN116301368A CN 116301368 A CN116301368 A CN 116301368A CN 202310258381 A CN202310258381 A CN 202310258381A CN 116301368 A CN116301368 A CN 116301368A
Authority
CN
China
Prior art keywords
teaching
scene
management platform
user
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310258381.2A
Other languages
Chinese (zh)
Other versions
CN116301368B (en
Inventor
蔡铁峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Polytechnic
Original Assignee
Shenzhen Polytechnic
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Polytechnic filed Critical Shenzhen Polytechnic
Priority to CN202310258381.2A priority Critical patent/CN116301368B/en
Publication of CN116301368A publication Critical patent/CN116301368A/en
Application granted granted Critical
Publication of CN116301368B publication Critical patent/CN116301368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a teaching method, a system and a medium based on an immersion type XR teaching management platform, wherein the method comprises the following steps: the management platform system generates and initializes a management platform scene instance for the user, and the management platform scene instance renders immersive experience picture transmission according to pose information of the user and displays the immersive experience picture transmission to the user for watching, so that the user enters the management platform scene; the management platform scene instance presents teaching object information according to interactive operation of viewing the teaching object information by a user; the management platform system instantiates the teaching application scene, renders an immersive experience picture of the teaching application scene according to the pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the scene picture of the platform, and enables the user to enter the teaching application scene from the teaching platform scene. The invention can carry out immersive interaction with teaching objects in the scene of the management platform, thereby enabling the management platform to fully present the information of all the XR teaching objects and enabling the management platform to provide a stronger interaction management function.

Description

Teaching method, system and medium based on immersion type XR teaching management platform
Technical Field
The invention relates to the technical field of immersion type teaching based on XR technology, in particular to a teaching method, a system and a medium based on an immersion type XR teaching management platform.
Background
Technologies such as virtual Simulation (SR), virtual Reality (VR), augmented Reality (AR), and Mixed Reality (MR) are compatible and collectively referred to as augmented reality technology (XR). XR teaching based on XR technology can construct virtual or virtual-real combined immersive teaching environment, in which knowledge content is presented to students through virtual objects in a three-dimensional immersive manner, and students can interact with the virtual objects to perform virtual-real operation. In summary, XR teaching is deeply revolutionizing teaching morphology.
The original teaching management platform presents information on a two-dimensional plane through a webpage or a mobile phone and a PC client, can not effectively present three-dimensional teaching contents in XR teaching and three-dimensional information of immersive teaching activities, and can not provide immersive interactive teaching management capability.
Disclosure of Invention
The invention mainly aims to provide a teaching method, a system and a medium based on an immersive XR teaching management platform, which aim to construct a three-dimensional scene of the teaching management platform by using an XR technology, use information carriers such as a three-dimensional model, a scene and the like to present three-dimensional information of teaching objects such as personnel, courses, activities and the like, enable a user to view the three-dimensional information of the teaching objects such as personnel, courses, activities and the like in an immersive manner by using an XR terminal, and also enable immersive interaction with the teaching objects in the scene of the management platform, so that the management platform fully presents information of each object of XR teaching, and the management platform provides a stronger interaction management function.
In order to achieve the above purpose, the invention provides a teaching method based on an immersive XR teaching management platform, which comprises the following steps:
step S10, after receiving an application of a user entering an immersive XR teaching management platform scene, a management platform scene instance is generated and initialized for the user, and the management platform scene instance renders immersive experience picture transmission according to pose information of the user and displays the immersive experience picture transmission to the user for watching, so that the user enters the management platform scene;
step S20, the management platform scene instance presents teaching object information according to interactive operation of viewing the teaching object information by a user;
and step S30, when a user selects a teaching object in the management platform scene, and sends out a teaching application scene instruction associated with the selected teaching object through interactive operation, the management platform system instantiates a teaching application scene, renders an immersive experience picture of the teaching application scene according to the pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the scene picture of the platform, and enables the user to enter the teaching application scene from the teaching platform scene.
According to the further technical scheme, the step S10 is to present a three-dimensional model, a three-dimensional scene and an animation associated with a teaching object for a management platform scene instance generated by a user.
According to the further technical scheme, the step S10 is that a management platform scene example generated for a user can also be used for presenting three-dimensional live situations of teaching application scenes in the process of teaching objects.
According to a further technical scheme, the step S20 further comprises the step that a user performs interactive operation on the three-dimensional scene of the teaching application in progress of the teaching object in the scene of the management platform, the three-dimensional scene of the teaching application in progress of the teaching object responds, and management intervention of the user on the teaching application in progress in the teaching management platform is achieved.
According to the further technical scheme, the step S20 of managing the platform scene to present teaching object information comprises presenting information of all users, including teaching application information of any online user currently experienced, and the step S30 of managing the platform scene to present a command of joining in teaching application experience coordination of the selected user object to the platform through interactive operation when the user selects any online user object of teaching application experience in the management platform, the system generates a coordination experience picture of the selected user currently experienced application scene to the user to see, so that the user enters the selected user scene to carry out multi-person coordination teaching.
According to the further technical scheme, the interactive operation of checking teaching object information by a user in the step S20 comprises searching objects, displaying management objects in batches on the previous page/next page and displaying/hiding object details, and the platform can newly build, delete and edit teaching objects according to the interactive operation of the user in the step S20.
The further technical scheme of the invention is that the teaching management platform scene example generated in the step S10 consists of a personal module and a public module, and the step S10 further comprises:
and enabling multiple persons to cooperate in real time in the immersive XR teaching management platform.
According to a further technical scheme of the invention, the public module of the teaching management platform scene example generated in the step S10 comprises a personnel object, an organization object, a notice object, a resource object, a course object, an activity process record object and a teaching situation data object, the personal module comprises a personnel object, an organization object, a resource object, a course object, an activity process record object, an activity archiving object, a message object, a notice object and a teaching situation data object, the teaching situation data generated by the teaching application is counted in real time by the management platform system and displayed in the management platform scene example, so that when a plurality of people cooperate in real time in the immersed XR teaching management platform, the public module sharing can cooperate, and the personal module does not share and does not cooperate.
To achieve the above object, the present invention also proposes a teaching system based on an immersive XR teaching management platform, the system comprising a memory, a processor and an immersive XR teaching management platform based teaching program stored on the processor, which is executed by the processor to perform the steps of the method as described above.
To achieve the above object, the present invention also proposes a computer readable storage medium storing a teaching program based on an immersive XR teaching management platform, which when executed by a processor performs the steps of the method as described above.
The teaching method, the system and the medium based on the immersion type XR teaching management platform have the beneficial effects that:
according to the technical scheme, after receiving the application of the user entering the immersive XR teaching management platform scene, the management platform system generates and initializes the management platform scene instance for the user, and the management platform scene instance renders the immersive experience picture according to the pose information of the user, transmits and displays the immersive experience picture to the user for viewing, so that the user enters the management platform scene; the management platform scene instance presents teaching object information according to interactive operation of viewing the teaching object information by a user; when a user selects a teaching object in a management platform scene, a teaching application scene instruction associated with the selected teaching object is sent out through interactive operation, the management platform system instantiates the teaching application scene, renders an immersive experience picture of the teaching application scene according to pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the scene picture of the platform, enables the user to enter the teaching application scene from the teaching platform scene, builds a three-dimensional scene of the teaching management platform by using an XR technology, uses information carriers such as a three-dimensional model, a scene and the like to present three-dimensional information of teaching objects such as personnel, courses and activities and the like, enables the user to use an XR terminal to immersively view the three-dimensional information of the teaching objects such as personnel, courses and activities and the like, and can also immersively interact with the teaching objects in the management platform scene, so that the management platform fully presents information of all the teaching objects such as XR, and enables the management platform to provide a more powerful interactive management function.
Drawings
FIG. 1 is a flow diagram of a first embodiment of the teaching method of the present invention based on an immersive XR teaching management platform;
FIG. 2 is a system architecture diagram based on an immersive XR teaching management platform;
FIG. 3 is a schematic diagram of a personnel sand table;
FIG. 4 is a schematic view of a management platform scene composition;
FIG. 5 is a simplified schematic diagram of a management platform scenario;
FIG. 6 is a schematic diagram of an example composition of a management platform scenario;
fig. 7 is a schematic diagram of the refinement flow of the present step S10;
FIG. 8 is a schematic diagram of a multi-person collaborative management scenario construction;
FIG. 9 is a schematic diagram of a multi-person collaborative management platform scene construction for distributed rendering
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1 to 9, the present invention proposes a teaching method based on an immersive XR teaching management platform, and a first embodiment of the teaching method based on the immersive XR teaching management platform includes the following steps:
step S10, after receiving the application of the user entering the immersive XR teaching management platform scene, the management platform system generates and initializes a management platform scene instance for the user, and the management platform scene instance renders immersive experience picture transmission according to the pose information of the user and displays the immersive experience picture transmission to the user for viewing, so that the user enters the management platform scene.
In this embodiment, the teaching method based on the immersive XR teaching management platform can be applied to an XR terminal, the XR terminal can define a user coordinate system by taking a certain point of a ground plane where a user is located as an origin, defining 2 coordinate axes on the ground plane, defining 1 coordinate axis in a vertical direction of the ground plane, and the XR terminal has a user positioning function and can generate pose information of the user under the user coordinate system.
The scene and the scene example involved in the present embodiment are explained below.
A scene defines objects contained in a three-dimensional space, object states, object itself running logic, and logic for interactions between objects; the scene example is a program process or thread which is operated in real time by computing resources such as a computer processor, a memory, a display card and the like according to scene definition, and the program process or thread calculates the states of all objects in the scene in real time, renders pictures and responds to the interaction of users. When a single scene has multiple user experiences at the same time, if the computing resource which can be obtained by the single scene instance can not generate experience pictures for all users in real time, multiple scene instances are needed to be generated for the scene and distributed to all users, object states in the scene are synchronized by establishing communication connection among the scene instances, and the experience pictures are respectively generated for the corresponding users in real time by all the scene instances, so that all the users share the experience scene.
The system configuration of the immersion type XR teaching management platform related to the teaching method based on the immersion type XR teaching management platform is explained below.
As shown in FIG. 2, the immersive XR teaching management platform system of the invention is composed of a user interface 50, a management platform scene instance 10, a background service module 20, a database 30, a scene instance generation module 40 and a teaching application scene instance 60, and all the components of the system can be communicated. Wherein the user interface 50 comprises a 1 st user interface (50-1), a 2 nd user interface (50-2),. The user interface (50-K), K is a natural number, K is more than or equal to 1 and less than or equal to 100000000; the management platform scene examples comprise a 1 st platform scene example (10-1), a 2 nd platform scene example (10-2), an N th platform scene example (10-N), wherein N is a natural number, and N is more than or equal to 1 and less than or equal to 100000000; the scene example generation module 40 comprises a 1 st scene example generator (40-1), a 2 nd scene example generator (40-2), a J scene example generator (40-J), wherein J is a natural number, and J is more than or equal to 1 and less than or equal to 100000000; the teaching application scene examples 60 comprise a 1 st teaching application scene example (60-1), a 2 nd teaching application scene example (60-2), an M teaching application scene example (60-M), M is a positive integer, and M is more than or equal to 0 and less than or equal to 100000000; the background service module 20 comprises an object adding and deleting service 201, an information read-write service 202, teaching situation data statistics 203, a scene instance management service 204, a multi-person activity service 205, an object retrieval service 206 and a file read-write service 207; database 30 includes object information data 301, scene instance information 302, teaching resource files 303, teaching application files 304, management platform scene program files 305, teaching situation data 306.
The teaching object types in the object information data 301 stored in the database 30 are: personnel, organizations, announcements, messages, resources, and XR tutorial application class objects. Wherein the XR teaching application class object further comprises: course, task, activity archive, activity process record. The information of each object includes basic information such as an object I D (unique identification number) and an object name. The information of the personnel object also comprises gender, profession, management authority, self-introduced voice, a 3D model of the personnel and action expression animation. The information of the organization object also includes organization members I D. The information of the message object also includes the message sender name and I D, the recipient name and I D, the time of transmission, the message content. The advertisement information also includes advertisement content, time of release, and publisher. The resource information also includes the resource type, uploading person, downloading times, uploading time, resource function introduction, and resource file address. The information of the course object also includes the course creator name I D, creation time, course type, number of users, course content introduction, course application file address. The information of the activity object also includes activity organizer names and I D, activity place, activity type, participant names and I D, activity status (ready/in progress), activity scheduled start time and end time, activity content introduction, activity application file address. The task information includes task publisher names and I D, task personnel names and I D to be completed, task publication time and completion deadline, task content introduction, task application file addresses, task completion personnel I D. The activity archives are used to preserve activity status, and can be restored by archiving after an activity is interrupted, and the information of the activity archives object further includes the archiving time, participant I D, the archived content presentation, and the archived file address. The archive file stores state values such as the pose (position and attitude angle) of each object in the activity scene. The information of the activity procedure record object also includes an activity start time, an end time, a participant I D, a record content introduction, and an activity procedure record file address. The activity process record file samples states such as the pose of each object in the scene at each moment in time at a certain frequency, and the activity process can be played back through the XR engine. The message content, the announcement content, and the content introduction of each teaching application such as courses, tasks, activities, activity archives, activity recording processes and the like can comprise characters, three-dimensional models/scenes and animations. These models/scenes all contain two dimensional versions: miniature models/scenes, large-sized models/scenes. Compared with a large-size model/scene, the miniature model/scene can remove a lot of detail information, so that the required storage and rendering calculation amount is small, and the method can be used for introduction of objects; the large-size model/scene can present rich information, but the storage and rendering calculation amount required is large at the same time, so as to present the details of the objects.
The scenario instance information 302 stored in the database 30 includes the communication addresses of the participants I D, object types, objects I D, and scenario instances. Wherein the object type employs enumerated values (0-course, 1-task, 2-activity, 3-management platform). The teaching resource file 303 contains various types of resource files such as pictures, videos, three-dimensional models, scenes, and the like. Teaching application files 304 contain XR course, task, activity application files. The teaching situation data 306 includes a learning period of each student, a knowledge grasping degree of each course of each student, an average grasping degree of each course knowledge of a class/college/whole school, and the like.
The management platform scene program file 305 stored in the database 30 is developed based on an XR engine such as a unit 3D, and uses functions of three-dimensional scene construction, UI development, interactive script development, and the like of the XR engine, and after the computer system runs the management platform scene program, the generated management platform scene instance has the following functions: the scene example comprises various object preformed bodies such as personnel, organizations, notices, messages, courses, tasks, activities, activity archives, activity process records and the like, the preformed bodies are provided with object information according to certain layout, style and style, the presentation of the object information comprises introduction and detail, wherein the introduction is to present part of key information and miniature three-dimensional models/scenes in the object information, the detail is to present all or most of information and large-size three-dimensional models/scenes in the object information, and the preformed bodies are also provided with interactive response functions. When a field Jing Shili obtains certain object information, only 1 instance of the type of the object is needed to be instantiated, then the object information is assigned to the instance of the object, and the pose and the size of the instance of the object in the scene instance of the management platform are set, so that the loading of the object by the scene instance of the management platform is realized; the management platform scene example also has the functions of generating/destroying, displaying/hiding, displaying various types of objects in batches, and automatically setting the pose size of each object in the management platform scene; the management platform scene instance can also present teaching situation data through a 2D/3D chart. The implementation of the above functions can be developed through development functions common to XR engines such as unit 3D, etc., and will not be described in the present specification.
In order to facilitate centralized presentation and management of each type of object, in an embodiment of the present invention, the management platform scene instance generated by the management platform scene program file 305 further functions as follows: the management platform scene instance includes a plurality of sand tables: personnel sand table, organization sand table, announcement sand table, message sand table, resource sand table, course sand table, task sand table, activity process recording sand table, activity archiving sand table. Each sand table can present a plurality of teaching objects of the same type, for example, a personnel sand table can present information of a plurality of personnel at the same time. Besides, the teaching situation data sand table is also provided, so that the centralized presentation and management of the teaching situation data are realized. In order to avoid excessive objects being presented in the sand table integration set, the teaching objects can be loaded in batches, and a user can switch batches for loading the teaching objects. Each sand table also provides the functions of adding and deleting teaching objects. After the new object U I control is provided and the new object U I control is clicked, object editing U I appears, and object information is input or associated with information in a database through interactive operation of object editing U I, so that new and editing of teaching objects are completed. The object preform in the management platform scene provides a delete teaching object U I control. The implementation of the above functions can be developed through development functions common to XR engines such as unit 3D, etc., and will not be described in the present specification. The following describes the object preform and the sand table taking personnel and activities as examples.
As shown in fig. 3, the personnel object preform includes personnel introduction and details including miniature models and motion expression animations of personnel, names, professions and classes of personnel, and personnel details including large-sized models and motion expression animations of personnel, names, sexes, professions, colleges, classes, current status, XR teaching activities currently performed, self-introduction voice, and the like. The 3D models of the personnel have different states corresponding to the current states of the personnel, for example: when the personnel state is offline, the 3D model of the personnel is static and the texture becomes gray, when the personnel state is at the immersive XR teaching management platform, the personnel is static but the texture is colored, when the personnel state is in the XR teaching application experience, the 3D model texture of the personnel is colored, and the walking animation of the personnel three-dimensional model is played. When clicking on the personnel profile, personnel details are displayed, and when clicking on the personnel profile again, the personnel details are hidden. When the user clicks one personnel profile to display personnel details and clicks the other personnel profile, the sand table can hide the details of the previous personnel and display the details of the current personnel. The personnel object prefabricated body also provides UI controls for other people to cooperatively participate in the current teaching application experience of the personnel object, and UI controls for turning pages to display, search and add/delete the object in batches.
The activity object preform profile contains an activity miniature activity three-dimensional scene, an activity name, a reservation time, and a creator name. The activity details comprise a large-size activity three-dimensional scene, an activity name, a reservation time starting time and ending time, a creator name, a participant name, an activity place and an activity content text introduction. The activity details are displayed when the activity profile is clicked and hidden when the activity profile is clicked again. The active object preform also provides UI controls for the user to engage in this activity, next/last page, add, delete, search, etc.
As shown in fig. 4, any management platform scenario example in the management platform scenario example 10 is composed of a public module 101 and a personal module 102. Both public module 101 and personal module 102 are used to present platform teaching object information and respond to interactive operations, but personal module 102 is defined to present platform teaching object information that is relevant to the user's individual. As shown in fig. 4, the common module 101 includes: personnel sand table 1011, organization sand table 1012, bulletin sand table 1013, resource sand table 1014, course sand table 1015, activity sand table 1016, activity process record sand table 1017, teaching situation data sand table 1018. As shown in fig. 4, the personal module 102 includes: personnel sand table 1021, organization sand table 1022, resource sand table 1023, course sand table 1024, activity sand table 1025, task sand table 1026, activity process record sand table 1027, activity archive sand table 1028, teaching situation data sand table 1029, message sand table 1030, announcement sand table 1031. In the personal module 102, the personnel object included in the personnel sand table 1021 is a person who participates in the teaching activity together with the individual user or belongs to some teaching organizations, the organization object of the organization sand table 1022 is an organization in which the individual user participates in or composes, the resource object of the resource sand table 1023 is a resource created or used by the individual user, the course object of the 1024 course sand table is a course created, participated in or collected by the individual user, the activity object of the 1025 activity sand table is an activity created, participated in or focused by the individual user, the task object of the 1026 task sand table is a task created and participated by the individual user, the activity process record object of the 1027 activity archive sand table and the archive object are a record process and archive of the individual user participating in the XR teaching activity, the teaching condition data included in the 1029 teaching condition data sand table is teaching condition data generated by the individual user performing teaching application experience or generated by the individual user related organization, the information of the 1030 information sand table is a notice sent or received by the individual user, and the information of the 1031 sand table is a notice issued by the individual user. Fig. 5 is a simplified schematic diagram of a management platform scenario in which only personnel sand tables, course sand tables, and teaching situation data are presented.
In order to provide the function of real-time collaborative management XR teaching for multiple persons, as shown in fig. 6, the personal module 102 in the management platform scene example 10 further includes a 1 st personal module (102-1), a 2 nd personal module (102-2) … … and an xth personal module (103-X), where X is a natural number, and X is 1-100000000. And the number of personal modules in the management platform scene example corresponds to the number of users to perform real-time collaborative management. Each of which contains all of the types of sand tables in personal module 102 of fig. 4.
In this embodiment, the step S10 presents a three-dimensional model, a three-dimensional scene and an animation associated with the teaching object for the user-generated management platform scene instance, and also presents a three-dimensional live situation of the teaching application scene in the process of teaching the object.
As shown in fig. 7, in this embodiment, the step S10 specifically includes the following steps:
step S101, generating a management platform scene instance for a user.
Let arbitrary user p i I D it is
Figure BDA0004130362110000091
The communication address of its user interface (50-i) is +.>
Figure BDA0004130362110000092
(for example, this address may be the address and port number of the user terminal I P), the user interface (50-i) sends an instruction to enter the immersive management platform scenario ≡>
Figure BDA0004130362110000093
To the background service module 20, where M 0 I D, instruction "enter immersive management platform scenario". The background service module 20 receives->
Figure BDA0004130362110000094
Thereafter, according to I D M 0 Activation response: calling scene instance management module 204 to send a "generate management platform scene instance" instruction to scene instance generation module 40->
Figure BDA0004130362110000095
Wherein (1)>
Figure BDA0004130362110000096
Is instruction I D. After the scene example generation module 40 receives the instruction, it generates a scene according to I D +.>
Figure BDA0004130362110000097
Activation response: invoking 1 instance of scene generator with enough computing resources reads management platform scene program file 305 from database 30 with parameters
Figure BDA0004130362110000098
This procedure is started to generate a management platform scene instance (10-i).
Step S102, the user interface establishes communication with the management platform scene instance.
After the management platform scene instance (10-i) is generated, the communication address D of the user interface (50-i) is given i 0 Sending an established communication application (exampleSuch as socket of TCP/UDP protocol), the user interface (50-i) establishes communication with the management platform scene instance (10-i) after receiving the application. After communication is established, a user interface (50-i) sends pose and interaction operation information of a user under a user coordinate system to a management platform scene instance (10-i) in real time, the pose and interaction operation information of the user is converted from the user coordinate system to the management platform scene instance coordinate system, the management platform scene instance (10-i) responds to the interaction operation, a user experience picture is generated by rendering according to the pose of the user in real time and sent to the user interface (50-i), and the user interface (50-i) displays the picture for a user p i Looking at.
Step S103, the management platform scene instance reads teaching object information.
The management platform scene instance (10-i) sends a message M of reading all teaching objects 1 The information read-write service 202 is called by the background service module 20 to respond to the background service module 20, the information of all teaching objects in the object information data 301 is read from the database 30, various teaching object sequences are generated after sorting according to a certain rule (for example, object creation time), and teaching situation data 306 is also read from the database 30. When the information of all teaching objects is read, large files such as pictures, videos, three-dimensional models/scenes and the like in the teaching object information can be read temporarily, and the management platform can read from the database when the scenes are loaded with the objects. The generated teaching object sequences of various types are as follows: personnel object sequence O 0 Organization object sequence O 1 Message object sequence O 2 Bulletin object sequence O 3 Resource object sequence O 4 Course object sequence O 5 Sequence of moving objects O 6 Task object sequence O 7 Active archiving sequence O 8 Sequence of active process record objects O 9
Management platform scene instance traversal O 0 、O 1 、O 3 、O 4 、O 5 、O 6 、O 7 、O 8 、O 9 All objects belonging to user p are filtered out from i Personal module object, generating a sequence of person objects belonging to a personal module
Figure BDA0004130362110000101
Tissue object sequence->
Figure BDA0004130362110000102
Message object sequence->
Figure BDA0004130362110000103
Bulletin object sequence->
Figure BDA0004130362110000104
Resource object sequence->
Figure BDA0004130362110000105
Sequence of course objects->
Figure BDA0004130362110000106
Sequence of active objects->
Figure BDA0004130362110000107
Task object sequence->
Figure BDA0004130362110000108
Active archiving sequence->
Figure BDA0004130362110000109
Sequence of active procedure recording objects->
Figure BDA00041303621100001010
The user p is also copied from the teaching data i Data to p i Is a teaching situation data sand table in the personal module.
Step S104, the management platform scene instance loads the teaching object into the management platform scene.
Management platform scene instance (10-i) personnel sand table traversal personnel object sequence O of public module 101 0 Sequentially give O 0 Generating 1 person object preform instance from any person object, and generating name, I D, character model, sex, specialty, and date of person objectThe prior state and other information are assigned to the human object prefabricated body instance, and the pose and the size of the prefabricated body instance in the sand table and the display/hiding state are set, so that the loading of the human object by the public module 101 of the management platform scene instance (10-i) is realized, according to the method, the public module loads organization objects, announcement objects, resource objects, course objects, activity objects, task objects, activity process record objects and teaching condition data, and the management platform scene instance (10-i) also realizes the loading of the personnel objects, organization objects, message objects, announcement objects, resource objects, course objects, activity objects, task objects, activity archives, activity process record objects and teaching condition data by the personal module 102, thereby realizing the initialization of the management platform scene. As the sand table can display various teaching objects in batches, the management platform scene can load the teaching objects in batches.
Step S105, updating teaching objects and information thereof in the management platform scene instance.
For any user p i The management platform scene example 10-i sends a request to the background service module 20 at a certain frequency, and re-reads the object information from the object information data 301 of the database 30, so as to update various types of teaching object sequences and object information, load newly added teaching objects or unload deleted teaching objects according to the updated teaching object sequences, the object information and the teaching condition data, and reassign the changed object information to the teaching objects in the management platform scene example. Only the information that changes the teaching object needs to be read again, and the information that changes includes: status values of personnel objects, names of activities currently performed by personnel, activities I D, activity status, activity participants, task completion personnel I D, and the like.
In addition, the three-dimensional activity scene is used for representing the activity content in the brief introduction and the detail of the activity object in the management platform scene example, and when the activity is in progress, the activity state is synchronized to the three-dimensional scene of the activity object in the management platform scene example in real time, so that a user can view the live condition of the activity in real time in an immersive manner in the management scene.
Any user
Figure BDA0004130362110000111
Participate in any collaborative teaching activity together, and the activity I D is gamma 2 Representing that the system generates an activity Γ for these users 2 According to the number of users and the available computing resources of a single scene instance, the generated scene instances may be 1 or more, and the scene instances are used
Figure BDA0004130362110000112
Representing state synchronization between scene instances, their communication addresses being respectively used
Figure BDA0004130362110000113
Representing that these scenes are assigned to the user->
Figure BDA0004130362110000114
For generating an XR teaching activity immersive experience. These scene instances put activity I D Γ 2 Scene instance communication address->
Figure BDA0004130362110000115
User->
Figure BDA0004130362110000116
The I D number of (c) is written into the scenario instance information 302 of the database 30 by calling the multi-person activity of the background service module 20. The following describes, by way of example, a method for implementing real-time updating of a three-dimensional scene in an active object profile and details:
for any user p i The management platform scene example (10-i) detects that I D number is Γ 2 If the activity object state of (a) is that the activity is in progress and the activity is not being presented in the platform scene, then an instruction [ M ] is sent to the background service module 20 to "query the activity instance address according to activity I D ] 2 Γ 2 ]After receiving the instruction, the background service module 20 receives the instruction I D M 2 Activating response, calling multi-person activityThe mobile service 205 queries the active instance communication address d from the context instance information 302 of the database 30 0 (communication address of any instance of the activity can be selected) and returned to the management platform scene, and the platform scene instance (10-i) sends the communication address d 0 Transmitting an application for establishing communication, the address being d 0 Active scene instance s of (2) 0 And after receiving the application, establishing communication connection with the platform scene instance (10-i). Event scene instance s 0 Transmitting the states of all objects in the activity scene to a platform scene instance (10-i) in real time, and mapping the received real-time state values of all objects to three-dimensional scene objects of the activity object introduction and details by the platform scene instance (10-i) in the following mapping mode: order the
Figure BDA0004130362110000121
Is I D as gamma 2 Three-dimensional scene in the active object introduction or detail of (a), scene instance s 0 To->
Figure BDA0004130362110000122
Scaling to lambda 0 That is, if s 0 Any object u in (3) 0 In scene s 0 In which there are corresponding miniature objects
Figure BDA0004130362110000123
Then->
Figure BDA0004130362110000124
And u is equal to 0 The size ratio is lambda 0 . If at any time u 0 Pose in the activity scene is (c) x c y c z ) An angle value of (θ x θ y θ z ) Coordinate value lambda 0 g(c x c y c z ) The angle value is (theta) x θ y θ z ) Set to->
Figure BDA0004130362110000125
In the scene coordinate system->
Figure BDA0004130362110000126
Lower pose, additionally
Figure BDA0004130362110000127
And u is equal to 0 Status values such as animation status are mapped one by one. The three-dimensional scene in the brief introduction and detail of the movable object in the management platform scene is updated in real time, so that the user can watch the live condition of the activity in real time in the management platform scene.
By adopting the same implementation mode, when other XR teaching applications such as courses, tasks and the like are in progress, the state of the teaching application scene is synchronized to the three-dimensional scene of the teaching object in the management platform scene instance in real time, so that the management platform scene instance presents the live condition of the three-dimensional scene of the teaching application in progress.
In step S20, the management platform scene presents teaching object information according to the interactive operation of viewing the teaching object information by the user, and can manage and intervene in the ongoing teaching application and newly create and delete the edited teaching object according to the interactive operation.
In this embodiment, the step S20 includes receiving an interactive input and responding to the interactive input by the management platform scene instance.
Specifically, the user performs interactive operation on the three-dimensional scene of the teaching application in progress of the teaching object in the scene of the management platform, and the three-dimensional scene of the teaching application in progress of the teaching object responds to realize management intervention of the user on the teaching application in progress in the teaching management platform.
In step S20, the management platform scene presenting teaching object information includes presenting information of all users, including teaching application information of any online user currently experience.
In step S20, the interactive operation of the user for viewing the teaching object information includes searching objects, displaying management objects in batches of the previous page/next page, displaying/hiding details of the objects, and step S20 further includes creating, deleting and editing teaching objects by the platform according to the interactive operation of the user.
In this embodiment, the management platform scene provides interactive response functions such as creating/deleting teaching objects, searching objects, batch displaying teaching objects of previous page/next page, displaying/hiding object details, entering XR teaching application scenes such as course/task/activity, etc., and when the management platform scene instance receives interactive input of the user, the interactive response function corresponding to the management platform scene is activated. The interactive functions of creating/deleting teaching objects, searching objects, batch displaying teaching objects of the previous page/next page, displaying/hiding object details and the like of the management platform scene through the UI control can be realized by adopting common technical means, and are not described herein. In the management platform scene example, the system can interact with virtual objects constructed by the three-dimensional models in the object introduction or detail, can interact with virtual objects in the three-dimensional scenes in the object introduction or detail, and can also manage and intervene in the ongoing activities through the interaction with the three-dimensional scenes in the introduction or detail in the movable object. The implementation of three-dimensional model/three-dimensional scene interaction in object details will be described in detail below.
Virtual objects constructed by three-dimensional models in object introduction or details of personnel, courses, activities and the like, and virtual objects in three-dimensional scenes in object introduction or details of courses, activities and the like, a user can directly interact with the virtual objects in a teaching management platform scene, and the interactive operation comprises the interactive operation of selecting, translating, enlarging/shrinking the virtual objects, and response functions such as animation and the like of the virtual objects are activated through interaction, for example: the user touches the character model in the person object, and the character model swings the hand and plays the voice.
When a user interacts with the brief introduction and detailed three-dimensional scene of the active object in progress in the teaching management platform scene, management intervention can be carried out on the active in progress. The specific implementation mode is as follows: in step S20, for any activity in progress, activity I D is denoted by Γ 2 Representation of arbitrary user p i Management platform scene instance (10-i) and activity Γ 2 Is established by 1 activity scene instance of s 0 Representation s 0 The states of the poses, the animations and the like of all the virtual objects in the scene are sent to a management platform scene example (10-i) in real time, and the management platform scene example (10-i) maps the states of the received object poses, the animations and the like to a I D number gamma 2 Living in (2)Three-dimensional scenes in a moving object introduction or detail
Figure BDA0004130362110000131
Each virtual object. So when the user interacts with the three-dimensional scene in the brief introduction and detail of the movable object, the generated changes of the states of the virtual object such as pose, animation and the like are inversely mapped back to the movable scene instance s 0 User p is realized i In the management platform scene instance (10-i) for the activity Γ 2 Management intervention of (a). The implementation requires that the three-dimensional scene of the active object in the management platform scene instance contains corresponding interactive response logic.
In another implementation, the method does not need to manage the three-dimensional scene of the activity object in the platform scene instance to contain interactive response logic, thus simplifying the activity scene object, and the implementation method is as follows: the management platform scene sets a three-dimensional response interval for each movable object, and for any user p i Performing interactive operation in the management platform scene instance (10-i), when the position parameter of the interactive operation is in the movable object Γ 2 In the three-dimensional response interval of the mobile terminal, the pose parameters of the interactive operation are firstly converted into three-dimensional scenes in the brief introduction or detail of the mobile object
Figure BDA0004130362110000132
Pose in coordinate system according to s 0 To->
Figure BDA0004130362110000133
Scale of (c), transition to active scene instance s 0 The pose under the coordinate system sends the interaction operation after the pose conversion to the activity scene instance s 0 Event scene instance s 0 Responding to the interaction operation, and updating the scene state change after the response to the scene activity object of the management platform in real time.
Similarly, management intervention of teaching applications such as teaching tasks, courses and the like in progress can be realized in the management platform.
In step S30, when the user selects a teaching object in the management platform scene, and issues a teaching application scene instruction associated with entering the selected teaching object through interactive operation, the management platform system instantiates a teaching application scene, renders an immersive experience picture of the teaching application scene according to the pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the platform scene picture, and enables the user to enter the teaching application scene from the teaching platform scene.
In this embodiment, the step S30 includes the user entering the teaching application scenario from the management platform scenario. Specifically, XR teaching application class objects such as course, activity, task, etc. all provide an enter application U I, where U I of course object is the "enter course learn" button, U I of activity object is the "engage activity" button, U I of task object is the "do task" button. In addition, the personnel objects are provided with a 'collaboration' button, when a user selects any user object which is experiencing teaching application on line in the management platform, an instruction for joining in the collaborative experience of the teaching application of the selected user object is sent to the platform through interactive operation, and the system generates a collaborative experience picture of the current experience application scene of the selected user for the user to see, so that the user enters the scene of the selected user to conduct multi-person collaborative teaching.
Arbitrary user p i Click I D number c on management platform scenario instance (10-i) j The "enter lesson learn" button of the lesson object of (2), and the management platform scenario instance (10-i) responds by sending "as user p" to the background service module 20 i Course c is opened j Instruction to learn
Figure BDA0004130362110000141
Wherein->
Figure BDA0004130362110000142
For user p i I D of (A)>
Figure BDA0004130362110000143
Is the communication address of the user interface (50-i). The background service module 20 receives the instruction +. >
Figure BDA0004130362110000144
After that, the processing unit is configured to,according to instruction I D number M 3 Activating a corresponding response: invoking the scenario instance management service 204, sending "as user p" to the scenario instance generation module 40 i Generating a curriculum application c j Scene instance instruction->
Figure BDA0004130362110000145
When the scene instance generation module 40 receives the instruction +.>
Figure BDA0004130362110000146
Thereafter, 1 scene instance generator with enough free computing resources is selected, whereby the generator reads I D number c from teaching application file 304 of database 30 j Is provided with the parameters +.>
Figure BDA0004130362110000147
Starting the course application program to generate a course application scene instance s i ,s i Will be included in the teaching application scenario example 60, s i To communication address->
Figure BDA0004130362110000148
Transmitting an application for establishing a communication connection, p i Is connected to the user interface (50-i) of the application and s i Establishing a communication connection (simultaneously stopping communication with the platform scene instance 10-i, stopping displaying the platform scene experience picture to the user), based on which the user interface (50-i) handles p i The real-time pose information and the interactive operation information of the model (S) are sent to the model (S) i ,s i Converting the pose information into user in-course scene instance s i Real-time pose in the map, thereby rendering user p i At s i Is transmitted to the user interface (50-i) and displayed to the user p i Looking at. s is(s) i Course I D number c of this scenario example j User I D->
Figure BDA0004130362110000149
Scene instance communication address->
Figure BDA0004130362110000151
The information is written into database 30 as scene instance information 302.
Arbitrary user p i Click I D number b on management platform scenario instance (10-i) j The "done task" button of the task object of (2), the system generates a task scene instance s k And user interfaces (50-i) and s k The process of establishing communication connection is consistent with the above response process of "enter course learning", and scene instance s k Is to be used in the communication address of (a)
Figure BDA0004130362110000152
User I D->
Figure BDA0004130362110000153
Task I D number b j Etc. information is saved to the scene instance information 302. However, the task may be a multi-person collaborative task, and the scene instances of the same task need to be synchronized. So task scenario instance s k Also sends a "query task object I D number b" to the background service module 20 j The background service module 20 invokes the multiplayer activity service 205 to query from the database 30 scene instance information 302 as if it successfully queried that s is a division of k Scene s, other than scene instance k And sending an application for establishing communication connection to the communication addresses of the queried scene examples, and then establishing communication connection and synchronizing scene states with task scene examples corresponding to the communication addresses.
Arbitrary user p i After the management platform scene instance (10-i) clicks the "participate in activity" button of the activity object, the response process of the system is the same as the response process after clicking the "go to completion task" button.
When the user ends the teaching application experience, the system clears the scene instance information related to the teaching application experience from the scene instance information 302. Arbitrary user p i The scene instance information 302 of the management platform is accessed by the background service module 20 at a certain time interval, when the situation that the user p belongs to any user is inquired k Reads the scene instance of (a) when the scene instance is readComprises:
Figure BDA0004130362110000154
for user p k Delta is the experience type of teaching application (delta takes values of 0, 1 and 2 to correspond to courses, tasks and activities respectively), b j I D for courses or tasks, activities. User p i Selecting corresponding p in management platform scene k When the "collaboration" button is clicked by the person object of (2), the management platform scene instance (10-i) responds and sends "as user p" to the background service module 20 i Instruction to open teaching application collaboration->
Figure BDA0004130362110000155
Wherein->
Figure BDA0004130362110000156
For user p i I D of (A)>
Figure BDA0004130362110000157
Is p i The communication address of the user interface (50-i). After receiving the instruction, the background service module 20 receives the instruction I D M 5 Activating a corresponding response: invoking the scenario instance management service 204, sending a "generate teaching application scenario instance for user instruction" to the scenario instance generation module 40>
Figure BDA0004130362110000158
When the scene instance generation module 40 receives this instruction, 1 scene instance generator with enough free computing resources is selected, whereby the generator reads I D number delta I D number b from the teaching application file 304 of the database 30 j Is provided with the scene instance generator with the parameters +.>
Figure BDA0004130362110000159
Starting the task application program to generate task application scene instance +.>
Figure BDA00041303621100001510
To communicationAddress->
Figure BDA00041303621100001511
Send an application to establish a communication connection +.>
Figure BDA00041303621100001512
Establishing a communication connection with the user interface (50-i) and the user interface (50-i) terminating the receiving of the experience picture of the platform scene instance, while +.>
Figure BDA00041303621100001513
To communication address->
Figure BDA00041303621100001514
Transmitting an application for establishing a communication connection with a user p k Communication connection is established for teaching application scene examples, scene states are synchronized, cooperation of teaching application experience is achieved, and scene examples are saved>
Figure BDA0004130362110000161
To the scene instance information 302. When a user selects any user object which is experiencing teaching application on line in the management platform, an instruction for joining in teaching application experience collaboration of the selected user object is sent to the platform through interactive operation, and a system generates a collaboration experience picture of the current experience application scene of the selected user to be seen by the user, so that the user enters the scene of the selected user to conduct multi-person collaborative teaching.
Further, based on the first embodiment shown in fig. 1, a second embodiment of the teaching method based on the immersive XR teaching management platform of the present invention is proposed, and the difference between this embodiment and the first embodiment shown in fig. 1 is that, in this embodiment, the teaching management platform scene instance generated in the step S10 is composed of a personal module and a public module, and the step S10 further includes:
And S106, enabling multiple persons to cooperate in real time in the immersed XR teaching management platform, wherein the public modules can cooperate in sharing and the personal modules cannot cooperate in sharing during cooperation.
As shown in FIG. 8, the present embodimentAnd a multi-person immersive management scene is also constructed, so that the platform provides a real-time collaborative management function. Specifically, in step S106, the I D number of the user, the management platform scenario instance communication address of the user, are registered to the scenario instance information 302. For any user p i I D it is
Figure BDA0004130362110000162
The communication address of the management platform scene example (10-i) is +.>
Figure BDA0004130362110000163
Scene instance information->
Figure BDA0004130362110000164
The scene instance information 302 is written into the database 30, wherein the value 3 in the scene instance information indicates that this scene instance is a management platform scene, and the value 0 in the information is meaningless here.
In this embodiment, step S40 further includes an optional user p i In the scene example (10-i) of the management platform, any I D number is
Figure BDA0004130362110000165
User object p of (2) k When the state is "in management platform scene", user p i Selecting object p k Clicking the "collaboration" button, the management platform scene instance (10-i) queries user p from scene instance information 302 via the background service module 20 k The communication address of the management platform scene instance (10-k) is D k 2 The management platform scene instance (10-i) sends a communication connection establishment application to the management platform scene instance (10-k), the management platform scene instance (10-k) establishes communication connection with the management platform scene instance (10-i) after receiving the application, the management platform scene instance (10-i) and the management platform scene instance (10-k) synchronize the public module state in real time based on the communication connection, and the personal module does not synchronize, so that a user p is realized i And p is as follows k Collaboration in a management platform scenario. When the public modules are cooperated, the public modules can be cooperated, the personal modules are not cooperated, and the personal modules can be protected by the non-sharing and the non-cooperated personal modulesPerson privacy.
In this embodiment, the public module of the teaching management platform scene instance generated in the step S10 includes a personnel object, an organization object, a bulletin object, a resource object, a course object, a activity object, an activity process record object, and a teaching situation data object, and the personal module includes a personnel object, an organization object, a resource object, a course object, an activity process record object, an activity archive object, a message object, a bulletin object, and a teaching situation data object, and the teaching situation data generated by the teaching application is counted in real time by the management platform system and displayed in the management platform scene instance.
Furthermore, the immersive XR teaching management platform aims at covering all teachers and students, most courses and all links of teaching in the whole school, has a plurality of teaching objects, and has large quantity of types and large quantity of calculation resources required for constructing a scene of the management platform. Thus, based on the first and second embodiments described above, a teaching third embodiment of the present invention based on an immersive XR teaching management platform is presented.
The embodiment improves the first embodiment and the second embodiment, three-dimensional scenes in brief introduction and details of XR teaching application such as courses, tasks and activities adopt three-dimensional voxel diagrams or three-dimensional voxel videos, and when updating a management platform scene, the state mode of synchronizing virtual objects by the activity/task objects is that the activity/task scene instances generate scene voxel diagrams in real time and push the scene voxel diagrams to the management platform scene instances, and the scene voxel diagrams are loaded into brief introduction and details of the activity/task objects in real time. In step S40, when the user interacts with the three-dimensional scene voxel map or voxel video of the activity/task scene object, the mode of transferring the interaction operation to the application scene instance is adopted when the position parameter of the user interaction operation presents the three-dimensional interval in the voxel map.
The method has the beneficial effects that models and logics in teaching application do not need to be directly exposed to a scene of a management platform, so that copyrights of teaching application are conveniently protected.
It should be noted that, in this embodiment, the implementation scheme adopted for generating and loading the voxel map may refer to application number "202210427227.9", and the name is an invention patent application of an immersion class construction method, system and medium based on XR technology.
Based on the first embodiment and the second embodiment, the invention also provides a fourth embodiment based on the immersive XR teaching management platform, in the embodiment, the three-dimensional scene live of the teaching application in the process of three-dimensionally presenting the teaching object in the scene of the management platform is adopted, and a mode of receiving and displaying the three-dimensional live broadcast picture of the teaching application is adopted.
The embodiment has the beneficial effects that the live situation of the teaching application scene is presented by using the stereoscopic live broadcast picture, the immersion sense is reduced, but a group of stereoscopic live broadcast pictures of the application scene can stereoscopically present the live situation of the application scene for a large number of users, the number of active scene instances is greatly reduced, the requirement on rendering hardware resources is reduced, and the copyright is also conveniently protected.
It should be noted that, for a specific implementation method, please refer to an application of patent application 202210906282.6, "immersion type interactive live broadcast construction method, system and Medium based on XR technology".
Based on the first to fourth embodiments, a fifth embodiment of the present invention is proposed, in which a distributed rendering method is adopted in the rendering management platform scene in step S30. The management platform scene is split into a plurality of sub-scenes, and a single management platform scene instance is composed of a plurality of sub-scene instances, for example, in the management platform scene, a single or a plurality of sand tables are composed into 1 sub-scene. Each sub-scene receives pose information and interaction information of a user, renders sub-scene experience pictures of the user according to real-time poses of the user in the management platform scene, and synthesizes the sub-scene experience pictures into a complete user experience picture of the management platform scene according to shielding relations to be transmitted to the user for watching. Taking fig. 9 as an example, in a multi-user collaborative management platform scene, 3 users participate in collaborative management, and the management platform scene is deconstructed into 4 sub-scenes, which are respectively: public module scene, 1 st personal module scene, 2 nd module scene, 3 rd personal module scene. The system respectively generates examples of all sub-scenes, each personal module only belongs to a single user, and the public module belongs to all users who enter the scene to cooperate, so that a user interface of the user 1 establishes communication with the 1 st personal module scene example and the public module scene example, pose and interaction operation information are sent to the user interface and the 1 st personal module scene example and the public module scene example, the 1 st personal module scene example and the public module scene example all need to render an immersive experience picture according to the pose of the user in the management platform scene example, a 1 st picture splicing module synthesizes a complete experience picture according to a shielding relation and transmits the complete experience picture to the 1 st user interface to be displayed to the user 1 for watching, and other users also establish communication with the public module scene example and the personal module scene example thereof, send pose and interaction operation information and receive the experience picture from the picture splicing module. When the computing resources owned by the single common module scene instance cannot generate experience pictures for all the users in coordination, a plurality of common module scene instances need to be generated, communication synchronization states are established among the common module scene instances, and each user is allocated to render the experience pictures.
Furthermore, if the computing resources owned by a single personal module or a common module scene instance cannot render a high quality experience picture for 1 user in real time, the scene can be further split into multiple sub-scenes. The specification of the patent application 202210022608.9, namely a new dimension space construction method, system and platform based on XR technology, is specifically described in the specification, namely a method for generating an immersive experience picture of a complete scene by respectively rendering pictures and splicing a plurality of sub-scenes.
The teaching method based on the immersion type XR teaching management platform has the beneficial effects that:
according to the technical scheme, after receiving the application of the user entering the immersive XR teaching management platform scene, the management platform system generates and initializes the management platform scene instance for the user, and the management platform scene instance renders the immersive experience picture according to the pose information of the user, transmits and displays the immersive experience picture to the user for viewing, so that the user enters the management platform scene; the management platform scene instance presents teaching object information according to interactive operation of viewing the teaching object information by a user; when a user selects a teaching object in a management platform scene, a teaching application scene instruction associated with the selected teaching object is sent out through interactive operation, the management platform system instantiates the teaching application scene, renders an immersive experience picture of the teaching application scene according to pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the scene picture of the platform, enables the user to enter the teaching application scene from the teaching platform scene, builds a three-dimensional scene of the teaching management platform by using an XR technology, uses information carriers such as a three-dimensional model, a scene and the like to present three-dimensional information of teaching objects such as personnel, courses and activities and the like, enables the user to use an XR terminal to immersively view the three-dimensional information of the teaching objects such as personnel, courses and activities and the like, and can also immersively interact with the teaching objects in the management platform scene, so that the management platform fully presents information of all the teaching objects such as XR, and enables the management platform to provide a more powerful interactive management function.
In order to achieve the above objective, the present invention further provides a teaching system based on an immersive XR teaching management platform, where the system includes a memory, a processor, and a teaching program based on the immersive XR teaching management platform stored on the processor, where the teaching program based on the immersive XR teaching management platform is executed by the processor to perform the steps of the method described in the above embodiments, which is not repeated herein.
To achieve the above objective, the present invention further provides a computer readable storage medium, where a teaching program based on an immersive XR teaching management platform is stored, and the steps of the method described in the above embodiments are executed by a processor when the teaching program based on the immersive XR teaching management platform is executed by the processor, which is not described herein again.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or modifications in the structures or processes described in the specification and drawings, or the direct or indirect application of the present invention to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An immersive XR teaching management platform-based teaching method, which is characterized by comprising the following steps:
Step S10, after receiving an application of a user entering an immersive XR teaching management platform scene, a management platform scene instance is generated and initialized for the user, and the management platform scene instance renders immersive experience picture transmission according to pose information of the user and displays the immersive experience picture transmission to the user for watching, so that the user enters the management platform scene;
step S20, the management platform scene instance presents teaching object information according to interactive operation of viewing the teaching object information by a user;
and step S30, when a user selects a teaching object in the management platform scene, and sends out a teaching application scene instruction associated with the selected teaching object through interactive operation, the management platform system instantiates a teaching application scene, renders an immersive experience picture of the teaching application scene according to the pose information of the user, transmits and displays the immersive experience picture to the user, stops displaying the scene picture of the platform, and enables the user to enter the teaching application scene from the teaching platform scene.
2. The teaching method based on the immersive XR teaching management platform according to claim 1, wherein the step S10 presents the three-dimensional model, the three-dimensional scene and the animation associated with the teaching object for the user-generated management platform scene instance.
3. The teaching method based on the immersive XR teaching management platform according to claim 2, wherein the step S10 is to enable three-dimensional rendering of the teaching application scene in progress for the user-generated management platform scene instance.
4. The teaching method based on the immersive XR teaching management platform according to claim 3, wherein step S20 further comprises the step that the user performs interactive operation on the three-dimensional scene of the teaching application in progress of the teaching object in the scene of the management platform, and the three-dimensional scene of the teaching application in progress of the teaching object responds to implement management intervention of the user on the teaching application in progress in the teaching management platform.
5. The teaching method based on the immersive XR teaching management platform according to claim 4, wherein the step S20 of presenting teaching object information in the management platform scene includes presenting information of all users including teaching application information of current experience of any online user, and the step S30 of sending an instruction of joining in teaching application experience collaboration of the selected user object to the platform through interactive operation when the user selects any online user object of teaching application experience in the management platform, wherein the system will generate a collaboration experience picture of the current experience application scene of the selected user for the user to see, so that the user enters the scene of the selected user to perform multi-person collaborative teaching.
6. The teaching method based on the immersive XR teaching management platform according to claim 5, wherein the interactive operation of viewing the teaching object information by the user in step S20 includes searching the object, displaying the management object in batches of the previous page/next page, displaying/hiding the object details, and the platform can create, delete and edit the teaching object according to the interactive operation of the user in step S20.
7. The teaching method based on the immersion XR teaching management platform according to any one of claims 1-6, wherein the teaching management platform scene example generated in step S10 is composed of a personal module and a public module, and the step S10 further includes:
and enabling multiple persons to cooperate in real time in the immersive XR teaching management platform.
8. The teaching method based on the immersive XR teaching management platform according to claim 7, wherein the public module of the teaching management platform scene instance generated in step S10 includes personnel objects, organization objects, announcement objects, resource objects, course objects, activity process record objects, teaching situation data objects, and the personal module includes personnel objects, organization objects, resource objects, course objects, activity process record objects, activity archive objects, message objects, announcement objects, and teaching situation data objects, and the teaching situation data generated by the teaching application is counted in real time by the management platform system and displayed in the management platform scene instance, so that the public module sharing can cooperate when multiple persons cooperate in real time in the immersive XR teaching management platform, and the personal module does not share non-cooperation.
9. An immersive XR teaching management platform based teaching system comprising a memory, a processor and an immersive XR teaching management platform based teaching program stored on the processor, which is executed by the processor to perform the steps of the method of any of claims 1 to 8.
10. A computer readable storage medium storing an immersive XR teaching management platform based teaching program which when run by a processor performs the steps of the method of any of claims 1 to 8.
CN202310258381.2A 2023-03-10 2023-03-10 Teaching method, system and medium based on immersion type XR teaching management platform Active CN116301368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310258381.2A CN116301368B (en) 2023-03-10 2023-03-10 Teaching method, system and medium based on immersion type XR teaching management platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310258381.2A CN116301368B (en) 2023-03-10 2023-03-10 Teaching method, system and medium based on immersion type XR teaching management platform

Publications (2)

Publication Number Publication Date
CN116301368A true CN116301368A (en) 2023-06-23
CN116301368B CN116301368B (en) 2023-12-01

Family

ID=86792093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310258381.2A Active CN116301368B (en) 2023-03-10 2023-03-10 Teaching method, system and medium based on immersion type XR teaching management platform

Country Status (1)

Country Link
CN (1) CN116301368B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106128196A (en) * 2016-08-11 2016-11-16 四川华迪信息技术有限公司 E-Learning system based on augmented reality and virtual reality and its implementation
US20190057620A1 (en) * 2017-08-16 2019-02-21 Gaumard Scientific Company, Inc. Augmented reality system for teaching patient care
US20190096274A1 (en) * 2017-09-07 2019-03-28 Thebeamer Llc Educational teaching methods and systems providing games and lessons for measuring properties of virtual objects utilizing virtual measurement instruments
CN110494196A (en) * 2016-12-08 2019-11-22 数字脉冲私人有限公司 System and method for using virtual reality to study in coordination
US20190392725A1 (en) * 2018-06-26 2019-12-26 TAMM Innovations, Inc. System and method for virtual experiential immersive learning platform
CN111459286A (en) * 2020-04-16 2020-07-28 黄河水利职业技术学院 Web-based VR interactive learning education system and method
CN112870706A (en) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 Teaching content display method, device, equipment and storage medium
CN114998063A (en) * 2022-04-22 2022-09-02 深圳职业技术学院 XR (X-ray fluorescence) technology-based immersive class construction method and system and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106128196A (en) * 2016-08-11 2016-11-16 四川华迪信息技术有限公司 E-Learning system based on augmented reality and virtual reality and its implementation
CN110494196A (en) * 2016-12-08 2019-11-22 数字脉冲私人有限公司 System and method for using virtual reality to study in coordination
US20190057620A1 (en) * 2017-08-16 2019-02-21 Gaumard Scientific Company, Inc. Augmented reality system for teaching patient care
US20190096274A1 (en) * 2017-09-07 2019-03-28 Thebeamer Llc Educational teaching methods and systems providing games and lessons for measuring properties of virtual objects utilizing virtual measurement instruments
US20190392725A1 (en) * 2018-06-26 2019-12-26 TAMM Innovations, Inc. System and method for virtual experiential immersive learning platform
CN111459286A (en) * 2020-04-16 2020-07-28 黄河水利职业技术学院 Web-based VR interactive learning education system and method
CN112870706A (en) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 Teaching content display method, device, equipment and storage medium
CN114998063A (en) * 2022-04-22 2022-09-02 深圳职业技术学院 XR (X-ray fluorescence) technology-based immersive class construction method and system and storage medium

Also Published As

Publication number Publication date
CN116301368B (en) 2023-12-01

Similar Documents

Publication Publication Date Title
CN110570698B (en) Online teaching control method and device, storage medium and terminal
US9305465B2 (en) Method and system for topic based virtual environments and expertise detection
CN113242277B (en) Virtual synchronous classroom teaching system in 5G network environment and working method thereof
Lee A study on the intention and experience of using the metaverse
CN106463153A (en) Systems and methods for interactively presenting a presentation to viewers
Khansulivong et al. Adaptive of new technology for agriculture online learning by metaverse: a case study in faculty of agriculture, national university of Laos
Becattini et al. E-learning infrastructure prototype for geographically distributed project-based learning
CN116301368B (en) Teaching method, system and medium based on immersion type XR teaching management platform
CN116360593B (en) Teaching method, system and medium of immersion teaching plan system based on XR technology
Stefan et al. Rethinking visual arts education with new technologies and resources during the covid pandemic
Keenan et al. The naturalist’s workshop: virtual reality interaction with a natural science educational collection
Freudenthaler Flexible learning settings in Second Life
Chen et al. Using virtual world technology to construct immersive 3D virtual university
Okada et al. Distributed virtual environment realizing collaborative environmental education
Ferdig et al. Building an augmented reality system for consumption and production of hybrid gaming and storytelling
Salvador-Herranz et al. Management of distributed collaborative learning environments based on a concept map paradigm and natural interfaces
Rive et al. Face to face with the white rabbit-sharing ideas in second life
KR20230166418A (en) Method and device for producing virtual classroom object in a virtual tutorial world
Sanchez-Segura Developing future interactive systems
Prasolova-Førland et al. Battle of Stiklestad: supporting virtual heritage with 3D Collaborative Virtual Environments and Mobile Devices in educational settings
Jiang Design and Implementation of Hybrid Teaching Reform Platform for Marketing Based on Micro Lecture
Tuah et al. Intangible Cultural Dance Preservation Using Virtual Reality 360-Degree Video
Jennings Virtual Apparitions, Digital Ghosts: A Sculptural Manifestation of a Social Network
Orel Towards a Digitized Workplace
Yu et al. Innovative application of virtual reality technology in digital display of intangible cultural heritage

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant