CN114706513A - VR chemical laboratory implementation method and system based on unity3D and hand motion capture - Google Patents

VR chemical laboratory implementation method and system based on unity3D and hand motion capture Download PDF

Info

Publication number
CN114706513A
CN114706513A CN202210413350.5A CN202210413350A CN114706513A CN 114706513 A CN114706513 A CN 114706513A CN 202210413350 A CN202210413350 A CN 202210413350A CN 114706513 A CN114706513 A CN 114706513A
Authority
CN
China
Prior art keywords
user
hand
unity3d
experimental
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210413350.5A
Other languages
Chinese (zh)
Other versions
CN114706513B (en
Inventor
漆舒汉
卢登震
李恒毅
刘思媛
仇傅宇
李林浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute Of Technology shenzhen Shenzhen Institute Of Science And Technology Innovation Harbin Institute Of Technology
Original Assignee
Harbin Institute Of Technology shenzhen Shenzhen Institute Of Science And Technology Innovation Harbin Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute Of Technology shenzhen Shenzhen Institute Of Science And Technology Innovation Harbin Institute Of Technology filed Critical Harbin Institute Of Technology shenzhen Shenzhen Institute Of Science And Technology Innovation Harbin Institute Of Technology
Priority to CN202210413350.5A priority Critical patent/CN114706513B/en
Publication of CN114706513A publication Critical patent/CN114706513A/en
Application granted granted Critical
Publication of CN114706513B publication Critical patent/CN114706513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Hardware Design (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a VR chemical laboratory implementation method and system based on unity3D and hand motion capture, wherein the method comprises the following steps: building a module and a chemical laboratory model in a VR experimental scene through modeling software; constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine; recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to be matched with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time; and popping up corresponding text and voice prompts according to the operation of the user to guide the experimenter to operate. According to the virtual chemical laboratory, the VR equipment and the unity3D engine are used for constructing the virtual chemical laboratory, students are placed in scenes which cannot be realized in the original classroom, deep understanding of the students on knowledge is promoted, an operator can operate experimental equipment in the virtual chemical laboratory through a hardware interaction method, training effects are improved, understanding of experiments is deepened, meanwhile, cost is reduced, and safety is improved.

Description

VR chemical laboratory implementation method and system based on unity3D and hand motion capture
Technical Field
The invention relates to the technical field of computers, in particular to a VR chemical laboratory implementation method and system based on unity3D and hand motion capture.
Background
In fact, educational resources are unevenly distributed, and educational competition is everywhere. Some schools do not have relevant equipment to provide corresponding hardware support for experiments. The lack of training in experiments when a large number of high school classmates participate in college competition has resulted in failure to achieve the desired goal as expected. Meanwhile, the experiment brings in the examination range of middle and high schools, but students rarely operate the examination device manually in practice, so that the subject thinking cannot penetrate into every detail. The underlying reason for this is limited funding. To reduce cost, the invention provides VR chemistry lab implementation method and system based on unity3D and hand motion capture.
The chemical experiment plays a vital role in mastering basic chemical knowledge and cultivating basic chemical experiment capacity of the same students in the beginning and high schools. However, it is difficult to perform chemical experiments anytime and anywhere due to the influence of site, instruments, safety and other factors. The invention aims to realize a chemical experiment environment based on a Virtual Reality technology, namely, a real environment is simulated on Virtual Reality (VR) equipment, and a Virtual space is created to provide highly simulated experiment experience for a user so that the user can learn an experiment course. By sensing the hand action of the user and forming visual feedback, the on-site experience highly matched with the real experiment is realized.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
In view of the above disadvantages of the prior art, the present invention aims to provide a VR chemical laboratory implementation method and system based on unity3D and hand motion capture, and aims to solve the problem that the existing chemical experiment cost is high, so that a user lacks experimental training and cannot deeply understand chemical knowledge.
In order to solve the above technical problems, the technical solution adopted by the present invention to solve the above technical problems is as follows:
a VR chemistry lab implementation method based on unity3D and hand motion capture, comprising:
building a module and a chemical laboratory model in a VR experimental scene through modeling software;
constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to be matched with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time;
and popping up corresponding text and voice prompts according to the operation of the user to guide the user to operate.
The VR chemical laboratory implementation method based on unity3D and hand motion capture, wherein a module in the VR experimental scene comprises three-dimensional position data information and material information of an object.
The VR chemical laboratory implementation method based on unity3D and hand motion capture is characterized in that a chemical laboratory model in a VR experimental scenario comprises a living area and a laboratory area, wherein the living area comprises a user station and a file cabinet, and the laboratory area comprises a test bench, a heater and a laboratory instrument.
The VR chemical laboratory implementation method based on unity3D and hand motion capture, wherein the step of constructing a virtual environment required for an experiment in the VR experiment scenario through unity3D engine comprises:
providing, by a unity3D engine, a physical computing system in which attributes of a laboratory instrument are defined, the attributes of the laboratory instrument including one or more of mass, volume, material, density, surface roughness, and elasticity;
the collision detection module is attached to an experimental instrument and a reagent by utilizing the function of a collision body of unity3D, and when position parameters of two or more reagents are partially overlapped, different trigger scripts are given to the chemical reagents according to different phenomena of an experiment and different chemical and physical properties of different chemical reagents, and animation and special effect which are independently designed are deduced.
The VR chemical laboratory implementation method based on unity3D and hand motion capture, wherein the step of identifying the initial hand position of the user, inputting the initial position coordinates of the hand of the user to match with the hand model in the VR experimental scene, and mapping the hand motion to the hand model in the VR experimental scene in real time comprises the steps of:
detecting that the hand of a user is placed on the equipment in the air, and identifying the initial position of the hand of the user after the equipment is initialized successfully;
inputting initial position coordinates of a user hand to be matched with a hand model in a VR experiment scene;
and establishing a mapping relation between a physical space and an information space, and mapping the hand action to the hand model in the VR experimental scene in real time.
The VR chemical laboratory implementation method based on unity3D and hand motion capture is characterized in that the step of popping up corresponding text and voice prompts according to the operation of a user and guiding the user to operate comprises the following steps:
exporting the correct operation sequence as a topological operation sequence, popping up corresponding text and voice prompts according to the operation of a user, and guiding the user to operate;
if the experimental operation of the user meets the specification and meets the correct operation sequence, no error is reported;
and if the experimental operation sequence of the user is wrong or not in accordance with the specification, triggering an error reporting system and popping up reminding operation of voice and characters.
The VR chemical laboratory implementation method based on unity3D and hand motion capture further comprises the following steps:
detecting an account number logged in by a user and a selected mode, wherein the modes comprise a teacher mode and a student mode.
The VR chemistry lab-based method of unity3D and hand motion capture, wherein,
when detecting that the user selects a teacher mode, giving the user the authority to create a room and share the current picture;
and when the user is detected to select the student mode, giving the user a search permission, and enabling the user to find a target room through room number search to watch teaching.
The VR chemistry lab-based method of unity3D and hand motion capture, wherein,
the user selecting the teacher mode communicates with the user selecting the student mode through video, audio, and chat frames.
A VR chemistry laboratory system based on unity3D and hand motion capture, comprising:
the VR scene construction module is used for constructing a module and a chemical laboratory model in a VR experimental scene through modeling software;
the hardware interaction module is used for constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
the hand recognition module is used for recognizing the initial hand position of the user, inputting the initial position coordinates of the hand of the user to be matched with a hand model in the VR experimental scene, and mapping the hand action to the hand model in the VR experimental scene in real time;
and the text and voice prompt module is used for popping up corresponding text and voice prompts according to the operation of the user and guiding the user to operate.
Has the advantages that: the invention discloses a VR chemical laboratory implementation method and system based on unity3D and hand motion capture, and the method comprises the following steps: building a module and a chemical laboratory model in a VR experimental scene through modeling software; constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine; recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to be matched with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time; and popping up corresponding text and voice prompts according to the operation of the user to guide the experimenter to operate. The virtual chemical laboratory is constructed by using VR equipment and a unity3D engine, students are placed in scenes which cannot be realized in the original classroom and are closer to teaching contents, the real perception of the students on specific situations and special contents is deepened, the deep understanding of the students on knowledge is promoted, operators can operate experimental equipment in the virtual chemical laboratory through a hardware interaction method, repeated training is carried out, the operation proficiency is gradually improved, the training effect is improved, the understanding of experiments is deepened, meanwhile, the cost is reduced, and the safety is improved.
Drawings
Fig. 1 is a flowchart of an embodiment of a VR chemistry lab implementation method based on unity3D and hand motion capture according to an embodiment of the present invention.
Fig. 2 is a functional block diagram of a VR chemical laboratory system based on unity3D and hand motion capture according to an embodiment of the present invention.
FIG. 3 is a flow chart of the steps of the preferred embodiment 1 of the present invention.
Fig. 4 is a flow chart of the steps of the preferred embodiment 2 of the present invention.
Fig. 5 is a flow chart of the steps of the preferred embodiment 3 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In order to solve the problems in the prior art, as shown in fig. 1, the present embodiment provides a VR chemical laboratory implementation method based on unity3D and hand motion capture, which includes the following steps:
s100, building a module and a chemical laboratory model in a VR experimental scene through modeling software;
s200, constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
s300, recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to match with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time;
s400, popping up corresponding text and voice prompts according to the operation of the user, and guiding the user to operate.
The virtual chemical laboratory is constructed by using VR equipment and unity3D engine, students are placed in scenes which cannot be realized in the original classroom and are closer to teaching contents, the students can manually operate experimental instruments in person, the real perception of the students on specific situations and special contents is deepened, the deep understanding of the students on knowledge is promoted, operators can operate experimental equipment in the virtual chemical laboratory by means of hardware interaction, repeated training is carried out, the operation proficiency is gradually improved, the training effect is improved, the understanding of experiments is deepened, meanwhile, the cost is reduced, and the safety is improved.
In some embodiments, the model in the VR experimental scenario includes position data information and material information of the object in three dimensions.
The modules in the VR experimental scene are output after being designed through modeling software Blender, a perfect virtual laboratory model is automatically designed and modeled after actual investigation and deep knowledge of the structure of a laboratory, and is restored according to the actual size of 1:1 by researching and combining with the existing laboratory design drawing, so that an operator has an immersive feeling.
In some embodiments, the chemical laboratory model in the VR experimental scenario includes a living area including, but not limited to, a user's workstation and a document cabinet, and a laboratory area including, but not limited to, a test stand, a heater, and laboratory instruments.
Specifically, a chemical laboratory model in the VR experiment scene is planned and divided into a living area and an experiment area, the living area can be provided with stations of users for the users to display and explain conveniently, and a document cabinet can be further arranged for the users to obtain some data or store experiment records; the experimental region is provided with a laboratory bench, a heater and various experimental instruments which need to be used, such as an alcohol lamp, a beaker, a test tube, a flask, a glass vessel, a glass rod, a wide-mouth bottle, an iron stand, matches and the like, and the experimental instruments are also designed by autonomous investigation and are modeled by modeling software Blender.
In this embodiment, the step of constructing, by the unity3D engine, a virtual environment required for the experiment in the VR experiment scenario includes:
s201, providing a physical computing system through a unity3D engine, and defining the attribute of the experimental instrument in the physical computing system, wherein the attribute of the experimental instrument comprises one or more of mass, volume, material, density, surface roughness and elasticity;
s202, attaching a collision detection module to an experimental instrument and a reagent by using the function of a collision body of unity3D, and endowing different trigger scripts to the chemical reagent according to different phenomena of an experiment and different chemical and physical properties of different chemical reagents when position parameters of two or more reagents are partially overlapped, thereby deducing animation and special effect which are independently designed.
Specifically, the model set can be used under a unity3D physical engine so that a virtual environment required by an experiment can be constructed through the unity3D engine, the unity3D engine is utilized to provide a real and reliable physical computing system, objects can be attached with attributes possessed by the nature, and the real mass, volume, material, density, surface roughness and elasticity (rigid body) of the experimental instrument can be defined in the physical computing system, so that a foundation is laid for the collision reaction of the following chemical reagents.
Then the collision detection module is physically attached to the experimental instrument and the reagent by utilizing the specific function of the collider in the unity3D, thus, after the operation, when the position parameters of the two reagents are partially overlapped, the animation and the special effect are triggered, the real chemical experiment is simulated comprehensively, by taking advantage of the unique and distinctive animation in the unity3D engine and the particle system (particle refers to the smallest unit cell that renders a special effect), according to different phenomena of each experiment and different chemical and physical properties of different chemical agents, different trigger scripts (the script language required by the unity3D engine is C #) are endowed on the chemical agents, animation and special effect which are designed autonomously are deduced, the effect of the experiment is reproduced perfectly, and a user can observe corresponding experimental phenomena including visual phenomena, auditory phenomena and olfactory phenomena (the phenomena can be presented to the user in a text form).
In this embodiment, the step of identifying the initial hand position of the user, inputting the initial position coordinates of the hand of the user to match with the hand model in the VR experimental scene, and mapping the hand motion to the hand model in the VR experimental scene in real time includes:
s301, detecting that the hand of a user is placed on the equipment and suspended, and identifying the initial position of the hand of the user after the equipment is initialized successfully;
s302, inputting initial position coordinates of the user hand to be matched with a hand model in a VR experiment scene;
s303, establishing a mapping relation between a physical space and an information space, and mapping the hand motion to the hand model in the VR experimental scene in real time.
The method adopts gesture recognition capturing hardware as input equipment, the gesture of a user is known by recognizing the hand action of the user, specifically, the hand of the user is detected to be placed on the equipment in a suspended mode, the equipment starts to be initialized, the initial position of the hand of the user is recognized after the equipment is successfully initialized, the initial position coordinates of the hand are input into software by the gesture recognition capturing hardware, the model in the software is subjected to physical calculation through a hand model on a computer, interaction is realized, the hand in reality is matched with the hand model in a VR experimental scene, then, by means of dividing a far field and a near field of a sensor working area, a reasonable physical space and information space mapping relation can be established, the hand action is mapped to the hand model in the VR experimental scene in real time, and objects in a laboratory are operated by driving the hand model.
In this embodiment, the step of popping up corresponding text and voice prompts according to the operation of the user and guiding the user to operate includes:
s401, exporting a correct operation sequence to be a topological operation sequence, popping up corresponding characters and voice prompts according to the operation of a user, and guiding the user to operate;
s402, if the experimental operation of the user meets the specification and meets the correct operation sequence, no error is reported;
and S403, if the experimental operation sequence of the user is wrong or not in accordance with the specification, triggering an error reporting system and popping up voice and text reminding operation.
Specifically, during initial development, a developer already exports a correct operation sequence as a topological operation sequence, corresponding text and voice prompts are popped up according to the operation of a user, and the operation of an experimenter is guided, wherein a carrier of the text prompts is made of an NGUI plug-in a unity3D physical engine, and when the user operates, whether the operation sequence of the user is correct and meets the specification is also detected, and if the experiment operation of the user meets the specification and meets the correct operation sequence, no error is reported; if the experimental operation sequence of the user is wrong or not in accordance with the specification, an error reporting system is triggered, and voice and text reminding operation is popped up.
In some embodiments, the unit 3D-and-hand motion capture-based VR chemistry lab implementation method further comprises:
s500, detecting a login account number of a user and a selected mode, wherein the modes comprise a teacher mode and a student mode.
In some embodiments, when it is detected that the user selects the teacher mode, the user is given the right to create a room and share a current picture, and the user generates a room number after creating the room; and when the user is detected to select the student mode, giving the user a search right to enable the user to find a target room through room number search to watch teaching, and enabling the user selecting the teacher mode to communicate with the user selecting the student mode through videos, audios and chat frames.
Specifically, the VR chemical laboratory based on unity3D and hand motion capture constructed in the invention can also support a user to carry out remote teaching, after the user logs in an account, the system can detect the account logged in by the user, different accounts correspond to different authorities, when the user is detected to be a teacher, a teacher mode is selected, the user is given the authority to create a room and share the current picture, the user can create the room for teaching, the picture seen by the user is shared for others to watch, when the user is detected to be a student, the student mode is selected, the user is given the authority to search, the user can find a target room to watch the teaching through room number search, the student can further know the details of experimental operation through the operation of watching the teacher and the prompt of characters and voice, in the room, the teacher and the student can communicate through video, audio and chat frames to complete answering, meanwhile, students can also watch the instruction of the teacher and then wear the VR helmet for manual operation, and the teacher guides the operation beside, so that the remote teaching efficiency is realized, and the defects of remote network teaching are overcome.
In some embodiments, the specific steps of step S500 are as follows:
s501, establishing a user login and registration module.
Specifically, the registration and login functions of the user are opened, so that different users can have different authorities, and the operation records of the users can be conveniently monitored, and the method specifically comprises the following steps:
s5011, in the client, the client is connected with the server through a socket class (used for network connection and communication) in the C # language;
s5012, recording the account information of the user by adopting a Mysql database at the server side. The client and the server adopt self-carrying methods send (data sending) and receive (data receiving) in the socket class to transmit data;
s5013, when the user registers, the user side sends a user name and a password to the server through a send method, after the server checks the legality of the user name and the password, the server assigns values to the existing parameterized SQL sentences generated through MySqlCommand types (used for parameterized SQL sentences) by taking the user name and the password as parameters, then executes the SQL sentences, modifies the contents of the database, completes the user registration, completes the identity verification of the user, and gives different authorities to the users with different identities;
s5014, when the user logs in, the user transmits the user name and the password to the server, the server conducts query through the parameterized SQL statement, the queried line number is sent back to the client, and if the line number is 1, the client logs in successfully.
S502, perfecting room functions, and specifically comprises the following steps:
s5021, when a teacher mode is selected by a user to create a room, a server stores a room number and an IP address of a user creating the room in a MySql database, calls a command line instruction through a Process class (used for adjusting a Process) in a C # language, calls ffmpeg (an interface for coding, decoding and transmitting audio), records a screen of a house owner through ffmpeg, converts the screen into a video stream and sends the video stream to the server;
s5022, acquiring Microphone input of a homeowner through a Microphone of unity3D, converting the audio input into audio stream through ffmpeg, and sending the audio stream to a server, wherein the sending function is realized by adopting a stream pushing instruction in the ffmpeg;
s5023, when a user inquires a room, the server inquires the room number in the database, and when the inquiry is successful, the video stream and the audio stream corresponding to the room number are transmitted to the client, so that the function of watching the teacher operation in the student mode is realized.
S503, establishing a teacher-student feedback system, specifically comprising the following steps:
s5031, creating an extremely thin bulletin board in the unity3D, and fixing the bulletin board at a certain position, wherein the display content of the bulletin board is a character stream sent by a server;
s5032, the speech of the student in the room is mainly displayed through a chat box, the student inputs a character string and then sends the character string to a server through a send of a socket class, and the server returns the character string to the teacher end;
s5033, the bulletin board is used as a part of the video stream output by the teacher, and then broadcasted by the server to all students, so as to complete an on-line questioning process.
S504, enabling the starter plate and the player plate to operate independently, and specifically comprising the following steps:
and S5041, splitting the program by the instance implementer in the processes of user login registration and room creation. Packaging a part needing to use the keyboard and mouse input, wherein the packaging is called as a starter; packaging the code part for live broadcasting and live broadcasting watching, wherein the packaging is called a player;
s5042, after the starter is started, inputting a user name, a password, a selection mode, and a room number in order, after completing the filling, recording these information in a configuration file lab.ini by using FileStream (for reading and writing files), and automatically starting the player at the same time;
s5043, the player reads the configuration file according to the path and runs according to the configuration file.
According to the invention, through constructing the account system, the registration and login of the user are opened, and the server transmits the video stream and the audio stream, so that the user can realize the spaced teaching, can simultaneously answer questions in the teaching process, can help students to understand and consolidate related chemical experiment knowledge more quickly, and learn related experiment safety operation requirements, can be widely applied to the fields of chemical classes, extraclass guidance and the like in the beginning and at high school, improves the interest degree of chemical learning of the students, and provides effective assistance for the students to master basic chemical knowledge.
The invention also provides a VR chemical laboratory system based on unity3D and hand motion capture, as shown in FIG. 2, comprising:
the VR scene construction module 10 is used for constructing a module and a chemical laboratory model in a VR experimental scene through modeling software;
the hardware interaction module 20 is used for constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
the hand recognition module 30 is configured to recognize an initial hand position of a user, input initial position coordinates of the hand of the user to match a hand model in a VR experimental scene, and map hand movements to the hand model in the VR experimental scene in real time;
and the text and voice prompt module 40 is used for popping up corresponding text and voice prompts according to the operation of the user and guiding the user to operate.
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
As shown in fig. 3, this is a flow chart of preferred embodiment 1, and embodiment 1 is a combustion property verification experiment.
The implementation method comprises the following steps:
(1) the unity3D engine loads a laboratory environment including pre-set laboratory equipment, a set of items in a laboratory.
(2) Initializing hardware equipment, conveying image streams to a VR equipment end, and detecting whether the gesture recognition hardware works normally or not.
(3) Gesture recognition hardware detects hand movements. The processing module analyzes the hand depth information and identifies the hand knocking action according to the depth image of the hand; initially, a user can view hand images through a display in the gesture recognition hardware drive, and when the user confirms that the joint points of the fingers are correctly tracked, the user can freely grasp any object to be controlled.
(4) When the operator grips the combustion spoon, the fingertip portion of the hand model overlaps the collision body of the combustion spoon in position, and thus a collision occurs. After the collision is generated, the unity3D runs a gripping script to achieve gripping, i.e. the burning spoon follows the movement of the hand model.
(5) The operator places the combustion spoon into the oxygen cylinder. At this time, the collision body at the bottom of the combustion spoon collides with the collision body in the oxygen cylinder. The flame burns more intensely after impact. At this point, the text prompt is updated. Several points are explained here:
(5.1) the flame generated here is generated by the particle system in unity 3D. The principle of the particle system is to pop up the pictures of one frame by one frame in the order from the vertex to the bottom surface of the cone to realize the dynamic jetting effect. The flame can be generated by taking the picture of the flame as an element.
(5.2) at the moment, the point light source around the flame is strengthened, and the flare effect during violent combustion can be obtained.
(6) The operator places the burning carbon block into the carbon dioxide bottle by manipulating the form of the burning spoon. The collision detection is as described above, when it is detected that the bottom of the combustion spoon has reached the carbon dioxide bottle.
(7) When a collision is detected, the particle system and the point light source thereof are turned off by the trigger script, namely, the flame is extinguished. And updating the text prompt at the same time.
The operator can also carry out the two related operations independently, and the experimental effect is not influenced.
Example 2
As shown in fig. 4, this is a flow chart of a preferred embodiment 2, and embodiment 2 is a flame reaction experiment.
The implementation method comprises the following steps:
(1) the unity3D engine loads the laboratory environment.
(2) Initializing hardware equipment, conveying image streams to a VR equipment end, and detecting whether the gesture recognition hardware works normally or not. Gesture recognition hardware detects hand motions. The principle is not described in detail here.
(3) And popping up text and voice prompts to inform an operator of experiment background information.
(4) When an operator grasps the glass rod, the hand model is superposed with the collision body preset on the glass rod. Upon detection of the collision, a grasping script is triggered so that the glass rod can be moved with the hand.
(5) The bottom end of the glass rod is placed into hydrochloric acid, the bottom of the glass rod collides with a collision body in the hydrochloric acid, and a pop-up prompt is given to an operator to place the glass rod on an alcohol lamp for ignition. While changing the state variable from a to B. The explanation is made here:
(5.1) state variables refer to different states represented by different numbers. For example, the number a above represents the initial state, and B represents the state after having undergone the step of washing with hydrochloric acid. In general, unless otherwise specified, the actual meaning of changing from a to B is that only from state a to state B can be achieved, and if the other state is reached, an error warning indication is popped up.
(6) The operator puts the glass stick bottom on alcohol burner flame, and the collision body that is located the glass stick bottom collides with flame particle in the particle system promptly, triggers the script this moment, and the suggestion operator can dip in reagent. And changing the state variable from B to C.
(7) The operator dips in any of the four reagents and changes the state variable from C to Di after the collision between the collider at the bottom of the glass rod and the reagent Xi. Then, the operator is prompted to burn the alcohol lamp.
(8) When the operator moves the glass rod above the alcohol lamp, the flame of the alcohol lamp is changed into a color i corresponding to a state Di after the collision body at the bottom end of the glass rod collides with the flame of the alcohol lamp, namely the particle system, and then a prompt pops up to inform the back of the operator. If the experiment is to be continued, skipping to the step (5); if the experiment is to be ended, the demonstration of example two is ended.
Example 3
As shown in fig. 5, this is a flow chart of preferred embodiment 3, example 3 is a hydrogen production experiment.
(1) The unity3D engine loads the laboratory environment.
(2) Initializing hardware equipment, conveying an image stream to a VR (virtual reality) equipment end, and detecting whether the gesture recognition hardware works normally or not. Gesture recognition hardware detects hand movements. The principle is not described in detail here.
(3) And popping up text and voice prompts to inform an operator of experiment background information.
(4) And sequentially installing required instruments according to prompt requirements, wherein the instruments comprise position identification and regression correction homing. Here, the following description is made:
(4.1) position identification and regression correction: the position of the object, which is represented by the geometric center of the object, determines whether the object is determined to be placed back near the specified position. Due to inaccuracy of manual operation, in order to guarantee user experience, the designated position is enlarged to be in a circle with the accurate position as the center. This embodiment adjusts the radius to 2.5 units long. This means that the user can correct the return to the precise position by simply placing the object to a position within 2.5 units of length from the precise position.
(4.2) regression correction mode: after the object is detected to correct the regression, the object will slowly move at a certain speed until the object moves to the specified accurate position. After the position returns to the specified precise position, the degree of rotation also returns to the specified parameters. The correct regression is now complete.
(4.3) sequence assurance: the sequence of installing the instruments can be tightly controlled in the above similar state variable method.
(5) Pressing the key G ignites the match.
(6) After the match is ignited, one section of the match is grasped, and the flame, namely the particle system at the head of the match is used for triggering the particle system of the first alcohol lamp, so that the ignition effect is realized. Then a next prompt pops up.
(7) A solution turning green was observed. The description here is made about the script:
(7.1) the solution is originally blue, which corresponds to an RGB value. The RGB color scheme is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G) and blue (B) and superimposing the three color channels on each other, wherein RGB represents the colors of the three color channels of red, green and blue. Similarly, the final state green also corresponds to an RGB value. Marking out points corresponding to two colors in a three-dimensional coordinate graph, connecting the points, carrying out linear change from beginning to end along the line, and finishing the linear change within a given time length to finish slow color change.
(8) After waiting a period of time, the pop-up cue may ignite a second alcohol lamp, which may generate an operator error cue if ignited before the cue appears. Where the wait time was set to 12 seconds after a reasonable evaluation.
(9) After the second alcohol burner was ignited, the particle system at the catheter port was triggered, giving off a light blue flame. Meanwhile, the opacity of the inner wall of the beaker changes from shallow to deep, namely the effect under the influence of water mist.
(10) Pressing the P key can pick up the instrument and end the experiment.
In summary, the invention discloses a VR chemical laboratory implementation method and system based on unity3D and hand motion capture, the method comprises: building a module and a chemical laboratory model in a VR experimental scene through modeling software; constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine; recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to be matched with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time; and popping up corresponding text and voice prompts according to the operation of the user to guide the experimenter to operate. The virtual chemical laboratory is constructed by using VR equipment and unity3D engine, students are placed in scenes which cannot be realized in the original classroom and are closer to teaching contents, the students can manually operate experimental instruments in person, the real perception of the students on specific situations and special contents is deepened, the deep understanding of the students on knowledge is promoted, operators can operate experimental equipment in the virtual chemical laboratory by means of hardware interaction, repeated training is carried out, the operation proficiency is gradually improved, the training effect is improved, the understanding of experiments is deepened, meanwhile, the cost is reduced, and the safety is improved.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A VR chemistry lab implementation method based on unity3D and hand motion capture, comprising:
building a module and a chemical laboratory model in a VR experimental scene through modeling software;
constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
recognizing an initial hand position of a user, inputting an initial position coordinate of the hand of the user to be matched with a hand model in a VR experimental scene, and mapping hand motion to the hand model in the VR experimental scene in real time;
and popping up corresponding text and voice prompts according to the operation of the user to guide the user to operate.
2. The VR chemical laboratory implementation method of claim 1 based on unity3D and hand motion capture, wherein the modules in the VR experimental scene include position data information and material information for three dimensions of objects.
3. The unity3D and hand motion capture based VR chemical laboratory implementation method of claim 1, wherein a chemical laboratory model in the VR experimental scenario includes a living area and a laboratory area, wherein the living area includes a user's workstation and a document cabinet, and the laboratory area includes a test stand, a heater, and laboratory instruments.
4. The unity3D and hand motion capture based VR chemical laboratory implementation method of claim 1, wherein the step of constructing a virtual environment required for an experiment in the VR experimental scenario via a unity3D engine comprises:
providing, by a unity3D engine, a physical computing system in which attributes of a laboratory instrument are defined, the attributes of the laboratory instrument including one or more of mass, volume, material, density, surface roughness, and elasticity;
the collision detection module is attached to an experimental instrument and a reagent by utilizing the function of a collision body of unity3D, and when position parameters of two or more reagents are partially overlapped, different trigger scripts are given to the chemical reagents according to different phenomena of an experiment and different chemical and physical properties of different chemical reagents, so that animation and special effect which are independently designed are deduced.
5. The VR chemical laboratory implementation method of claim 1 based on unity3D and hand motion capture, wherein the step of identifying initial hand positions of the user, inputting initial position coordinates of the user's hands to match hand models in a VR experimental scenario, and mapping hand movements to hand models in the VR experimental scenario in real time comprises:
detecting that the hand of a user is placed on the equipment in the air, and identifying the initial position of the hand of the user after the equipment is initialized successfully;
inputting initial position coordinates of a user hand to be matched with a hand model in a VR experiment scene;
and establishing a mapping relation between a physical space and an information space, and mapping the hand action to the hand model in the VR experimental scene in real time.
6. The VR chemical laboratory implementation method of claim 1 based on unity3D and hand motion capture, wherein the step of popping up corresponding text and voice prompts according to the user's operation and guiding the user's operation comprises:
exporting the correct operation sequence as a topological operation sequence, popping up corresponding text and voice prompts according to the operation of a user, and guiding the user to operate;
if the experimental operation of the user meets the specification and meets the correct operation sequence, no error is reported;
and if the experimental operation sequence of the user is wrong or not in accordance with the specification, triggering an error reporting system and popping up reminding operation of voice and characters.
7. The VR chemistry lab implementation method of claim 1 based on unity3D and hand motion capture, further comprising:
detecting an account number logged in by a user and a selected mode, wherein the modes comprise a teacher mode and a student mode.
8. The VR chemistry lab implementation method of claim 7 based on unity3D and hand motion capture,
when detecting that the user selects a teacher mode, giving the user the authority to create a room and share the current picture;
and when the user is detected to select the student mode, giving the user a search permission, and enabling the user to find a target room through room number search to watch teaching.
9. The VR chemistry lab implementation method of claim 8 based on unity3D and hand motion capture,
the user selecting the teacher mode communicates with the user selecting the student mode through video, audio, and chat frames.
10. A VR chemistry lab system based on unity3D and hand motion capture, comprising:
the VR scene construction module is used for constructing a module and a chemical laboratory model in a VR experimental scene through modeling software;
the hardware interaction module is used for constructing a virtual environment required by an experiment in the VR experiment scene through a unity3D engine;
the hand recognition module is used for recognizing the initial hand position of the user, inputting the initial position coordinates of the hand of the user to be matched with a hand model in the VR experimental scene, and mapping the hand action to the hand model in the VR experimental scene in real time;
and the text and voice prompt module is used for popping up corresponding text and voice prompts according to the operation of the user and guiding the user to operate.
CN202210413350.5A 2022-04-20 2022-04-20 VR chemical laboratory implementation method and system based on unity3D and hand motion capture Active CN114706513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210413350.5A CN114706513B (en) 2022-04-20 2022-04-20 VR chemical laboratory implementation method and system based on unity3D and hand motion capture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210413350.5A CN114706513B (en) 2022-04-20 2022-04-20 VR chemical laboratory implementation method and system based on unity3D and hand motion capture

Publications (2)

Publication Number Publication Date
CN114706513A true CN114706513A (en) 2022-07-05
CN114706513B CN114706513B (en) 2022-11-11

Family

ID=82173999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210413350.5A Active CN114706513B (en) 2022-04-20 2022-04-20 VR chemical laboratory implementation method and system based on unity3D and hand motion capture

Country Status (1)

Country Link
CN (1) CN114706513B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634426A (en) * 2018-12-20 2019-04-16 南京钟山虚拟现实技术研究院有限公司 High-freedom degree experiment class three-dimensional emulation mode and system based on Unity3D
CN110928414A (en) * 2019-11-22 2020-03-27 上海交通大学 Three-dimensional virtual-real fusion experimental system
CN111539245A (en) * 2020-02-17 2020-08-14 吉林大学 CPR (CPR) technology training evaluation method based on virtual environment
WO2020198824A1 (en) * 2019-04-04 2020-10-08 De Araujo Ana Sara Domingos Laboratory teaching equipment
US20210082542A1 (en) * 2019-09-16 2021-03-18 Burzin Bhavnagri System and method for creating lead compounds, and compositions thereof
CN113724360A (en) * 2021-08-25 2021-11-30 济南大学 Virtual-real simulation method for taking experimental materials
CN113934293A (en) * 2021-09-15 2022-01-14 中国海洋大学 Chemical experiment learning interaction method, system and application based on augmented reality technology
CN114282837A (en) * 2021-12-30 2022-04-05 塔里木大学 Physics chemical experiment teaching system
CN114373050A (en) * 2022-01-07 2022-04-19 重庆邮电大学 Chemistry experiment teaching system and method based on HoloLens

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634426A (en) * 2018-12-20 2019-04-16 南京钟山虚拟现实技术研究院有限公司 High-freedom degree experiment class three-dimensional emulation mode and system based on Unity3D
WO2020198824A1 (en) * 2019-04-04 2020-10-08 De Araujo Ana Sara Domingos Laboratory teaching equipment
US20210082542A1 (en) * 2019-09-16 2021-03-18 Burzin Bhavnagri System and method for creating lead compounds, and compositions thereof
CN110928414A (en) * 2019-11-22 2020-03-27 上海交通大学 Three-dimensional virtual-real fusion experimental system
CN111539245A (en) * 2020-02-17 2020-08-14 吉林大学 CPR (CPR) technology training evaluation method based on virtual environment
CN113724360A (en) * 2021-08-25 2021-11-30 济南大学 Virtual-real simulation method for taking experimental materials
CN113934293A (en) * 2021-09-15 2022-01-14 中国海洋大学 Chemical experiment learning interaction method, system and application based on augmented reality technology
CN114282837A (en) * 2021-12-30 2022-04-05 塔里木大学 Physics chemical experiment teaching system
CN114373050A (en) * 2022-01-07 2022-04-19 重庆邮电大学 Chemistry experiment teaching system and method based on HoloLens

Also Published As

Publication number Publication date
CN114706513B (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN110162164B (en) Augmented reality-based learning interaction method, device and storage medium
CN106128212B (en) Learning calligraphy system and method based on augmented reality
CN104346081A (en) Augmented reality learning system and method thereof
WO2017186001A1 (en) Education system using virtual robots
CN110688005A (en) Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
Bonial et al. Laying down the yellow brick road: Development of a wizard-of-oz interface for collecting human-robot dialogue
CN113129661A (en) VR-based multi-user remote teaching system and teaching method thereof
CN111862346B (en) Experimental teaching method for preparing oxygen from potassium permanganate based on virtual reality and Internet
CN110928414A (en) Three-dimensional virtual-real fusion experimental system
CN110544399A (en) Graphical remote teaching system and graphical remote teaching method
CN110727351A (en) Multi-user collaboration system for VR environment
CN116664355A (en) Virtual teaching system and method based on remote experiment
CN117292601A (en) Virtual reality sign language education system
CN114706513B (en) VR chemical laboratory implementation method and system based on unity3D and hand motion capture
CN112732075B (en) Virtual-real fusion machine teacher teaching method and system for teaching experiments
CN113934293A (en) Chemical experiment learning interaction method, system and application based on augmented reality technology
Soares et al. Sign language learning using the hangman videogame
CN208433026U (en) A kind of wisdom education system
KR100850041B1 (en) Science experiment method by objective choice
CN112017247A (en) Method for realizing unmanned vehicle vision by using KINECT
CN118038722B (en) Virtual reality-based classroom live-action reappearance interactive teaching system and method
KR102352318B1 (en) Apparatus and method for providing remote coding education
CN117649505B (en) Augmented reality experiment teaching system guided by entity model
He et al. Design of Blended Teaching Platform for Aerobics Courses Based on Wearable Virtual VR
CN110047343B (en) Method for operating VR (virtual reality) simulation microhardness meter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant