CN115938178A - Automobile part assembly teaching method and system based on mixed reality technology - Google Patents
Automobile part assembly teaching method and system based on mixed reality technology Download PDFInfo
- Publication number
- CN115938178A CN115938178A CN202211611976.3A CN202211611976A CN115938178A CN 115938178 A CN115938178 A CN 115938178A CN 202211611976 A CN202211611976 A CN 202211611976A CN 115938178 A CN115938178 A CN 115938178A
- Authority
- CN
- China
- Prior art keywords
- teaching
- mixed reality
- model
- space
- reality technology
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000005516 engineering process Methods 0.000 title claims abstract description 19
- 238000000034 method Methods 0.000 title claims abstract description 16
- 230000033001 locomotion Effects 0.000 claims abstract description 16
- 238000004364 calculation method Methods 0.000 claims abstract description 7
- 239000011521 glass Substances 0.000 claims description 15
- 230000002452 interceptive effect Effects 0.000 claims description 9
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000004891 communication Methods 0.000 claims description 3
- 238000004088 simulation Methods 0.000 claims 1
- 230000004927 fusion Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Educational Technology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Educational Administration (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses an automobile part assembly teaching method and system based on a mixed reality technology, which comprises the steps of firstly collecting and building a teaching scene, copying and scanning a model for teaching, adopting double cameras to shoot the model for teaching for overlapping calculation, combining and fusing the model for teaching and the teaching scene to form a teaching space, adjusting the proportional relation between the model and the space, then calculating the moving distance and angle generated in the teaching space by the movement of mixed reality equipment, identifying a voice command and capturing hand movement track information after obtaining the relative moving parameters of the model for teaching in the teaching space, generating a model control command by the hand movement track information, controlling the operation of the model for teaching in a virtual teaching scene by the control command, wearing the mixed reality equipment by a teacher to simulate teaching in the virtual teaching space, and generating a video in the teaching process and storing and uploading the video to a server.
Description
Technical Field
The invention relates to the technical field of mixed reality, in particular to an automobile part assembly teaching method and system based on the mixed reality technology.
Background
In the automobile teaching process, students need to know all parts of an automobile, and all parts of an automobile teaching aid need to be displayed, disassembled and assembled during explanation, so that the students can better recognize the parts.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide an automobile part assembly teaching method and system capable of realizing high-precision and high-definition movement, disassembly, assembly and detail display based on a mixed reality technology.
The technical scheme is as follows: the invention provides an automobile part assembly teaching method based on a mixed reality technology, which comprises the following steps:
s1: collecting and constructing a teaching scene;
s2: copying and/or scanning to collect a teaching model;
s3: adopting double cameras to shoot a model for teaching to perform overlapping calculation;
s4: combining and fusing a model for teaching and a teaching scene to form a teaching space, and adjusting the proportional relation between the model and the space;
s5: calculating the moving distance and angle of the mixed reality equipment moving in the teaching space to obtain the relative moving parameters of the model for teaching in the teaching space;
s6: identifying interactive information, wherein the interactive information comprises voice instructions and captured hand motion track information, the hand motion track information generates model control instructions, and the control instructions control the operation of a teaching model in a virtual teaching scene;
s7: the teacher wears mixed reality equipment and simulates teaching in virtual teaching space, and the teaching process that produces in simulated teaching space generates the video and saves and upload to the server.
The invention provides a teaching system based on a mixed reality technology, which comprises mixed reality glasses, a hand action acquisition device, an upper computer and a fisheye-shaped camera for shooting a model for teaching, wherein the lenses of the mixed reality glasses adopt a free-form surface technology, the field angle is larger than 45 degrees FOV, and the mixed reality glasses are provided with a vision processor GPU, a six-degree-of-freedom system and a nine-axis gyroscope.
Furthermore, fisheye-shaped cameras are symmetrically arranged at the front ends of the two sides of the mixed reality glasses and used for simulating a model for teaching of human eye shooting and scanning.
Further, the fisheye-shaped camera is used for a teaching model of superposition calculation and comprises two 1300 ten thousand high-definition RGBC cameras with 100W and a single 150-degree FOV field angle.
Furthermore, the nine-axis gyroscope comprises a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer, and the data are embedded into the application program system after being subjected to a fusion algorithm.
Further, the refresh rate of the nine-axis gyroscope is 1000Hz.
Furthermore, the system is provided with a Wifi system and a Bluetooth system, and is used for realizing interactive communication and file transmission among the mixed reality glasses, the hand motion acquisition device and the upper computer.
Has the beneficial effects that: compared with the prior art, the invention has the following remarkable advantages: the invention is based on the mixed reality technology, the teaching mould is input through modes of copying, shooting, scanning and the like, the automobile part model is moved, disassembled, assembled and displayed in detail with high precision and high definition, the teaching introduction sense is strong, and students can repeatedly operate and practice, thereby exciting the learning interest of the students and improving the learning effect.
Drawings
FIG. 1 is a functional schematic diagram of the system of the present invention
Detailed Description
The technical scheme of the invention is further explained by combining the attached drawings.
The invention provides an automobile part assembly teaching method based on a mixed reality technology, which comprises the following steps:
s1: collecting and constructing a teaching scene;
s2: copying and scanning to collect a model for teaching;
s3: adopting double cameras to shoot a model for teaching to perform overlapping calculation;
s4: combining and fusing a model for teaching and a teaching scene to form a teaching space, and adjusting the proportional relation between the model and the space;
s5: calculating the moving distance and angle generated by the movement of the mixed reality equipment in the teaching space to obtain the relative moving parameters of the model for teaching in the teaching space;
s6: identifying interactive information, wherein the interactive information comprises voice instructions and captured hand motion track information, the hand motion track information generates model control instructions, and the control instructions control the operation of a teaching model in a virtual teaching scene;
s7: the teacher wears mixed reality equipment and simulates teaching in virtual teaching space, and the teaching process that produces in simulated teaching space generates the video and saves and upload to the server.
The invention provides a teaching system based on a mixed reality technology, which comprises mixed reality glasses, a hand action acquisition device, an upper computer and a fisheye-shaped camera for shooting a model for teaching, wherein the lenses of the mixed reality glasses adopt a free-form surface technology, the field angle is larger than 45 degrees FOV, and the mixed reality glasses are provided with a vision processor GPU, a six-degree-of-freedom system and a nine-axis gyroscope. The fisheye-shaped cameras are symmetrically arranged at the front ends of the two sides of the mixed reality glasses and used for simulating a model for teaching of shooting and scanning by human eyes. The fisheye-shaped camera is used for a teaching model of superposition calculation and comprises two 1300-thousand high-definition RGBC cameras with 100W and a single 150-degree FOV visual field angle. The nine-axis gyroscope comprises a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer, and the data are embedded into an application program system after being subjected to a fusion algorithm. The refresh rate of the nine-axis gyroscope was 1000Hz. The system is provided with a Wifi system and a Bluetooth system and is used for realizing interactive communication and file transmission among the mixed reality glasses, the hand action acquisition device and the upper computer.
Claims (8)
1. A mixed reality technology-based automobile part assembly teaching method is characterized by comprising the following steps:
s1: collecting and constructing a teaching scene;
s2: copying and/or scanning to collect a teaching model;
s3: adopting double cameras to shoot a model for teaching to perform overlapping calculation;
s4: combining and fusing a model for teaching and a teaching scene to form a teaching space, and adjusting the proportional relation between the model and the space;
s5: calculating the moving distance and angle generated by the movement of the mixed reality equipment in the teaching space to obtain the relative moving parameters of the model for teaching in the teaching space;
s6: identifying interactive information, wherein the interactive information comprises voice instructions and captured hand motion track information, the hand motion track information generates model control instructions, and the control instructions control the operation of a teaching model in a virtual teaching scene;
s7: the teacher wears mixed reality equipment and simulates the teaching in virtual teaching space, and the teaching process that produces in simulation teaching space generates the video and saves and upload to the server.
2. The automobile part assembly teaching method based on mixed reality technology as claimed in claim 1, wherein the hand motion trajectory information includes movement, disassembly, assembly and detail display of a teaching model.
3. The utility model provides a teaching system based on mixed reality technique, its characterized in that includes mixed reality glasses, hand action collection system, host computer and is used for shooing the fisheye form camera of model for the teaching, mixed reality glasses lens adopts the free form surface technique, and the angle of vision is greater than 45 FOV, mixed reality glasses configuration vision processor GPU, six degree of freedom systems and nine-axis gyroscope.
4. The teaching system based on the mixed reality technology as claimed in claim 3, wherein the fisheye-shaped cameras are symmetrically arranged at the front ends of the two sides of the mixed reality glasses.
5. The mixed reality technology-based teaching system of claim 3, wherein the fisheye-shaped camera is used for teaching models of superposition calculation, and comprises two 100W 1300 ten thousand high-definition RGB camera cameras with a single 150-degree FOV field angle.
6. A mixed reality technology-based instruction system according to claim 3, wherein the nine-axis gyroscope includes a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer.
7. A mixed reality technology-based tutorial system according to claim 3, wherein the refresh rate of the nine-axis gyroscope is 1000Hz.
8. The teaching system based on the mixed reality technology as claimed in claim 3, wherein the system is configured with a Wifi and Bluetooth system for realizing interactive communication and file transmission among the mixed reality glasses, the hand motion collection device and the upper computer.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211611976.3A CN115938178A (en) | 2022-12-14 | 2022-12-14 | Automobile part assembly teaching method and system based on mixed reality technology |
CN202311709865.0A CN118098033A (en) | 2022-12-14 | 2023-12-13 | Teaching system and method based on mixed reality technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211611976.3A CN115938178A (en) | 2022-12-14 | 2022-12-14 | Automobile part assembly teaching method and system based on mixed reality technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115938178A true CN115938178A (en) | 2023-04-07 |
Family
ID=86553715
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211611976.3A Withdrawn CN115938178A (en) | 2022-12-14 | 2022-12-14 | Automobile part assembly teaching method and system based on mixed reality technology |
CN202311709865.0A Pending CN118098033A (en) | 2022-12-14 | 2023-12-13 | Teaching system and method based on mixed reality technology |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311709865.0A Pending CN118098033A (en) | 2022-12-14 | 2023-12-13 | Teaching system and method based on mixed reality technology |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN115938178A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117391822A (en) * | 2023-12-11 | 2024-01-12 | 中汽传媒(天津)有限公司 | VR virtual reality digital display method and system for automobile marketing |
-
2022
- 2022-12-14 CN CN202211611976.3A patent/CN115938178A/en not_active Withdrawn
-
2023
- 2023-12-13 CN CN202311709865.0A patent/CN118098033A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117391822A (en) * | 2023-12-11 | 2024-01-12 | 中汽传媒(天津)有限公司 | VR virtual reality digital display method and system for automobile marketing |
CN117391822B (en) * | 2023-12-11 | 2024-03-15 | 中汽传媒(天津)有限公司 | VR virtual reality digital display method and system for automobile marketing |
Also Published As
Publication number | Publication date |
---|---|
CN118098033A (en) | 2024-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105373224B (en) | A kind of mixed reality games system based on general fit calculation and method | |
CN109817031B (en) | Limbs movement teaching method based on VR technology | |
CN101968890B (en) | 360-degree full-view simulation system based on spherical display | |
CN204695231U (en) | Portable helmet immersion systems | |
CN106710362A (en) | Flight training method implemented by using virtual reality equipment | |
CN105913364A (en) | Virtual reality technology-based prisoner post-release education simulation method | |
CN110969905A (en) | Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof | |
CN110568923A (en) | unity 3D-based virtual reality interaction method, device, equipment and storage medium | |
WO2017186001A1 (en) | Education system using virtual robots | |
CN102662472A (en) | Body movement based learning method and cloud service system thereof | |
CN103050028A (en) | Driving simulator with stereoscopic vision follow-up function | |
CN108830944B (en) | Optical perspective three-dimensional near-to-eye display system and display method | |
CN104427230A (en) | Reality enhancement method and reality enhancement system | |
CN115938178A (en) | Automobile part assembly teaching method and system based on mixed reality technology | |
CN105183161A (en) | Synchronized moving method for user in real environment and virtual environment | |
CN112102667A (en) | Video teaching system and method based on VR interaction | |
Song et al. | An immersive VR system for sports education | |
CN112669469A (en) | Power plant virtual roaming system and method based on unmanned aerial vehicle and panoramic camera | |
CN111724645B (en) | Rail underwater blasting simulation training system | |
Fadzli et al. | A robust real-time 3D reconstruction method for mixed reality telepresence | |
CN113253843B (en) | Indoor virtual roaming realization method and realization system based on panorama | |
CN103544713A (en) | Human-body projection interaction method on basis of rigid-body physical simulation system | |
CN113066192B (en) | Real-time masking method in full-virtual environment based on AR imaging | |
CN116129043A (en) | Universal three-dimensional model for fusing reality scene and construction method thereof | |
CN107544677B (en) | Method and system for simulating motion scene by using modular track and somatosensory device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20230407 |
|
WW01 | Invention patent application withdrawn after publication |