CN104916182A - Immersion type virtual reality maintenance and training simulation system - Google Patents
Immersion type virtual reality maintenance and training simulation system Download PDFInfo
- Publication number
- CN104916182A CN104916182A CN201510278780.0A CN201510278780A CN104916182A CN 104916182 A CN104916182 A CN 104916182A CN 201510278780 A CN201510278780 A CN 201510278780A CN 104916182 A CN104916182 A CN 104916182A
- Authority
- CN
- China
- Prior art keywords
- subsystem
- point
- user
- dimensional model
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
Disclosed in the invention is an immersion type virtual reality maintenance and training simulation system comprising a position tracking sub system, a three-dimensional model database, a node management sub system, a node rendering sub system, and a projection sub system. The three-dimensional model database stores three-dimensional model data; the position tracking sub system outputs spatial position information of a user to the node management sub system; the node management sub system sends spatial position information of the three-dimensional model and the user to the node rendering sub system; the node rendering sub system carries out rendering on the three-dimensional model and sends a rendered image to the projection sub system, obtains image information that should be displayed in a human-eye vision field according to received information of head and eyespot location of the user and hand operation information, and sends the updated image to the projection sub system. The projection sub system receives and displays image information. According to the invention, the training scene can be presented visually and an objective of realization of an immersive effect for participants can be achieved.
Description
Technical field
The invention belongs to space product Design of digital and system emulation field, the present invention relates to the application of virtual reality technology in Virtual Maintenance field.
Background technology
Virtual reality (Virtual Reality is called for short VR) technology is the product of the technological synthesis development such as computer graphics, artificial intelligence, computer network, information processing.It utilizes computer technology to generate a kind of simulated environment, by various sensing equipment be user's " input " in simulated environment, make user naturally with environment direct interaction.Virtual reality is a kind of computer interface technology.In essence, virtual reality is exactly the computer user interface of a kind of advanced person, it is by providing the various real-time, interactive means directly perceived and natural such as such as vision, the sense of hearing, sense of touch to user simultaneously, user friendly operation to greatest extent, thus alleviate the burden of user, improve the work efficiency of whole system.
Utilize reality environment to carry out Virtual Maintenance Simulation and there has been many systems, carry out application in the field such as automobile, aircraft.These system design schemes are varied, and the problem of many problems faced and solution is also not quite similar.Although maintenance simulation system the most frequently used is at present carried out based on three-dimensional model, but be mostly to show on the computer screen and carry out, lack three-dimensional real feeling of immersion, in degree of verisimilitude, performance not, actual scene cannot be reflected authentic and validly, can not build the three-dimensional virtual environment of immersion, versatility is poor.
Summary of the invention
The object that technology of the present invention is dealt with problems is: technology of the present invention is dealt with problems: overcome the deficiencies in the prior art, a kind of Virtual Maintenance and virtual reality of immersion are provided, training and Training scene and content can be presented in directly perceived, three-dimensional mode, reach the object that participant is on the spot in person.
Technical solution of the present invention is:
A kind of immersive VR maintenance comprises with Training Simulation System: position tracking subsystem, three-dimensional modeling data storehouse, node administration subsystem, node rendering subsystem, projection subsystem;
Three-dimensional modeling data storehouse, stores three-dimensional modeling data, and is called for node administration subsystem by three-dimensional model;
Position tracking subsystem, obtains the spatial positional information of user, and this information is outputted to node administration subsystem; Described spatial positional information comprises head position of eye point and the operation by human hand information of user;
Node administration subsystem, reads three-dimensional model from three-dimensional modeling data storehouse, three-dimensional model is passed to node rendering subsystem, and the head position of eye point of the user that position tracking subsystem is provided and operation by human hand information, be sent to node rendering subsystem;
Node rendering subsystem, in the indication range preset, played up according to the left, center, right of three-dimensional model, lower four parts by the three-dimensional model received, the image after playing up is sent to projection subsystem's display, thus makes user see whole virtual three-dimensional model; Node rendering subsystem, according to head position of eye point and the operation by human hand information of the user received, obtains the image information that human eye should show within sweep of the eye, the image of renewal is sent to projection subsystem;
Projection subsystem, the image information that receiving node rendering subsystem sends, and show, the three-dimensional model image after making user see renewal.
Position tracking subsystem comprises position processing host, optical camera, tracking rigid-object;
Follow the tracks of rigid-object, be fixed on the major joint of user, head and ocular vicinity;
Optical camera launches Infrared, by gathering the light following the tracks of rigid-object reflection, the data of collection is passed to position processing host;
After position processing host receives data, followed the tracks of the spatial positional information determining user by position, and spatial positional information is passed to node administration subsystem;
Projection subsystem comprises four stereo projectors and four display screens;
Four display screens present 90 degree respectively, according to the position with user, be respectively left, center, right, under; A part of image of projected virtual scene while of four projection screens, the virtual environment that common composition one is complete.
The present invention compared with prior art tool has the following advantages:
(1) present invention achieves the stereoscopic three-dimensional virtual environment of an immersion, virtual training is carried out and Virtual Maintenance provides a platform very intuitively for user, be conducive to the understanding of personnel to operation, training process, strengthen sense of participation, can enhancement training or training effect significantly.
(2) the present invention passes through the collaborative work of node administration subsystem and node rendering subsystem, achieve the collaborative display of 3 D stereo scene image, for the virtual scene forming immersion provides technical foundation, design ingenious, realize simple, versatility is stronger.
(3) the present invention achieves real-time seizure to user's head, hand position by positioning control system, utilize the process of management software to user's positional information to achieve 3-D view further and make Real-time Feedback along with the difference of user position and operation information, achieve man-machine interaction, user is upgraded to " interactive operation " " roaming is browsed " impression of virtual reality scenario, improves the sense of reality and the acceptance level of virtual training and training.
Accompanying drawing explanation
Fig. 1 is system composition diagram of the present invention;
Fig. 2 is that realization flow schematic diagram is followed the tracks of in position of the present invention;
Fig. 3 is impact point of the present invention and anchor point and camera position coordinate schematic diagram;
Fig. 4 is projection subsystem of the present invention structural representation;
Fig. 5 is projection subsystem of the present invention structural representation.
Embodiment
Below in conjunction with accompanying drawing, the specific embodiment of the invention is described further.
As shown in Figure 1, a kind of immersive VR maintenance of the present invention and Training Simulation System, comprise position tracking subsystem, three-dimensional modeling data storehouse, node administration subsystem, node rendering subsystem, projection subsystem; Three-dimensional modeling data storehouse, mainly stores three-dimensional modeling data, and is called for node administration subsystem by three-dimensional model.In order to meet the sense of reality and the display fast of vision, three-dimensional model adopts tri patch form, the Facing material that surface has metal, compound substance etc. different according to actual object.
Position tracking subsystem, obtains the spatial positional information of user, and this information is outputted to node administration subsystem; Described spatial positional information comprises head position of eye point and the operation by human hand information of user;
Position tracking subsystem comprises position processing host, optical camera, tracking rigid-object;
Follow the tracks of rigid-object, be fixed on the major joint of user, head and ocular vicinity; For indicate user's hand position and head position tracking rigid-object each be made up of two parts, a part is impact point, and another part is anchor point, and anchor point is mainly used to the auxiliary movement locus determining impact point.
Optical camera launches Infrared, by gathering the light following the tracks of rigid-object reflection, the data of collection is passed to position processing host;
After position processing host receives data, followed the tracks of the spatial positional information determining user by position, and spatial positional information is passed to node administration subsystem;
As shown in Figure 2,3, the specific implementation of position tracking is as follows:
(1) each reference point locations is determined; Reference point is the locus of optical camera, respectively installs an optical camera in the present embodiment in left front, left back, right front, right back four positions of user's overhead;
(2) K moment each reference point is obtained to impact point and anchor point present range; At initial time K, send infrared light by optical camera, anchor point and impact point reflection ray, received by optical camera, information transmitted return and puts processing host, and Framework computing draws the present range of each reference point to impact point and anchor point;
(3) spatial value of K moment impact point and anchor point is obtained according to maximum likelihood method; Concrete mode is as follows:
If the coordinate of three-dimensional n (n>=4) individual reference point is (x
1, y
1, z
1), (x
2, y
2, z
2), (x
3, y
3, z
3) ... (x
n, y
n, z
n), the coordinate X of impact point or anchor point is set to (x, y, z), some X to the 1st, the 2nd ..., the n-th reference point distance be respectively d
1, d
1..., d
1, then there is system of equations according to space distance between two points formula:
In above formula, from first equation, deduct last equation respectively and transplant, can Linear Equations be obtained:
Above formula can be expressed as AX=b
Wherein
The coordinate that then can obtain impact point or anchor point X is:
(4) moment renewal is carried out: K=K+i, i=i+t, t represent a sampling interval duration, initial time i=0;
(5) in the K moment, the spatial value of K moment impact point and anchor point is obtained according to maximum likelihood method;
(6) whether the locating point position that determining step (5) obtains changes relative to the locating point position obtained in step (3), if do not change, carry out step (7), if there occurs change, then think and impact point generation translation enter step (10);
(7) whether the aiming spot that determining step (5) obtains changes relative to the aiming spot obtained in step (3), if do not change, then think that impact point does not rotate or translation, enter step (12), if there occurs change, then think that impact point there occurs rotation around orientational vector coordinate axis, enter step (8);
(8) according to position and the orientational vector coordinate axis of impact point, the angle of impact point around orientational vector X-axis rotate is calculated;
(9) upgrade the aiming spot information after locating shaft rotates, enter step (12);
The concrete mode upgrading the aiming spot information after locating shaft rotates is as follows:
The coordinate of the new point of a point after Vector Rotation in three dimensions
In three dimensions, the coordinate of the point that point is new after X-axis Y-axis Z axis rotates certain radian easily calculates, and this point around turning axle be any vector (x, y, z), (x, y, z) be anchor point vector, concrete implementation is:
Following matrix is realized by the function glRotatef (angle, x, y, z) in OPENGL:
Here Assumed coordinate axle is right-handed system, wherein c=cos (angle), s=sin (angle), angle is from vector (x, y, z) forward is looked and is counterclockwise rotated angulation (namely in right-hand rule, thumb pointing vector direction, all the other four refer in the counterclockwise direction around angle be positive angle), vector (x, y, z) must be unitization and through initial point, then utilize above-mentioned matrix can obtain postrotational coordinate of ground point:
x
1=(x
2(1-c)+c)*x
0+(xy(1-c)-zs)*y
0+(xz(1-c)+ys)*z
0
y
1=(yx(1-c)+zs)*x
0+(y
2(1-c)+c)*y
0+(yz(1-c)-xs)*z
0
z
1=(xz(1-c)-ys)*x
0+(yz(1-c)+xs)*y
0+(z
2(1-c)+c)*z
0
(x
0, y
0, z
0) be the coordinate of former impact point, (x
1, y
1, z
1) be the coordinate of postrotational new impact point.
(10) according to aiming spot and orientational vector coordinate axis, the distance of impact point along three coordinate axis translations is calculated;
(11) aiming spot information is upgraded;
(12) judge whether user continues use system, if use, enter step (4) and enter subsequent time, otherwise ends with system.
Node administration subsystem, reads three-dimensional model from three-dimensional modeling data storehouse, three-dimensional model is passed to node rendering subsystem, and the head position of eye point of the user that position tracking subsystem is provided and operation by human hand information, be sent to node rendering subsystem;
Node rendering subsystem, in the indication range preset, played up according to the left, center, right of three-dimensional model, lower four parts by the three-dimensional model received, the image after playing up is sent to projection subsystem's display, thus makes user see whole virtual three-dimensional model; Node rendering subsystem, according to head position of eye point and the operation by human hand information of the user received, obtains the image information that human eye should show within sweep of the eye, the image of renewal is sent to projection subsystem;
Projection subsystem, the image information that receiving node rendering subsystem sends, and show, the three-dimensional model image after making user see renewal.Projection subsystem comprises four stereo projectors and four display screens; Four display screens present 90 degree respectively, according to the position with user, be respectively left, in (front), right, under (ground); A part of image of projected virtual scene while of four projection screens, the virtual environment that common composition one is complete.The corresponding screen of Mei Tai projector, four projectors project the three-dimensional modeling data of different angles jointly, the stereoscopic three-dimensional virtual environment of the immersion that common composition one is complete.User can wearing stereoscopic glasses, watches the three-dimensional model in three-dimensional virtual environment.
As shown in Figure 4,5, in the present embodiment, each screen size is 3520mm × 2200mm, and display specification is 16:10, and the projected resolution of projector is 1920 × 1200.Every block rear projection screen all needs employing one monoblock thickness to be not less than 12mm, super flat, low gain compound glass host material, and ground screen adopts and just throws mode, and material is wear-resisting white material.A part of image of projected virtual scene while of four projection screens, the virtual environment that common composition one is complete.
(7) user watches virtual scene in the square region of projection subsystem's screen composition, can operate as required, operate and undertaken by the 3D mouse of position tracking subsystem model.3D mouse bundlees positioning control system and follows the tracks of rigid-object, for indicating the hand position of user.The anaglyph spectacles that user wears bundlees positioning control system and follows the tracks of rigid-object, for indicating the head position of user.User operates in viewing area, and the optical camera of positioning control system catches and collects the head of user and the position data of hand, is sent to position processing host.Position processing host will be sent to the hyperchannel scene management software of management node computing machine after position data process.
The unspecified part of the present invention belongs to general knowledge as well known to those skilled in the art.
Claims (4)
1. immersive VR maintenance and a Training Simulation System, is characterized in that comprising: position tracking subsystem, three-dimensional modeling data storehouse, node administration subsystem, node rendering subsystem, projection subsystem;
Three-dimensional modeling data storehouse, stores three-dimensional modeling data, and is called for node administration subsystem by three-dimensional model;
Position tracking subsystem, obtains the spatial positional information of user, and this information is outputted to node administration subsystem; Described spatial positional information comprises head position of eye point and the operation by human hand information of user;
Node administration subsystem, reads three-dimensional model from three-dimensional modeling data storehouse, three-dimensional model is passed to node rendering subsystem, and the head position of eye point of the user that position tracking subsystem is provided and operation by human hand information, be sent to node rendering subsystem;
Node rendering subsystem, in the indication range preset, played up according to the left, center, right of three-dimensional model, lower four parts by the three-dimensional model received, the image after playing up is sent to projection subsystem's display, thus makes user see whole virtual three-dimensional model; Node rendering subsystem, according to head position of eye point and the operation by human hand information of the user received, obtains the image information that human eye should show within sweep of the eye, the image of renewal is sent to projection subsystem;
Projection subsystem, the image information that receiving node rendering subsystem sends, and show, the three-dimensional model image after making user see renewal.
2. a kind of immersive VR maintenance according to claim 1 and Training Simulation System, is characterized in that: described position tracking subsystem comprises position processing host, optical camera, tracking rigid-object;
Follow the tracks of rigid-object, be fixed on the major joint of user, head and ocular vicinity;
Optical camera launches Infrared, by gathering the light following the tracks of rigid-object reflection, the data of collection is passed to position processing host;
After position processing host receives data, followed the tracks of the spatial positional information determining user by position, and spatial positional information is passed to node administration subsystem;
3. a kind of immersive VR maintenance according to claim 1 and Training Simulation System, is characterized in that: described projection subsystem comprises four stereo projectors and four display screens;
Four display screens present 90 degree respectively, according to the position with user, be respectively left, center, right, under; A part of image of projected virtual scene while of four projection screens, the virtual environment that common composition one is complete.
4. a kind of immersive VR maintenance according to claim 2 and Training Simulation System, is characterized in that: described position tracking specific implementation is as follows:
(1) each reference point locations is determined; Reference point is the locus of optical camera, respectively installs an optical camera in the present embodiment in left front, left back, right front, right back four positions of user's overhead;
(2) K moment each reference point is obtained to impact point and anchor point present range; At initial time K, send infrared light by optical camera, anchor point and impact point reflection ray, received by optical camera, information transmitted return and puts processing host, and Framework computing draws the present range of each reference point to impact point and anchor point;
(3) spatial value of K moment impact point and anchor point is obtained according to maximum likelihood method; Concrete mode is as follows:
If the coordinate of three-dimensional n (n>=4) individual reference point is (x
1, y
1, z
1), (x
2, y
2, z
2), (x
3, y
3, z
3) ... (x
n, y
n, z
n), the coordinate X of impact point or anchor point is set to (x, y, z), some X to the 1st, the 2nd ..., the n-th reference point distance be respectively d
1, d
1..., d
1, then there is system of equations according to space distance between two points formula:
In above formula, from first equation, deduct last equation respectively and transplant, can Linear Equations be obtained:
Above formula can be expressed as AX=b
Wherein
The coordinate that then can obtain impact point or anchor point X is:
carry out moment renewal: K=K+i, i=i+t, t represent a sampling interval duration, initial time i=0;
(5) in the K moment, the spatial value of K moment impact point and anchor point is obtained according to maximum likelihood method;
(6) whether the locating point position that determining step (5) obtains changes relative to the locating point position obtained in step (3), if do not change, carry out step (7), if there occurs change, then think and impact point generation translation enter step (10);
(7) whether the aiming spot that determining step (5) obtains changes relative to the aiming spot obtained in step (3), if do not change, then think that impact point does not rotate or translation, enter step (12), if there occurs change, then think that impact point there occurs rotation around orientational vector coordinate axis, enter step (8);
(8) according to position and the orientational vector coordinate axis of impact point, the angle of impact point around orientational vector X-axis rotate is calculated;
(9) upgrade the aiming spot information after locating shaft rotates, enter step (12);
(10) according to aiming spot and orientational vector coordinate axis, the distance of impact point along three coordinate axis translations is calculated;
(11) aiming spot information is upgraded;
(12) judge whether user continues use system, if use, enter step (4) and enter subsequent time, otherwise ends with system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510278780.0A CN104916182B (en) | 2015-05-27 | 2015-05-27 | A kind of immersive VR maintenance and Training Simulation System |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510278780.0A CN104916182B (en) | 2015-05-27 | 2015-05-27 | A kind of immersive VR maintenance and Training Simulation System |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104916182A true CN104916182A (en) | 2015-09-16 |
CN104916182B CN104916182B (en) | 2017-07-28 |
Family
ID=54085215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510278780.0A Active CN104916182B (en) | 2015-05-27 | 2015-05-27 | A kind of immersive VR maintenance and Training Simulation System |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104916182B (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105489102A (en) * | 2015-12-30 | 2016-04-13 | 北京宇航系统工程研究所 | Three-dimensional interactive training exercise system |
CN106227352A (en) * | 2016-07-28 | 2016-12-14 | 北京国承万通信息科技有限公司 | Virtual reality scenario rendering method and system |
CN106249893A (en) * | 2016-08-03 | 2016-12-21 | 上海机电工程研究所 | Rigid-flexible system virtual assembly system and dummy assembly method thereof |
CN106652721A (en) * | 2016-10-21 | 2017-05-10 | 中国民航大学 | Virtualized training system and method for airplane maintenance |
CN106774923A (en) * | 2016-12-30 | 2017-05-31 | 天津天堰科技股份有限公司 | A kind of virtual sterilization training system for showing operation trace |
CN106950534A (en) * | 2017-02-27 | 2017-07-14 | 广东小天才科技有限公司 | Spatial position detection method and system and VR (virtual reality) wearable equipment |
CN107092357A (en) * | 2017-04-21 | 2017-08-25 | 厦门中智信系统集成有限公司 | The real border Architectural Equipment management system of holography based on virtual reality |
CN107256654A (en) * | 2017-07-31 | 2017-10-17 | 中国航空工业集团公司西安飞机设计研究所 | A kind of guiding emergency evacuation virtual training system |
CN107293182A (en) * | 2017-07-19 | 2017-10-24 | 深圳国泰安教育技术股份有限公司 | A kind of vehicle teaching method, system and terminal device based on VR |
CN107464465A (en) * | 2017-07-31 | 2017-12-12 | 中国航空工业集团公司西安飞机设计研究所 | A kind of active emergency evacuation virtual training system |
CN108039080A (en) * | 2017-12-21 | 2018-05-15 | 中国舰船研究设计中心 | A kind of immersion remote training system based on virtual reality |
CN108287483A (en) * | 2018-01-17 | 2018-07-17 | 北京航空航天大学 | A kind of immersion Virtual Maintenance Simulation method and system towards Product maintenance verification |
WO2018210257A1 (en) * | 2017-05-16 | 2018-11-22 | 江苏海事职业技术学院 | Method and system of practical trouble shooting training for ship navigation devices based on vr technology |
CN109785425A (en) * | 2018-12-12 | 2019-05-21 | 珠海超凡视界科技有限公司 | A kind of three-dimensional imaging method |
CN112085983A (en) * | 2020-09-29 | 2020-12-15 | 北京森合智源技术有限公司 | Virtual-real combination-based automobile virtual simulation teaching cloud service platform system |
CN112258656A (en) * | 2020-09-14 | 2021-01-22 | 北京京东振世信息技术有限公司 | Method, apparatus, server and medium for displaying product maintenance information |
CN113038116A (en) * | 2021-03-09 | 2021-06-25 | 中国人民解放军海军航空大学航空作战勤务学院 | Method for constructing aerial refueling simulation training visual system |
GB2594714A (en) * | 2020-05-04 | 2021-11-10 | Createc Robotics Ltd | Virtual view generation |
CN113784109A (en) * | 2021-09-07 | 2021-12-10 | 太仓中科信息技术研究院 | Projection system and method for script killing environment |
CN115808974A (en) * | 2022-07-29 | 2023-03-17 | 深圳职业技术学院 | Immersive command center construction method and system and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070048702A1 (en) * | 2005-08-25 | 2007-03-01 | Jang Gil S | Immersion-type live-line work training system and method |
CN101510074A (en) * | 2009-02-27 | 2009-08-19 | 河北大学 | High present sensation intelligent perception interactive motor system and implementing method |
US20090213114A1 (en) * | 2008-01-18 | 2009-08-27 | Lockheed Martin Corporation | Portable Immersive Environment Using Motion Capture and Head Mounted Display |
CN204246844U (en) * | 2014-12-09 | 2015-04-08 | 新疆触彩动漫科技有限公司 | The aobvious device of virtual reality type human-computer interaction holography feedback |
CN104573230A (en) * | 2015-01-06 | 2015-04-29 | 北京卫星环境工程研究所 | Virtual human work task simulation analyzing system and method for spacecraft repair |
-
2015
- 2015-05-27 CN CN201510278780.0A patent/CN104916182B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070048702A1 (en) * | 2005-08-25 | 2007-03-01 | Jang Gil S | Immersion-type live-line work training system and method |
US20090213114A1 (en) * | 2008-01-18 | 2009-08-27 | Lockheed Martin Corporation | Portable Immersive Environment Using Motion Capture and Head Mounted Display |
CN101510074A (en) * | 2009-02-27 | 2009-08-19 | 河北大学 | High present sensation intelligent perception interactive motor system and implementing method |
CN204246844U (en) * | 2014-12-09 | 2015-04-08 | 新疆触彩动漫科技有限公司 | The aobvious device of virtual reality type human-computer interaction holography feedback |
CN104573230A (en) * | 2015-01-06 | 2015-04-29 | 北京卫星环境工程研究所 | Virtual human work task simulation analyzing system and method for spacecraft repair |
Non-Patent Citations (2)
Title |
---|
刘钡钡等: "航空虚拟维修系统的设计与实现", 《计算机集成制造系统》 * |
杨琼: "航空维修中虚拟维修训练系统展望", 《价值工程》 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105489102A (en) * | 2015-12-30 | 2016-04-13 | 北京宇航系统工程研究所 | Three-dimensional interactive training exercise system |
CN105489102B (en) * | 2015-12-30 | 2018-06-01 | 北京宇航系统工程研究所 | A kind of three-dimensional interactive training training system |
CN106227352B (en) * | 2016-07-28 | 2019-11-08 | 北京国承万通信息科技有限公司 | Virtual reality scenario rendering method and system |
CN106227352A (en) * | 2016-07-28 | 2016-12-14 | 北京国承万通信息科技有限公司 | Virtual reality scenario rendering method and system |
CN106249893A (en) * | 2016-08-03 | 2016-12-21 | 上海机电工程研究所 | Rigid-flexible system virtual assembly system and dummy assembly method thereof |
CN106249893B (en) * | 2016-08-03 | 2019-06-04 | 上海机电工程研究所 | Rigid-flexible system virtual assembly system and its dummy assembly method |
CN106652721A (en) * | 2016-10-21 | 2017-05-10 | 中国民航大学 | Virtualized training system and method for airplane maintenance |
CN106774923B (en) * | 2016-12-30 | 2022-07-01 | 天津天堰科技股份有限公司 | Virtual disinfection training system capable of displaying operation track |
CN106774923A (en) * | 2016-12-30 | 2017-05-31 | 天津天堰科技股份有限公司 | A kind of virtual sterilization training system for showing operation trace |
CN106950534B (en) * | 2017-02-27 | 2020-07-03 | 广东小天才科技有限公司 | Spatial position detection method and system and VR (virtual reality) wearable equipment |
CN106950534A (en) * | 2017-02-27 | 2017-07-14 | 广东小天才科技有限公司 | Spatial position detection method and system and VR (virtual reality) wearable equipment |
CN107092357A (en) * | 2017-04-21 | 2017-08-25 | 厦门中智信系统集成有限公司 | The real border Architectural Equipment management system of holography based on virtual reality |
WO2018210257A1 (en) * | 2017-05-16 | 2018-11-22 | 江苏海事职业技术学院 | Method and system of practical trouble shooting training for ship navigation devices based on vr technology |
CN107293182A (en) * | 2017-07-19 | 2017-10-24 | 深圳国泰安教育技术股份有限公司 | A kind of vehicle teaching method, system and terminal device based on VR |
CN107464465A (en) * | 2017-07-31 | 2017-12-12 | 中国航空工业集团公司西安飞机设计研究所 | A kind of active emergency evacuation virtual training system |
CN107256654A (en) * | 2017-07-31 | 2017-10-17 | 中国航空工业集团公司西安飞机设计研究所 | A kind of guiding emergency evacuation virtual training system |
CN108039080A (en) * | 2017-12-21 | 2018-05-15 | 中国舰船研究设计中心 | A kind of immersion remote training system based on virtual reality |
CN108287483A (en) * | 2018-01-17 | 2018-07-17 | 北京航空航天大学 | A kind of immersion Virtual Maintenance Simulation method and system towards Product maintenance verification |
CN109785425A (en) * | 2018-12-12 | 2019-05-21 | 珠海超凡视界科技有限公司 | A kind of three-dimensional imaging method |
CN109785425B (en) * | 2018-12-12 | 2023-02-28 | 珠海超凡视界科技有限公司 | Three-dimensional virtual imaging method |
GB2594714A (en) * | 2020-05-04 | 2021-11-10 | Createc Robotics Ltd | Virtual view generation |
GB2594714B (en) * | 2020-05-04 | 2022-12-07 | Createc Robotics Ltd | Virtual view generation |
CN112258656A (en) * | 2020-09-14 | 2021-01-22 | 北京京东振世信息技术有限公司 | Method, apparatus, server and medium for displaying product maintenance information |
CN112258656B (en) * | 2020-09-14 | 2023-08-08 | 北京京东振世信息技术有限公司 | Method, device, server and medium for displaying product maintenance information |
CN112085983B (en) * | 2020-09-29 | 2021-04-06 | 北京森合智源技术有限公司 | Virtual-real combination-based automobile virtual simulation teaching cloud service platform system |
CN112085983A (en) * | 2020-09-29 | 2020-12-15 | 北京森合智源技术有限公司 | Virtual-real combination-based automobile virtual simulation teaching cloud service platform system |
CN113038116A (en) * | 2021-03-09 | 2021-06-25 | 中国人民解放军海军航空大学航空作战勤务学院 | Method for constructing aerial refueling simulation training visual system |
CN113038116B (en) * | 2021-03-09 | 2022-06-28 | 中国人民解放军海军航空大学航空作战勤务学院 | Simulation training visual system for oil adding and receiving in air |
CN113784109A (en) * | 2021-09-07 | 2021-12-10 | 太仓中科信息技术研究院 | Projection system and method for script killing environment |
CN115808974A (en) * | 2022-07-29 | 2023-03-17 | 深圳职业技术学院 | Immersive command center construction method and system and storage medium |
CN115808974B (en) * | 2022-07-29 | 2023-08-29 | 深圳职业技术学院 | Immersive command center construction method, immersive command center construction system and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN104916182B (en) | 2017-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104916182A (en) | Immersion type virtual reality maintenance and training simulation system | |
Hilliges et al. | HoloDesk: direct 3d interactions with a situated see-through display | |
CN104050859A (en) | Interactive digital stereoscopic sand table system | |
CN103337095B (en) | The tridimensional virtual display methods of the three-dimensional geographical entity of a kind of real space | |
KR20130028878A (en) | Combined stereo camera and stereo display interaction | |
KR20170082907A (en) | Apparatus for providing augmented reality based on projection mapping and method thereof | |
CN105212418A (en) | Augmented reality intelligent helmet based on infrared night viewing function is developed | |
CN103440677A (en) | Multi-view free stereoscopic interactive system based on Kinect somatosensory device | |
CN103793049A (en) | Virtual reality display system | |
Piekarski et al. | Augmented reality working planes: A foundation for action and construction at a distance | |
Lugrin et al. | CaveUDK: a VR game engine middleware | |
JP2000122176A (en) | Information presentation method and device therefor | |
CN108132490A (en) | Detection system and detection method based on alignment system and AR/MR | |
CN105093551A (en) | Real three-dimensional stereo display system and display method for real-time performance of landscape environment | |
CN113918021A (en) | 3D initiative stereo can interactive immersive virtual reality all-in-one | |
Brunnett et al. | V-Pong: an immersive table tennis simulation | |
CN204990186U (en) | Desktop formula stereoscopic display interaction system | |
CN106125927B (en) | Image processing system and method | |
Novak-Marcincin et al. | Application of virtual and augmented reality technology in education of manufacturing engineers | |
KR101076263B1 (en) | Tangible Simulator Based Large-scale Interactive Game System And Method Thereof | |
CN113689756A (en) | Cabin reconstruction system based on augmented reality and implementation method | |
CN206515920U (en) | A kind of Novel pole curtain is positive to throw visual system | |
CN110060349A (en) | A method of extension augmented reality head-mounted display apparatus field angle | |
Zang et al. | Virtual reality and the application in virtual experiment for agricultural equipment | |
KR20120101878A (en) | Telescope device and telescope screen creating method using hmd for ship handling simulator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |