CN113552950B - Virtual and real interaction method for virtual cockpit - Google Patents
Virtual and real interaction method for virtual cockpit Download PDFInfo
- Publication number
- CN113552950B CN113552950B CN202110904548.9A CN202110904548A CN113552950B CN 113552950 B CN113552950 B CN 113552950B CN 202110904548 A CN202110904548 A CN 202110904548A CN 113552950 B CN113552950 B CN 113552950B
- Authority
- CN
- China
- Prior art keywords
- virtual
- cockpit
- physical
- real
- world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
- G09B9/048—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles a model being viewed and manoeuvred from a remote point
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention relates to the technical field of virtual cabins, and discloses a virtual and real interaction method for a virtual cabin, which comprises the following steps: calibrating the position angle relation of the whole physical object system of the virtual cockpit and a virtual world corresponding to the tracking system; carrying out operation tracking on the whole physical object of the virtual cabin; adjusting the interaction speed of the physical object operation of the driver and the response of the virtual object in the virtual scene to enable the response of the virtual object in the virtual scene to be synchronously matched with the physical operation state of the driver; and displaying the virtual object in the virtual scene on the display end in a real-time visualized mode in response to the motion. The method tracks the operation of the driver in the cockpit, so that the user observes and experiences the physical world in an operation mode in nature, reasonably designs the operation time, and enables the response of the system to be matched with the state of physical operation, thereby realizing the technical effect of more real immersive feeling.
Description
Technical Field
The invention relates to the technical field of virtual cabins, in particular to a virtual and real interaction method for a virtual cabin.
Background
Since the birth of the driving simulator, the characteristics of convenience, safety, low cost and high efficiency make the driving simulator become an important auxiliary tool for researching the running performance and the safety performance of the vehicle; in the aspect of traffic civil engineering, the driving simulator has important significance on the design and evaluation of traffic road alignment design, road sign design, environment design and other road related facilities.
However, in actual use, a series of problems that the reliability of the model is low, the image similarity measure and the image quality evaluation are difficult, the construction workload and data of a complex high-simulation virtual environment are huge, a human-computer interaction mechanism is unnatural, the immersion is lacked, the real-time performance and consistency of virtual reality during multiple users are poor and the like exist, and the further expansion of an application scene is always restricted.
Since vision is a significant proportion of the sensory experience of driving. At present, a plurality of companies and research institutions invest in a large amount of driving simulation systems researched and developed by manpower and material resources, and more sense of immersion and sense of reality of a view in the driving system is improved by combining with a virtual reality technology with stereoscopic vision. With the development of stereoscopic vision technology, various technical routes such as mainly using a head-mounted display, surround type stereoscopic projection, real vehicle window modification and the like are formed in the market at present. The solutions of these techniques are all able to meet to some extent the visual needs of current driver simulators. With the benefit, the current automobile driving simulation system gradually turns from a passive driving simulator to an active driving simulator, and the application range also covers the wide range of driving training, development and entertainment.
However, driving is not a pure experience process, and in actual use, very strong demands are made on interaction. Due to the complexity of the cabin itself, it is very difficult to track the driver's movements. The majority of the solutions currently prevailing are the abstraction of the driver's operation as sensor events of the vehicle operating means. This causes some splitting of the interaction in the synchronization of the physical and virtual environments, resulting in a bottleneck for the difficult experience.
Disclosure of Invention
Technical problem to be solved
Among the mainstream virtual systems at present, the main cause of immersive disruption is: the interaction mode and the real driving mode have larger difference; experience logic is not perfect; data exchange has a delay that affects the sense of substitution;
in order to solve the problems and achieve a more real immersive feeling, the operation of a driver in a cockpit is tracked, so that a user can observe and experience the physical world in a natural operation mode, the operation time is reasonably designed, and the response of the system is matched with the state of physical operation.
(II) technical scheme
In order to achieve the purpose, the invention provides the following technical scheme:
a virtual and real interaction method for a virtual cockpit comprises the following steps:
calibrating the position angle relation between the whole physical object system of the virtual cabin and a virtual world corresponding to a tracking system;
step two, performing operation tracking on the whole physical object of the virtual cabin, specifically as follows:
step S1, the driver performs an operation of the physical object in the virtual cabin and designs the operation time of the driver;
step S2, the tracking system firstly tracks the operation of the driver in the cockpit, and then transmits the physical object operation to the virtual object operation in the virtual scene on the premise of ensuring that the requirements of different data types in real time or reliability can be met by using different communication protocols, so that the user can observe and experience the physical world in the natural world operation mode;
adjusting the interaction speed of the physical object operation of the driver and the response of the virtual object in the virtual scene to enable the response of the virtual object in the virtual scene to be synchronously matched with the physical operation state of the driver;
and fourthly, displaying the response motion of the virtual object in the virtual scene on a display end in a real-time visual mode, so that the physical operation of a driver in the virtual cabin is visually visible and tactually real.
Further, the first step: on the basis that the calibration of the tracking system is completed, the whole physical object of the virtual cabin is in one-to-one correspondence with the virtual object in the coordinate system of the virtual world corresponding to the tracking system.
Further, the first step: the rapid calibration of the virtual world and the physical world comprises the following steps: one-point positioning, two-point positioning and multi-point positioning.
Further, the one-point positioning is used for an operation object which only needs to confirm the position, does not change the direction, or does not influence the experience due to the direction change: the device comprises an instrument panel, a steering wheel, a brake pedal, an accelerator pedal, a parking brake, a combined indicator light and an indication panel.
Further, the two-point positioning is used for an operation object which needs to confirm the position and the rotation, changes the direction and affects the experience: virtual cabin basic platform, indicator light switch, gear shift lever, headlight switch, marker light switch, wiper switch.
Further, the multi-point positioning is used for calibration of the whole virtual cabin base platform, and the object itself may be relatively changed and may affect the experienced operation object: a seat and a safety belt.
Further, the method is used for a cockpit system using VR helmet for driving simulation, and can be compatible with a use scene of surrounding stereo projection or window screen transformation.
Further, the method uses a game engine to perform dynamic visual calculation, and can also be compatible with an industrial visualization engine and other engines through a VRPN protocol.
(III) advantageous technical effects
Compared with the prior art, the invention has the following beneficial technical effects:
the method comprises the steps of establishing a one-to-one correspondence relationship between the whole physical object of the virtual cabin and the virtual object in a coordinate system of the virtual world corresponding to a tracking system by calibrating the position angle relationship between the whole system and the virtual world;
the physical operation of a driver in the cockpit is tracked, so that a user can observe and experience the physical world in an operation mode in nature;
because all operations are physical objects and the virtual objects are driven to move in the same way, different communication protocols can be used to ensure that the requirements of different data types on real-time performance or reliability can be met;
the interaction speed of the physical object operation of the driver and the response of the virtual object in the virtual scene is adjusted, so that the response of the virtual object in the virtual scene is synchronously matched with the physical operation state of the driver;
therefore, the consistency of all operation objects of the virtual cockpit and the physical cockpit in vision and body feeling can be improved, the physical operation is visible in vision and real in touch, and the beneficial technical effect of more real immersive feeling is realized.
Drawings
Fig. 1 is a flowchart of the steps of a virtual cockpit virtual-real interaction method;
FIG. 2 is a diagram of the physical location of a virtual cockpit base platform;
fig. 3 is a diagram of the physical location of the steering wheel.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention does not relate to a method for positioning and tracking the physical world and calibrating the corresponding virtual world coordinates, but directly uses the specific result of the method for tracking;
the method does not relate to the manufacturing process of the virtual vehicle, but assumes that a digital model and a corresponding scene with the completion degree capable of meeting the visualization requirement are provided, and provides the mapping relation between the physical operation and the virtual scene on the basis;
a virtual and real interaction method for a virtual cockpit, as shown in fig. 1, includes the following steps:
calibrating the position angle relation between the whole physical object system of the virtual cabin and a virtual world corresponding to a tracking system;
the calibration in this step is performed on the basis that the calibration of the tracking system is already completed, and the purpose is to establish a one-to-one correspondence relationship between the whole physical object of the virtual cockpit and the virtual object in the coordinate system of the virtual world corresponding to the tracking system;
the physical object composition of the virtual cockpit comprises: the system comprises a virtual cabin base platform, seats (comprising seat operation buttons), an indication panel, a steering wheel, a steering lamp switch, a wiper switch, a gear lever, a headlight switch, an outline marker lamp switch, a brake pedal, an accelerator pedal and a safety belt;
the virtual cabin base platform can adjust the position and the angle, and is integrally adjusted when the whole cabin is deployed;
as shown in fig. 2, assuming that AC is the direction of the head of the virtual cockpit base platform, four points of an ABCD with symmetrical positions use tracked objects with completely identical structures, or the same tracked object is used to sample at four positions respectively, to obtain information of the four points, wherein intersection points of the AD and the BC are aligned to the origin of coordinates of the tracking system, a connecting line of a midpoint of the AC and a midpoint of the BD is aligned to an X axis of the tracking system, a positive direction of the X axis points to the midpoint of the AC from the origin, a positive direction of the Z axis is opposite to a direction of gravity, a connecting line of the midpoint of the AB and the midpoint of the CD is aligned to a Y axis, and a positive direction of the Y axis points to the midpoint of the AB from the origin;
to ensure that the entire physical object system is more easily adjusted and calculated, the points obtained in the tracking system by the physical location of the virtual cabin base platform can be offset according to the virtual world coordinate system; virtual world coordinate [ A ] in the tracking system through the acquired four points x 、A y 、A z ],[B x 、B y 、B z ],[C x 、C y 、C z ],[D x 、D y 、D z ]The coordinate of the intersection point of AD and BD in the virtual world can be calculated as [ (A) x +B x +C x +D x )/4,(A y +B y +C y +D y )/4,(A z +B z +C z +D z )/4]Then, the virtual world coordinate is shifted to the point in place; if the tracking system has no self offset, the offset value is suggested to be used as offset data of all subsequent tracking points;
the device is characterized in that the adjustable objects of a seat (comprising a seat operation button) are the whole position of the seat and a seat back, the tracked position of a cushion is used as the position information of the whole seat in a virtual world, the angle of the backrest rotates the backrest by taking a hinge point of the backrest and the seat as an axis, and the device needs two trackers, wherein the tracker A is used for tracking the cushion, and the tracker B is used for tracking the backrest;
the positional information of the seat is [ A ] x 、A y 、A z ]The rotation of the tracker B around the hinge point is [ B ] rotX ,B rotY ,B rotZ ]In which B is rotX And B rotZ Should approach 0;
the indicating panel comprises a combined indicating lamp and an instrument panel;
combination pilot lamp (left indicator, right indicator, parking braking warning light, far-reaching headlamp and power indicator) need not physical structure, adjusts overall position through the tracking, when using the position of tracking point acquisition panel board, should make its long limit be on a parallel with tracker's Y axle, can obtain the position coordinate of combination pilot lamp in the virtual world and be: [ A ] x ,A y ,A z ]In an attitude of [ A rotX ,A rotY ,A rotZ ]A acquired by the tracking system should be adjusted during calibration rotY Approaching to 0 as much as possible;
the instrument panel (speedometer, tachometer, fuel gauge, water temperature gauge) does not need the physical structure, adjusts the overall position through tracking, when using the tracking point to obtain the position of instrument panel, should make the long limit in instrument panel plane be on a parallel with tracking system's Y axle, can obtain with the position coordinate of instrument panel in the virtual world: [ A ] x ,A y ,A z ]In an attitude of [ A rotX ,A rotY ,A rotZ ],A rotX And A rotZ Should approach 0;
the steering wheel is a replaceable steering wheel and can be adjusted in position and angle, as shown in figure 3, the y coordinate of the point A is used, the x and z coordinates of the point B are aligned with the mass center of the steering wheel ring, namely the position coordinate of the steering wheel in the virtual world is [ B x ,A y ,B z ]In an attitude of [ B rotX ,A rotY ,B rotZ ];
The turn light switch, the wiper switch and the gear lever can be replaced and the position and the angle can be adjusted;
the device is hinged to the steering wheel integral control device, so that the position information of the steering wheel can be directly used for positioning to obtain accurate positioning, angles in various states only need to be calibrated, and angle information in information obtained by optical or other tracking information with position positioning can be calibrated, and angle information given by an angle sensor can also be used;
the headlight switch, the clearance lamp switch, the brake pedal and the accelerator pedal can be replaced and the position and the angle can be adjusted; the tracking device is installed at the pedal position by using a set tool, tracks two limit positions and is removed after calibration;
the safety belt is replaceable and adjustable in position, and serves as a unique flexible structure of a cabin position, the calibration of the safety belt can be simply simulated as two-point calibration and two-point tracking, two fixed ends of the safety belt calibrate positions, and tracking devices are arranged on a left shoulder stress point and a chest stress point of the safety belt to acquire the state of the safety belt during driving action of a person;
the above device can perform fast calibration through vector points with directions, the specific method depends on the working environment of the whole system, and the implementation scheme is as follows:
the following methods all assume that when an adopted positioning system calibrates the virtual and physical world spaces, the origin (X is 0, Y is 0, and Z is 0) point of a coordinate system can be shifted and rotated according to a certain rule, if the original positioning system does not include a related method, the coordinate system needs to be converted, and a method for calculating coordinate shift can be used as a beneficial supplement of the method;
optical tracking marker point scheme: the core point of the scheme is that an optical motion capture system is used for tracking a specifically designed rigid body so as to complete the alignment of the whole system and the tracking of operation;
LightHouse protocol: the core point of the scheme is that a specially optimized head-mounted virtual reality display device is customized and developed for a Vive system by using HTC, and the self-consistency of the system is high;
sensor angle calculation scheme: the core point of the scheme is that various sensor devices are utilized to track the operation of the whole system;
the multi-technology mixing scheme comprises the following steps: the multi-technology mixing is the scheme with the most cost performance and the most development potential in the current technical level;
in fact, according to the characteristics of the object and the characteristics of the operation, the best experience effect can be achieved by using hybrid tracking;
for example, a Tracker of an HTC five system is used as a positioning point tool to position a virtual cabin platform to match the relationship between physical coordinates and virtual coordinates of the whole system, the Tracker is used for providing the position and the direction of a palm, a plurality of leapmotions are combined to cover an operation area to track gestures, a power-on switch is used for simulating a headlight switch and a wiper switch, and an angle sensor is used for tracking a pedal;
although the technology of inertial sensor hybrid optical or laser positioning device positioning can be involved in the implementation case, the innovation point of the method does not relate to the specific calculation method thereof, and only the product of the method or the self-calculation result of the customized device is used as a providing source of the positioning information;
according to the steps required by calibration positioning, the method can be divided into one-point positioning, two-point positioning and multi-point positioning to carry out the rapid calibration of the virtual world and the physical world:
one-point positioning is used for an operation object which only needs to confirm the position, does not change the direction, or does not influence the experience by the direction change: the device comprises an instrument panel, a steering wheel, a brake pedal, an accelerator pedal, a parking brake, a combined indicator light and an indication panel; one point positioning only takes the position information of the tracking device, and the position and the direction in the virtual world can be accurately positioned by taking the coordinate direction of the virtual cabin basic platform as reference;
two-point positioning is used for operation objects that require confirmation of position and rotation, change in direction, and affect experience: the virtual cockpit base platform, a turn light switch, a gear lever, a headlight switch, an outline marker light switch and a wiper switch; the two-point positioning comprises the position and direction information of the tracking device, and can track the dynamic change of the object
Multipoint positioning is used for calibration of the entire virtual cockpit base platform, and the object itself may change relatively and affect the experienced operation object: seats, safety belts; the multipoint positioning comprises position and direction information, at least comprises a group of positioning components of three tracking devices, and can define rigid body state change;
the point in the calibration positioning refers to a vector with a position and a direction in a space, and is not a point with only position information in a three-dimensional space, and a specific acquisition method has a difference according to a used positioning technical scheme;
the three calibration methods correspond to devices with multiple possibilities (HTCvive handle, tracker with laser emitter, optical rigid body with electronic compass);
or a level meter (bubble type or electronic type) is matched, and a coordinate system for contrasting the physical world and the virtual world is generated through the device;
the method can greatly reduce the difficulty of calibration between the physical object and the virtual object in the system under the condition of ensuring that the relative error precision of the physical operation object and the virtual object is acceptable;
calibration operations to achieve the best results, one should establish 1: 1, a model;
step two, performing operation tracking on the whole physical object of the virtual cabin, specifically as follows:
step S1, the driver performs an operation of the physical object in the virtual cabin and designs the operation time of the driver;
step S2, the tracking system firstly tracks the operation of the driver in the cockpit, and then transmits the physical object operation to the virtual object operation in the virtual scene on the premise of ensuring that the requirements of different data types in real time or reliability can be met by using different communication protocols, so that the user can observe and experience the physical world in the natural world operation mode;
for devices that require only one-time adjustment (e.g., cabin base platform, electronic dashboard, seat), only track their position and issue a warning when the positional offset exceeds a threshold, asking the user to confirm whether the entire system needs to be recalibrated;
for devices with assistance (such as a headlight switch, a windshield wiper and high-low light control), only the trigger of the action is tracked, and an adjustable animation control mode is provided;
for devices requiring precise control (such as an accelerator pedal, a brake pedal, and a steering wheel), the variation thereof is precisely recorded, and calibration is required before the start of use;
or a positioning device is arranged on the palm to track the position and direction of the palm, and the palm can be matched with a gesture tracking system (such as an infrared optical gesture tracking device LeapMotion, an inertial sensor motion capture glove Hi5, a fiber stretching deflection sensor motion capture glove hand assembly, a resistance bending sensor and a flexible optical fiber sensor) to simulate the motion of the hand and use the sensor system to obtain the technical scheme of operation;
regulating the interaction speed of the physical object operation of the driver and the response of the virtual object in the virtual scene to enable the response of the virtual object in the virtual scene to be synchronously matched with the physical operation state of the driver;
step four, displaying the virtual object in the virtual scene on a display end (a stereo display device or a spectator display) in a real-time visual mode in response to the motion, so that the physical operation of a driver in the virtual cabin is visually visible and tactually real;
in consideration of cost and immersion, the scheme is mainly directed to a cabin system using a VR helmet for driving simulation, and can be compatible with a use scene of surrounding stereo projection or vehicle window screen transformation;
the method recommends using a game engine to perform dynamic visual calculation, and can also be compatible with an industrial visualization engine and other engines through a VRPN protocol;
combination of virtual vision and physical haptics: in the scheme, a cabin simulation device with replaceable parts and dynamically adjusted layout is recommended to be used, and the virtual driving simulation states of real vehicle modification, six-axis seat platforms and the like are compatible;
although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (8)
1. A virtual and real interaction method for a virtual cockpit is characterized by comprising the following steps:
calibrating the position angle relation between the whole physical object system of the virtual cabin and a virtual world corresponding to a tracking system;
the physical object constitution of the virtual cockpit comprises a virtual cockpit base platform, the position and the angle of the virtual cockpit base platform are integrally adjusted when the whole cockpit is deployed, and the specific method comprises the following steps:
setting AC as the direction of a head of a virtual cockpit basic platform, using a tracked object with the same structure at four ABCD points with symmetrical positions, or using the same tracked object to sample at four positions respectively to obtain information of the four points, aligning the intersection point of AD and BC in the four points to a coordinate origin of a tracking system, aligning a connecting line of a midpoint of AC and a midpoint of BD to an X axis of the tracking system, pointing the positive direction of the X axis to the midpoint of the AC from the origin, aligning the connecting line of the midpoint of AB and the midpoint of CD to a Y axis, and pointing the positive direction of the Y axis to the midpoint of AB from the origin;
shifting points obtained by the physical position of the virtual cabin basic platform in the tracking system according to a virtual world coordinate system; through the obtained virtual world coordinates [ Ax, Ay, Az ], [ Bx, By, Bz ], [ Cx, Cy, Cz ], [ Dx, Dy, Dz ] of the four points in the tracking system, the coordinate of the intersection point of AD and BC in the virtual world can be calculated to be [ (Ax + Bx + Cx + Dx)/4, (Ay + By + Dy)/4, (Az + Bz + Cz + Dz)/4], and then the origin of the virtual world coordinates is shifted to the point;
step two, performing operation tracking on the whole physical object of the virtual cabin, specifically as follows:
step S1, the driver performs an operation of the physical object in the virtual cabin and designs the operation time of the driver;
step S2, the tracking system firstly tracks the operation of the driver in the cockpit, and then transmits the physical object operation to the virtual object operation in the virtual scene on the premise of ensuring that the requirements of different data types in real time or reliability can be met by using different communication protocols, so that the user can observe and experience the physical world in the natural world operation mode;
regulating the interaction speed of the physical object operation of the driver and the response of the virtual object in the virtual scene to enable the response of the virtual object in the virtual scene to be synchronously matched with the physical operation state of the driver;
and fourthly, displaying the response motion of the virtual object in the virtual scene on a display end in a real-time visual mode, so that the physical operation of a driver in the virtual cabin is visually visible and tactually real.
2. The virtual cockpit virtual-real interaction method of claim 1, wherein said first step: on the basis that the calibration of the tracking system is completed, the whole physical object of the virtual cabin is in one-to-one correspondence with the virtual object in the coordinate system of the virtual world corresponding to the tracking system.
3. The virtual cockpit virtual-real interaction method of claim 2, wherein said first step: the rapid calibration of the virtual world and the physical world comprises the following steps: one-point positioning, two-point positioning, and multi-point positioning.
4. The virtual cockpit virtual-real interaction method of claim 3, wherein said one-point positioning is used for an operation object that only needs to confirm a position, does not change a direction, or does not affect an experience: the device comprises an instrument panel, a steering wheel, a brake pedal, an accelerator pedal, a parking brake, a combined indicator light and an indication panel.
5. The virtual cockpit virtual-real interaction method of claim 3, wherein the two-point positioning is used for operation objects which need to confirm position and rotation, change direction and affect experience: virtual cabin basic platform, indicator light switch, gear shift lever, headlight switch, marker light switch, wiper switch.
6. The virtual cockpit virtual-real interaction method of claim 3, wherein the multi-point positioning is used for calibration of the entire virtual cockpit base platform, and the object itself may be relatively changed and may affect the experienced operation object: seats, safety belts.
7. The virtual cockpit virtual-real interaction method of any one of claims 1-6, wherein the method is compatible with a use scene using surround stereo projection or window screening for a cockpit system using VR helmet for driving simulation.
8. The virtual cockpit virtuality and reality interaction method of any one of claims 1-6, wherein the method uses a game engine to perform dynamic visual computation, and is also compatible with an industrial visualization engine and other engines through a VRPN protocol.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110904548.9A CN113552950B (en) | 2021-08-06 | 2021-08-06 | Virtual and real interaction method for virtual cockpit |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110904548.9A CN113552950B (en) | 2021-08-06 | 2021-08-06 | Virtual and real interaction method for virtual cockpit |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113552950A CN113552950A (en) | 2021-10-26 |
CN113552950B true CN113552950B (en) | 2022-09-20 |
Family
ID=78134194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110904548.9A Active CN113552950B (en) | 2021-08-06 | 2021-08-06 | Virtual and real interaction method for virtual cockpit |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113552950B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104133378A (en) * | 2014-08-05 | 2014-11-05 | 中国民用航空总局第二研究所 | Real-time simulation platform for airport activity area monitoring guidance system |
CN105159145A (en) * | 2015-09-24 | 2015-12-16 | 吉林大学 | Movement following correction method for visual system of open driving simulator |
CN110136535A (en) * | 2018-02-09 | 2019-08-16 | 深圳市掌网科技股份有限公司 | Examination of driver simulation system and method |
CN110610547A (en) * | 2019-09-18 | 2019-12-24 | 深圳市瑞立视多媒体科技有限公司 | Cabin training method and system based on virtual reality and storage medium |
CN111314484A (en) * | 2020-03-06 | 2020-06-19 | 王春花 | Virtual reality data synchronization method and device and virtual reality server |
CN112669671A (en) * | 2020-12-28 | 2021-04-16 | 北京航空航天大学江西研究院 | Mixed reality flight simulation system based on physical interaction |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110188689B (en) * | 2019-05-30 | 2022-11-29 | 重庆大学 | Virtual driving target collision detection method based on real scene modeling |
CN110764620A (en) * | 2019-10-30 | 2020-02-07 | 中仿智能科技(上海)股份有限公司 | Enhanced semi-virtual reality aircraft cabin system |
-
2021
- 2021-08-06 CN CN202110904548.9A patent/CN113552950B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104133378A (en) * | 2014-08-05 | 2014-11-05 | 中国民用航空总局第二研究所 | Real-time simulation platform for airport activity area monitoring guidance system |
CN105159145A (en) * | 2015-09-24 | 2015-12-16 | 吉林大学 | Movement following correction method for visual system of open driving simulator |
CN110136535A (en) * | 2018-02-09 | 2019-08-16 | 深圳市掌网科技股份有限公司 | Examination of driver simulation system and method |
CN110610547A (en) * | 2019-09-18 | 2019-12-24 | 深圳市瑞立视多媒体科技有限公司 | Cabin training method and system based on virtual reality and storage medium |
CN111314484A (en) * | 2020-03-06 | 2020-06-19 | 王春花 | Virtual reality data synchronization method and device and virtual reality server |
CN112669671A (en) * | 2020-12-28 | 2021-04-16 | 北京航空航天大学江西研究院 | Mixed reality flight simulation system based on physical interaction |
Also Published As
Publication number | Publication date |
---|---|
CN113552950A (en) | 2021-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108803607B (en) | Multifunctional simulation system for automatic driving | |
JP7336184B2 (en) | Systems, methods, and tools for spatially aligning virtual content with a physical environment in an augmented reality platform | |
Thomas et al. | Augmented reality: An application of heads-up display technology to manual manufacturing processes | |
Brooks | What's real about virtual reality? | |
Zheng et al. | Virtual reality | |
CN106960612B (en) | One kind seeing vehicle and test ride simulation system and method based on VR | |
CN108681264A (en) | A kind of intelligent vehicle digitalized artificial test device | |
CN103050028B (en) | Driving simulator with stereoscopic vision follow-up function | |
CN112669671B (en) | Mixed reality flight simulation system based on physical interaction | |
CN103365416A (en) | System and method for virtual engineering | |
Pollini et al. | A synthetic environment for dynamic systems control and distributed simulation | |
Haeling et al. | In-car 6-dof mixed reality for rear-seat and co-driver entertainment | |
US6149435A (en) | Simulation method of a radio-controlled model airplane and its system | |
JP2018049258A (en) | Program and train operation simulator | |
JPH09138637A (en) | Pseudo visibility device | |
Zintl et al. | Development of a virtual reality simulator for eVTOL flight testing | |
CN113552950B (en) | Virtual and real interaction method for virtual cockpit | |
CN208655066U (en) | Automotive visibility evaluation system | |
CN112530022A (en) | Method for computer-implemented simulation of LIDAR sensors in a virtual environment | |
Zang et al. | Virtual reality and the application in virtual experiment for agricultural equipment | |
KR20120101878A (en) | Telescope device and telescope screen creating method using hmd for ship handling simulator | |
JP2023140280A (en) | System for design, review and/or presentation of prototype solution for vehicle, corresponding operation method and computer program product | |
WO2017014671A1 (en) | Virtual reality driving simulator with added real objects | |
CN109634427B (en) | AR (augmented reality) glasses control system and control method based on head tracking | |
Tonnis et al. | Visualization of spatial sensor data in the context of automotive environment perception systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |