CN106960612B - One kind seeing vehicle and test ride simulation system and method based on VR - Google Patents
One kind seeing vehicle and test ride simulation system and method based on VR Download PDFInfo
- Publication number
- CN106960612B CN106960612B CN201710375620.7A CN201710375620A CN106960612B CN 106960612 B CN106960612 B CN 106960612B CN 201710375620 A CN201710375620 A CN 201710375620A CN 106960612 B CN106960612 B CN 106960612B
- Authority
- CN
- China
- Prior art keywords
- signal
- automobile
- vehicle
- seat
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
- G09B9/052—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles characterised by provision for recording or measuring trainee's performance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
- G09B9/05—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a kind of to see vehicle and test ride simulation system and method based on VR, it is related to seeing vehicle and test ride simulation field, caused by the interaction that the application aims to solve the problem that VR is generated in user's test run and test ride can not quantify caused by interaction effect inaccuracy the problem of poor user experience, the technical solution that the application uses are as follows: control signal and scene signals and human body pose signal based on acquisition terminal module acquisition skeleton information and acquisition automobile under same reference frame, and corresponding driving vision model of place and flight deck seat model are established based on these signals, and these real-time signal input models are generated into corresponding visual scene effect and haptic effect and seat variation effect.By using this programme, so that user realizes quantization in the interaction for carrying out test run or test ride process using VR, while realizing and realize test run under the same coordinate system and drive, so that the experience of user is truer.The present invention is suitable for the practical application related fields of VR.
Description
Technical field
The present invention relates to vehicle and test ride simulation field is seen, it particularly relates to which a kind of see vehicle based on VR and simulate test ride
System and method.
Background technique
Virtual reality (virtual reality, VR) be a kind of simulation people in the natural environment depending on, the behaviors such as listen, move
Advanced human-computer interaction technology can simulate and the identical 3-D image world in reality for user.In recent years, VR technology
It quickly grows, also urgent need VR technology bring is various convenient for people's production and life.In automobile industry, " VR sees vehicle "
At a big important need instantly." VR sees vehicle " can solve the problems such as people's " vehicle is seen at scene " hardly possible and " test ride is at high cost ".
Vehicle requirement hotel owner's vehicle various styles are seen at scene, this is a high cost and long time-consuming process for businessman.Also, it is true
Test ride, due to the limitation of time and space, user experience content is limited.However, user can be by VR test ride in a short time
Experience various different Driving Scenes, such as fine day, rainy day, ice and snow day, urban road and backroad etc..Therefore, VR sees vehicle
The test ride experience sense of user with save the cost, can also be not only improved, there is wide application value.
But existing some VR see that vehicle and drive simulating only simply divert VR technology to seeing that vehicle simulates test ride
Field can not really realize the quantization of interactive information so that interaction accuracy it is poor, so as to cause user VR experience effect not
Good, seller's vehicle is unable to get real displaying;The interaction accuracy that VR is seen in vehicle commissioning process so how is improved, is improved
User experience becomes the heavy difficult point for further increasing application of the VR in test ride field.
There is also technical barriers when test run is with vehicle is seen for the prior art, and user is the difference that stand and be seated, if with
During see vehicle to test run, existing VR is difficult to realize the linking of this transition process and causes user's sensory effect at family
Difference is experienced untrue;So how to distinguish user using image recognition is seeing vehicle still in test ride and how in uniform coordinate
System is lower to realize that VR's sees that vehicle and test ride are a problem to be solved.
Summary of the invention
It is an object of the invention to: see that vehicle test run field cannot achieve in the presence of interaction based on VR due to existing for above-mentioned
Interaction accuracy is not high caused by quantization and then user is caused to see the ineffective problem of vehicle test run, and the present invention provides one kind
That realizes the interactive promotion user experience for quantifying and then improving interaction accuracy sees vehicle and simulation test ride system and method based on VR.
This application provides one kind to see vehicle and simulation test ride system based on VR, and specific technical solution includes:
Acquisition terminal, cloud, haptic feedback module, VR helmet module, drive simulation cabin control module and flight deck seat
Execution module;
The acquisition terminal, acquires the RGB-D data of human body and coding is sent to cloud;
The VR helmet module detects human body head posture information and is sent to cloud, sends the automobile of user's selection
Three-dimensional model information and scene information, receive the visual signal that cloud is sent and display automobile threedimensional model and automobile are three-dimensional
The visual scene of model;
Drive simulation cabin control module obtains vehicle control signal and is sent to cloud;
The cloud receives automobile three-dimensional model information and scene information that VR helmet module is sent, human body head position
Appearance information, the RGB-D data (namely skeleton location information) that acquisition terminal is sent, which calculate, generates seat pose letter
Number, test run visual signal, see vehicle visual signal and haptic signal;
The flight deck seat execution module receives the seat pose signal that distal end is sent and controls seat generation test ride
(variation of the tilt angle including seat, seat jolt) is changed accordingly when simulation;
The haptic feedback module, receives the haptic signal that cloud is sent and generation is seen and is used to stimulate human body when vehicle simulation
The signal of reaction.
Specifically, cloud receives RGB-D data and decodes and calculates skeleton location information and receive automobile threedimensional model
Information simultaneously carries out esthesiometer and calculates haptic signal and to be sent to haptic feedback module, receives human body head posture information and simultaneously combines vapour
Vehicle three-dimensional modeling data and skeleton location information carry out vision and calculate to see vehicle visual signal and be sent to VR helmet module,
It receives human body head posture information and automobile three-dimensional modeling data and skeleton location information and scene information is combined to carry out
Vision calculates test run visual signal and to be sent to VR helmet module, receives scene information and vehicle control signal calculate
Seat pose signal is simultaneously sent to flight deck seat execution module.
Specifically, the acquisition terminal acquires 4 ZED Stereo of the RGB-D data of human body using alignment human body
Camera;Drive simulation cabin control module includes seat, steering wheel, throttle, brake, clutch, selector, cockpit shape
State executive device;The VR helmet module includes the VR helmet;The haptic feedback module uses the tactile with piezoelectric actuator
Feedback glove;The cloud uses server (having calculation processing capacity device, it is not limited to server);The VR head
Helmet includes for detecting the Position and attitude sensor of human body head pose signal, car model select button, for display automobile three-dimensional
The display of model.
Specifically, the Position and attitude sensor of the VR helmet uses gyroscope and accelerometer.
This application provides one kind to see vehicle analogy method, specific technical solution based on VR are as follows:
Step 1 generates automobile threedimensional model
Automobile threedimensional model is generated based on user demand and is established reference frame centered on automobile threedimensional model and is obtained
Take automobile threedimensional model coordinate (Xj,Yj,Zj) (wherein j is to indicate different automobile parts).XjWhen expression does not generate haptic signal
The abscissa of automobile each section, YjThe ordinate of automobile each section, Z when indicating not generating haptic signaljExpression does not generate tactile letter
Number when automobile each section vertical coordinate.
Step 2 acquires the pose signal of the skeleton location information of user and the head of user
The pose signal of the skeleton location information of user and the head of user is obtained based on reference frame;
Step 3 generates haptic signal and visual signal
Skeleton location information (X based on familyi,Yi,Zi) (i is corresponding skeleton point) and automobile threedimensional model coordinate
(Xj,Yj,Zj) haptic signal is generated, view is generated based on skeleton location information and automobile threedimensional model and head pose signal
Feel signal;XiIndicate the abscissa of a certain skeleton point in skeleton when not generating haptic signal, YiExpression does not generate haptic signal
When skeleton in a certain skeleton point ordinate, ZiIndicate when not generating haptic signal hanging down for a certain skeleton point in skeleton
Straight coordinate;
Step 4 generates haptic effect and visual effect
User's haptic effect is generated based on the haptic signal in step 3, user is generated based on the visual signal in step 3
Visual effect.
Specifically, haptic effect is generated in step 4 specifically: automobile threedimensional model includes the space coordinate of automobile profile
Information, skeleton point includes the skeleton point spatial coordinated information of partes corporis humani minute, as skeleton space of points coordinate value (xi,
yi,zi) reach automobile dummy model profile space coordinate value (xj,yj,zj) when, pressure feedback is generated, user experiences tactile effect
Fruit.
User need to select to intend the car category of viewing by the VR helmet " car model select button ", so that it is determined that automobile
Threedimensional model;Acquisition terminal acquires RGB-D data in real time, by itself and camera parameter together coding transmission to cloud (wherein, phase
Machine parameter is obtained by the calibration of camera);The data received are decoded by cloud, are joined after decoding to RGB-D data and camera
Skeleton position is calculated in number;Cloud combines the threedimensional model of automobile and skeleton position to carry out tactile calculating, that is, works as
The position of manpower and virtual car model calculate when being overlapped and generate corresponding tactile;Result " the tactile that cloud calculates tactile
Signal " is transferred to haptic feedback devices;(usual haptic feedback module uses touch feedback gloves to human body, so the hand of i.e. human body
Experience haptic effect in portion) haptic effect is experienced by haptic feedback devices.
User sees that the tactile during vehicle obtains principle are as follows:
Vision obtains principle first half and obtains principle with tactile, i.e. system determines automobile threedimensional model and obtains human body bone
Bone position additionally includes the position and posture signal that cloud obtains head by " Position and attitude sensor ", and skeleton position is used for
Human-computer interaction;Human-computer interaction includes but is not limited to: the position position of manpower and automobile threedimensional model door handle position are overlapped, triggering
Opening door operation;The position position of manpower and automobile threedimensional model boot button position are overlapped, and boot operation is opened in triggering;Finger
Position and the position of automobile threedimensional model button are overlapped, and trigger the operation that touches the button.
This application provides one kind to be based on VR test ride analogy method, specific technical solution are as follows:
Step 1 generates automobile threedimensional model
Automobile threedimensional model is generated based on user demand and establishes reference frame centered on automobile threedimensional model;
Step 2 obtains Driving Scene signal and human body head pose signal
Based on reference frame and customer requirement retrieval Driving Scene signal s, and obtain human body head pose signal p;
Step 3 obtains vehicle drive and controls signal
The control letter of throttle, brake, clutch, steering wheel and selector based on reference frame acquisition vehicle drive
Number;
Step 4 generates corresponding driving vision model and flight deck seat model
Driving vision model F is established by learning training based on human body head pose signal p and vehicle drive control signal
(a, b, c, d, g, p, s) and cockpit seat is generated by learning training based on Driving Scene signal s and vehicle drive control signal
Chair model G (a, b, c, d, g, s);
Step 5 generates real-time visual scene effect and seat pose effect
Based on the driving view in the human body head pose signal p acquired in real time and vehicle drive control signal input step four
Feel that model F (a, b, c, d, g, p, s) is calculated and generate visual signal, receive visual signal and generates the visual scene effect of corresponding user
Fruit, based on the flight deck seat model G in the Driving Scene signal s acquired in real time and vehicle drive control signal input step four
(a, b, c, d, g, s), which is calculated, generates seat pose signal, receives seat pose signal and generates seat variation effect.
Specifically, visual signal includes the visual angle of the switch speed for exporting every frame picture and every frame picture of output;Seat
Chair pose signal includes the inclined angle value of seat.
User's detailed process that driving vision generates during simulating test ride are as follows:
User selects Driving Scene by VR helmet button, and " scene signals " (s) is transmitted to the driving mould in cloud by the VR helmet
Type, while the VR helmet obtains human body head position and posture by Position and attitude sensor, and " pose signal " (p) is transmitted to the view in cloud
Feel model of place F (a, b, c, d, g, p, s);Cockpit passes through throttle control, the brake controller, clutch control acquired in real time
Device, steering wheel controller and selector controller processed obtain control signal, and cockpit will control signal (a, b, c, d, g) and be transmitted to cloud
End;Cloud is calculated according to visual scene model F (a, b, c, d, g, p, s) and the signal (a, b, c, d, g, p, s) received
Visual signal, and by visual signal transmission to the display of the VR helmet;Human body obtains the visual effect of simulation by display.
User's detailed process that seat pose generates during simulating test ride are as follows:
User selects Driving Scene by VR helmet button, and " scene signals " (s) is transmitted to the driving mould in cloud by the VR helmet
Type;Cockpit is obtained by throttle control, brake controller, clutch controller, steering wheel controller and selector controller
Control signal is taken, cockpit will control signal (a, b, c, d, g) and be transmitted to cloud;Cloud according to seat pose model G (a, b, c, d,
G, s) and the signal (a, b, c, d, g, s) that receives seat pose signal is calculated, and seat pose signal is transferred to and is driven
Sail cabin seat;Human body obtains the seat pose signal executed by flight deck seat.
In conclusion by adopting the above-described technical solution, the beneficial effects of the present invention are:
1. the application establishes corresponding driving vision model F (a, b, c, d, g, p, s) and cockpit by model learning
Seat model G (a, b, c, d, g, s), and reference frame is established, realize the amount of interaction of the user when seeing vehicle and test run
Change, so that the mapping relations between true driving data and visual signal or flight deck seat position are more accurately found, so that
User carries out that interaction when vehicle is seen in test run is more true to nature accurate, and the impression of picture and body that user sees is more acurrate to force using VR
Very, and then the user experience is improved;
2. there is also technical barriers to be how that distinguishing user using image recognition is seeing vehicle still in test ride for the prior art
And how under unified coordinate system realizing that VR's sees vehicle and test ride, the application is by the introducing of acquisition terminal, and with automobile three
Dimension module be reference frame and establish driving vision model F (a, b, c, d, g, p, s) and flight deck seat model G (a, b,
C, d, g, s), the function of seeing vehicle and test ride of realizing VR under unified coordinate system is realized, realizes and sees between vehicle and test run
Transformation experience, overcomes existing technical barrier;
3. experiencer's hand strap pressure sensitivity gloves on the spot in person can experience the feeling for touching automobile, make interactive feel deeper.Example
Such as: experiencer can simulate opening and closing arrangements for automotive doors, and the button of automobile is pressed in touching, and either on or off car light and operation automobile are shown
Screen, allows and sees that vehicle experience content is truer, interesting and abundant;
4. experiencer can simulate test ride automobile, simulated automotive is different scenes (such as different time, place and weather)
Driving situation makes test ride experience content abundant, solves the problems, such as that traditional test ride scene type is single;
5. the application provides to experiencer and virtually sees vehicle, experiencer is not only facilitated to watch all kinds of vehicles, but also reduce and sell vehicle
See the cost of vehicle.
Detailed description of the invention
It in order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, below will be to institute in embodiment
Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the invention
Example, for those of ordinary skill in the art, without creative efforts, can also obtain according to these attached drawings
Obtain other attached drawings.By the way that shown in attached drawing, above and other purpose of the invention, feature and advantage will be more clear.
Fig. 1 is the flow chart for seeing vehicle simulation system the present invention is based on VR;
Fig. 2 is to see vehicle model schematic diagram;
Fig. 3 is skeleton schematic diagram;
Fig. 4 is the flow chart of the test ride simulation system the present invention is based on VR;
Fig. 5 is cockpit schematic diagram;
Fig. 6 is the schematic diagram of seat pose model;
Fig. 7 is that driving vision model and flight deck seat model generate schematic diagram;
Marked in the figure: 1- throttle control;2- brake controller;3- clutch controller;4- steering wheel controller;5- is changed
Lug-latch controller;6- cockpit state executive device;A- throttle control signal;B- brake control signal;C- clutch control letter
Number;D- steering wheel controls signal;G- selector controls signal;S- scene signals;P- pose signal.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments, based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
It elaborates below with reference to Fig. 1 to Fig. 7 to the present invention.
Embodiment one
It present embodiments provides one kind and vehicle simulation system and method is seen based on VR, acquisition terminal selects ZED Stereo
Camera, to obtain RGB-D data.The Position and attitude sensor of the VR helmet selects gyroscope and accelerometer, to provide head
Angle information.Haptic feedback devices select the touch feedback gloves for having piezoelectric actuator.Such as Fig. 2 institute in terms of Ground arrangement
Show, four ZED Stereo Camera of selection are arranged in four angles that VR sees parking lot ground, vertical height 1m, i.e. phase
The three dimensional space coordinate of machine C1, C2, C3 and C4 are respectively (0,0,1), (10,0,1), (0,10,1) and (10,10,1), virtual vapour
The bottom surface geometric center of vehicle model is placed at plane coordinates (5,5,0).
Specific process is as described in Figure 1, and user enters the place that VR sees the system of vehicle, and takes the VR helmet and tactile is anti-
Present gloves;The car category that user checks needed for being selected by the button on the VR helmet;System passes through ZED Stereo Camera
Obtain the RGB-D data of partes corporis humani point.System calculates human skeleton location information by RGB-D data and camera parameter,
Skeleton point is as shown in Figure 3;Tactile calculating is carried out according to automobile threedimensional model and skeleton position;According to automobile threedimensional model,
Skeleton position and user's head pose carry out display content and calculate.
(1) display content calculates
System calculates corresponding display content according to skeleton position, automobile threedimensional model and head position signal.Example
Such as, head rotates up and down, then visual signal also has and correspondingly rotates.When the hand arrival door handle position of user, it is
System executes driving door operation.
Interactive portion embodiment are as follows:
A. when the hand of user reaches door handle position, system executes driving door operation;
B. the hand of user reaches the button position of boot, and boot operation is opened in system execution;
C. when the hand arrival button position of user, corresponding push-botton operation is executed, such as " igniting starting ", " opening car light ",
" operation multimedia display screen ";
Wherein, the position of user's hand is the spatial value (x of skeleton point 1,2,3,4,5 and 6i,yi,zi), i is to correspond to
Skeleton point;The door handle of automobile, interior button and boot button have corresponding spatial value (xj,yj,zj), j table
Show different automobile positions.
(2) tactile calculates
Automobile threedimensional model includes that the space coordinate of automobile profile is believed, skeleton point includes each section of human hands
Skeleton point spatial coordinated information.As user's hand skeleton point spatial value (xi,yi,zi) (i is point 1,2,3,4,5,6) arrival
Automobile dummy model profile space coordinate value (xj,yj,zj) when (j indicates different automobile positions), pressure sensitivity gloves generate pressure it is anti-
Feedback, simulates the feeling of touching.
Embodiment two
The present embodiment provides a kind of test ride simulation system and method based on VR:
As shown in fig. 7, automobile is in different scenes (such as: different time, place, weather) and different controls under acquisition truth
The data of signal (throttle, brake, clutch, steering wheel and selector) processed, temporal information are divided into the different moons, day and hour;
Place is divided into urban road, backroad, highway and hill path;Weather is divided into snowy day, rainy day, fine day and cloudy day.Pass through depth
Learning network model is spent, study obtains driving model, respectively visual scene model F (a, b, c, d, g, p, s) and seat pose
Model G (a, b, c, d, g, s).
Test ride mode treatment process:
The controller of cockpit is throttle a, brake b, clutch c, steering wheel d and selector g, in addition, cockpit is held
Row device is " the cockpit state executive device " under seat.
Firstly, user sits on flight deck seat, the scene driven by the button selection on the VR helmet.System enters mould
Intend test ride mode, the visual angle on the display screen of the VR helmet enters.For example, the scene that selects of user is " morning 9:00+ fine day+mountain
Road ", then the parameter s that system obtains are " morning 9:00+ fine day+hill path ".
The process flow of system is made of the duplicate control period one by one, and a control period is arranged as 5ms in we.?
In one control period, firstly, user passes through operation throttle control 1, brake controller 2, clutch controller 3, steering wheel
Controller 4, selector controller 5 sequentially input control signal (a, b, c, d, g) to system.In addition, system passes through the VR helmet
Gyroscope and accelerometer obtain the head pose parameter p of user;The visual scene model F (a, b, c, d, g, p, s) in cloud and
Seat pose model G (a, b, c, d, g, s) calculates separately corresponding vision by parameter (a, b, c, d, g, p, s) achieved above
Signal and flight deck seat signal.Thus the calculating in a control period terminates, into next control period.
Parameter is directly inputted into seat pose model G (a, b, c, d, g, s).As shown in Fig. 6 spherical coordinate system, function G's is defeated
It is out exactly the inclined angle of seat, i.e.,With two angles of θ.According toWith the value of θ, seat is tilted to all directions, and simulation is true
In the case of driving in fact, go up a slope, descending and the seat position and posture to jolt whens waiting.
For example, when the head rotation of user, system perceives head pose by position sensor, allows the display of the VR helmet
Device makes corresponding visual angle change.When road conditions are to go up a slope, seat is tilted backwards;When road conditions are descending, seat is tilted backwards;
When Uneven road, seat execution is jolted accordingly.
Claims (7)
1. one kind sees vehicle and test ride simulation system based on VR characterized by comprising
Acquisition terminal, cloud, haptic feedback module, VR helmet module, drive simulation cabin control module and flight deck seat execute
Module;
The acquisition terminal, acquires the RGB-D data of human body and coding is sent to cloud;
The VR helmet module detects human body head posture information and is sent to cloud, sends the automobile three-dimensional mould of user's selection
Type information and scene information receive the visual signal that cloud is sent and display automobile threedimensional model and automobile threedimensional model
Visual scene;
Drive simulation cabin control module obtains vehicle control signal and is sent to cloud;
The cloud receives automobile three-dimensional model information and scene information that VR helmet module is sent, human body head pose letter
Breath, the RGB-D data that acquisition terminal is sent, which calculate, to be generated seat pose signal, test run visual signal, sees vehicle visual signal
And haptic signal;
The flight deck seat execution module receives the seat pose signal that distal end is sent and controls seat generation test ride simulation
When change accordingly;
The haptic feedback module, receives the haptic signal that cloud is sent and generation is seen and is used to stimulate human response when vehicle simulation
Signal.
2. a kind of VR that is based on as described in claim 1 sees vehicle and test ride simulation system, which is characterized in that cloud receives RGB-D
Data simultaneously decode and calculate skeleton location information and receive automobile three-dimensional model information and carry out esthesiometer and calculate to obtain haptic signal
And it is sent to haptic feedback module, receive human body head posture information and combines automobile three-dimensional modeling data and skeleton position
Information carries out vision and calculates to see vehicle visual signal and be sent to VR helmet module, receives human body head posture information and combines vapour
Vehicle three-dimensional modeling data and skeleton location information and scene information carry out vision and calculate test run visual signal and to send
To VR helmet module, receives scene information and vehicle control signal calculate seat pose signal and being sent to cockpit seat
Chair execution module.
3. a kind of VR that is based on as described in claim 1 sees vehicle and test ride simulation system, which is characterized in that the acquisition terminal is adopted
With 4 ZED Stereo Camera of the RGB-D data of alignment human body acquisition human body;The drive simulation cabin control module packet
Include seat, steering wheel, throttle, brake, clutch, selector, cockpit state executive device;The VR helmet module includes VR
The helmet;The haptic feedback module uses the touch feedback gloves with piezoelectric actuator;The cloud uses server;It is described
The VR helmet includes for detecting the Position and attitude sensor of human body head pose signal, car model select button, for display automobile
The display of threedimensional model.
4. a kind of VR that is based on as claimed in claim 3 sees vehicle and test ride simulation system, which is characterized in that the pose of the VR helmet
Sensor uses gyroscope and accelerometer.
5. one kind sees vehicle and test ride analogy method based on VR, which is characterized in that including seeing vehicle method and test ride method, see vehicle method
The following steps are included:
Step 1 generates automobile threedimensional model
Automobile threedimensional model is generated based on user demand and is established reference frame centered on automobile threedimensional model and is obtained vapour
Vehicle threedimensional model coordinate (Xj,Yj,Zj), wherein j indicates different automobile parts, XjAutomobile is each when expression does not generate haptic signal
Partial abscissa, YjThe ordinate of automobile each section, Z when indicating not generating haptic signaljVapour when expression does not generate haptic signal
The vertical coordinate of vehicle each section;
Step 2 acquires the pose signal of the skeleton location information of user and the head of user
The pose signal of the skeleton location information of user and the head of user is obtained based on reference frame;
Step 3 generates haptic signal and visual signal
Skeleton location information (X based on useri,Yi,Zi) and automobile threedimensional model coordinate (Xj,Yj,Zj) generate tactile letter
Number, visual signal is generated based on skeleton location information and automobile threedimensional model and head pose signal, wherein i indicates people
The corresponding skeleton point of body, XiIndicate the abscissa of a certain skeleton point in skeleton when not generating haptic signal, YiExpression does not generate
When haptic signal in skeleton a certain skeleton point ordinate, ZiA certain bone in skeleton when expression does not generate haptic signal
The vertical coordinate of bone point;
Step 4 generates haptic effect and visual effect
User's haptic effect is generated based on the haptic signal in step 3, user's vision is generated based on the visual signal in step 3
Effect;
Test ride method the following steps are included:
Step 1 generates automobile threedimensional model
Automobile threedimensional model is generated based on user demand and establishes reference frame centered on automobile threedimensional model;
Step 2 obtains Driving Scene signal and human body head pose signal
Based on reference frame and customer requirement retrieval Driving Scene signal s, and obtain human body head pose signal p;
Step 3 obtains vehicle drive and controls signal
The control signal of throttle, brake, clutch, steering wheel and selector based on reference frame acquisition vehicle drive;
Step 4 generates corresponding driving vision model and flight deck seat model
Based on human body head pose signal p and vehicle drive control signal by learning training establish driving vision model F (a, b,
C, d, g, p, s) and based on Driving Scene signal s and vehicle drive control signal by learning training generation flight deck seat model
G(a,b,c,d,g,s);Wherein, p indicates that human body head pose signal, s indicate that Driving Scene letter, F indicate driving vision model, G
Indicate that flight deck seat model, a indicate that throttle, b indicate brake, c indicates that clutch, d indicate that steering wheel, g indicate selector;
Step 5 generates real-time visual scene effect and seat pose effect
Based on the driving vision mould in the human body head pose signal p acquired in real time and vehicle drive control signal input step four
Type F (a, b, c, d, g, p, s), which is calculated, generates visual signal, receives visual signal and generates the visual scene effect of corresponding user,
Based on the Driving Scene signal s acquired in real time and vehicle drive control signal input step four in flight deck seat model G (a,
B, c, d, g, s) generation seat pose signal is calculated, it receives seat pose signal and simultaneously generates seat variation effect.
6. a kind of VR that is based on as claimed in claim 5 sees vehicle and test ride analogy method, which is characterized in that generate touching in step 4
Feel effect specifically: automobile threedimensional model includes the spatial coordinated information of automobile profile, and skeleton point includes partes corporis humani minute
Skeleton point spatial coordinated information, as skeleton space of points coordinate value (xi,yi,zi) reach automobile dummy model profile space
Coordinate value (xj,yj,zj) when, pressure feedback is generated, user experiences haptic effect, wherein xiIndicate that the skeleton space of points is horizontal
Coordinate, xjIndicate automobile dummy model profile space abscissa, yiIndicate skeleton space of points ordinate, yjIndicate that automobile is empty
Analog model profile space ordinate, ziIndicate skeleton space of points vertical coordinate, zjIndicate that automobile dummy model profile space hangs down
Straight coordinate, i are human body skeleton point or automobile dummy model profile space point.
7. a kind of VR that is based on as claimed in claim 5 sees vehicle and test ride analogy method, which is characterized in that visual signal includes defeated
The visual angle of the switch speed of every frame picture and every frame picture of output out;Seat pose signal includes the inclined angle of seat
Value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710375620.7A CN106960612B (en) | 2017-05-24 | 2017-05-24 | One kind seeing vehicle and test ride simulation system and method based on VR |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710375620.7A CN106960612B (en) | 2017-05-24 | 2017-05-24 | One kind seeing vehicle and test ride simulation system and method based on VR |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106960612A CN106960612A (en) | 2017-07-18 |
CN106960612B true CN106960612B (en) | 2019-08-16 |
Family
ID=59482470
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710375620.7A Active CN106960612B (en) | 2017-05-24 | 2017-05-24 | One kind seeing vehicle and test ride simulation system and method based on VR |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106960612B (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107591113A (en) * | 2017-09-27 | 2018-01-16 | 张四清 | A kind of multiple-azimuth automobile sale displaying platform |
CN107945604A (en) * | 2017-12-07 | 2018-04-20 | 中国航空工业集团公司西安航空计算技术研究所 | A kind of traffic safety immersion experiencing system based on virtual reality technology |
CN108091203A (en) * | 2017-12-07 | 2018-05-29 | 中国航空工业集团公司西安航空计算技术研究所 | It is a kind of based on virtual reality technology stress traffic scene driving training system |
CN108039084A (en) * | 2017-12-15 | 2018-05-15 | 郑州日产汽车有限公司 | Automotive visibility evaluation method and system based on virtual reality |
CN108492008B (en) * | 2018-03-02 | 2021-06-04 | 上汽通用汽车有限公司 | Passenger car evaluation method, electronic equipment and storage medium |
CN109448130A (en) * | 2018-10-24 | 2019-03-08 | 成都旸谷信息技术有限公司 | Track emergency event experiencing system based on VR and BIM |
CN109377565A (en) * | 2018-10-25 | 2019-02-22 | 广州星唯信息科技有限公司 | One kind simulating true driving vision method based on three dimensional spatial scene map |
CN111984853B (en) * | 2019-05-22 | 2024-03-22 | 北京车和家信息技术有限公司 | Test driving report generation method and cloud server |
CN110488979A (en) * | 2019-08-23 | 2019-11-22 | 北京枭龙科技有限公司 | A kind of automobile showing system based on augmented reality |
US11257391B2 (en) | 2020-04-27 | 2022-02-22 | Nithin S Senthil | System and a method for virtual learning of driving a vehicle |
CN111861666A (en) * | 2020-07-21 | 2020-10-30 | 上海仙豆智能机器人有限公司 | Vehicle information interaction method and device |
CN113178112A (en) * | 2021-04-26 | 2021-07-27 | 重庆电子工程职业学院 | Artificial intelligence VR device |
CN113946259B (en) * | 2021-09-18 | 2023-04-07 | 北京城市网邻信息技术有限公司 | Vehicle information processing method and device, electronic equipment and readable medium |
CN114397113A (en) * | 2021-12-28 | 2022-04-26 | 重庆长安汽车股份有限公司 | Automobile CMF model evaluation system and method based on VR technology |
CN114860082A (en) * | 2022-05-30 | 2022-08-05 | 歌尔股份有限公司 | Handle control method, device and computer readable storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2000234240A1 (en) * | 2000-02-18 | 2001-08-27 | Prima Industries Establishment | Virtual reality testdrive system |
KR20030029360A (en) * | 2001-10-08 | 2003-04-14 | 현대자동차주식회사 | Driving simulator system |
CN103479138A (en) * | 2013-08-08 | 2014-01-01 | 罗轶 | Interactive virtual reality car show platform |
DE102014010309B4 (en) * | 2014-07-11 | 2017-11-23 | Audi Ag | View additional content in a virtual scenery |
DE102015200157A1 (en) * | 2015-01-08 | 2016-07-28 | Avl List Gmbh | Method for operating a driving simulator |
CN105654808A (en) * | 2016-02-03 | 2016-06-08 | 北京易驾佳信息科技有限公司 | Intelligent training system for vehicle driver based on actual vehicle |
CN106448337A (en) * | 2016-09-20 | 2017-02-22 | 山西省交通科学研究院 | VR technology based automobile driving simulation device |
CN106530891A (en) * | 2017-01-03 | 2017-03-22 | 刘晨 | Driving simulation system based on VR technology |
-
2017
- 2017-05-24 CN CN201710375620.7A patent/CN106960612B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN106960612A (en) | 2017-07-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106960612B (en) | One kind seeing vehicle and test ride simulation system and method based on VR | |
CN103246351B (en) | A kind of user interactive system and method | |
RU2019131365A (en) | SYSTEM AND METHOD FOR CONTROLLING ADDITIONAL ATTRACTIONS FOR RIDING | |
CN105807922A (en) | Implementation method, device and system for virtual reality entertainment driving | |
CN104464438A (en) | Virtual reality technology-based automobile driving simulation training system | |
Deligiannidis et al. | The vr scooter: Wind and tactile feedback improve user performance | |
EP3591503B1 (en) | Rendering of mediated reality content | |
CN106621324A (en) | Interactive operation method of VR game | |
CN106484982A (en) | Method for designing based on virtual reality and VR system during automobile product development | |
KR20170005971A (en) | Training simulator and method for special vehicles using argmented reality technology | |
KR101865282B1 (en) | Virtual Reality Skeleton Simulator Motion Control Device and Integrated Control Method thereof | |
CN108922307A (en) | Drive simulating training method, device and driving simulation system | |
CN108995590A (en) | A kind of people's vehicle interactive approach, system and device | |
CN107621880A (en) | A kind of robot wheel chair interaction control method based on improvement head orientation estimation method | |
CN110930811B (en) | System suitable for unmanned decision learning and training | |
CN109686165A (en) | A kind of the train lightweight simulation system and its construction method of multiple motion platforms | |
CN112669671B (en) | Mixed reality flight simulation system based on physical interaction | |
CN208655066U (en) | Automotive visibility evaluation system | |
CN207337401U (en) | A kind of automobile cabin human-computer interaction assessment system | |
TWI343270B (en) | ||
US11797093B2 (en) | Integrating tactile nonvirtual controls in a virtual reality (VR) training simulator | |
WO2017014671A1 (en) | Virtual reality driving simulator with added real objects | |
CN113112888A (en) | AR real scene interactive simulation driving method | |
CN106621325B (en) | Simulation method and system for racing operation | |
CN110947175A (en) | High-simulation in-person three-screen body sense racing car |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |