CN110097799A - Virtual driving system based on real scene modeling - Google Patents

Virtual driving system based on real scene modeling Download PDF

Info

Publication number
CN110097799A
CN110097799A CN201910434811.5A CN201910434811A CN110097799A CN 110097799 A CN110097799 A CN 110097799A CN 201910434811 A CN201910434811 A CN 201910434811A CN 110097799 A CN110097799 A CN 110097799A
Authority
CN
China
Prior art keywords
virtual
scene
driving vehicle
sphere
virtual driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910434811.5A
Other languages
Chinese (zh)
Other versions
CN110097799B (en
Inventor
沈志熙
宋永端
李聃
曾海林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201910434811.5A priority Critical patent/CN110097799B/en
Publication of CN110097799A publication Critical patent/CN110097799A/en
Application granted granted Critical
Publication of CN110097799B publication Critical patent/CN110097799B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/05Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated

Abstract

The invention discloses a kind of virtual driving systems based on real scene modeling, including computer, driver behavior input module, virtual scene system and virtual reality hardware, in a computer, the virtual scene system includes sphere scene module and driving operational module to the virtual scene system loading.Driver behavior input module includes steering wheel and foot pedal, and the steering wheel and foot pedal pass through data-interface respectively and connect with computer;Virtual reality hardware includes the VR helmet and three-dimensional space tracking and positioning device;Sphere scene module includes sphere model and the real scene video frame picture that is attached on sphere model inner surface;Driving operational module includes change gear control module and deflecting control module.The present invention builds virtual driving scene using the real scene video of shooting, and the validity of scene is high;Cooperate virtual reality hardware simultaneously, user can experience driving sensation on the spot in person, and then can improve training effectiveness and effect.

Description

Virtual driving system based on real scene modeling
Technical field
The present invention relates to automobile virtual driving technical field, in particular to a kind of virtual driving based on real scene modeling System.
Background technique
Driving simulation system is a kind of advanced driving training tool, contains various roads situation and weather in the system Actual environment is imitated, while the drive simulatings equipment such as full-size(d) steering wheel, foot pedal and selector of arranging in pairs or groups, to drive simulating Virtual car in scene carries out real-time control, and by display screen broadcasting pictures and audible sound, is reached with this and approach reality The effect of driving.
But existing drive simulating, which still has usage experience, feels the problem of greatly differing from each other with what is really driven, causes This problem main reason is that: 1, the scene of virtual driving is untrue, 2, operation it is not intuitive.Instantly drive simulating product, It is to build scene by way of 3D modeling, not by real material, therefore, the sense of user in use There is bigger difference by with reality scene.Meanwhile present most of drive simulating products, display equipment are all the displays of plane Device, impression not on the spot in person, this will also result in the huge drop with experience of actually driving, influences training quality.
Summary of the invention
In view of this, the object of the present invention is to provide a kind of virtual driving system based on real scene modeling, to solve Existing simulative automobile driving product builds scene time-consuming using 3D modeling, and the scene built and real scene gap are big, operates Person cannot experience the technical problems such as driving sensation on the spot in person.
The present invention is based on the virtual driving systems of real scene modeling, including computer, further include driver behavior input mould Block, virtual scene system and virtual reality hardware, the virtual scene system loading in a computer, the virtual scene system Including sphere scene module and drive operational module;
The driver behavior input module includes steering wheel and foot pedal, and the steering wheel and foot pedal pass through number respectively It is connect according to interface with computer;
The sphere scene module is included the sphere model built in Virtual Space using Unity3D, is attached to sphere mould Real scene video frame picture on type inner surface is placed in the virtual driving vehicle at sphere model center and is placed in and virtually drives Sail the virtual camera of vehicle drive position;
The driving operational module includes change gear control module and deflecting control module, and deflecting control module is used for according to side Virtual driving vehicle deflecting into the input control sphere scene module of disk, change gear control module are used for according to the defeated of foot pedal Enter to control the broadcasting speed of real scene video
The virtual reality hardware includes the VR helmet and three-dimensional space tracking and positioning device, the VR helmet and three-dimensional space Tracking and positioning device passes through data-interface respectively and connects with computer;The three-dimensional space tracking and positioning device exists for acquiring the VR helmet Coordinate position in three-dimensional space and rotation angle, and using collected coordinate and corner information input virtual scene system as The reference input of virtual camera, to control the angles and positions of virtual camera, so that virtual camera be made to follow the VR helmet is synchronous to turn It is dynamic;For the virtual camera by shooting when the scene image on front hook direction, the picture as the output of virtual scene system is defeated Enter the VR helmet;The VR helmet is used to the picture that virtual scene system exports being shown to driver.
Further, the change gear control module performs the steps of at runtime
1) input signal of foot pedal, including throttle input signal values δ are acquired1With brake input signal value δ2, input signal Value range be all [- 32767,32767];
2) according to the kinetic parameter of certain actual vehicle, the average acceleration value of the vehicle is calculated, including throttle averagely adds Speed ac1With brake average acceleration ac2:
V in above formula10And v11The initial velocity and end of a period speed of throttle accelerator, unit km/h are respectively indicated, throttle adds The fast time is Δ t1Second;v20And v21The initial velocity and end of a period speed of brake deceleration process, unit km/h are respectively indicated, brake subtracts The fast time is Δ t2Second;
3) according to the average acceleration value of the input signal of foot pedal and certain actual vehicle, the model virtual driving is calculated The throttle acceleration a of vehicle1, brake acceleration a2With current acceleration a:
4) according to current acceleration a, the present speed v of virtual driving vehicle is calculatedD:
vD=vD0+(a·Δt)×3.6 (6)
V in above formulaDAnd vD0The car speed of current time and last moment are respectively indicated, unit km/h, front and back is counted twice The interval time of calculation is Δ t seconds;
5) according to the present speed of virtual driving vehicle and posture deflection angle, dynamic changes scene video playback rate coefficient The expression formula of k, the scene video playback rate coefficient k are as follows:
K in above formulamaxFor maximum playback rate coefficient, value depends on computer hardware configuration height;v0It is true to shoot The movement speed of camera, v when Driving Scene panoramic videoDFor the current driving speed of virtual driving vehicle, v0kmaxVirtually to drive Sail the maximum travelling speed of vehicle permission;θ is the posture deflection angle of virtual driving vehicle, and value is calculated by following formula (8);
The deflecting control module performs the steps of at runtime
1) the input angle signal ω of steering wheel is acquired;
2) change posture and cross of the virtual driving vehicle in sphere scene according to the input angle signal dynamics of steering wheel To position:
A) steer coefficient q=ω is setss, wherein ωsFor the unilateral maximum rotation angle of steering wheel, θsFor virtual driving The steering locking angle degree of pivoted wheels on vehicle;
B) the posture deflection angle theta of virtual driving vehicle in sphere scene is set are as follows:
ω is the input angle signal of steering wheel in above formula;
C) according to the present speed of virtual driving vehicle and posture deflection angle, automobile side angle translational velocity v is obtainedx:
vx=vD·sinθ (9)
D) the left and right range of translation of virtual driving vehicle in sphere scene is set as Rc, when virtual driving vehicle translational range In RcWithin when, i.e. x ∈ [- Rc,Rc], virtual driving vehicle moves left and right in the scene video of current lane;When the model of translation It encloses beyond RcWhen, i.e. x <-RcOr x > Rc, virtual driving vehicle is switched to the scene video of adjacent lane in moving process.
Beneficial effects of the present invention:
1, the present invention is based on the virtual driving system of real scene modeling, the real scene video of shooting is used to build void Quasi- Driving Scene, the validity of built scene is relative to the existing scene built using 3D modeling software, and validity is significantly It improves;The basic functions such as the energy simulated automotive acceleration and deceleration of this virtual driving system and deflecting, while cooperating virtual reality hardware, it uses Person can experience driving sensation on the spot in person, can improve training effectiveness and effect.
2, the present invention is based on the virtual driving system of real scene modeling, operator not only can be carried out observation vehicle front Scene, and with the help of the VR helmet, operator, which is able to rotate head, can also observe the scene of surrounding in sphere scene, further It improves and drives the sense of reality on the spot in person.
Detailed description of the invention
Fig. 1 is the virtual driving system structure diagram modeled based on real scene;
Fig. 2 is the work flow diagram of the virtual driving system modeled based on real scene;
Fig. 3 is sphere schematic diagram of a scenario;
Fig. 4 is virtual driving deflection action schematic diagram.
Specific embodiment
The invention will be further described with reference to the accompanying drawings and examples.
The virtual driving system that the present embodiment is modeled based on real scene, including computer further include driver behavior input Module, virtual scene system and virtual reality hardware, the virtual scene system loading in a computer, the virtual scene system System includes sphere scene module and driving operational module.
The driver behavior input module includes steering wheel and foot pedal, and the steering wheel and foot pedal pass through number respectively It is connect according to interface with computer;The foot pedal includes gas pedal and brake pedal.In the present embodiment, steering wheel and foot-operated Plate uses sieve skill G29 steering wheel and foot pedal of Logitech Company's production, this product is connect using USB interface with computer;Its side Double motor force feedback technique is used to disk, it can realistically analog force feedback effects;Steering wheel can from left to right rotate 900 °, i.e., two circles half are identical as the degree of vehicle steering wheel rotation;Built-in Hall-type rotation direction sensor is capable of providing essence simultaneously True steering wheel angle digital signal.Its device with pedal keeps the vehicle driving posture closer to reality, steps on using non-linear brake Plate imitates pressure-sensitive braking system, there is sensitive, accurately brake experience.Certainly in different embodiments, other models can also be used Drive simulating steering wheel and foot pedal.
The virtual reality hardware includes the VR helmet and three-dimensional space tracking and positioning device, the VR helmet and three-dimensional space Tracking and positioning device passes through data-interface respectively and connects with computer.The three-dimensional space tracking and positioning device exists for acquiring the VR helmet Coordinate position and rotation angle in three-dimensional space, the three-dimensional space tracking and positioning device is for acquiring the VR helmet in three-dimensional space In coordinate position and rotation angle, and using collected coordinate and corner information input virtual scene system as virtual camera Reference input, to control the angles and positions of virtual camera, thus make virtual camera follow the VR helmet rotate synchronously;Specific In implementation, the synchronous script of realization in VR development kit is added on virtual camera, in this way when program is run, locator will be adopted The coordinate collected is just used as synchronous script reference input with corner information, and to control the angles and positions of virtual camera, realization makes Virtual camera follows the VR helmet to rotate synchronously.The virtual camera by shooting when the scene image on front hook direction, as The picture of virtual scene system output inputs the VR helmet, and the VR helmet is for the picture that virtual scene system exports to be shown to Driver.
In the present embodiment, it is by HTC and Value that the virtual reality hardware, which specifically uses HTC VIVE, HTC VIVE, A VR wear-type product developed jointly.Certainly in different embodiments, other virtual reality hardware can also be used.
The sphere scene module is included the sphere model built in Virtual Space using Unity3D, is attached to sphere mould Real scene video frame picture on type inner surface is placed in the virtual driving vehicle at sphere model center and is placed in and virtually drives Sail the virtual camera of vehicle drive position.The specific modeling method of the sphere scene module are as follows: first recorded using panorama camera True Driving Scene video is conducted into the engineering of Unity3D engine, then after completing panoramic video shooting Begin setting up sphere scene.A sphere model object is established in the Virtual Space of Unity3D, which is added VideoPlayer component.VideoPlayer is the component built in Unity3D, using the component and cooperates corresponding API, can be with It realizes and plays video in Unity3d and switch over, suspend, adjusting the functions such as playback rate.It is configured after addition component, Render mode Render Mode indicates the mode that image is shown, including Camera Far Plane, Camera Near Plane, Tetra- kinds of modes of Render texture and Material Override.In conjunction with the reality for needing to be shown to picture on sphere model Border situation, the present embodiment are applicable in Material Override mode, similarly, in Renderer selection sphere Sphere.It sets up In, Source indicates video source, by directly selecting in engineering video file and video can be selected to be stored in calculating by URL Storage location two ways selects video source in machine.
After VideoPlayer component is provided with, video council is shown on the surface of sphere model object.The present embodiment Subjective visual angle is placed in the centre of sphere, just can watch the scene of surrounding in this way, it is therefore desirable to which the rendering position of video is changed to sphere Inner surface, this purpose is reached by modification Shader file.Shader is a part of computer graphical rendering pipeline, It shows the rendering mode of object in scene by one section of program in machine code.This process includes calculating color and painted areas etc., Then object is given, object is allowed to be shown.By modifying Shader file, the rendering position of sphere model is changed to ball Internal side, while brightness will be rendered and be always on (i.e. illumination system in cancellation Unity3d influences) instead.
After the completion of video component and Shader file process, a virtual camera Camera object is put into car body model Position of driver, as driver's subjectivity visual angle;And the synchronous script of realization in VR development kit is added to virtual camera On Camera object, in this way when program is run, locator will test angles and positions of the VR helmet in actual environment, then The angles and positions that will test are as synchronous script reference input, to control the angles and positions of virtual camera, to make void Quasi- camera follows the VR helmet to rotate synchronously, and by virtual camera capture when the scene image input VR head on front hook direction Helmet.When the system is operated, driver, which puts on the VR helmet, can watch the true environment of surrounding in sphere scene, very close to existing It is real.
The driving operational module includes change gear control module and deflecting control module, and deflecting control module is used for according to side Virtual driving vehicle deflecting into the input control sphere scene module of disk, change gear control module are used for according to the defeated of foot pedal Enter to control the broadcasting speed of real scene video.
In the present embodiment, the change gear control module performs the steps of at runtime
1) input signal of foot pedal, including throttle input signal values δ are acquired1With brake input signal value δ2, input signal Value range be all [- 32767,32767];
2) according to the kinetic parameter of certain actual vehicle, the average acceleration value of the vehicle is calculated, including throttle averagely adds Speed ac1With brake average acceleration ac2:
V in above formula10And v11The initial velocity and end of a period speed of throttle accelerator, unit km/h are respectively indicated, throttle adds The fast time is Δ t1Second;v20And v21The initial velocity and end of a period speed of brake deceleration process, unit km/h are respectively indicated, brake subtracts The fast time is Δ t2Second;
3) according to the average acceleration value of the input signal of foot pedal and certain actual vehicle, the model virtual driving is calculated The throttle acceleration a of vehicle1, brake acceleration a2With current acceleration a:
4) according to current acceleration a, the present speed v of virtual driving vehicle is calculatedD:
vD=vD0+(a·Δt)×3.6 (6)
V in above formulaDAnd vD0The car speed of current time and last moment are respectively indicated, unit km/h, front and back is counted twice The interval time of calculation is Δ t seconds;
5) according to the present speed of virtual driving vehicle and posture deflection angle, dynamic changes scene video playback rate coefficient The expression formula of k, the scene video playback rate coefficient k are as follows:
K in above formulamaxFor maximum playback rate coefficient, value depends on computer hardware configuration height, is usually set to 2- Between 3;v0The movement speed of camera, v when to shoot true Driving Scene panoramic videoDFor the current driving of virtual driving vehicle Speed, v0kmaxThe maximum travelling speed allowed for virtual driving vehicle;θ be virtual driving vehicle posture deflection angle, value by Following formula (8) calculates.
The deflecting control module performs the steps of at runtime
1) the input angle signal ω of steering wheel is acquired;
2) change posture and cross of the virtual driving vehicle in sphere scene according to the input angle signal dynamics of steering wheel To position:
A) steer coefficient q=ω is setss, wherein ωsFor the unilateral maximum rotation angle of steering wheel, θsFor virtual driving The steering locking angle degree of pivoted wheels on vehicle;
B) the posture deflection angle theta of virtual driving vehicle in sphere scene is set are as follows:
ω is the input angle signal of steering wheel in above formula;
C) according to the present speed of virtual driving vehicle and posture deflection angle, automobile side angle translational velocity v is obtainedx:
vx=vD·sinθ (9)
D) the left and right range of translation of virtual driving vehicle in sphere scene is set as Rc, when virtual driving vehicle translational range In RcWithin when, i.e. x ∈ [- Rc,Rc], virtual driving vehicle moves left and right in the scene video of current lane;When the model of translation It encloses beyond RcWhen, i.e. x <-RcOr x > Rc, virtual driving vehicle is switched to the scene video of adjacent lane in moving process.
In the present embodiment, using G29, there are two types of schemes in the sphere scene built using Unity3d, and one is installations API is directly used after the driving of G29 in Unity3d, one is directly read the defeated of equipment for G29 as a common peripheral hardware Enter.The present embodiment selects the second way, and when G29 is connect by USB interface with Unity3d, Unity3d can identify G29 Main key mapping.The input manager (Input Manager) of Unity3d carries out each input setting of G29 equipment, then leads to The Input class function of Unity3d is crossed to read each input respectively.
By taking the input of the brake pedal of G29 is read as an example:
1. creating an input in input manager, it is named as Brake, setting input sensitivity Sensitivity is 0.5 (can adjust according to the actual situation), input type Type are Joystick Axis (the shaft type input for indicating external equipment), Axis Axis is 4rd axis (indicating the 4th shaft type input in equipment).
2. use Input class function: Input.GetAxisRaw (" Brake ") makes to obtain in control shell script User tramples the degree of trampling of brake.In this way, system can lead to when user is when carrying out various driver behaviors to G29 equipment The movement of user is learned in the various inputs for crossing reading G29, to be adjusted.
Then, a virtual driving vehicle is put near the sphere center position of sphere model, then by adjusting virtually driving The position for sailing vehicle makes virtual camera realize the position just at driver's seat.
It is arranged in the script of Driving control module:
If 1) detect, steering wheel is inputted, according to the deflection direction of steering wheel and deflection angle, the generation pair of vehicle mould posture The left rotation and right rotation answered, vehicle mould position generate corresponding transverse shifting.
If 2) detect, gas pedal is inputted, and sphere model tramples the broadcasting speed that amplitude accelerates panoramic video according to throttle Rate;Conversely, if detecting the input of brake pedal, sphere model slows down the playback rate of panoramic video according to amplitude, until being 0。
Deflecting, the acceleration and deceleration that vehicle in sphere scene is controlled by steering wheel and foot pedal can be realized by above-mentioned setting And brake.
Finally, it is stated that the above examples are only used to illustrate the technical scheme of the present invention and are not limiting, although referring to compared with Good embodiment describes the invention in detail, those skilled in the art should understand that, it can be to skill of the invention Art scheme is modified or replaced equivalently, and without departing from the objective and range of technical solution of the present invention, should all be covered at this In the scope of the claims of invention.

Claims (2)

1. based on the virtual driving system of real scene modeling, including computer, it is characterised in that: it further include driver behavior input Module, virtual scene system and virtual reality hardware, the virtual scene system loading in a computer, the virtual scene system System includes sphere scene module and driving operational module;
The driver behavior input module includes steering wheel and foot pedal, and the steering wheel and foot pedal are connect by data respectively Mouth is connect with computer;
The sphere scene module is included the sphere model built in Virtual Space using Unity3D, is attached in sphere model Real scene video frame picture on surface is placed in the virtual driving vehicle at sphere model center and is placed in virtual driving vehicle The virtual camera of operator seat;
The driving operational module includes change gear control module and deflecting control module, and deflecting control module is used for according to steering wheel Input control sphere scene module in virtual driving vehicle deflecting, change gear control module is used for according to the input control of foot pedal The broadcasting speed of real scene video processed
The virtual reality hardware includes the VR helmet and three-dimensional space tracking and positioning device, and the VR helmet and three-dimensional space track Locator passes through data-interface respectively and connects with computer;The three-dimensional track and localization device is for acquiring the VR helmet in three-dimensional space In coordinate position and rotation angle, and using collected coordinate and corner information input virtual scene system as virtual camera Reference input, to control the angles and positions of virtual camera, thus make virtual camera follow the VR helmet rotate synchronously;The void For quasi- camera by shooting when the scene image on front hook direction, the picture as the output of virtual scene system inputs the VR helmet; The VR helmet is used to the picture that virtual scene system exports being shown to driver.
2. the virtual driving system according to claim 1 based on real scene modeling, it is characterised in that: the speed change control Molding block performs the steps of at runtime
1) input signal of foot pedal, including throttle input signal values δ are acquired1With brake input signal value δ2, input signal takes Value range is all [- 32767,32767];
2) according to the kinetic parameter of certain actual vehicle, the average acceleration value of the vehicle, including throttle average acceleration are calculated ac1With brake average acceleration ac2:
V in above formula10And v11Respectively indicate the initial velocity and end of a period speed of throttle accelerator, unit km/h, when throttle accelerates Between be Δ t1Second;v20And v21Respectively indicate the initial velocity and end of a period speed of brake deceleration process, unit km/h, when brake deceleration Between be Δ t2Second;
3) according to the average acceleration value of the input signal of foot pedal and certain actual vehicle, the model virtual driving vehicle is calculated Throttle acceleration a1, brake acceleration a2With current acceleration a:
4) according to current acceleration a, the present speed v of virtual driving vehicle is calculatedD:
vD=vD0+(a·Δt)×3.6 (6)
V in above formulaDAnd vD0Respectively indicate the car speed of current time and last moment, unit km/h, what front and back calculated twice Interval time is Δ t seconds;
5) according to the present speed of virtual driving vehicle and posture deflection angle, dynamic changes scene video playback rate coefficient k, institute The expression formula for stating scene video playback rate coefficient k is as follows:
K in above formulamaxFor maximum playback rate coefficient, value depends on computer hardware configuration height;v0To shoot true drive The movement speed of camera, v when scene panoramic videoDFor the current driving speed of virtual driving vehicle, v0kmaxFor virtual driving vehicle Allow maximum travelling speed;θ is the posture deflection angle of virtual driving vehicle, and value calculates by following formula (8)
The deflecting control module performs the steps of at runtime
1) the input angle signal ω of steering wheel is acquired;
2) change posture and lateral position of the virtual driving vehicle in sphere scene according to the input angle signal dynamics of steering wheel It sets:
A) steer coefficient q=ω is setss, wherein ωsFor the unilateral maximum rotation angle of steering wheel, θsFor virtual driving vehicle The steering locking angle degree of deflecting roller;
B) the posture deflection angle theta of virtual driving vehicle in sphere scene is set are as follows:
ω is the input angle signal of steering wheel in above formula;
C) according to the present speed of virtual driving vehicle and posture deflection angle, automobile side angle translational velocity v is obtainedx:
vx=vD·sinθ (9)
D) the left and right range of translation of virtual driving vehicle in sphere scene is set as Rc, when virtual driving vehicle translational range is in Rc Within when, i.e. x ∈ [- Rc,Rc], virtual driving vehicle moves left and right in the scene video of current lane;When the range of translation is super R outcWhen, i.e. x <-RcOr x > Rc, virtual driving vehicle is switched to the scene video of adjacent lane in moving process.
CN201910434811.5A 2019-05-23 2019-05-23 Virtual driving system based on real scene modeling Active CN110097799B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910434811.5A CN110097799B (en) 2019-05-23 2019-05-23 Virtual driving system based on real scene modeling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910434811.5A CN110097799B (en) 2019-05-23 2019-05-23 Virtual driving system based on real scene modeling

Publications (2)

Publication Number Publication Date
CN110097799A true CN110097799A (en) 2019-08-06
CN110097799B CN110097799B (en) 2020-12-11

Family

ID=67448969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910434811.5A Active CN110097799B (en) 2019-05-23 2019-05-23 Virtual driving system based on real scene modeling

Country Status (1)

Country Link
CN (1) CN110097799B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570718A (en) * 2019-09-03 2019-12-13 福建百信信息技术有限公司 VR school bus system
CN110782728A (en) * 2019-11-01 2020-02-11 深圳风向标教育资源股份有限公司 New energy automobile training system based on intelligent teaching
CN110782738A (en) * 2019-11-06 2020-02-11 北京千种幻影科技有限公司 Driving simulation training device
CN111045425A (en) * 2019-12-05 2020-04-21 中国北方车辆研究所 Auxiliary teleoperation driving method for ground unmanned vehicle
CN111314484A (en) * 2020-03-06 2020-06-19 王春花 Virtual reality data synchronization method and device and virtual reality server
CN111915956A (en) * 2020-08-18 2020-11-10 湖南汽车工程职业学院 Virtual reality car driving teaching system based on 5G
CN112102680A (en) * 2020-08-27 2020-12-18 华东交通大学 Train driving teaching platform and method based on VR
CN112382165A (en) * 2020-11-19 2021-02-19 北京罗克维尔斯科技有限公司 Driving strategy generation method, device, medium, equipment and simulation system
CN112817453A (en) * 2021-01-29 2021-05-18 聚好看科技股份有限公司 Virtual reality equipment and sight following method of object in virtual reality scene
CN113157098A (en) * 2020-12-23 2021-07-23 武汉小绿人动力技术股份有限公司 Large-closed-space immersive driving system and control method
CN113534963A (en) * 2021-09-17 2021-10-22 北京启瞳智能科技有限公司 Urban route planning system and method based on VR
CN113946212A (en) * 2021-10-16 2022-01-18 天津大学 Steady driving test system based on virtual reality
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN116312149A (en) * 2023-04-24 2023-06-23 武汉木仓科技股份有限公司 Driving test simulation method and device, electronic equipment and storage medium
CN116773216A (en) * 2023-06-12 2023-09-19 江苏泽景汽车电子股份有限公司 Test method, device, system, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251958A (en) * 2007-07-06 2008-08-27 浙江大学 Method for implementing automobile driving analog machine facing to disciplinarian
CN105898337A (en) * 2015-11-18 2016-08-24 乐视网信息技术(北京)股份有限公司 Panoramic video display method and device
CN106327946A (en) * 2016-10-21 2017-01-11 安徽协创物联网技术有限公司 Virtual reality integrated machine for driving training
CN106412424A (en) * 2016-09-20 2017-02-15 乐视控股(北京)有限公司 View adjusting method and device for panoramic video
CN107888894A (en) * 2017-10-12 2018-04-06 浙江零跑科技有限公司 A kind of solid is vehicle-mounted to look around method, system and vehicle-mounted control device
CN108668168A (en) * 2018-05-28 2018-10-16 烽火通信科技股份有限公司 Android VR video players and its design method based on Unity 3D

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251958A (en) * 2007-07-06 2008-08-27 浙江大学 Method for implementing automobile driving analog machine facing to disciplinarian
CN105898337A (en) * 2015-11-18 2016-08-24 乐视网信息技术(北京)股份有限公司 Panoramic video display method and device
CN106412424A (en) * 2016-09-20 2017-02-15 乐视控股(北京)有限公司 View adjusting method and device for panoramic video
CN106327946A (en) * 2016-10-21 2017-01-11 安徽协创物联网技术有限公司 Virtual reality integrated machine for driving training
CN107888894A (en) * 2017-10-12 2018-04-06 浙江零跑科技有限公司 A kind of solid is vehicle-mounted to look around method, system and vehicle-mounted control device
CN108668168A (en) * 2018-05-28 2018-10-16 烽火通信科技股份有限公司 Android VR video players and its design method based on Unity 3D

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张茜: "《基于Unity 3D 的汽车功能模拟与驾驶场景演示系统的设计和实现》", 《中国优秀硕士学位论文全文数据库 工程科技II辑,第3期》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110570718A (en) * 2019-09-03 2019-12-13 福建百信信息技术有限公司 VR school bus system
CN110782728A (en) * 2019-11-01 2020-02-11 深圳风向标教育资源股份有限公司 New energy automobile training system based on intelligent teaching
CN110782738A (en) * 2019-11-06 2020-02-11 北京千种幻影科技有限公司 Driving simulation training device
CN111045425A (en) * 2019-12-05 2020-04-21 中国北方车辆研究所 Auxiliary teleoperation driving method for ground unmanned vehicle
CN111045425B (en) * 2019-12-05 2023-04-28 中国北方车辆研究所 Auxiliary teleoperation driving method for ground unmanned vehicle
CN111314484A (en) * 2020-03-06 2020-06-19 王春花 Virtual reality data synchronization method and device and virtual reality server
CN111314484B (en) * 2020-03-06 2020-11-10 北京掌中飞天科技股份有限公司 Virtual reality data synchronization method and device and virtual reality server
CN111915956B (en) * 2020-08-18 2022-04-22 湖南汽车工程职业学院 Virtual reality car driving teaching system based on 5G
CN111915956A (en) * 2020-08-18 2020-11-10 湖南汽车工程职业学院 Virtual reality car driving teaching system based on 5G
CN112102680A (en) * 2020-08-27 2020-12-18 华东交通大学 Train driving teaching platform and method based on VR
CN112382165A (en) * 2020-11-19 2021-02-19 北京罗克维尔斯科技有限公司 Driving strategy generation method, device, medium, equipment and simulation system
CN113157098A (en) * 2020-12-23 2021-07-23 武汉小绿人动力技术股份有限公司 Large-closed-space immersive driving system and control method
CN113157098B (en) * 2020-12-23 2021-09-28 武汉小绿人动力技术股份有限公司 Large-closed-space immersive driving system and control method
CN112817453A (en) * 2021-01-29 2021-05-18 聚好看科技股份有限公司 Virtual reality equipment and sight following method of object in virtual reality scene
CN113534963A (en) * 2021-09-17 2021-10-22 北京启瞳智能科技有限公司 Urban route planning system and method based on VR
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN113946259B (en) * 2021-09-18 2023-04-07 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN113946212A (en) * 2021-10-16 2022-01-18 天津大学 Steady driving test system based on virtual reality
CN116312149A (en) * 2023-04-24 2023-06-23 武汉木仓科技股份有限公司 Driving test simulation method and device, electronic equipment and storage medium
CN116773216A (en) * 2023-06-12 2023-09-19 江苏泽景汽车电子股份有限公司 Test method, device, system, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN110097799B (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN110097799A (en) Virtual driving system based on real scene modeling
CN110109552B (en) Virtual driving scene modeling method based on real environment
US10406433B2 (en) Method and system for applying gearing effects to visual tracking
JP4567805B2 (en) Method and apparatus for providing a gearing effect to an input based on one or more visual, acoustic, inertial and mixed data
CN100430691C (en) Vehicle steering sensing apparatus
US5919045A (en) Interactive race car simulator system
US7391409B2 (en) Method and system for applying gearing effects to multi-channel mixed input
US7352358B2 (en) Method and system for applying gearing effects to acoustical tracking
US7352359B2 (en) Method and system for applying gearing effects to inertial tracking
US6354838B1 (en) Interactive race car simulator system
US6384834B1 (en) Three-dimensional simulator apparatus and image synthesis method using texture computation and texture information storage
CN102426425A (en) Automobile ABS (Antilock Brake System) virtual reality simulation system
US20110181711A1 (en) Sequential image generation
JP2005003923A (en) Virtual driving system
CN202534194U (en) Yacht driving simulation system
CN112221117A (en) Driving simulation platform and method
CN101770707B (en) Camera based virtual vehicle driving system and virtual driving method
Yoshimoto et al. The history of research and development of driving simulators in Japan
CN109920035B (en) Dynamic special effect synthesis method based on mobile equipment augmented reality
KR100576839B1 (en) Seat-driving device for simulation system
US20210141972A1 (en) Method for generating an image data set for a computer-implemented simulation
Siegel et al. A gamified simulator and physical platform for self-driving algorithm training and validation
Han et al. A real-time virtual simulation environment for Advanced Driver Assistance System development
AU2005210514B2 (en) Vehicle steering sensing apparatus
JP2010284258A (en) Game device and game program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant