CN108202669A - Adverse weather vision enhancement driving auxiliary system and its method based on truck traffic - Google Patents
Adverse weather vision enhancement driving auxiliary system and its method based on truck traffic Download PDFInfo
- Publication number
- CN108202669A CN108202669A CN201810011066.9A CN201810011066A CN108202669A CN 108202669 A CN108202669 A CN 108202669A CN 201810011066 A CN201810011066 A CN 201810011066A CN 108202669 A CN108202669 A CN 108202669A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- vision enhancement
- information
- vehicles
- truck
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
The present invention provides a kind of adverse weather vision enhancement driving auxiliary system based on truck traffic, including central control unit, perceives unit, man-machine interaction unit at vision enhancement unit;The central control unit carries out information exchange with perceiving unit, carries out decision and images match from vehicle information of vehicles and front vehicles information based on perception unit transmission, is sent to the communication equipment perceived in unit from vehicle information of vehicles;Prompt message and the front vehicles virtual image that operator demand shows are sent to vision enhancement unit;Data interaction is realized with man-machine interaction unit, receives driver command, sends safe early warning information;Vision enhancement unit receives the front vehicles virtual image information that central control unit is sent, and to outer projection;Projected image is made to conduct just to human eye, and the size positions of the vision enhancement image of the front vehicles of projection is made to be overlapped with front actual vehicle.The present invention greatly improves the travel safety under poor weather.
Description
Technical field
Join the invention belongs to intelligent network and drive field of auxiliary, and in particular to is a kind of towards safe driving under poor weather
The driving auxiliary system and method for the vision enhancement based on truck traffic.
Background technology
Workshop collision accident is exactly one of principal mode of traffic accidents all the time, and workshop collision avoidance has become generation
Various countries of boundary problem urgently to be resolved hurrily.There is natural limitation in perception, decision, execution in operator of the driver as vehicle, and
Physiological Psychology state complex is changeable, in complicated environment, often due to perceiving inaccurate, incorrect decision, performing sluggishness etc.
Reason causes workshop to collide, especially under the poor weathers such as rain, snow, hail, mist, haze, sand and dust, driver's field of front vision
Poor visibility is more easy to induce workshop collision accident.
In order to reduce the workshop collision accident caused by driver, to avoid the advanced driver assistance of all kinds of workshops collision
System (Advanced Driver Assistance System, ADAS) is widely studied and practical application.The overwhelming majority is kept away
It hits class ADAS to perceive from vehicle surrounding vehicles information using onboard sensors such as millimetre-wave radar, laser radar, cameras, for colliding
Dangerous quantitative evaluation and driving aid decision, and then provide actively auxiliary and information alert for driver.Although current collision avoidance
Class ADAS can effectively reduce a part of collision accident really, but work as in driving conditions and encounter rain, snow, hail, mist, haze, sand and dust
When adverse weathers, not only the visual perception of driver can similarly be lost by severe jamming, the onboard sensors such as radar, camera
Effect causes collision avoidance class ADAS that can not provide effective drive under poor weather and assists.
Increasingly mature with truck traffic technology, truck traffic becomes the new tool for obtaining its vehicle information, truck traffic
The sensing range from vehicle is not only expanded significantly, but also can in real time, comprehensively directly acquire the accurate information of its vehicle, and not by day
Gas bar part limits.The introducing of truck traffic creates new breakthrough mouth further to improve traffic safety, makes novel intelligent network
Connection driving assistance system is widely studied.But the novel intelligent net connection driving assistance system currently occurred is mostly controlled in the car
Platform installs display and equipment of raising one's voice provides driving assistance information for driver, often results in driver drives vehicle in the process because of observation
Display and divert attention, especially under conditions of adverse weather low visibility, instead easily cause collision accident.And driver drives vehicle
In the process, the field of front vision for being more desirable to oneself is clear.
In conclusion driving of the prior art under poor weather is assisted, there are main problems to be:
(1) the vehicle environment detecting sensor of traditional collision avoidance class ADAS easily fails under poor weather;
(2) mode of the novel intelligent net connection driving assistance system offer information auxiliary of introducing truck traffic is unsatisfactory for bad
Operator demand during weather condition down train or even easily cause safety accident.
Invention content
The present invention, based on truck traffic, draws for the safe driving under the adverse weathers such as rain, snow, hail, mist, haze, sand and dust
Enter augmented reality, a kind of adverse weather vision enhancement driving auxiliary system based on truck traffic is proposed, by what is perceived
The vision enhancement image projection of front vehicles on the windshield, makes it be overlapped with the front actual vehicle in driver's eye, dashes forward
The limitation of broken adverse weather low visibility makes driver recapture clear field of front vision, and projects front on the windshield in real time
Vehicle real time and safety instruction information.The technical solution adopted by the present invention is:
A kind of adverse weather vision enhancement driving auxiliary system based on truck traffic, including central control unit, vision
Enhancement unit perceives unit, man-machine interaction unit;
The central control unit carries out information exchange with perceiving unit, based on perceive unit transmission from vehicle vehicle
Information and front vehicles information carry out decision and images match, are sent to the communication equipment perceived in unit from vehicle information of vehicles;
Prompt message and the front vehicles virtual image that operator demand shows are sent to vision enhancement unit;With man-machine interaction unit reality
Existing data interaction, receives driver command, sends safe early warning information;
Vision enhancement unit receives the front vehicles virtual image information that central control unit is sent, and to outer projection;Make
Projected image is conducted to human eye just, and makes the size positions of the vision enhancement image of the front vehicles of projection and the practical vehicle in front
Overlap.
Specifically, it perceives unit and includes wireless telecom equipment, vehicle bus, Inertial Measurement Unit, satellite positioning module, phase
Machine;
Wireless telecom equipment receives the information from surrounding vehicles for being sent out from vehicle information of vehicles, transmitting-receiving letter
Breath includes:Vehicle ID, it timestamp, speed, longitudinal acceleration, course angle, longitude and latitude, height above sea level, yaw angle, angle of heel, bows
The elevation angle;
Vehicle speed data is obtained in vehicle bus;
Inertial Measurement Unit is used to be obtained from vehicle longitudinal acceleration, yaw angle, angle of heel, pitch angle;
Satellite positioning module for being obtained from vehicle longitude and latitude, height above sea level in real time, and pass through from vehicle current location with it is previous
Moment location determination is from vehicle course angle;
Camera is set on driver front upper place in driver's cabin, for identify driver eye drive indoor relative position and
Pilot's line of vision.
Specifically, vision enhancement unit includes optical lens screen, light directing component, projection device;
Optical lens screen is attached to windshield, can be imaged on it, also sees front through the optical lens screen
Practical scenery;
Projection device receives the front vehicles virtual image information of central control unit, and to outer projection;
Light directing component is used to conduct the projected image of projection device injection, including level-one reflective mirror and can revolve reflective
Mirror;Level-one reflective mirror can revolve reflective mirror according to identifying for projection image to be reflected to that can revolve reflective mirror
Eye locations and sight adjust the angle, make projected image by optical lens screen imaging after conduct just to human eye, and make throwing
The size positions of the vision enhancement image of the front vehicles of shadow are overlapped with front actual vehicle.
Specifically, man-machine interaction unit includes touch screen and loud speaker;Touch screen is interacted with central control unit row information,
System is configured for driver;Loud speaker receives master controller information, and sound prompting is provided to driver.
A kind of adverse weather vision enhancement driving householder method based on truck traffic provided by the invention, including following step
Suddenly:
In step S1, each cycle of operation Δ T, it is obtained from vehicle and surrounding vehicles information first, believes from vehicle and surrounding vehicles
It is identical to cease type, including vehicle ID, timestamp, speed, longitudinal acceleration, course angle, longitude and latitude, height above sea level, yaw angle, side
Inclination angle, pitch angle;
Step S2, to being screened from vehicle surrounding vehicles, determine with from vehicle with track close to front vehicles;
Step S3, determine this close to front truck whether in the target area from vehicle, if continuing to run with, otherwise according to driving
The instruction of member decides whether return to step S1;
Step S4 is assessed from vehicle and the shop safety close to front truck;
Step S5 judges whether be located in the range of driver's normal visual field close to front truck, if then continuing in next step
S6, otherwise into information output element;
Step S6 identifies driver eye position and sight;
Step S7 generates the vision enhancement image of front vehicles;
Step S8 projects the front vehicles vision enhancement image of generation;
Step S9 accordingly enters next cycle of operation according to instruction or into standby mode.
The advantage of the invention is that:The present invention is different from the environment sensing mode of traditional driving assistance system, passes through vehicle
Vehicle communication obtains the comprehensive real time information of front truck, and using augmented reality, is being greatly reduced to driver drives vehicle attention shadow
Under the premise of sound, carried for driver in the severe poor weather down train of the visibility such as rain, snow, mist, haze, hail, sand and dust
Effective security information has been supplied to assist, it is significant to reducing the vehicle collision accident under adverse weather.
Description of the drawings
Fig. 1 is the structure composition schematic diagram of the present invention.
Fig. 2 is the vision enhancement cellular construction schematic diagram of the driving auxiliary system of the present invention.
Fig. 3 is the operational flow diagram of the driving auxiliary system of the present invention.
Fig. 4 is face and eye the location presentation schematic diagram in camera coordinates system of the present invention.
Fig. 5 is the actual vehicle of the present invention and longitudinal human eye projection theory schematic diagram of projected image.
Specific embodiment
With reference to specific drawings and examples, the invention will be further described.
Adverse weather vision enhancement driving auxiliary system provided by the invention based on truck traffic, as shown in Figure 1, including
Central control unit, perceives unit, man-machine interaction unit at vision enhancement unit.Master controller is developed as central control unit,
Master controller carries out information exchange with perceiving unit, based on determining from vehicle and front vehicles information for perception unit transmission
Plan and images match are sent to the LTE-V communication equipments perceived in unit from vehicle information of vehicles, are driven to the transmission of vision enhancement unit
The prompt message and front vehicles virtual image that the person's of sailing demand is shown realize data interaction with man-machine interaction unit, receive and drive
Member's instruction, and pass through console loud speaker and send acoustic safety warning information to driver.
It perceives unit and includes LTE-V communication equipments, from vehicle CAN bus, Inertial Measurement Unit (Inertial
Measurement Unit, IMU), differential GPS, camera;
Wherein, LTE-V communication equipments are used to implement truck traffic, are sent out from vehicle vehicle believing from master controller
Breath, and receive the information from surrounding vehicles, receive and send messages including:Vehicle ID, timestamp, speed, longitudinal acceleration, course
Angle, longitude and latitude, height above sea level, yaw angle, angle of heel, pitch angle;Real-time vehicle speed data can be being obtained from vehicle CAN bus;
Inertial Measurement Unit is placed in vehicle zero load centroid position, for being obtained from vehicle longitudinal acceleration, yaw angle, angle of heel, pitching
Angle;The antenna of differential GPS is arranged in roof, right over vehicle zero load barycenter, for being obtained from vehicle longitude and latitude, height above sea level in real time
Degree, and pass through from vehicle current location and previous moment location determination from vehicle course angle;Camera is placed in driver's cabin position of rear view mirror, uses
Indoor relative position and pilot's line of vision are being driven in identification driver eye;
As shown in Fig. 2, vision enhancement unit includes optical lens screen, light directing component, projection device;
Optical lens screen is attached to windshield, can both be imaged on it, also sees front through the optical lens screen
Practical scenery;
Projection device receives the front vehicles virtual image information of central control unit, and to outer projection;
Light directing component is used to conduct the projected image of projection device injection, including level-one reflective mirror and can revolve reflective
Mirror;For level-one reflective mirror for projection image to be reflected to that can revolve reflective mirror, can revolve reflective mirror can be according to identification
The eye locations and sight gone out adjust the angle, and projected image is made to be conducted just to human eye, and make after the imaging of optical lens screen
The size positions of the vision enhancement image of the front vehicles of projection are overlapped with the seen front actual vehicle of driver, reach vision
Enhancing effect;
Man-machine interaction unit includes console touch screen and console loud speaker;Console touch screen is carried out with master controller
Information exchange is configured system for driver, may be selected unlatching/closing system, unlatching/closing vision enhancement, open/
Close prompt message, unlatching/closing sound prompting;Console loud speaker receives master controller information, and sound is provided to driver
It reminds.
A kind of adverse weather vision enhancement driving auxiliary system and its method based on truck traffic provided by the invention, can
Under the adverse weathers such as rain, snow, mist, haze, hail, sand and dust, augmented reality is introduced, front vehicles are obtained using truck traffic
Information, and security decision is carried out, vision enhancement image and associated safety auxiliary information are provided for driver, is made up because of low visibility
The visual perception limitation brought improves travel safety, and driving auxiliary system provided by the invention is in the application, it is desirable that must from vehicle
Full-rigged driving auxiliary system of palpus, it is desirable that front vehicles at least equip LTE-V communication equipments, Inertial Measurement Unit, difference
Divide GPS, and have controller that can summarize from vehicle CAN bus, Inertial Measurement Unit, DGPS data, and then lead in real time to LTE-V
Believe the front vehicles information packet needed for the driving auxiliary system of the equipment transmission present invention.Assuming that provided by the invention be based on vehicle
Function is all turned on the adverse weather vision enhancement driving auxiliary system of vehicle communication when driving, then this driving auxiliary system
The operational process of each cycle is as shown in figure 3, the driving householder method of the adverse weather vision enhancement based on truck traffic of the present invention
Specific implementation step it is as follows:
Step S1, after driver opens the adverse weather vision enhancement driving auxiliary system based on truck traffic, this
System starts scheduling real-time periodic operation, if the period is Δ T, in each cycle of operation, system is obtained from vehicle and surrounding vehicles first
Information, it is identical with surrounding vehicles information type from vehicle, including vehicle ID, timestamp, speed, longitudinal acceleration, course angle, longitude and latitude
Degree, height above sea level, yaw angle, angle of heel, pitch angle;
Step S2, to being screened from vehicle surrounding vehicles, determine with from vehicle with track close to front vehicles;Detailed process
It is as follows:
According to from vehicle and surrounding vehicles longitude and latitude, altitude information, matched with the track high-precision map of grade, pick out with
From surrounding vehicles of the vehicle in same track;It will be turned from vehicle longitude and latitude and with the longitude and latitude of track inner periphery vehicle by coordinate
It changes, is transformed into WGS-84 earth coordinates, and then judge whether its vehicle of same track is from vehicle close to front truck;
The coordinate of the known current time T from vehicle is (Xhost(T),Yhost(T)), last moment coordinate is (Xhost(T-Δ
T),Yhost(T-ΔT));It is (X with its vehicle current time coordinate of trackelse(T),Yelse(T)), last moment coordinate is
(Xelse(T-ΔT),Yelse(T-ΔT));Current time can then be obtained from vehicle and the following distance D of this its vehiclehe(T) such as following formula:
It further asks from the vector angle φ that this its truck position is directed toward from truck positionhe, work as Xelse(T)-Xhost(T) > 0 and
Yelse(T)-Yhost(T) during > 0,Work as Xelse(T)-
Xhost(T) < 0 and Yelse(T)-Yhost(T) during > 0, φhe=180 ° of+arctan ((Yelse(T)-Yhost(T))/(Xelse(T)-
Xhost(T)));Work as Xelse(T)-Xhost(T) < 0 and Yelse(T)-Yhost(T) during < 0, φhe=180 ° of+arctan ((Yelse(T)-
Yhost(T))/(Xelse(T)-Xhost(T)));Work as Xelse(T)-Xhost(T) > 0 and Yelse(T)-Yhost(T) during < 0, φhe=
270°+arctan((Yelse(T)-Yhost(T))/(Xelse(T)-Xhost(T)));Work as Xelse(T)-Xhost(T) > 0 and Yelse(T)-
Yhost(T)=0 when, φhe=0 °;Work as Xelse(T)-XhostAnd Y (T)=0else(T)-Yhost(T) during > 0, φhe=90 °;When
Xelse(T)-Xhost(T) < 0 and Yelse(T)-Yhost(T)=0 when, φhe=180 °;Work as Xelse(T)-XhostAnd Y (T)=0else
(T)-Yhost(T) during < 0, φhe=270 °;
By each vehicle is current and the position coordinates of last moment, the course angle of each vehicle can be estimated, for from vehicle course angle, when
ΔXhost=Xhost(T)-Xhost(T- Δ T) > 0 and Δ Yhost=Yhost(T)-YhostDuring (T- Δ T) > 0, it is from vehicle course angle
ψhost=arctan (Δ Yhost/ΔXhost);As Δ Xhost< 0 and Δ YhostDuring > 0, ψhost=180 ° of+arctan (Δ Yhost/
ΔXhost);As Δ Xhost< 0 and Δ YhostDuring < 0, ψhost=180 ° of+arctan (Δ Yhost/ΔXhost);As Δ Xhost> 0 and
ΔYhostDuring < 0, ψhost=360 ° of+arctan (Δ Yhost/ΔXhost);As Δ Xhost> 0 and Δ YhostWhen=0, ψhost=0 °;
As Δ Xhost=0 and Δ YhostDuring > 0, ψhost=90 °;As Δ Xhost< 0 and Δ YhostWhen=0, ψhost=180 °;As Δ Xhost
=0 and Δ YhostDuring < 0, ψhost=270 °.The course angle ψ of its vehicle can similarly be obtainedelse;
If | ψhost-ψelse|≤30 ° and | φhe-ψhost|≤30 °, then this its vehicle is and this front truck with track front truck from vehicle
Relative to other front trucks, away from nearest from vehicle, then this is from vehicle close to front truck;
Whether step S3 determines this close to front truck in the target area from vehicle;Since the range of truck traffic is much big
In sensors such as traditional in-vehicle camera, radars, if close to front truck away from too far, to there is no security threat from vehicle, then must not from vehicle
Follow-up processing is carried out, so default target area, rectangular area covering is certain from front of track where vehicle
Range, with track with wide;Target area length LtargetWith having relationship from vehicle speed, current weather visibility, visibility is by mark
It will definitely be divided into 9 grades, 1 grade of ceiling and visibility unlimited (CAVU), 9 grades of insufficient visibility 100m, it is known that current visibility scale is nv, it is from vehicle speed
vhostIf target time headway htarget=15s, then target area length LtargetSuch as following formula:
During 9 grades of visibility scale, parameter nrThe median 5 of desirable visibility scale;
If current time is from the following distance D of its immediate front truck of vehiclehe≤Ltarget, then it is assumed that close to front truck in target area
Interior, system continues to run within this period;Otherwise, in the case where not receiving driver and closing the instruction condition of system, this period knot
Beam, system return to step S1;
Step S4 is assessed from vehicle and the shop safety close to front truck;
In order to ensure can to follow safely close to front truck from vehicle, it is desirable that should keep safe headstock from its immediate front truck of vehicle
When away from also keeping the enough collision avoidance time, default minimum safe time headway hsafeFor 2.5s, minimum collision avoidance time
TTCsafeFor 5s, it is known that close to front truck speed vprecedingWith vehicle commander Lpreceding, it is v from vehicle speedhost, then set from vehicle and its
Close to the minimum safe distance d of front trucksafeSuch as following formula:
dsafe=max (Lpreceding+hsafevhost,Lpreceding+(vhost-vpreceding)TTCsafe)
Then current time is from the following distance D of its immediate front truck of vehiclehe≥dsafeWhen, otherwise there is collision wind in traffic safety
Danger should provide safety prompt function to driver;
Step S5 judges whether be located in the range of driver's normal visual field close to front truck;
During known driver drives vehicle, under good weather condition, when front there are during vehicle, make driver naked eyes can
See and cause the maximum following distance D of enough attentionseye, driver's normal visual field is defined as, if Dhe≤Deye, then system enter vision
Enhancing mechanism continues to execute step S6;Otherwise, system is directly entered information output element, performs step S8;
Step S6 identifies driver eye position and sight;It is real-time using the camera for being installed on driver's cabin position of rear view mirror
It obtains and drives off-the-air picture, identify driver's facial area in target area in the picture, further know in facial area
Other eye locations and sight;It is specific to need to identify that obtained parameter is marked in camera coordinates system as shown in Figure 4, in two
Between point position represent eye locations, coordinate is (xeye,yeye,zeye), it is δ to drive human face's pitch angleface,pitch, facial sideway
Angle is δface,yaw。
Step S7 generates the vision enhancement image of front vehicles;According to the front vehicles ID received, in system database
Middle search and the matched information of vehicles of the ID, can obtain the data such as the vehicle, parameters of basic dimensions, 3D auto models of the vehicle;Into
One step, shape, the positions and dimensions for being projected in the front vehicles vision enhancement image in windshield optical lens screen are solved,
Its imaging in human eye is made to be overlapped with imaging in the eye of practical front truck.
First in earth coordinates determine front vehicles, from vehicle, drive human eye relevant position angle information, it is known that phase
The height of machine fixed position is hc, camera is d along from the level interval of vehicle y direction with differential GPS antennacg, away from windshield
Horizontal distance be dcs, it is assumed that the vision enhancement image of practical front truck imaging and projection, which finally overlaps, converges at position of human eye, suddenly
Slightly windshield angle of inclination thinks that it is vertical, and front truck and inclination and pitching from vehicle are ignored to simplify solution procedure;
Relative position and posture according to the front truck of current time reality and from vehicle driver eye, in 3D model libraries
Corresponding 3D auto models carry out position and pose adjustment;The scaling multiple n of vision enhancement image is determined firstreduce, actual vehicle
With longitudinal human eye projection theory of projected image as shown in figure 5, according to current information, practical front vehicles tail portion can be obtained away from from vehicle
The fore-and-aft distance d of windshield optical lens screenpsIt is as follows:
dps=Dhe-bp-(dcs+dcg)
bpHorizontal distance of the centroid position away from the tailstock for front truck;
Fore-and-aft distance d of the optical perspective screen position away from driver eye can be obtainedseIt is as follows:
dse=yeye+dcs
The scaling multiple n of vision enhancement image can then be obtainedreduceIt is as follows:
Then by 3D auto models size according to scaling multiple nreduceIt reduces;Further adjust the point of observation of 3D auto models
Vertical and lateral position;In earth coordinates, the vertical height h of driver eyeeIt is as follows:
he=hc+zeye
And then acquire lateral shift e of the driver eye relative to the front truck longitudinal axislateralIt is as follows:
elateral=Xhost-Xelse-xeye
Then by the point of observation of 3D auto models according to the driver eye acquired relative to the vertical height and transverse direction of vehicle
Offset adjustment;Then according to the yaw angle δ of front vehicleselse,yawWith driver's face yaw angle δface,yawAround vertical rotation 3D
Auto model, rotation angle are α=δelse,yaw-δface,yaw;According to driver's face pitch angle δface,pitchBy 3D auto models
Around transverse rotation-δface,pitch, the 3D auto model images obtained by point of observation position at this time seek to project to windshield
Vision enhancement image in optical lens screen;
Step S8, to the front vehicles vision enhancement image of optical lens screen projection generation;And optionally, projection front
Information of vehicles, safety instruction information;
As shown in figure 5, the pitch angle and yaw angle that turn reflective mirror in vision enhancement unit are adjusted first, by front vehicle
Correct position of the vision enhancement image projection in optical lens screen makes actual vehicle and the vision enhancement vehicle image of projection
It overlaps when being passed to human eye.And by the speed of current front vehicles, range information, from vehicle and the minimum safe close to front truck
Distance and safety instruction information are projected to optical lens screen.If work as from vehicle and the following distance D close to front truckheLess than it is current most
Small safe distance dsafe, then driver is reminded in optical lens screen there are risk of collision, and using in man-machine interaction unit
Console loud speaker provides sound early warning, and driver is prompted to slow down or detour;
Step S9, judges whether driver sends out the instruction of halt system, if driver requested halt system, system into
Enter standby mode;If system is not received by halt instruction, into next cycle of operation, i.e. return to step S1.
The present invention is assisted by providing visual information to driver, can be effectively reduced vehicle collision accident, especially be existed
Under poor weather, travel safety is greatly improved.
It should be noted last that more than specific embodiment is merely illustrative of the technical solution of the present invention and unrestricted,
Although the present invention is described in detail with reference to example, it will be understood by those of ordinary skill in the art that, it can be to the present invention
Technical solution be modified or replaced equivalently, without departing from the spirit and scope of technical solution of the present invention, should all cover
In scope of the presently claimed invention.
Claims (12)
1. a kind of adverse weather vision enhancement driving auxiliary system based on truck traffic, which is characterized in that including center control
Unit, perceives unit, man-machine interaction unit at vision enhancement unit;
The central control unit carries out information exchange with perceiving unit, based on perceive unit transmission from vehicle information of vehicles
Decision and images match are carried out with front vehicles information, is sent to the communication equipment perceived in unit from vehicle information of vehicles;To regarding
Feel that enhancement unit sends prompt message and the front vehicles virtual image that operator demand shows;Number is realized with man-machine interaction unit
According to interaction, driver command is received, sends safe early warning information;
Vision enhancement unit receives the front vehicles virtual image information that central control unit is sent, and to outer projection;Make projection
Image is conducted to human eye just, and makes the size positions of the vision enhancement image of the front vehicles of projection and front actual vehicle weight
It closes.
2. the adverse weather vision enhancement driving auxiliary system based on truck traffic as described in claim 1, which is characterized in that
It perceives unit and includes wireless telecom equipment, vehicle bus, Inertial Measurement Unit, satellite positioning module, camera;
Wireless telecom equipment receives the information from surrounding vehicles, packet of receiving and sending messages for being sent out from vehicle information of vehicles
It includes:Vehicle ID, timestamp, speed, longitudinal acceleration, course angle, longitude and latitude, height above sea level, yaw angle, angle of heel, pitch angle;
Vehicle speed data is obtained in vehicle bus;
Inertial Measurement Unit is used to be obtained from vehicle longitudinal acceleration, yaw angle, angle of heel, pitch angle;
Satellite positioning module is used to be obtained from vehicle longitude and latitude, height above sea level in real time, and pass through from vehicle current location and previous moment
Location determination is from vehicle course angle;
Camera is set on driver front upper place in driver's cabin, for identifying that driver eye is driving indoor relative position and driving
Member's sight.
3. the adverse weather vision enhancement driving auxiliary system based on truck traffic as described in claim 1, which is characterized in that
Vision enhancement unit includes optical lens screen, light directing component, projection device;
Optical lens screen is attached to windshield, can be imaged on it, and it is practical also to see front through the optical lens screen
Scenery;
Projection device receives the front vehicles virtual image information of central control unit, and to outer projection;
Light directing component is used to conduct the projected image of projection device injection, including level-one reflective mirror and can revolve reflective mirror;One
Grade reflective mirror can revolve reflective mirror according to the eye identified for projection image to be reflected to that can revolve reflective mirror
Position and sight adjust the angle, and projected image is made to be conducted just to human eye after the imaging of optical lens screen, and before making projection
The size positions of the vision enhancement image of square vehicle are overlapped with front actual vehicle.
4. the adverse weather vision enhancement driving auxiliary system based on truck traffic as described in claim 1, which is characterized in that
Man-machine interaction unit includes touch screen and loud speaker;Touch screen is interacted with central control unit row information, for driver
System is configured;Loud speaker receives master controller information, and sound prompting is provided to driver.
5. a kind of adverse weather vision enhancement driving householder method based on truck traffic, which is characterized in that include the following steps:
In step S1, each cycle of operation Δ T, it is obtained from vehicle and surrounding vehicles information first, from vehicle and surrounding vehicles info class
Type is identical, including vehicle ID, timestamp, speed, longitudinal acceleration, course angle, longitude and latitude, height above sea level, yaw angle, inclination
Angle, pitch angle;
Step S2, to being screened from vehicle surrounding vehicles, determine with from vehicle with track close to front vehicles;
Step S3, determine this close to front truck whether in the target area from vehicle, if continuing to run with, otherwise according to driver's
Instruction decides whether return to step S1;
Step S4 is assessed from vehicle and the shop safety close to front truck;
Step S5 judges whether be located in the range of driver's normal visual field close to front truck, no if then continuing next step S6
Then enter information output element;
Step S6 identifies driver eye position and sight;
Step S7 generates the vision enhancement image of front vehicles;
Step S8 projects the front vehicles vision enhancement image of generation;
Step S9 accordingly enters next cycle of operation according to instruction or into standby mode.
6. the adverse weather vision enhancement driving householder method based on truck traffic as claimed in claim 5, which is characterized in that
Step S2 is specifically included:
According to from vehicle and surrounding vehicles longitude and latitude, altitude information, matched with the track high-precision map of grade, pick out with from vehicle
Surrounding vehicles in same track;It will be converted, turned by coordinate from vehicle longitude and latitude and with the longitude and latitude of track inner periphery vehicle
It dissolves into earth coordinates, and then judges whether its vehicle of same track is from vehicle close to front truck;
The coordinate of the known current time T from vehicle is (Xhost(T),Yhost(T)), last moment coordinate is (Xhost(T-ΔT),
Yhost(T-ΔT));It is (X with its vehicle current time coordinate of trackelse(T),Yelse(T)), last moment coordinate is (Xelse
(T-ΔT),Yelse(T-ΔT));
According to the coordinate from vehicle current time T and with its vehicle current time coordinate of track, acquire from from truck position and be directed toward its vehicle
The vector angle φ of positionhe;
By each vehicle is current and the position coordinates of last moment, the course angle of each vehicle is calculated, is obtained from vehicle course angle ψhostWith its vehicle
Course angle ψelse;
If | ψhost-ψelse|≤θ1And | φhe-ψhost|≤θ2, then this its vehicle is with track front truck from vehicle, and this front truck is relative to it
His front truck, away from from vehicle it is nearest when, then this is from vehicle close to front truck;θ1And θ2For judgment threshold angle.
7. the adverse weather vision enhancement driving householder method based on truck traffic as claimed in claim 6, which is characterized in that
θ1And θ230 ° are taken respectively.
8. the adverse weather vision enhancement driving householder method based on truck traffic as claimed in claim 5, which is characterized in that
In step S3, target area length LtargetIt is related to from vehicle speed, current weather visibility;Known current visibility etc.
Grade is nv, it is v from vehicle speedhostIf target time headway htarget, then target area length LtargetSuch as following formula:
Wherein parameter nrTake empirical value;Target area width is with track with wide;
If current time is from the following distance D of its immediate front truck of vehiclehe≤Ltarget, then it is assumed that close to front truck in target area.
9. the adverse weather vision enhancement driving householder method based on truck traffic as claimed in claim 5, which is characterized in that
In step S4, minimum safe time headway h is setsafe, minimum collision avoidance time TTCsafe, it is known that close to front truck speed
vprecedingWith vehicle commander Lpreceding, it is v from vehicle speedhost, then the minimum safe distance d from its immediate front truck of vehicle is setsafeSuch as
Following formula:
dsafe=max (Lpreceding+hsafevhost,Lpreceding+(vhost-vpreceding)TTCsafe)
Then current time is from the following distance D of its immediate front truck of vehiclehe≥dsafeWhen, traffic safety, otherwise there are risk of collision.
10. the adverse weather vision enhancement driving householder method based on truck traffic, feature exist as claimed in claim 5
In,
In step S6, eye locations are represented with two mid-point positions, coordinate is (xeye,yeye,zeye), drive human face's pitching
Angle is δface,pitch, facial yaw angle is δface,yaw。
11. the adverse weather vision enhancement driving householder method based on truck traffic, feature exist as claimed in claim 10
In,
Step S7 is specifically included:
According to the front vehicles ID received, search and the matched information of vehicles of the ID in the database, including vehicle, basic ruler
Very little parameter, 3D auto model data;
First in earth coordinates determine front vehicles, from vehicle, drive human eye relevant position angle information, it is known that camera is solid
It is h to position the height putc, camera is d along from the level interval of vehicle y direction with satellite positioning module antennacg, away from the glass that keeps out the wind
The horizontal distance of glass is dcs,
Relative position and posture according to the front truck of current time reality and from vehicle driver eye, to the correspondence in 3D model libraries
3D auto models carry out position and pose adjustment;The scaling multiple n of vision enhancement image is determined firstreduce, according to current letter
Breath, obtains practical front vehicles tail portion away from the fore-and-aft distance d from car bumper wind Glass optical Clear View ScreenpsIt is as follows:
dps=Dhe-bp-(dcs+dcg)
bpHorizontal distance of the centroid position away from the tailstock for front truck;
Fore-and-aft distance d of the optical perspective screen position away from driver eyeseIt is as follows:
dse=yeye+dcs
Then obtain the scaling multiple n of vision enhancement imagereduceIt is as follows:
Then by 3D auto models size according to scaling multiple nreduceIt reduces;Further the point of observation of adjustment 3D auto models is vertical
To and lateral position;In earth coordinates, the vertical height h of driver eyeeIt is as follows:
he=hc+zeye
And then acquire lateral shift e of the driver eye relative to the front truck longitudinal axislateralIt is as follows:
elateral=Xhost-Xelse-xeye
Then by the point of observation of 3D auto models according to the driver eye acquired relative to the vertical height and lateral shift of vehicle
Adjustment;Then according to the yaw angle δ of front vehicleselse,yawWith driver's face yaw angle δface,yawAround vertical rotation 3D vehicles
Model, rotation angle are α=δelse,yaw-δface,yaw;According to driver's face pitch angle δface,pitchBy 3D auto models around horizontal stroke
To rotation-δface,pitch, the 3D auto model images obtained by point of observation position at this time seek to project in optical lens screen
Vision enhancement image.
12. the adverse weather vision enhancement driving householder method based on truck traffic, feature exist as claimed in claim 5
In,
In step S8, front vehicles information, safety instruction information are also projected;
When from vehicle and the following distance D close to front truckheLess than current minimum safe distance dsafe, then reminded in optical lens screen
Driver provides sound early warning there are risk of collision, and using man-machine interaction unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810011066.9A CN108202669B (en) | 2018-01-05 | 2018-01-05 | Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810011066.9A CN108202669B (en) | 2018-01-05 | 2018-01-05 | Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108202669A true CN108202669A (en) | 2018-06-26 |
CN108202669B CN108202669B (en) | 2021-05-07 |
Family
ID=62605205
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810011066.9A Active CN108202669B (en) | 2018-01-05 | 2018-01-05 | Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108202669B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109849790A (en) * | 2019-03-06 | 2019-06-07 | 武汉理工大学 | A kind of driving at night scene visual enhancing system and method for Multi-source Information Fusion |
CN110053626A (en) * | 2019-05-10 | 2019-07-26 | 深圳市元征科技股份有限公司 | A kind of control method for vehicle and relevant apparatus |
CN111325086A (en) * | 2018-12-14 | 2020-06-23 | 丰田自动车株式会社 | Information processing system, program, and information processing method |
CN112109550A (en) * | 2020-09-08 | 2020-12-22 | 中国第一汽车股份有限公司 | AR-HUD-based display method, device and equipment for early warning information and vehicle |
CN112230228A (en) * | 2020-09-30 | 2021-01-15 | 中汽院智能网联科技有限公司 | Intelligent automobile vision sensor testing method based on field testing technology |
CN112639808A (en) * | 2018-07-31 | 2021-04-09 | 法雷奥开关和传感器有限责任公司 | Driving assistance for longitudinal and/or lateral control of a motor vehicle |
CN113232586A (en) * | 2021-06-04 | 2021-08-10 | 河南科技大学 | Infrared pedestrian projection display method and system for driving at night |
CN113348384A (en) * | 2018-11-26 | 2021-09-03 | 大陆汽车系统公司 | Adverse weather condition detection system with LIDAR sensor |
CN113859123A (en) * | 2021-10-08 | 2021-12-31 | 上汽通用汽车有限公司 | Vehicle front-view system display control method, storage medium, and electronic device |
CN114407902A (en) * | 2022-01-19 | 2022-04-29 | 浙江大学 | System for driving decision based on road water layer depth estimation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101881885A (en) * | 2009-04-02 | 2010-11-10 | 通用汽车环球科技运作公司 | Peripheral salient feature on the full-windscreen head-up display strengthens |
CN102555908A (en) * | 2010-12-28 | 2012-07-11 | 通用汽车环球科技运作有限责任公司 | Traffic visibility in poor viewing conditions on full windshield head-up display |
CN104149691A (en) * | 2014-05-16 | 2014-11-19 | 苟安 | Augmented-reality vehicle-mounted projection system |
US20150042541A1 (en) * | 2013-03-29 | 2015-02-12 | Funai Electric Co., Ltd. | Head-up display device and display method of head-up display device |
CN104570351A (en) * | 2014-12-29 | 2015-04-29 | 信利半导体有限公司 | Vehicle-mounted head-up display system |
CN106918909A (en) * | 2015-11-10 | 2017-07-04 | 奥特润株式会社 | head-up display control device and method |
CN206627701U (en) * | 2016-02-12 | 2017-11-10 | Lg电子株式会社 | Vehicle head-up display |
-
2018
- 2018-01-05 CN CN201810011066.9A patent/CN108202669B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101881885A (en) * | 2009-04-02 | 2010-11-10 | 通用汽车环球科技运作公司 | Peripheral salient feature on the full-windscreen head-up display strengthens |
CN102555908A (en) * | 2010-12-28 | 2012-07-11 | 通用汽车环球科技运作有限责任公司 | Traffic visibility in poor viewing conditions on full windshield head-up display |
US20150042541A1 (en) * | 2013-03-29 | 2015-02-12 | Funai Electric Co., Ltd. | Head-up display device and display method of head-up display device |
CN104149691A (en) * | 2014-05-16 | 2014-11-19 | 苟安 | Augmented-reality vehicle-mounted projection system |
CN104570351A (en) * | 2014-12-29 | 2015-04-29 | 信利半导体有限公司 | Vehicle-mounted head-up display system |
CN106918909A (en) * | 2015-11-10 | 2017-07-04 | 奥特润株式会社 | head-up display control device and method |
CN206627701U (en) * | 2016-02-12 | 2017-11-10 | Lg电子株式会社 | Vehicle head-up display |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112639808A (en) * | 2018-07-31 | 2021-04-09 | 法雷奥开关和传感器有限责任公司 | Driving assistance for longitudinal and/or lateral control of a motor vehicle |
CN112639808B (en) * | 2018-07-31 | 2023-12-22 | 法雷奥开关和传感器有限责任公司 | Driving assistance for longitudinal and/or transverse control of a motor vehicle |
CN113348384A (en) * | 2018-11-26 | 2021-09-03 | 大陆汽车系统公司 | Adverse weather condition detection system with LIDAR sensor |
CN111325086B (en) * | 2018-12-14 | 2023-09-19 | 丰田自动车株式会社 | Information processing system, program, and information processing method |
CN111325086A (en) * | 2018-12-14 | 2020-06-23 | 丰田自动车株式会社 | Information processing system, program, and information processing method |
CN109849790A (en) * | 2019-03-06 | 2019-06-07 | 武汉理工大学 | A kind of driving at night scene visual enhancing system and method for Multi-source Information Fusion |
CN110053626A (en) * | 2019-05-10 | 2019-07-26 | 深圳市元征科技股份有限公司 | A kind of control method for vehicle and relevant apparatus |
CN110053626B (en) * | 2019-05-10 | 2021-07-06 | 深圳市元征科技股份有限公司 | Vehicle control method and related device |
CN112109550A (en) * | 2020-09-08 | 2020-12-22 | 中国第一汽车股份有限公司 | AR-HUD-based display method, device and equipment for early warning information and vehicle |
CN112230228A (en) * | 2020-09-30 | 2021-01-15 | 中汽院智能网联科技有限公司 | Intelligent automobile vision sensor testing method based on field testing technology |
CN112230228B (en) * | 2020-09-30 | 2024-05-07 | 中汽院智能网联科技有限公司 | Intelligent automobile vision sensor testing method based on field testing technology |
CN113232586A (en) * | 2021-06-04 | 2021-08-10 | 河南科技大学 | Infrared pedestrian projection display method and system for driving at night |
CN113859123A (en) * | 2021-10-08 | 2021-12-31 | 上汽通用汽车有限公司 | Vehicle front-view system display control method, storage medium, and electronic device |
CN113859123B (en) * | 2021-10-08 | 2024-05-17 | 上汽通用汽车有限公司 | Display control method for vehicle forward-looking system, storage medium and electronic equipment |
CN114407902A (en) * | 2022-01-19 | 2022-04-29 | 浙江大学 | System for driving decision based on road water layer depth estimation |
CN114407902B (en) * | 2022-01-19 | 2023-11-28 | 浙江大学 | Driving decision system based on road water layer depth estimation |
Also Published As
Publication number | Publication date |
---|---|
CN108202669B (en) | 2021-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108202669A (en) | Adverse weather vision enhancement driving auxiliary system and its method based on truck traffic | |
CN108572663B (en) | Target tracking | |
US9507345B2 (en) | Vehicle control system and method | |
US10269331B2 (en) | Display control device for vehicle | |
US10179588B2 (en) | Autonomous vehicle control system | |
US10564434B2 (en) | Display control device for vehicle | |
EP3216667B1 (en) | Control system for vehicle | |
CN206374737U (en) | A kind of vehicle avoids accessory system | |
US9126533B2 (en) | Driving support method and driving support device | |
CN109353279A (en) | A kind of vehicle-mounted head-up-display system of augmented reality | |
CN106696961A (en) | Control system and method for automatically driving onto and off ramp of freeway | |
US20080151054A1 (en) | Driving support method and driving support apparatus | |
US10232772B2 (en) | Driver assistance system | |
CN107544518A (en) | The ACC/AEB systems and vehicle driven based on personification | |
WO2015163205A1 (en) | Vehicle display system | |
CN109747658A (en) | The control device of vehicle | |
JP2021120248A (en) | Vehicle control device | |
US10522041B2 (en) | Display device control method and display device | |
JP6996253B2 (en) | Vehicle control device | |
JP5898539B2 (en) | Vehicle driving support system | |
CN105023429B (en) | Automobile-used wireless vehicle tracking and device | |
CN110491156A (en) | A kind of cognitive method, apparatus and system | |
US11934188B2 (en) | Monitoring and planning a movement of a transportation device | |
JP3857698B2 (en) | Driving environment recognition device | |
US20200250980A1 (en) | Reuse of Surroundings Models of Automated Vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |