CN108202669B - Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication - Google Patents

Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication Download PDF

Info

Publication number
CN108202669B
CN108202669B CN201810011066.9A CN201810011066A CN108202669B CN 108202669 B CN108202669 B CN 108202669B CN 201810011066 A CN201810011066 A CN 201810011066A CN 108202669 B CN108202669 B CN 108202669B
Authority
CN
China
Prior art keywords
vehicle
information
driver
host
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810011066.9A
Other languages
Chinese (zh)
Other versions
CN108202669A (en
Inventor
王迪
郑磊
刘涛
蒋鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN201810011066.9A priority Critical patent/CN108202669B/en
Publication of CN108202669A publication Critical patent/CN108202669A/en
Application granted granted Critical
Publication of CN108202669B publication Critical patent/CN108202669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8053Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Abstract

The invention provides a bad weather vision enhancement driving auxiliary system based on vehicle-to-vehicle communication, which comprises a central control unit, a vision enhancement unit, a perception unit and a human-computer interaction unit, wherein the central control unit is used for controlling the vehicle-to-vehicle communication; the central control unit performs information interaction with the sensing unit, performs decision making and image matching based on the own vehicle information and the front vehicle information transmitted by the sensing unit, and sends the own vehicle information to the communication equipment in the sensing unit; sending prompt information required to be displayed by a driver and a front vehicle virtual image to a vision enhancement unit; the data interaction with the man-machine interaction unit is realized, a driver instruction is received, and safety early warning information is sent; the visual enhancement unit receives the front vehicle virtual image information sent by the central control unit and projects the front vehicle virtual image information outwards; the projected image is just conducted to human eyes, and the size and the position of the projected vision enhancement image of the front vehicle are overlapped with the front actual vehicle. The invention greatly improves the driving safety under adverse weather conditions.

Description

Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication
Technical Field
The invention belongs to the field of intelligent networking driving assistance, and particularly relates to a visual enhancement driving assistance system and method based on vehicle-vehicle communication for safe driving under adverse weather conditions.
Background
The collision accident of the workshop is one of the main forms of traffic safety accidents all the time, and the collision avoidance of the workshop becomes a difficult problem to be solved urgently in all countries in the world. As an operator of a vehicle, a driver has natural limitations in perception, decision and execution, the physiological and psychological states are complex and changeable, workshop collision is often caused due to reasons of inaccurate perception, wrong decision, delayed execution and the like in a complex driving environment, and particularly, under bad weather conditions such as rain, snow, hail, fog, haze, dust and the like, the visibility of the front visual field of the driver is poor, so that workshop collision accidents are more easily induced.
In order to reduce the collision accident of the vehicle caused by the Driver, an Advanced Driver Assistance System (ADAS) for avoiding various vehicle collisions has been widely researched and practically applied. Most collision avoidance ADAS adopt vehicle-mounted sensors such as millimeter wave radars, laser radars and cameras to sense vehicle information around a self-vehicle, and the vehicle-mounted sensors are used for quantitative evaluation of collision risks and driving aid decision-making, so that active assistance and information prompt are provided for a driver. Although the current collision avoidance ADAS can effectively reduce a part of collision accidents, when the vehicle runs in bad weather such as rain, snow, hail, fog, haze, dust and the like, not only is the visual perception of the driver seriously interfered, but also the vehicle-mounted sensors such as radar and camera can also fail, so that the collision avoidance ADAS cannot provide effective driving assistance under the bad weather condition.
With the increasing maturity of vehicle-to-vehicle communication technology, vehicle-to-vehicle communication becomes a new means for acquiring other vehicle information, and not only the sensing range of the vehicle is greatly enlarged, but also accurate information of other vehicles can be directly acquired in real time and comprehensively, and the vehicle-to-vehicle communication is not limited by weather conditions. The introduction of vehicle-vehicle communication creates a new breakthrough for further improving traffic safety, so that a novel intelligent internet driving auxiliary system is widely researched. However, the currently-appearing novel intelligent internet driving assistance system is mainly used for providing driving assistance information for a driver at a vehicle center console, often causes distraction of the driver due to observation of the display in the driving process, and is particularly easy to cause collision accidents under the condition of poor weather and low visibility. And the driver hopes that the front view of the driver is clear in the driving process.
In summary, the prior art has the main problems in driving assistance under bad weather conditions:
(1) the conventional collision avoidance ADAS vehicle-mounted environment perception sensor is easy to lose effectiveness under bad weather conditions;
(2) the novel intelligent internet driving auxiliary system introducing vehicle-to-vehicle communication provides an information auxiliary mode which can not meet the requirements of drivers in the driving process under bad weather conditions, and even easily causes safety accidents.
Disclosure of Invention
The invention provides a bad weather vision enhancement driving auxiliary system based on vehicle-vehicle communication aiming at safe driving under bad weather such as rain, snow, hail, fog, haze, sand and dust, and introducing an augmented reality technology based on vehicle-vehicle communication. The technical scheme adopted by the invention is as follows:
a bad weather vision enhancement driving auxiliary system based on vehicle-to-vehicle communication comprises a central control unit, a vision enhancement unit, a perception unit and a man-machine interaction unit;
the central control unit performs information interaction with the sensing unit, performs decision making and image matching based on the own vehicle information and the front vehicle information transmitted by the sensing unit, and sends the own vehicle information to the communication equipment in the sensing unit; sending prompt information required to be displayed by a driver and a front vehicle virtual image to a vision enhancement unit; the data interaction with the man-machine interaction unit is realized, a driver instruction is received, and safety early warning information is sent;
the visual enhancement unit receives the front vehicle virtual image information sent by the central control unit and projects the front vehicle virtual image information outwards; the projected image is just conducted to human eyes, and the size and the position of the projected vision enhancement image of the front vehicle are overlapped with the front actual vehicle.
Specifically, the sensing unit comprises wireless communication equipment, a vehicle bus, an inertia measurement unit, a satellite positioning module and a camera;
the wireless communication device is used for sending out the information of the own vehicle and receiving the information from the surrounding vehicles, and the information sending and receiving comprises: vehicle ID, timestamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle;
acquiring vehicle speed data in a vehicle bus;
the inertia measurement unit is used for acquiring the longitudinal acceleration, the yaw angle, the roll angle and the pitch angle of the self-vehicle;
the satellite positioning module is used for acquiring longitude and latitude and altitude of the self-vehicle in real time and determining a course angle of the self-vehicle according to the current position and the position of the self-vehicle at the previous moment;
the camera is arranged in the front upper part of the driver in the cab and used for identifying the relative position of the eyes of the driver in the cab and the sight line of the driver.
Specifically, the vision enhancement unit comprises an optical perspective screen, a light guide assembly and a projection device;
the optical perspective screen is attached to the inner side of the windshield and can form images on the windshield, and front actual scenes can be seen through the optical perspective screen;
the projection equipment receives the virtual image information of the front vehicle of the central control unit and projects the virtual image information outwards;
the light guide assembly is used for guiding a projection image emitted by the projection equipment and comprises a primary reflector and a rotatable reflector; the primary reflector is used for reflecting the projection image of the projection equipment to the rotatable reflector, the angle of the rotatable reflector is adjusted according to the recognized eye position and the recognized sight line, the projection image is just transmitted to human eyes after being imaged by the optical perspective screen, and the size position of the projected vision enhancement image of the vehicle in front is overlapped with the actual vehicle in front.
Specifically, the human-computer interaction unit comprises a touch screen and a loudspeaker; the touch screen is in information interaction with the central control unit and is used for a driver to set the system; the loudspeaker receives the main controller information and provides sound reminding for the driver.
The invention provides a bad weather vision enhancement driving auxiliary method based on vehicle-to-vehicle communication, which comprises the following steps:
step S1, in each operation period delta T, firstly obtaining information of a self vehicle and surrounding vehicles, wherein the information types of the self vehicle and the surrounding vehicles are the same, and comprise vehicle ID, time stamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle;
step S2, screening the vehicles around the vehicle to determine the vehicles in the same lane with the vehicle and in the immediate front;
step S3, determining whether the vehicle immediately before is in the target area of the vehicle, if so, continuing to run, otherwise, determining whether to return to step S1 according to the instruction of the driver;
step S4, evaluating the safety of the vehicle and the vehicle immediately before;
step S5, judging whether the vehicle immediately before is in the normal visual field range of the driver, if so, continuing to go to the next step S6, otherwise, entering an information output link;
step S6, recognizing the position and sight of eyes of the driver;
step S7, generating a visual enhancement image of the preceding vehicle;
step S8, projecting the generated front vehicle vision enhancement image;
and step S9, entering the next operation cycle or entering a standby state according to the instruction.
The invention has the advantages that: the invention is different from the environment perception mode of the traditional driving assistance system, obtains the comprehensive real-time information of the front vehicle through vehicle-vehicle communication, utilizes the augmented reality technology, provides effective safe information assistance for the driver to drive the vehicle under the bad weather conditions with bad visibility such as rain, snow, fog, haze, hailstones, dust and the like on the premise of greatly reducing the influence on the driving attention of the driver, and has important significance for reducing the vehicle collision accidents in the bad weather.
Drawings
FIG. 1 is a schematic structural diagram of the present invention.
Fig. 2 is a schematic structural diagram of a vision enhancement unit of the driving assistance system of the present invention.
Fig. 3 is a flow chart of the operation of the driving assistance system of the present invention.
FIG. 4 is a schematic diagram of the position expression of the face and eyes in the camera coordinate system according to the present invention.
Fig. 5 is a schematic diagram of the principle of longitudinal human eye projection of an actual vehicle and projected images of the present invention.
Detailed Description
The invention is further illustrated by the following specific figures and examples.
The invention provides a vehicle-vehicle communication-based adverse weather visual enhancement driving auxiliary system, which comprises a central control unit, a visual enhancement unit, a perception unit and a human-computer interaction unit, as shown in figure 1. The method comprises the steps of developing a main controller as a central control unit, carrying out information interaction with a sensing unit, carrying out decision making and image matching based on information of a vehicle and a front vehicle transmitted by the sensing unit, sending the information of the vehicle to LTE-V communication equipment in the sensing unit, sending prompt information required to be displayed by a driver and a virtual image of the front vehicle to a vision enhancement unit, realizing data interaction with a human-computer interaction unit, receiving a driver instruction, and sending sound safety early warning information to the driver through a central control console loudspeaker.
The sensing Unit comprises LTE-V communication equipment, a CAN bus of the vehicle, an Inertial Measurement Unit (IMU), a differential GPS and a camera;
the LTE-V communication equipment is used for realizing vehicle-to-vehicle communication, externally transmitting vehicle information from a main controller and receiving information from surrounding vehicles, and the information transmitting and receiving comprises the following steps: vehicle ID, timestamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle; real-time vehicle speed data CAN be acquired from a vehicle CAN bus; the inertia measurement unit is arranged at the position of the no-load mass center of the vehicle and is used for acquiring the longitudinal acceleration, the yaw angle, the roll angle and the pitch angle of the vehicle; the antenna of the differential GPS is arranged on the roof of the vehicle and right above the no-load mass center of the vehicle, and is used for acquiring the longitude, the latitude and the altitude of the vehicle in real time and determining the course angle of the vehicle according to the current position and the position of the vehicle at the previous moment; the camera is arranged at the position of a rearview mirror in the cab and used for identifying the relative position of the eyes of the driver in the cab and the sight line of the driver;
as shown in fig. 2, the vision enhancement unit includes an optical see-through screen, a light directing assembly, a projection device;
the optical perspective screen is attached to the inner side of the windshield, so that images can be formed on the optical perspective screen, and forward actual scenes can be seen through the optical perspective screen;
the projection equipment receives the virtual image information of the front vehicle of the central control unit and projects the virtual image information outwards;
the light guide assembly is used for guiding a projection image emitted by the projection equipment and comprises a primary reflector and a rotatable reflector; the primary reflector is used for reflecting the projection image of the projection equipment to the rotatable reflector, the angle of the rotatable reflector can be adjusted according to the recognized eye position and the recognized sight line, so that the projection image is just transmitted to human eyes after being imaged by the optical perspective screen, the size position of the projected vision enhancement image of the front vehicle is overlapped with the front actual vehicle seen by a driver, and the vision enhancement effect is achieved;
the human-computer interaction unit comprises a center console touch screen and a center console loudspeaker; the touch screen of the center console performs information interaction with the main controller, and is used for setting the system by a driver and selecting to turn on/off the system, turn on/off visual enhancement, turn on/off prompt information and turn on/off sound reminding; and the central console loudspeaker receives the information of the main controller and provides sound reminding for the driver.
The invention provides a bad weather vision enhancement driving auxiliary system and a method thereof based on vehicle-to-vehicle communication, can introduce the augmented reality technology in bad weather such as rain, snow, fog, haze, hail, sand and dust, and the like, utilizes vehicle-to-vehicle communication to acquire the information of the front vehicle, and making safety decision, providing visual enhancement image and related safety auxiliary information for driver, making up visual perception limitation due to low visibility, and improving driving safety, the self vehicle is required to be provided with the complete driving assistance system, the front vehicle is required to be provided with at least an LTE-V communication device, an inertia measurement unit and a differential GPS, and a controller CAN collect the CAN bus of the bicycle, the inertia measurement unit and the differential GPS data, and then transmitting the front vehicle information data packet required by the driving assistance system to the LTE-V communication equipment in real time. Assuming that all functions of the adverse weather vision enhancement driving assistance system based on vehicle-to-vehicle communication provided by the invention are started in the driving process, the operation process of the driving assistance system in each period is shown in fig. 3, and the specific implementation steps of the adverse weather vision enhancement driving assistance method based on vehicle-to-vehicle communication provided by the invention are as follows:
step S1, when the driver starts the bad weather vision enhancement driving auxiliary system based on vehicle-vehicle communication, the system starts to run periodically in real time, the period is set to be delta T, in each running period, the system firstly obtains the information of the vehicle and the surrounding vehicles, the information types of the vehicle and the surrounding vehicles are the same, and the information types comprise vehicle ID, timestamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle;
step S2, screening the vehicles around the vehicle to determine the vehicles in the same lane with the vehicle and in the immediate front; the specific process is as follows:
matching with a lane-level high-precision map according to longitude and latitude and altitude information of the own vehicle and surrounding vehicles, and selecting the surrounding vehicles in the same lane with the own vehicle; converting the longitude and latitude of the vehicle and the longitude and latitude of the surrounding vehicle in the same lane into a WGS-84 geodetic coordinate system through coordinate conversion, and further judging whether the other vehicle in the same lane is an adjacent front vehicle of the vehicle;
the coordinate of the current time T of the vehicle is known as (X)host(T),Yhost(T)), and the last-time coordinate is (X)host(T-ΔT),Yhost(T- Δ T)); the current time coordinate of one other vehicle on the same lane is (X)else(T),Yelse(T)), and the last-time coordinate is (X)else(T-ΔT),Yelse(T- Δ T)); the distance D between the current vehicle and the other vehicle can be obtainedhe(T) is represented by the following formula:
Figure BDA0001540318260000051
further determining the vector angle phi from the self-position to the other positionheWhen X is presentelse(T)-Xhost(T) > 0 and Yelse(T)-YhostWhen the ratio of (T) > 0,
Figure BDA0001540318260000052
when X is presentelse(T)-Xhost(T) < 0 and Yelse(T)-YhostWhen (T) > 0, phihe=180°+arctan((Yelse(T)-Yhost(T))/(Xelse(T)-Xhost(T))); when X is presentelse(T)-Xhost(T) < 0 and Yelse(T)-YhostWhen (T) < 0, [ phi ]he=180°+arctan((Yelse(T)-Yhost(T))/(Xelse(T)-Xhost(T))); when X is presentelse(T)-Xhost(T) > 0 and Yelse(T)-YhostWhen (T) < 0, [ phi ]he=270°+arctan((Yelse(T)-Yhost(T))/(Xelse(T)-Xhost(T))); when X is presentelse(T)-Xhost(T) > 0 and Yelse(T)-YhostWhen (T) is 0, phihe0 °; when X is presentelse(T)-Xhost(T) is 0 and Yelse(T)-YhostWhen (T) > 0, phihe90 °; when X is presentelse(T)-Xhost(T) < 0 and Yelse(T)-YhostWhen (T) is 0, phihe180 °; when X is presentelse(T)-Xhost(T) is 0 and Yelse(T)-YhostWhen (T) < 0, [ phi ]he=270°;
The course angle of each vehicle can be estimated according to the current and last time position coordinates of each vehicle, taking the course angle of the vehicle as an example, when the delta X ishost=Xhost(T)-Xhost(T- Δ T) > 0 and Δ Yhost=Yhost(T)-YhostWhen the (T-delta T) > 0, the heading angle of the vehicle is psihost=arctan(ΔYhost/ΔXhost) (ii) a When Δ Xhost< 0 and Δ YhostWhen greater than 0, psihost=180°+arctan(ΔYhost/ΔXhost) (ii) a When Δ Xhost< 0 and Δ YhostWhen < 0, psihost=180°+arctan(ΔYhost/ΔXhost) (ii) a When Δ Xhost> 0 and Δ YhostWhen < 0, psihost=360°+arctan(ΔYhost/ΔXhost) (ii) a When Δ Xhost> 0 and Δ YhostWhen 0, psihost0 °; when Δ Xhost0 and Δ YhostWhen greater than 0, psihost90 °; when Δ Xhost< 0 and Δ YhostWhen 0, psihost180 °; when Δ Xhost0 and Δ YhostWhen < 0, psihost270. The course angle psi of other vehicles can be obtained in the same wayelse
If | ψhostelse| is less than or equal to 30 DEG and | phihehostIf the angle is less than or equal to 30 degrees, the other vehicle is a front vehicle of the same lane of the self vehicle, and the front vehicle is closest to the self vehicle relative to other front vehicles and is an adjacent front vehicle of the self vehicle;
step S3, determining whether the immediately preceding vehicle is within the target area of the own vehicle; because the range of vehicle-vehicle communication is far larger than that of sensors such as a traditional vehicle-mounted camera and a radar, if the distance between the vehicle and the vehicle close to the front is too far away, the vehicle does not have safety threat, and subsequent information processing is not needed, so that a target area is set by the system, and the rectangular area covers a certain range in front of a lane where the vehicle is located and is as wide as the lane; target area length LtargetThe visibility is divided into 9 grades according to standards, the visibility of 1 grade is excellent, the visibility of 9 grade is less than 100m, and the known current visibility grade is nvThe speed of the bicycle is vhostSetting a target headway htarget15s, the target region length LtargetThe following formula:
Figure BDA0001540318260000061
parameter n at visibility level 9rAn intermediate value of 5 for the desired visibility level;
if the distance D between the current vehicle and the immediately preceding vehicle at the current momenthe≤LtargetIf the vehicle is in the target area, the system continues to operate in the period; otherwise, under the condition that an instruction for closing the system by the driver is not received, the period is ended, and the system returns to the step S1;
step S4, evaluating the safety of the vehicle and the vehicle immediately before;
in order to ensure that the self-vehicle can safely follow the adjacent front vehicle and require that the self-vehicle and the adjacent front vehicle maintain safe headway and enough collision avoidance time, the system sets the minimum safe headway hsafeIs 2.5s, minimum time to collision TTCsafeAt 5s, the speed v of the immediately preceding vehicle is knownprecedingAnd length LprecedingThe speed of the bicycle is vhostThen, the minimum safety distance d between the self-vehicle and the immediately preceding vehicle is setsafeThe following formula:
dsafe=max(Lpreceding+hsafevhost,Lpreceding+(vhost-vpreceding)TTCsafe)
the distance D between the current self-vehicle and the immediately preceding vehiclehe≥dsafeIf the collision risk exists, safety reminding is provided for the driver;
step S5, judging whether the vehicle immediately preceding is in the normal visual field of the driver;
it is known that the maximum inter-vehicle distance D which is visible to the naked eye of the driver and draws sufficient attention when the vehicle is present in the front in good weather conditions during the driving of the drivereyeDefined as the driver's normal vision, if Dhe≤DeyeThen the system enters the visual enhancement mechanism and continues to step S6; otherwise, the system directly enters an information output link and executes the step S8;
step S6, recognizing the position and sight of eyes of the driver; the method comprises the steps that a camera installed at the position of a rearview mirror of a cab is adopted to obtain an image in the cab in real time, the face area of a driver is identified in a target area in the image, and the position of eyes and sight lines are further identified in the face area; the parameters to be identified are marked in the camera coordinate system as shown in fig. 4, the middle point of two eyes represents the eye position, and the coordinate is (x)eye,yeye,zeye) The pitch angle of the face of the driver is deltaface,pitchThe face yaw angle is deltaface,yaw
Step S7, generating a visual enhancement image of the preceding vehicle; according to the received front vehicle ID, searching vehicle information matched with the ID in a system database to obtain data such as the vehicle type, basic size parameters, a 3D vehicle model and the like of the vehicle; further, the shape, position and size of the vision enhancement image of the front vehicle projected on the optical perspective screen of the windshield are solved, so that the image of the vision enhancement image of the front vehicle in the human eyes is superposed with the image of the front vehicle in the eyes of the actual front vehicle.
Firstly, determining the relevant position angle information of the front vehicle, the self vehicle and the eyes of a driver in a geodetic coordinate system, wherein the height of a fixed position of a camera is known as hcThe horizontal distance between the camera and the differential GPS antenna along the longitudinal axis direction of the bicycle is dcgHorizontal distance from windshield glass is dcsSupposing that the actual vision enhancement images formed by the front vehicle and projected images are finally overlapped and converged at the positions of human eyes, considering the front vehicle and the self vehicle to be vertical by neglecting the inclination angle of the windshield, and neglecting the rolling and pitching of the front vehicle and the self vehicle in order to simplify the solving process;
adjusting the position and the posture of a corresponding 3D vehicle model in a 3D model library according to the actual relative position and the posture of eyes of a driver of a preceding vehicle and a driver of a self vehicle at the current moment; first, the zoom factor n of the vision enhancement image is determinedreduceThe principle of the longitudinal human eye projection of the actual vehicle and the projected image is shown in fig. 5, and the longitudinal distance d from the tail of the actual front vehicle to the optical perspective screen of the windshield of the vehicle can be obtained according to the current informationpsThe following were used:
dps=Dhe-bp-(dcs+dcg)
bpthe horizontal distance between the centroid position of the front vehicle and the tail of the vehicle is obtained;
longitudinal distance d between position of available optical perspective screen and eyes of driverseThe following were used:
dse=yeye+dcs
the zoom factor n of the visually enhanced image can be obtainedreduceThe following were used:
Figure BDA0001540318260000071
the 3D vehicle model size is scaled by a scaling factor nreduceShrinking; further adjusting the vertical and lateral positions of the observation points of the 3D vehicle model; the vertical height h of the eyes of the driver in the geodetic coordinate systemeThe following were used:
he=hc+zeye
then the lateral deviation e of the eyes of the driver relative to the longitudinal axis of the front vehicle is obtainedlateralThe following were used:
elateral=Xhost-Xelse-xeye
adjusting the vertical height and the transverse deviation of the eyes of the driver relative to the vehicle according to the obtained observation point of the 3D vehicle model; then according to the yaw angle delta of the vehicle in frontelse,yawAnd the driver's face yaw angle deltaface,yawRotating the 3D vehicle model around the vertical direction by an angle alpha-deltaelse,yawface,yaw(ii) a According to the pitch angle delta of the driver's faceface,pitchRotating a 3D vehicle model about a lateral direction by- δface,pitchThe 3D vehicle model image obtained from the observation point position at the moment is a visual enhancement image to be projected on the optical perspective screen of the windshield;
step S8, projecting the generated front vehicle vision enhancement image to the optical perspective screen; and optionally, projecting front vehicle information, safety prompt information;
as shown in fig. 5, the pitch angle and yaw angle of the rotatable mirror in the visual enhancement unit are first adjusted to project the visual enhancement image of the vehicle ahead on the optical perspective screen at the correct position so that the actual vehicle and the projected visual enhancement image of the vehicle coincide with each other when they are transmitted to the human eyes. And the speed and distance information of the current front vehicle, the minimum safety distance between the current vehicle and the vehicle immediately before and the safety prompt information are projected to the optical perspective screen. If the distance D between the current vehicle and the vehicle next to the previous vehicleheBelow the current minimum safety distance dsafeIf so, reminding the driver of the collision risk on the optical perspective screen, and providing sound early warning by adopting a center console loudspeaker in the human-computer interaction unit to remind the driver of deceleration or detour;
step S9, judging whether the driver sends the instruction of stopping the system, if the driver requires to stop the system, the system enters the standby state; if the system does not receive the stop command, the next operation cycle is entered, i.e., the process returns to step S1.
The invention can effectively reduce vehicle collision accidents by providing visual information assistance for the driver, and greatly improves the driving safety especially under bad weather conditions.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (6)

1. The utility model provides a bad weather vision enhancement driving auxiliary method based on car-to-car communication, is applicable to a bad weather vision enhancement driving auxiliary system based on car-to-car communication, bad weather vision enhancement driving auxiliary system based on car-to-car communication includes: the system comprises a central control unit, a visual enhancement unit, a sensing unit and a human-computer interaction unit;
the central control unit performs information interaction with the sensing unit, performs decision making and image matching based on the own vehicle information and the front vehicle information transmitted by the sensing unit, and sends the own vehicle information to the communication equipment in the sensing unit; sending prompt information required to be displayed by a driver and a front vehicle virtual image to a vision enhancement unit; the data interaction with the man-machine interaction unit is realized, a driver instruction is received, and safety early warning information is sent;
the visual enhancement unit receives the front vehicle virtual image information sent by the central control unit and projects the front vehicle virtual image information outwards; the projected image is just transmitted to human eyes, and the size and the position of the projected vision enhancement image of the front vehicle are superposed with the front actual vehicle;
the sensing unit comprises wireless communication equipment, a vehicle bus, an inertia measuring unit, a satellite positioning module and a camera;
the wireless communication device is used for sending out the information of the own vehicle and receiving the information from the surrounding vehicles, and the information sending and receiving comprises: vehicle ID, timestamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle;
acquiring vehicle speed data in a vehicle bus;
the inertia measurement unit is used for acquiring the longitudinal acceleration, the yaw angle, the roll angle and the pitch angle of the self-vehicle;
the satellite positioning module is used for acquiring longitude and latitude and altitude of the self-vehicle in real time and determining a course angle of the self-vehicle according to the current position and the position of the self-vehicle at the previous moment;
the camera is arranged in the front upper part of a driver in the cab and used for identifying the relative position of the eyes of the driver in the cab and the sight line of the driver;
the vision enhancement unit comprises an optical perspective screen, a light guide assembly and a projection device;
the optical perspective screen is attached to the inner side of the windshield and can form images on the windshield, and front actual scenes can be seen through the optical perspective screen;
the projection equipment receives the virtual image information of the front vehicle of the central control unit and projects the virtual image information outwards;
the light guide assembly is used for guiding a projection image emitted by the projection equipment and comprises a primary reflector and a rotatable reflector; the primary reflector is used for reflecting the projection image of the projection equipment to the rotatable reflector, the angle of the rotatable reflector is adjusted according to the recognized eye position and the recognized sight line, the projection image is just transmitted to human eyes after being imaged by the optical perspective screen, and the size position of the projected vision enhancement image of the front vehicle is overlapped with the front actual vehicle;
the method is characterized by comprising the following steps:
step S1, in each operation period delta T, firstly obtaining information of a self vehicle and surrounding vehicles, wherein the information types of the self vehicle and the surrounding vehicles are the same, and comprise vehicle ID, time stamp, vehicle speed, longitudinal acceleration, course angle, longitude and latitude, altitude, yaw angle, roll angle and pitch angle;
step S2, screening the vehicles around the vehicle to determine the vehicles in the same lane with the vehicle and in the immediate front;
step S3, determining whether the vehicle immediately before is in the target area of the vehicle, if so, continuing to run, otherwise, determining whether to return to step S1 according to the instruction of the driver;
step S4, evaluating the safety of the vehicle and the vehicle immediately before;
step S5, judging whether the vehicle immediately before is in the normal visual field range of the driver, if so, continuing to go to the next step S6, otherwise, entering an information output link;
step S6, recognizing the position and sight of eyes of the driver;
step S7, generating a visual enhancement image of the preceding vehicle;
step S8, projecting the generated front vehicle vision enhancement image;
step S9, entering the next operation cycle or entering the standby state according to the instruction;
step S2 specifically includes:
matching with a lane-level high-precision map according to longitude and latitude and altitude information of the own vehicle and surrounding vehicles, and selecting the surrounding vehicles in the same lane with the own vehicle; converting the longitude and latitude of the vehicle and the longitude and latitude of the surrounding vehicle in the same lane into a geodetic coordinate system through coordinate conversion, and further judging whether other vehicles in the same lane are adjacent front vehicles of the vehicle;
the coordinate of the current time T of the vehicle is known as (X)host(T),Yhost(T)), and the last-time coordinate is (X)host(T-ΔT),Yhost(T- Δ T)); the current time coordinate of one other vehicle on the same lane is (X)else(T),Yelse(T)), and the last-time coordinate is (X)else(T-ΔT),Yelse(T-ΔT));
According to the coordinate of the current time T of the self-vehicle and the coordinate of the current time of another vehicle in the same lane, the vector angle phi from the position of the self-vehicle to the position of another vehicle is obtainedhe
Calculating the course angle of each vehicle according to the current and last position coordinates of each vehicle to obtain the course of the vehicleAngle psihostAnd the heading angle psi of other vehicleselse
If | ψhostelse|≤θ1And | phihehost|≤θ2If the other vehicle is a preceding vehicle of the own vehicle on the same lane, and the preceding vehicle is closest to the own vehicle relative to the other preceding vehicles, the preceding vehicle is the next preceding vehicle of the own vehicle; theta1And theta2Judging a threshold angle;
in step S3, the target region length LtargetRelative to vehicle speed, current weather visibility; knowing that the current visibility level is nvThe speed of the bicycle is vhostSetting a target headway htargetThen the target area length LtargetThe following formula:
Figure FDA0002752235110000021
wherein the parameter nrTaking an empirical value; the width of the target area is the same as that of the lane;
if the distance D between the current vehicle and the immediately preceding vehicle at the current momenthe≤LtargetThen, the vehicle immediately preceding the vehicle is considered to be in the target area;
in step S4, a minimum safe headway h is setsafeMinimum time to collision TTCsafeKnowing the speed v of the vehicle immediately precedingprecedingAnd length LprecedingThe speed of the bicycle is vhostThen, the minimum safety distance d between the self-vehicle and the immediately preceding vehicle is setsafeThe following formula:
dsafe=max(Lpreceding+hsafevhost,Lpreceding+(vhost-vpreceding)TTCsafe)
the distance D between the current self-vehicle and the immediately preceding vehiclehe≥dsafeAnd the automobile is safe to drive, otherwise, collision risk exists.
2. The adverse weather visual enhancement driving assistance method based on vehicle-to-vehicle communication according to claim 1,
θ1and theta2Respectively taking 30 degrees.
3. The adverse weather visual enhancement driving assistance method based on vehicle-to-vehicle communication according to claim 1,
in step S6, the eye position is represented by the interocular midpoint position, and the coordinate is (x)eye,yeye,zeye) The pitch angle of the face of the driver is deltaface,pitchThe face yaw angle is deltaface,yaw
4. The adverse weather visual enhancement driving assistance method based on vehicle-to-vehicle communication according to claim 3,
step S7 specifically includes:
searching vehicle information matched with the ID in a database according to the received front vehicle ID, wherein the vehicle information comprises a vehicle type, basic size parameters and 3D vehicle model data;
firstly, determining the relevant position angle information of the front vehicle, the self vehicle and the eyes of a driver in a geodetic coordinate system, wherein the height of a fixed position of a camera is known as hcThe horizontal distance between the camera and the satellite positioning module antenna along the longitudinal axis direction of the vehicle is dcgHorizontal distance from windshield glass is dcs
Adjusting the position and the posture of a corresponding 3D vehicle model in a 3D model library according to the actual relative position and the posture of eyes of a driver of a preceding vehicle and a driver of a self vehicle at the current moment; first, the zoom factor n of the vision enhancement image is determinedreduceAccording to the current information, the longitudinal distance d between the tail part of the actual front vehicle and the optical perspective screen of the windshield of the vehicle is obtainedpsThe following were used:
dps=Dhe-bp-(dcs+dcg)
bpthe horizontal distance between the centroid position of the front vehicle and the tail of the vehicle is obtained;
longitudinal distance d between screen position of optical perspective screen and eyes of driverseThe following were used:
dse=yeye+dcs
then obtaining the zoom factor n of the vision enhancement imagereduceThe following were used:
Figure FDA0002752235110000031
the 3D vehicle model size is scaled by a scaling factor nreduceShrinking; further adjusting the vertical and lateral positions of the observation points of the 3D vehicle model; the vertical height h of the eyes of the driver in the geodetic coordinate systemeThe following were used:
he=hc+zeye
then the lateral deviation e of the eyes of the driver relative to the longitudinal axis of the front vehicle is obtainedlateralThe following were used:
elateral=Xhost-Xelse-xeye
adjusting the vertical height and the transverse deviation of the eyes of the driver relative to the vehicle according to the obtained observation point of the 3D vehicle model; then according to the yaw angle delta of the vehicle in frontelse,yawAnd the driver's face yaw angle deltaface,yawRotating the 3D vehicle model around the vertical direction by an angle alpha-deltaelse,yawface,yaw(ii) a According to the pitch angle delta of the driver's faceface,pitchRotating a 3D vehicle model about a lateral direction by- δface,pitchThe 3D vehicle model image obtained at the viewpoint position at this time is a visual enhancement image to be projected onto the optical perspective screen.
5. The adverse weather visual enhancement driving assistance method based on vehicle-to-vehicle communication according to claim 4,
in step S8, front vehicle information and safety prompt information are also projected;
when the distance D between the bicycle and the adjacent front bicycleheBelow the current minimum safety distance dsafeAnd then, the driver is reminded of the collision risk on the optical perspective screen, and a human-computer interaction unit is adopted to provide sound early warning.
6. The adverse weather visual enhancement driving assistance method based on vehicle-to-vehicle communication according to claim 1,
the human-computer interaction unit comprises a touch screen and a loudspeaker; the touch screen is in information interaction with the central control unit and is used for a driver to set the system; the loudspeaker receives the main controller information and provides sound reminding for the driver.
CN201810011066.9A 2018-01-05 2018-01-05 Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication Active CN108202669B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810011066.9A CN108202669B (en) 2018-01-05 2018-01-05 Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810011066.9A CN108202669B (en) 2018-01-05 2018-01-05 Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication

Publications (2)

Publication Number Publication Date
CN108202669A CN108202669A (en) 2018-06-26
CN108202669B true CN108202669B (en) 2021-05-07

Family

ID=62605205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810011066.9A Active CN108202669B (en) 2018-01-05 2018-01-05 Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication

Country Status (1)

Country Link
CN (1) CN108202669B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3084631B1 (en) * 2018-07-31 2021-01-08 Valeo Schalter & Sensoren Gmbh DRIVING ASSISTANCE FOR THE LONGITUDINAL AND / OR SIDE CHECKS OF A MOTOR VEHICLE
JP7063256B2 (en) * 2018-12-14 2022-05-09 トヨタ自動車株式会社 Information processing systems, programs, and information processing methods
CN109849790A (en) * 2019-03-06 2019-06-07 武汉理工大学 A kind of driving at night scene visual enhancing system and method for Multi-source Information Fusion
CN110053626B (en) * 2019-05-10 2021-07-06 深圳市元征科技股份有限公司 Vehicle control method and related device
CN112109550A (en) * 2020-09-08 2020-12-22 中国第一汽车股份有限公司 AR-HUD-based display method, device and equipment for early warning information and vehicle
CN113232586A (en) * 2021-06-04 2021-08-10 河南科技大学 Infrared pedestrian projection display method and system for driving at night
CN113859123A (en) * 2021-10-08 2021-12-31 上汽通用汽车有限公司 Vehicle front-view system display control method, storage medium, and electronic device
CN114407902B (en) * 2022-01-19 2023-11-28 浙江大学 Driving decision system based on road water layer depth estimation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101881885A (en) * 2009-04-02 2010-11-10 通用汽车环球科技运作公司 Peripheral salient feature on the full-windscreen head-up display strengthens
CN102555908A (en) * 2010-12-28 2012-07-11 通用汽车环球科技运作有限责任公司 Traffic visibility in poor viewing conditions on full windshield head-up display
CN104149691A (en) * 2014-05-16 2014-11-19 苟安 Augmented-reality vehicle-mounted projection system
CN104570351A (en) * 2014-12-29 2015-04-29 信利半导体有限公司 Vehicle-mounted head-up display system
CN106918909A (en) * 2015-11-10 2017-07-04 奥特润株式会社 head-up display control device and method
CN206627701U (en) * 2016-02-12 2017-11-10 Lg电子株式会社 Vehicle head-up display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014194511A (en) * 2013-03-29 2014-10-09 Funai Electric Co Ltd Head-up display device and display method for the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101881885A (en) * 2009-04-02 2010-11-10 通用汽车环球科技运作公司 Peripheral salient feature on the full-windscreen head-up display strengthens
CN102555908A (en) * 2010-12-28 2012-07-11 通用汽车环球科技运作有限责任公司 Traffic visibility in poor viewing conditions on full windshield head-up display
CN104149691A (en) * 2014-05-16 2014-11-19 苟安 Augmented-reality vehicle-mounted projection system
CN104570351A (en) * 2014-12-29 2015-04-29 信利半导体有限公司 Vehicle-mounted head-up display system
CN106918909A (en) * 2015-11-10 2017-07-04 奥特润株式会社 head-up display control device and method
CN206627701U (en) * 2016-02-12 2017-11-10 Lg电子株式会社 Vehicle head-up display

Also Published As

Publication number Publication date
CN108202669A (en) 2018-06-26

Similar Documents

Publication Publication Date Title
CN108202669B (en) Bad weather vision enhancement driving auxiliary system and method based on vehicle-to-vehicle communication
US10269331B2 (en) Display control device for vehicle
US10293690B2 (en) Vehicle information projecting system and vehicle information projecting method
US20220130296A1 (en) Display control device and display control program product
WO2020125178A1 (en) Vehicle driving prompting method and apparatus
CN111508276B (en) High-precision map-based V2X reverse overtaking early warning method, system and medium
US11287879B2 (en) Display control device, display control method, and program for display based on travel conditions
JP2023010800A (en) Display device
CN111279689B (en) Display system, display method, and storage medium
JP5898539B2 (en) Vehicle driving support system
US20180037162A1 (en) Driver assistance system
WO2021249020A1 (en) Method and apparatus for predicting driving state, and terminal device
JP2019014300A (en) Vehicle control system, vehicle control method and program
US11701967B2 (en) Display control device, display control method, and storage medium
JP2021169235A (en) Vehicle travel assistance device
CN110954126A (en) Display system, display method, and storage medium
WO2020189238A1 (en) Vehicular display control device, vehicular display control method, and vehicular display control program
JP2004265432A (en) Travel environment recognition device
WO2020105685A1 (en) Display control device, method, and computer program
JP7315101B2 (en) Obstacle information management device, obstacle information management method, vehicle device
CN110271417A (en) Full liquid crystal instrument system based on ADAS and AR technology
JP2008046761A (en) System, device, and method for processing image of movable object
CN114822083A (en) Intelligent vehicle formation auxiliary control system
JP7424951B2 (en) Roadside monitoring system and vehicle running control method
JP7088152B2 (en) Display control device and display control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant