CN110550105B - Driving assistance system and method - Google Patents

Driving assistance system and method Download PDF

Info

Publication number
CN110550105B
CN110550105B CN201810539552.8A CN201810539552A CN110550105B CN 110550105 B CN110550105 B CN 110550105B CN 201810539552 A CN201810539552 A CN 201810539552A CN 110550105 B CN110550105 B CN 110550105B
Authority
CN
China
Prior art keywords
vehicle
blind area
driving assistance
road
historical average
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810539552.8A
Other languages
Chinese (zh)
Other versions
CN110550105A (en
Inventor
唐帅
吕尤
孙铎
张海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN201810539552.8A priority Critical patent/CN110550105B/en
Publication of CN110550105A publication Critical patent/CN110550105A/en
Application granted granted Critical
Publication of CN110550105B publication Critical patent/CN110550105B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system

Abstract

The present invention provides a driving assistance system and method, which is installed or applied to a host vehicle and includes: a blind area determination unit configured to determine whether a blind area, which is a road area that cannot be detected by a sensor of the own vehicle due to the presence of a blocking object, exists in a road ahead of the own vehicle; a virtual moving body setting unit configured to assume a virtual moving body in a blind area if there is the blind area, and set a traveling state of the virtual moving body; and a risk prediction unit configured to determine whether or not the own vehicle will collide with the virtual moving body to predict whether or not the own vehicle will collide with a vehicle that has exited from the blind area.

Description

Driving assistance system and method
Technical Field
The invention relates to the automotive field of vehicles. More particularly, the present invention relates to a driving assistance system and method for providing a hazard warning and driving assistance in the case where a blind spot exists in a road ahead of a vehicle.
Background
In recent years, various driving support systems have been developed and applied to vehicles such as automobiles, which detect the driving environment of the vehicle by a sensor device such as a camera, a laser, and/or a radar, support the driver in driving, and provide a driver with a warning of danger.
However, in some cases, some road areas cannot be detected by the vehicle's sensors due to the presence of obstructions. As shown in fig. 1, the driver of vehicle a cannot see the traffic conditions behind the wall due to the obstruction by the wall. Therefore, when the vehicle a makes a right turn, it is likely that a collision will occur with the rider B. In addition, due to the occlusion of the wall, the sensors of vehicle a cannot detect the road area behind the wall, and thus cannot provide a hazard warning.
Therefore, there is a need for a driving assistance system and method that can provide a warning of danger even in a road area in front of a vehicle where a sensor cannot detect the road area due to the presence of a blocking object.
Disclosure of Invention
An object of the present invention is to provide a driving assistance system and method capable of providing a warning of danger and assisting a driver in driving a vehicle in the presence of a blind spot in a road ahead of the vehicle, avoiding an accident.
One aspect of the present invention provides a driving assistance system that is mounted on or applied to a host vehicle and includes: a blind area determination unit configured to determine whether or not a blind area, which is a road area that cannot be detected by a sensor of the own vehicle due to the presence of a blocking object, exists in a road ahead of the own vehicle; a virtual moving body setting unit configured to assume a virtual moving body in the blind area if the blind area exists, and set a traveling state of the virtual moving body; and a risk prediction unit configured to determine whether or not the own vehicle will collide with the virtual moving body to predict whether or not the own vehicle will collide with a vehicle that exits from the blind area.
According to an embodiment of the present invention, wherein the blind area determination unit is configured to determine whether the blind area exists and determine a start boundary of the blind area on the own vehicle side based on map information stored in advance and sensor data of the sensor of the own vehicle.
According to an embodiment of the invention, wherein the virtual mobile body is arranged to have a historical average traffic speed of the road segment in which the blind area is located.
According to an embodiment of the invention, wherein the historical average traffic speed is a historical average traffic speed for a current time period of the day.
According to an embodiment of the present invention, wherein the virtual mobile unit setting unit is configured to assume the virtual mobile unit only in a case where a historical average traffic flow per unit time of the road section where the blind area is located is larger than a predetermined threshold value.
According to an embodiment of the invention, the historical average traffic flow per unit time is the historical average traffic flow per unit time of the current time period of the day.
According to an embodiment of the present invention, wherein the virtual moving body setting unit is configured to acquire a historical average traffic speed of the road section where the blind area is located from an external apparatus.
According to an embodiment of the present invention, wherein the virtual moving body setting unit is configured to acquire a historical average traffic volume per unit time of the road segment where the blind area is located from an external apparatus.
According to an embodiment of the present invention, wherein the external device is configured to acquire position, speed and direction of movement data thereof and sensor data including data on the position, speed and direction of movement of traffic participants around the corresponding vehicle from a number of vehicles within the road network within a predetermined length of time in the past, and the external device is configured to calculate a historical average traffic speed and a historical average traffic flow per unit time for a road section where the blind area is located based on the acquired data.
According to an embodiment of the present invention, wherein the risk prediction unit is configured to calculate a collision time of the host vehicle and the virtual moving body based on a traveling state of the host vehicle and a traveling state of the virtual moving body, and determine a collision risk level based on the collision time.
According to an embodiment of the present invention, wherein the driving assistance system further includes: a driving assistance unit configured to output a driving assistance instruction based on the determined collision risk level, the driving assistance instruction including at least one of: outputting a warning to a driver of the host vehicle; outputting a warning to an external environment of the host vehicle; decelerating, running or stopping the own vehicle at a predetermined speed; and steering the host vehicle.
According to another aspect of the present invention, there is also provided a vehicle on which the driving assistance system according to any one of the embodiments is mounted or applied.
According to another aspect of the present invention, there is also provided a driving assistance method including the steps of: judging whether a blind area exists in a road in front of the vehicle, wherein the blind area is a road area which cannot be detected by a sensor of the vehicle due to the existence of a sheltering object; assuming a virtual moving body in the blind area in the case where the blind area exists, and setting a traveling state of the virtual moving body; and judging whether the vehicle collides with the virtual moving body or not so as to predict whether the vehicle collides with the vehicle which is driven out of the blind area or not.
According to an embodiment of the present invention, wherein it is determined whether the blind area exists and a start boundary of the blind area is determined based on map information stored in advance and sensor data of the sensor of the own vehicle.
According to an embodiment of the present invention, wherein the traveling state of the virtual mobile body is set to have a historical average traffic speed of the road section where the blind area is located.
According to one embodiment of the invention, the historical average traffic speed is a historical average traffic speed for a current time period of the day.
According to an embodiment of the present invention, the virtual moving body is assumed only when the historical average traffic flow per unit time of the road section where the blind area is located is greater than a predetermined value.
According to one embodiment of the invention, the historical average traffic flow per unit time is the historical average traffic flow per unit time of the current time period of the day.
According to an embodiment of the present invention, the driving assistance method further includes: and acquiring the historical average traffic flow speed of the road section where the blind area is located from external equipment.
According to an embodiment of the present invention, the driving assistance method further includes: and acquiring the historical average traffic flow per unit time of the road section where the blind area is located from external equipment.
According to an embodiment of the invention, wherein the external device is configured to obtain from a number of vehicles within the road network, within a predetermined length of time in the past, position, speed and direction data thereof and sensor data comprising data relating to the position, speed and direction of traffic participants around the corresponding vehicle, and to calculate the historical average traffic speed and the historical average traffic flow per unit time for the road section where the blind area is located based on the obtained data.
According to an embodiment of the present invention, the driving assistance method further includes: the collision time of the host vehicle and the virtual mobile body is calculated based on the traveling state of the host vehicle and the traveling state of the virtual mobile body, and a collision risk level is determined based on the collision time.
According to an embodiment of the present invention, the driving assistance method further includes: outputting a driving assistance instruction based on the determined collision risk level, the driving assistance instruction including at least one of: outputting a warning to a driver of the host vehicle; outputting a warning to an external environment of the host vehicle; decelerating, running or stopping the own vehicle at a predetermined speed; and steering the host vehicle.
Therefore, according to the driving assistance system and method provided by the embodiment of the invention, the danger early warning can be provided and the driver can be assisted to drive the vehicle under the condition that a blind area exists in the road in front of the vehicle, so that accidents are avoided.
Drawings
Features, advantages and technical effects of exemplary embodiments of the present invention will be described below with reference to the accompanying drawings, in which like reference numerals represent like elements, and wherein:
fig. 1 shows an example of a scene in which a blind area exists in a road ahead of a vehicle.
Fig. 2 shows a block diagram of the driving assistance system according to the embodiment of the invention.
Fig. 3 illustrates a blind area determined in the scene example as illustrated in fig. 1 and an assumed virtual moving body according to an embodiment of the present invention.
Fig. 4 shows a flowchart of a driving assistance method according to an embodiment of the invention.
Detailed Description
Hereinafter, embodiments of the present invention are described with reference to the drawings. The following detailed description and drawings are illustrative of the principles of the invention, which is not limited to the preferred embodiments described, but is defined by the claims.
The driving assist system according to the embodiment of the invention may be mounted on or applied to a vehicle. The vehicle may be an internal combustion engine vehicle using an internal combustion engine as a drive source, an electric vehicle or a fuel cell vehicle using an electric motor as a drive source, a hybrid vehicle using both of the above as drive sources, or a vehicle having another drive source. The driving assistance system may be connected to and communicate with a control system of the vehicle. For the sake of brevity, well-known power and steering devices, drive trains, and like components of the vehicle are not described in detail.
Fig. 2 is a schematic diagram of a driving assistance system according to an embodiment of the invention. The driving assist system shown in fig. 2 is installed or applied to the vehicle 1, provides a hazard warning in the case where there is a blind spot in the road ahead of the vehicle 1, and can also drive assist the vehicle 1 to avoid an accident. In this context, a blind spot is a road area in front of a vehicle that cannot be detected by a sensor device on the vehicle due to the presence of a shield. Of course, the blind spot is also invisible to the driver of the vehicle 1. Here, the shade may include objects on both sides of the road (e.g., buildings such as walls and houses, bushes, fences, mountains, etc.), and may also include objects on the road surface, such as temporary constructions on the road surface, shades caused by the shape of the road itself, etc.
As shown in fig. 2, the driving assistance system 100 includes a sensor device 10, a storage device 20, a communication device 30, a calculation device 40, and an output device 50. These devices of the driving assistance system 100 may communicate with each other via a CAN bus or the like to transfer data and instructions to each other.
The sensor device 10 may comprise an internal sensor 11 and an external sensor 12. The interior sensors 11 may be interior sensors typically mounted on the vehicle 1, including but not limited to GNSS sensors, speed sensors, acceleration sensors, steering angle sensors, etc. The interior sensor 11 is used to detect the position, speed, acceleration, steering angle, traveling direction, and the like of the vehicle 1. The combination of the position, speed, acceleration, traveling direction, and the like of the vehicle 1 may indicate the traveling state of the vehicle 1.
The external sensor 12 may be an external sensor normally mounted on the vehicle 1. The external sensor 12 may include an image sensor such as a camera, an ultrasonic sensor, a radar sensor, a laser sensor, and/or the like. The sensor data of the external sensors 12 can be used to detect the ambient conditions of the vehicle 1, for example to detect the driving states (position, speed, acceleration, direction of travel, etc.) of other traffic participants (such as pedestrians, bicycles, motorcycles, other motor vehicles, etc.) around the vehicle 1, and road conditions around the vehicle 1, in particular ahead, such as the positions, contours, etc. of lane lines, fences, curbs, and objects on the road on both sides of the road (such as buildings, bushes, fences, mountains, etc.).
The storage device 20 may be a conventional storage device such as a hard disk. The storage device 20 is used to store various map information such as road shapes, building positions and outlines, rivers, railways, and the like in advance according to an embodiment of the present invention.
The communication device 30 may be a bluetooth, antenna, or the like device, and the communication device 30 may be configured to communicate with an external apparatus such as a remote server by a wireless manner such as bluetooth, Wi-Fi, mobile network, or the like to acquire information from the external apparatus or to transfer information to the external apparatus.
The computing device 40 may be a Central Processing Unit (CPU) or an Electronic Control Unit (ECU). The computing device 40 may acquire data from the sensor device 10, the storage device 20, the communication device 30, and the like, perform various processes to predict a potential collision risk of the vehicle 1. According to some embodiments, the computing device 40 also distinguishes between levels of potential collision risk, and may output driving assistance instructions to warn or assist the driver's driving according to the potential collision risk levels.
The output device 50 may include a display, a speaker, and the power system, the transmission system, and the brake system of the vehicle 1, etc., mounted on the vehicle 1 to perform corresponding operations in response to the driving assistance instruction.
According to a specific embodiment, the calculation device 40 includes a blind area determination unit 41, a virtual moving body setting unit 42, a risk prediction unit 43, and a driving assistance unit 44.
The blind area determination unit 41 is configured to determine whether a blind area exists in the road ahead of the vehicle 1. In the exemplary embodiment, the blind area determination unit 41 makes the above determination based on the sensor data of the external sensor 12 of the vehicle 1 and the map information stored in advance in the storage device 20. The blind area determination unit 41 acquires map information stored in advance from the storage device 20, and determines the complete shape of the road ahead of the vehicle 1 based on the map information. Further, the blind area determination unit 41 acquires sensor data thereof from the external sensor 12, and determines road conditions around the vehicle 1, particularly in front thereof, such as positions and contours of lane lines, fences, curbs, and objects on both sides of the road, and the like, on the basis of the sensor data. The blind area determination unit 41 can determine whether there is an area in the road ahead of the vehicle 1 that cannot be detected by the external sensor 12, that is, a blind area, by comparing the road condition determined based on the sensor data with the entire shape of the road ahead.
In the case where the blind area determination unit 41 determines that there is a blind area in the road ahead of the vehicle 1, the blind area determination unit 41 may further identify the range of the blind area, for example, specify the start boundary of the blind area on the vehicle 1 side. The blind area judgment unit 41 determines the start boundary of the blind area based on the sensor data of the external sensor 12. For example, the blind area judgment unit 41 may determine the farthest objects detectable in front of the vehicle 1 along various angles (for example, within an angle range of 120 degrees to 180 degrees in front of the vehicle 1) based on the sensor data of the external sensor 12, and determine the start boundary of the blind area by the connecting line of these farthest objects.
The blind area judgment and the determination of the start boundary by the blind area judgment unit 41 are described with reference to fig. 3. Fig. 3 shows an example of a scene as shown in fig. 1, in which a road L1 intersects with a road L2 and forms an intersection C toward which the vehicle 1 is traveling on a road L1. Walls W are present on both sides of the road L1 and the road L2. The blind area determination unit 41, immediately before the vehicle 1 enters the intersection C, acquires the map information stored in advance from the storage device 20 and determines the complete shapes of the road L1 and the road L2 near the intersection C, and acquires the sensor data from the external sensors 12 and determines the road condition ahead of the vehicle 1. The blind area determination unit 41 may determine that the blind area B exists in the road L2 by comparing the front road condition determined based on the sensor data with the full shapes of the road L1 and the road L2. Meanwhile, the blind area determination unit 41 may determine the start boundary B1 of the blind area B based on the sensor data of the external sensor 12. In an exemplary embodiment, the blind area determination unit 41 may determine the farthest object detectable at different angles in front of the vehicle 1 based on the positions of the wall, the lane line on the road surface, the curb, the signboard on both sides of the road, and the like photographed by the camera, and the distances of the objects detected by the ultrasonic sensor, the laser sensor, and the like. The starting boundary B1 of the blind spot B is the connecting line of the farthest objects along each angle.
The virtual moving body setting unit 42 is configured to assume a virtual moving body in a blind area in the case where the blind area exists, and set a traveling state of the virtual moving body. The traveling state of the virtual moving body is represented by the position, speed, and traveling direction of the virtual moving body. In the exemplary embodiment, the virtual mobile body is assumed to be in a blind area, and travels toward the direction approaching the vehicle 1 at the historical average traffic speed of the road segment in which the blind area is located. In particular, it can be assumed that the virtual moving body is located at the start boundary of the blind area at the present time.
In the scene example shown in fig. 3, the virtual moving body setting unit 42 may assume the rider 2, and set that the rider 2 is currently located at the start boundary B1 of the blind spot B and is traveling toward the intersection C at the historical average traffic speed V of the road L2 on the current link L2-1.
According to the embodiment of the present invention, the virtual moving body setting unit 42 acquires the historical average traffic speed of the road section where the blind area is located from the external apparatus. Specifically, the virtual moving body setting unit 42 may wirelessly communicate with an external device through the communication device 30 to acquire the above-described historical average traffic speed. Here, the external device may be a remote server. The remote server may be arranged to communicate with several vehicles within the road network and to obtain from these vehicles data of their position, speed, direction etc. and sensor data thereof. Here, the sensor data includes data relating to the position, speed, acceleration, direction of travel, etc. of other traffic participants (such as pedestrians, bicycles, motorcycles, other motor vehicles, etc.) around their corresponding vehicle. Thus, the remote server can obtain the average traffic flow speed of each road and even each road section of each road in the road network by using a statistical method. Here, the road network may be a road network of all or a part of the area of the city or region in which the vehicle 1 is located. The link is any one of several sections obtained by dividing a certain road by a predetermined length (for example, 5m, 10m, or the like).
The remote server may have been in communication with a number of vehicles within the road network and obtain relevant data for a predetermined length of time in the past. This makes it possible to obtain the historical average traffic speed over the past predetermined time period. The predetermined length of time in the past may be, for example, the past months, weeks, days, or hours from the present time. In addition, the remote server may calculate historical average traffic speeds for different time periods of the day (e.g., 8:00 to 8:30 a.m.).
According to some embodiments, the remote server may also calculate historical average traffic flow per unit time for individual roads, and even individual segments of individual roads, within the road network based on data obtained from several vehicles. In particular, the remote server may also calculate historical average traffic volume per unit time for different time periods of the day (e.g., 8:00 to 8:30 a.m.).
Thus, the virtual moving object setting unit 42 can also acquire the historical average traffic volume per unit time of the road link where the blind area is located from the remote server. According to some embodiments, the virtual moving body setting unit 42 is configured to assume a virtual moving body only when the historical average unit time traffic volume is greater than a predetermined threshold. In these embodiments, the historical average traffic flow per unit time may be the historical average traffic flow per unit time for the current time period of the day (the current time is 8:10 am, and the current time period may be, for example, 8:00 am to 8:30 am).
The risk prediction unit 43 is configured to predict whether the vehicle 1 will collide with a vehicle that exits from a blind area. The risk prediction unit 43 predicts the above-described collision risk by determining whether the vehicle 1 will collide with the virtual moving body. Specifically, the risk prediction unit 43 may calculate the time to collision TTC of the vehicle 1 and the virtual moving body based on the traveling state of both, and determine whether the vehicle 1 will collide with the virtual moving body according to the magnitude relationship between the time to collision TTC and the predetermined time threshold. According to the embodiment of the present invention, the running state of the vehicle 1 is determined by the position, the speed, the acceleration, the moving direction, and the like detected by the internal sensor 11, and the running state of the virtual moving body is set by the virtual moving body setting unit 42. When the time to collision TTC is less than the predetermined time threshold, the risk prediction unit 43 determines that the vehicle 1 may collide with the virtual mobile body, and thereby predicts that the vehicle 1 has a risk of colliding with a vehicle that has exited from the blind area.
According to other embodiments, the risk prediction unit 43 may set a plurality of different time thresholds and determine a plurality of collision risk levels according to the magnitude relationship of the time to collision TTC and each of the plurality of time thresholds. In an exemplary embodiment, four different time thresholds T1, T2, T3, and T4 may be set, where T1> T2> T3> T4. When T2< TTC < T1, the risk prediction unit 43 determines that the vehicle 1 is at the first level where the risk of collision is low. By analogy, when T3< TTC < T2, the risk prediction unit 43 determines that the vehicle 1 is at a second level higher than the first level in the risk of collision; when T4< TTC < T3, the risk prediction unit 43 determines that the vehicle 1 is at a third level where the risk of collision is higher; when the TTC < T4, the risk prediction unit 43 determines that the vehicle 1 is at the fourth level where the collision risk is extremely high.
The driving assistance unit 44 is configured to output a driving assistance instruction based on the collision risk level determined by the risk prediction unit 43. The driving assistance instruction may be output to the output device 50. The output device 50 performs a corresponding operation to warn the driver or assist the driver in driving the vehicle 1 by executing the received driving assist instruction. The driving assistance instruction may be one or more of: outputting a warning to the driver of the vehicle 1; outputting a warning to the external environment of the vehicle 1; decelerating, running at a predetermined speed, or stopping the vehicle 1; and to steer the vehicle 1.
The driving assistance unit 44 may output different driving assistance instructions or a combination thereof based on different collision risk levels. In the exemplary embodiment, when the collision risk level determined by the risk prediction unit 43 is the first level, the driving assistance unit 44 outputs a driving assistance instruction that outputs a warning to the driver of the vehicle 1, and a speaker or a display or the like of the vehicle 1 receives the instruction and issues an audio or video warning to the driver of the vehicle 1.
When the collision risk level determined by the risk prediction unit 43 is the second level, the driving support unit 44 outputs a driving support command for outputting a warning to the external environment of the vehicle 1, and the horn, the headlights, or the like of the vehicle 1 receives the command and sounds the horn or blinks the headlights, thereby warning a vehicle that is likely to exit from a blind area, for example, a vehicle that is about to travel from the blind area B to the intersection C on a road L2 shown in fig. 3.
When the collision risk level determined by the risk prediction unit 43 is the third level, the driving assistance unit 44 outputs a driving assistance instruction to decelerate and steer (make lateral avoidance) the vehicle 1 in the direction opposite to the virtual moving body, and the braking and steering system of the vehicle 1 receives the instruction and controls the vehicle 1 to decelerate and steer. According to some embodiments, the deceleration value of the vehicle 1, the target speed of deceleration, the lateral evasive distance, and the like are determined according to the time to collision TTC.
When the collision risk level determined by the risk prediction unit 43 is the fourth level, the time to collision TTC is very small (for example, less than 1s), at which time the vehicle 1 is highly likely to collide with a vehicle that has exited from the blind area. In this case, the driving assist unit 44 outputs a driving assist instruction to stop the vehicle 1 or to move the vehicle 1 at a very small speed (such as 3m/s or the like), the power system, the transmission system, the brake system, and the like of the vehicle 1 receive the instruction, and stops the vehicle 1 or controls the vehicle 1 to move at a target speed. Further, according to some embodiments, the risk prediction unit 43 continuously calculates the time to collision TTC of the vehicle 1 with the virtual moving body at predetermined time intervals, and determines the relationship between the time to collision TTC and T4. When the time to collision TTC becomes greater than T4, a new collision risk level is determined. The driving assistance unit 44 outputs a new driving assistance instruction according to the new collision risk level, causing the vehicle 1 to travel according to the new driving assistance instruction. Alternatively, when the vehicle 1 moves at a very small speed, the blind area determination unit 41 continuously monitors the change in the start boundary of the blind area. Until when a large portion of the blind area becomes detectable, the vehicle 1 can resume normal running.
Thus, according to the driving assist system of the embodiment of the invention, it is possible to provide a hazard warning and assist the driving of the vehicle 1 in the case where there is a blind area in the road ahead of the vehicle 1, avoiding the occurrence of an accident.
Next, a driving assistance method according to an embodiment of the invention will be described in detail. The driving assistance method according to the embodiment of the invention may be implemented using the driving assistance system of any of the embodiments described above. Fig. 4 shows a flowchart of a driving assistance method according to an embodiment of the invention.
According to the driving assistance method of the embodiment of the invention, it is possible to be manually activated by the driver. For example, when the driver sees that there is a shelter in the road ahead, the driving assistance method according to the present invention is manually activated by a button mounted on the vehicle 1 or a key on a touch panel or the like. Further, the driving assistance method may be automatically started when the vehicle 1 starts running.
As shown in fig. 4, after the driving assistance method is started, in step S10, it is determined whether a blind area exists in the road ahead of the vehicle 1. Specifically, map information stored in advance is acquired from the storage device 20, and the full road shape of the road ahead of the vehicle 1 is determined based on the map information. At the same time, sensor data is acquired from the external sensors 12 and the road conditions around the vehicle 1, in particular in front, are determined on the basis of the sensor data. By comparing the two, it is possible to judge whether or not a blind area exists in the road ahead of the vehicle 1. Further, the starting boundary of the blind area on the vehicle 1 side is determined based on the determined road condition around, particularly in front of, the vehicle 1.
When it is determined that there is a blind area in the road ahead of the vehicle 1, the method proceeds to step S20, in which a virtual moving body (e.g., the rider 2 in fig. 3) is assumed in the blind area, and the traveling state of the virtual moving body is set. Specifically, the traveling state of the virtual moving body is set to be located at the start boundary of the blind area at the present time, and to move toward the direction of the vehicle 1 at the historical average traffic speed of the road segment where the blind area is located. Taking the rider 2 in fig. 3 as an example, it may be assumed that the rider 2 is located at the start boundary B1 of the blind area B at the present time, and travels toward the intersection C at the historical average traffic speed V of the segment L2-1 of the road L2 on which the blind area B is located. In this step, the historical average traffic speed of the road section where the blind area is located is acquired from an external device, such as a remote server. The historical average traffic speed may be a historical average traffic speed for a current time period of the day (e.g., the current time is 8:10 am, which may be 8:00 am-8:30 am).
After step S20, the method proceeds to step S30. In step S30, it is predicted whether the vehicle 1 will collide with a vehicle that has exited from the blind area. Specifically, this step is implemented by determining whether the vehicle 1 will collide with the virtual moving body. More specifically, based on the traveling state of the vehicle 1 and the traveling state of the virtual moving body, the time to collision TTC of both can be calculated. Then, it is possible to determine whether the vehicle 1 will collide with the virtual moving body by comparing the time to collision TTC with a predetermined time threshold.
When the time to collision TTC is less than the predetermined time threshold, it is predicted that the vehicle 1 will collide with the vehicle that exits from the blind area. In this case, the method proceeds to step S40, and in step S40, a collision risk level is determined. The collision risk level may be determined by setting a plurality of time thresholds according to the magnitude relationship between the time to collision TTC and each of the plurality of time thresholds.
After step S40, the method proceeds to step S50. In step S50, a driving assistance instruction is output according to the determined collision risk level. Depending on the different collision risk levels, the driving assistance instructions may be one or more of the following: outputting a warning to the driver of the vehicle 1; outputting a warning to the external environment of the vehicle 1; decelerating, running at a predetermined speed, or stopping the vehicle 1; and to steer the vehicle 1.
In step S60, the output device 50 receives the driving assistance command and performs the relevant operation. For example, a speaker, a display, or the like of the vehicle 1 receives a driving assistance instruction for outputting a warning to the driver of the vehicle 1, and issues an audio or video warning. A horn or a headlamp or the like of the vehicle 1 may receive a driving assist instruction that outputs a warning to the external environment of the vehicle 1, and whistle or blink to warn a vehicle that may exit from a blind area. The transmission and brake system of the vehicle 1 may receive a driving assistance instruction to decelerate, run or stop the vehicle 1 at a predetermined speed, and decelerate, run or stop the vehicle 1 at the predetermined speed. The steering system of the vehicle 1 can receive a driving assistance instruction to steer the vehicle 1, and steer the vehicle 1, for example, in the opposite direction of the virtual moving body to perform lateral evasion.
When it is predicted in step S30 that the vehicle 1 will not collide with a vehicle that is coming out of the blind zone (i.e., the time to collision TTC is greater than the predetermined time threshold), the method proceeds directly to step S50, where a driving assistance instruction is output, which includes outputting a warning to the driver of the vehicle 1; and/or output a warning to the environment outside the vehicle 1.
Thus, according to the driving assistance method of the embodiment of the invention, it is possible to provide a hazard warning and assist the driving of the vehicle in the case where there is a blind area in the road ahead of the vehicle 1, avoiding the occurrence of an accident.
In the above embodiment, when it is determined that there is a blind area in the road ahead of the vehicle 1, the method assumes a virtual mobile body in the blind area and sets the state of the virtual mobile body. However, according to other embodiments of the present invention, upon determining that there is a blind spot in the road ahead of the vehicle 1 (step S10), the method proceeds to step S70. In step S70, it is determined whether the historical average traffic flow per unit time for the road segment where the blind area is located is greater than a predetermined threshold. When the historical average traffic flow per unit time is greater than the predetermined threshold, the method proceeds to step S20. When the historical average traffic flow per unit time is less than the predetermined threshold, the risk of collision of the vehicle 1 with a vehicle exiting from the blind zone is considered to be very low, and the method may proceed directly to step S50. In step S50, a driving assistance instruction is output, which includes outputting a warning to the driver of the vehicle 1; and/or output a warning to the environment outside the vehicle 1.
While the invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the construction and methods of the embodiments described above. On the contrary, the invention is intended to cover various modifications and equivalent arrangements. In addition, while the various elements and method steps of the disclosed invention are shown in various example combinations and configurations, other combinations, including more, less or all, of the elements or methods are also within the scope of the invention.

Claims (21)

1. A driving assistance system that is mounted on or applied to a host vehicle and includes:
a blind area determination unit configured to determine whether or not a blind area, which is a road area that cannot be detected by a sensor of the own vehicle due to the presence of a blocking object, exists in a road ahead of the own vehicle;
a virtual moving body setting unit configured to assume a virtual moving body in the blind area if the blind area exists, and set a traveling state of the virtual moving body; and
a danger prediction unit configured to determine whether or not the own vehicle will collide with the virtual moving body to predict whether or not the own vehicle will collide with a vehicle that exits from the blind area,
wherein the blind area judgment unit is configured to judge whether the blind area exists and determine a start boundary of the blind area on a self-vehicle side based on map information stored in advance and sensor data of the sensor of the self-vehicle, and
the blind area determination unit is configured to determine whether a blind area exists in a road ahead of the vehicle by comparing a full road shape of the road ahead of the vehicle determined based on the map information and a road condition ahead of the vehicle determined based on the sensor data.
2. The driving assistance system according to claim 1,
the virtual mobile body is set to have a historical average traffic speed of the road section where the blind area is located.
3. The driving assistance system according to claim 2,
the historical average traffic speed is a historical average traffic speed for a current time period of the day.
4. The driving assistance system according to claim 1,
the virtual mobile unit setting unit is configured to assume the virtual mobile unit only in a case where a historical average traffic flow per unit time of a road segment where the blind area is located is larger than a predetermined threshold value.
5. The driving assistance system according to claim 4, wherein,
the historical average traffic flow per unit time is the historical average traffic flow per unit time for the current time period of the day.
6. The driving assistance system according to claim 2,
the virtual moving body setting unit is configured to acquire a historical average traffic speed of a road segment where the blind area is located from an external apparatus.
7. The driving assistance system according to claim 4, wherein,
the virtual moving body setting unit is configured to acquire a historical average traffic volume per unit time of a road segment where the blind area is located from an external apparatus.
8. The driving assistance system according to claim 6 or 7, wherein,
the external device is configured to acquire from a number of vehicles within a road network, within a predetermined length of time in the past, position, speed and direction of movement data thereof, and sensor data comprising data relating to the position, speed and direction of movement of traffic participants around the respective vehicle, and
the external device is configured to calculate a historical average traffic flow speed and a historical average traffic flow per unit time of a road section where the blind area is located based on the acquired data.
9. The driving assistance system according to claim 1,
the risk prediction unit is configured to calculate a collision time of the host vehicle and the virtual mobile body based on a traveling state of the host vehicle and a traveling state of the virtual mobile body, and determine a collision risk level based on the collision time.
10. The driving assistance system according to claim 9, further comprising:
a driving assistance unit configured to output a driving assistance instruction based on the determined collision risk level, the driving assistance instruction including at least one of:
outputting a warning to a driver of the host vehicle;
outputting a warning to an external environment of the host vehicle;
decelerating, running or stopping the own vehicle at a predetermined speed; and
the vehicle is steered.
11. A vehicle on which the driving assistance system according to any one of claims 1 to 10 is mounted or applied.
12. A driving assistance method comprising the steps of:
judging whether a blind area exists in a road in front of the vehicle, wherein the blind area is a road area which cannot be detected by a sensor of the vehicle due to the existence of a sheltering object;
assuming a virtual moving body in the blind area in the case where the blind area exists, and setting a traveling state of the virtual moving body; and
determining whether or not the own vehicle will collide with the virtual moving body to predict whether or not the own vehicle will collide with a vehicle that has exited from the blind area,
wherein it is judged whether or not the blind area exists and a start boundary of the blind area is determined based on pre-stored map information and sensor data of the sensor of the own vehicle, and
determining whether a blind area exists in a road ahead of the vehicle by comparing a complete road shape of the road ahead of the vehicle determined based on the map information and a road condition ahead of the vehicle determined based on the sensor data.
13. The driving assistance method according to claim 12, wherein,
setting the traveling state of the virtual moving body to have a historical average traffic speed of the road section where the blind area is located.
14. The driving assistance method according to claim 13, wherein,
the historical average traffic speed is a historical average traffic speed for a current time period of the day.
15. The driving assistance method according to claim 12, wherein,
and assuming the virtual moving body only when the historical average traffic flow per unit time of the road section where the blind area is located is larger than a preset value.
16. The driving assistance method according to claim 15, wherein,
the historical average traffic flow per unit time is the historical average traffic flow per unit time for the current time period of the day.
17. The driving assistance method according to claim 13, further comprising:
and acquiring the historical average traffic flow speed of the road section where the blind area is located from external equipment.
18. The driving assistance method according to claim 15, further comprising:
and acquiring the historical average traffic flow per unit time of the road section where the blind area is located from external equipment.
19. The driving assistance method according to claim 17 or 18, wherein,
the external device is configured to acquire from a number of vehicles within a road network, within a predetermined length of time in the past, position, speed and direction data thereof, and sensor data comprising data relating to the position, speed and direction of traffic participants around the respective vehicle, and
the external device is configured to calculate a historical average traffic flow speed and a historical average traffic flow per unit time of a road section where the blind area is located based on the acquired data.
20. The driving assistance method according to claim 12, further comprising:
the collision time of the host vehicle and the virtual mobile body is calculated based on the traveling state of the host vehicle and the traveling state of the virtual mobile body, and a collision risk level is determined based on the collision time.
21. The driving assistance method according to claim 20, further comprising:
outputting a driving assistance instruction based on the determined collision risk level, the driving assistance instruction including at least one of:
outputting a warning to a driver of the host vehicle;
outputting a warning to an external environment of the host vehicle;
decelerating, running or stopping the own vehicle at a predetermined speed; and
the vehicle is steered.
CN201810539552.8A 2018-05-30 2018-05-30 Driving assistance system and method Active CN110550105B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810539552.8A CN110550105B (en) 2018-05-30 2018-05-30 Driving assistance system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810539552.8A CN110550105B (en) 2018-05-30 2018-05-30 Driving assistance system and method

Publications (2)

Publication Number Publication Date
CN110550105A CN110550105A (en) 2019-12-10
CN110550105B true CN110550105B (en) 2022-04-15

Family

ID=68733717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810539552.8A Active CN110550105B (en) 2018-05-30 2018-05-30 Driving assistance system and method

Country Status (1)

Country Link
CN (1) CN110550105B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113147751A (en) * 2020-01-06 2021-07-23 奥迪股份公司 Driving assistance system, method and readable storage medium for vehicle
CN111572538B (en) * 2020-04-27 2023-11-07 腾讯科技(深圳)有限公司 Method and device for determining vehicle collision early warning threshold
JP7268669B2 (en) * 2020-11-30 2023-05-08 トヨタ自動車株式会社 alert system
CN112526994A (en) * 2020-12-01 2021-03-19 广州小鹏自动驾驶科技有限公司 Data processing method and device
CN116184992A (en) * 2021-11-29 2023-05-30 上海商汤临港智能科技有限公司 Vehicle control method, device, electronic equipment and storage medium
CN115134491B (en) * 2022-05-27 2023-11-24 深圳市有方科技股份有限公司 Image processing method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260217A (en) * 2005-03-17 2006-09-28 Advics:Kk Traveling support device for vehicle
JP2008293099A (en) * 2007-05-22 2008-12-04 Mazda Motor Corp Driving support device for vehicle
CN103703496A (en) * 2011-08-10 2014-04-02 丰田自动车株式会社 Driving assistance device
CN103748622A (en) * 2011-08-10 2014-04-23 丰田自动车株式会社 Driving assistance device
CN104169136A (en) * 2012-03-15 2014-11-26 丰田自动车株式会社 Drive assist device
WO2017077598A1 (en) * 2015-11-04 2017-05-11 日産自動車株式会社 Autonomous vehicle operation apparatus and autonomous vehicle operation method
CN109969191A (en) * 2017-12-28 2019-07-05 奥迪股份公司 Driving assistance system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260217A (en) * 2005-03-17 2006-09-28 Advics:Kk Traveling support device for vehicle
JP2008293099A (en) * 2007-05-22 2008-12-04 Mazda Motor Corp Driving support device for vehicle
CN103703496A (en) * 2011-08-10 2014-04-02 丰田自动车株式会社 Driving assistance device
CN103748622A (en) * 2011-08-10 2014-04-23 丰田自动车株式会社 Driving assistance device
CN104169136A (en) * 2012-03-15 2014-11-26 丰田自动车株式会社 Drive assist device
WO2017077598A1 (en) * 2015-11-04 2017-05-11 日産自動車株式会社 Autonomous vehicle operation apparatus and autonomous vehicle operation method
CN109969191A (en) * 2017-12-28 2019-07-05 奥迪股份公司 Driving assistance system and method

Also Published As

Publication number Publication date
CN110550105A (en) 2019-12-10

Similar Documents

Publication Publication Date Title
JP7186352B2 (en) Automatic driving control device and vehicle
CN110550105B (en) Driving assistance system and method
CN108778879B (en) Vehicle control system, vehicle control method, and storage medium
US9981658B2 (en) Autonomous driving vehicle system
JP6206595B2 (en) Travel control device and travel control method
CN110036426B (en) Control device and control method
JP6711329B2 (en) Driving support device
JP2010083314A (en) Driving support device for vehicle
JP7017443B2 (en) Vehicle control devices, vehicle control methods, and programs
JP2006151114A (en) Driving support device
CN109969191B (en) Driving assistance system and method
JP2016002893A (en) Travel control device of vehicle
JP2019156269A (en) Vehicle controller, vehicle control method and program
JP2018090063A (en) Vehicle control system
JP7145178B2 (en) Travel control device, travel control method and program
JP2022060075A (en) Drive support device
CN110871814A (en) Driving assistance system and driving assistance method
CN115223397B (en) Traffic system
WO2022162909A1 (en) Display control device and display control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant