JP6479130B1 - Vehicle travel support device - Google Patents

Vehicle travel support device Download PDF

Info

Publication number
JP6479130B1
JP6479130B1 JP2017197372A JP2017197372A JP6479130B1 JP 6479130 B1 JP6479130 B1 JP 6479130B1 JP 2017197372 A JP2017197372 A JP 2017197372A JP 2017197372 A JP2017197372 A JP 2017197372A JP 6479130 B1 JP6479130 B1 JP 6479130B1
Authority
JP
Japan
Prior art keywords
obstacle
position information
vehicle
position
reliability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017197372A
Other languages
Japanese (ja)
Other versions
JP2019070986A (en
Inventor
考平 森
考平 森
宏樹 藤好
宏樹 藤好
哲司 羽下
哲司 羽下
拓人 矢野
拓人 矢野
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2017197372A priority Critical patent/JP6479130B1/en
Application granted granted Critical
Publication of JP6479130B1 publication Critical patent/JP6479130B1/en
Publication of JP2019070986A publication Critical patent/JP2019070986A/en
Application status is Active legal-status Critical

Links

Images

Abstract

An object of the present invention is to provide a vehicular travel support apparatus capable of reliably detecting an obstacle.
A first obstacle detection unit that outputs obstacle detection position information, a second obstacle detection unit that outputs obstacle size information, a storage unit that stores obstacle position information, and an estimated obstacle position Estimated obstacle position calculation unit that calculates information, own vehicle route calculation unit, obstacle determination unit that determines whether or not the obstacle contacts the vehicle, and collision that calculates the collision time until the obstacle collides with the vehicle And a target deceleration calculation unit that calculates a target deceleration based on the time calculation unit and the collision time. The second obstacle detection unit inputs the obstacle size information to the first obstacle detection unit, and The first obstacle detection unit uses the obstacle size information to determine whether or not the plurality of obstacle detection position information is a single obstacle when a plurality of obstacle detection position information is detected. If it is a single obstacle, the center coordinates of multiple obstacle detection position information And outputs as a harmful substance detection position information.
[Selection] Figure 5

Description

  The present invention relates to a vehicular driving support apparatus that supports driving of a driver, and relates to a vehicular driving support apparatus that detects an obstacle in a driving route of the host vehicle and gives a warning to the driver.

  In recent years, Patent Document 1 discloses a technique for preventing a collision with a preceding vehicle by appropriately maintaining a vehicle interval between the host vehicle and a preceding vehicle as a preventive safety system for preventing an automobile accident. In Patent Document 1, a preceding vehicle is detected based on a stereo image. However, in Patent Document 2, an ultrasonic wave is rebounded from an obstacle by transmitting an ultrasonic wave in the traveling direction of the vehicle by a sonar sensor using ultrasonic waves. Has been disclosed that detects the position of an obstacle based on a time difference from transmission to reception of ultrasonic waves.

Japanese Patent No. 3805832 Japanese Patent No. 3788109

  When obstacles are detected using a plurality of sonar sensors, there are problems that recognition accuracy is lowered and obstacles cannot be detected for obstacles with complicated shapes.

  The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a vehicular travel support apparatus capable of reliably detecting an obstacle.

The vehicle travel support apparatus according to the present invention detects a obstacle around the vehicle by a plurality of sonar sensors mounted on the vehicle, and outputs the obstacle detection position as obstacle detection position information. A second obstacle that captures an image of the obstacle by the obstacle detection unit and an imaging device, detects the size and approximate position of the obstacle based on the image, and outputs the detected obstacle size information. An object detection unit and a storage unit that periodically checks the relative position of the obstacle with respect to the vehicle based on the obstacle detection position information, and stores the confirmed relative position one cycle before as obstacle position information. An estimated obstacle position that is a position where the obstacle is estimated to be present due to movement of the vehicle based on the obstacle position information stored in the storage unit, and the estimated obstacle position Regulating reliability An estimated obstacle position calculating unit that calculates estimated obstacle position information including an obstacle reliability to be calculated, a host vehicle route calculating unit that calculates a route along which the vehicle travels, and the obstacle reliability is a predetermined threshold value When it is determined that the obstacle exists on the route of the vehicle based on the obstacle position information, the obstacle is determined to be in contact with the vehicle, and the obstacle contact determination is performed. As a result, based on the obstacle determination unit that outputs the obstacle reliability and the obstacle position information together with the obstacle reliability, the obstacle position information, and the obstacle contact determination result, the obstacle is the vehicle. A collision time calculation unit that calculates a collision time that is an expected time until the vehicle collides with, and a target deceleration calculation unit that calculates a target deceleration that is a deceleration for decelerating the vehicle based on the collision time. Comprising A vehicle travel support device that controls a braking device that brakes the vehicle based on a deceleration, wherein the second obstacle detection unit uses the obstacle size information as the first obstacle detection unit. The first obstacle detection unit uses the obstacle size information detected by the second obstacle detection unit when a plurality of obstacle detection position information is detected. It is determined whether or not a plurality of obstacle detection position information is a single obstacle. Output as information.

  According to the vehicle travel support device, an obstacle can be reliably detected even for an obstacle having a complicated shape.

It is a figure explaining the method of pinpointing the position of an obstruction using a sonar sensor. It is a figure explaining the method of pinpointing the position of an obstruction using a sonar sensor. It is a figure explaining the method of pinpointing the position of an obstruction using a sonar sensor. It is the schematic which shows the structure of the vehicle carrying the driving assistance device for vehicles which concerns on this invention. It is a functional block diagram which shows the structure of the driving assistance device for vehicles of Embodiment 1 which concerns on this invention. It is a figure explaining the case where a complicated obstacle is detected by a plurality of sonar sensors. It is a figure explaining the case where a complicated obstacle is detected using a camera. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 1 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 1 which concerns on this invention. It is a figure which shows the positional relationship of the vehicle and obstacle position information of 1 period before. It is a figure which shows the positional relationship of the vehicle and obstruction position information at the present time on the basis of the position of the vehicle before 1 period. It is a figure which shows the positional relationship of the vehicle and obstacle position information at the present time on the basis of the position of the vehicle at the present time. It is a figure which shows the path | route of the vehicle at the time of going straight. It is a figure which shows the path | route of the vehicle at the time of turning. It is a figure which shows the positional relationship of the collision obstacle with respect to the vehicle at the time of a straight drive and turning, and a non-collision obstacle. It is a figure which shows the relationship between the shortest collision time and target deceleration. It is a functional block diagram which shows the structure of the driving assistance device for vehicles of Embodiment 2 which concerns on this invention. It is a functional block diagram which shows the structure of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows the modification of operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows the modification of operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a flowchart which shows the modification of operation | movement of the driving assistance device for vehicles of Embodiment 3 which concerns on this invention. It is a functional block diagram which shows the structure of the driving assistance device for vehicles of Embodiment 4 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 4 which concerns on this invention. It is a flowchart which shows operation | movement of the driving assistance device for vehicles of Embodiment 4 which concerns on this invention. It is a figure which shows the relationship between obstacle recognition information and an obstacle control determination threshold value.

<Introduction>
Prior to the description of the embodiment, a method for identifying the position of an obstacle using a sonar sensor will be described with reference to FIGS. When the ultrasonic SW is transmitted to the obstacle OB using the plurality of sonar sensors SS1 to SS4 mounted on the vehicle VC as shown in FIG. 1, the transmitted sensor itself detects the ultrasonic wave, and the transmitted sensor In some cases, ultrasonic waves are detected by a sensor different from the above. The former is called direct reception and the latter is called indirect reception. In the case of direct reception, for example, the ultrasonic wave SW transmitted from the sonar sensor SS3 is reflected by the obstacle OB and received by the sonar sensor SS3 itself. In the case of direct reception, the distance between the sonar sensor SS3 and the obstacle OB is obtained by halving the distance obtained by dividing the time from transmission of ultrasonic waves to reception by the speed of sound in consideration of the forward path and the backward path. Can do.

  On the other hand, the ultrasonic wave SW transmitted from the sonar sensor SS3 is reflected by the obstacle OB and also received by the sonar sensor SS2, for example. In the case of such indirect reception, the transmitted sensor and the received sensor are different, so the distance to the obstacle OB cannot be obtained simply by dividing by the sound speed and halving it. By subtracting the distance obtained by reception, the distance to the obstacle OB is obtained.

  Indirect reception can be received by the sonar sensors SS1 and SS4 in addition to being received by the sonar sensor SS2, and a plurality of distances to the obstacle OB are measured for one transmission. Thus, when the distance from each sonar sensor to the obstacle OB is obtained, each arc can be drawn based on the position of the sonar sensor. A plurality of arcs centered on the position of the sonar sensor have intersections, so-called two-circle intersections, and the positions of these intersections become obstacle positions.

  Here, when the shape of the obstacle OB is small, the position of the obstacle OB obtained at the intersection of the two circles indicates a relatively collective position and is recognized as one obstacle as shown by a cross in FIG. However, in the case of an obstacle OBL having a complicated shape as shown in FIG. 3, the position of the obstacle OBL obtained at the intersection of the two circles is dispersed over a wide range as shown by a cross in FIG. Depending on the situation, there may be a plurality of intersections (intersection group). Especially when the shape of the obstacle is complicated and the reflection surface of the sound wave is made of polyhedron, or depending on the location of the obstacle, it is made of a material that absorbs the sound wave, and the reflectance of the sound wave is uniform depending on the location It is likely to occur when it is not. In such a case, the sonar sensor may recognize that there are a plurality of obstacles, and may recognize that there are a plurality of different obstacles for each intersection group.

  Further, in the case of the obstacle OBL having a complicated shape as shown in FIG. 3, when the vehicle VC moves, the surface where the ultrasonic wave bounces is changed due to the change in the position of the sonar sensor and the obstacle OBL, which has been detected so far. In some cases, the intersection of two circles on the other surface disappears, a two-circle intersection on a different surface is newly formed, and a new group of intersections is formed. In such a case, when the tracking process (tracking process) is performed based on the movement of the vehicle VC and the reliability is calculated based on the correlation between the tracking result and the detection position, the reliability suddenly decreases. In some cases, it may be difficult to ensure the reliability required for control. In such a situation, a plurality of detections are required to ensure the reliability required for the control, and as a result, the obstacle recognition timing may be delayed.

<Vehicle configuration>
FIG. 4 is a schematic diagram showing the configuration of a vehicle VC equipped with the vehicle travel support apparatus according to the present invention. The vehicle VC includes a sonar sensor 2, a camera 3 (imaging device), a brake 4, and a vehicle travel support device 1. A plurality of sonar sensors 2 are installed in front of and behind the vehicle, and they are connected to a sonar controller 9 via sonar sensor wiring 8. In FIG. 4, four sonar sensors 2 are arranged at the front and rear of the vehicle. However, the sonar sensors 2 may be arranged at the left and right in addition to the front and rear, and the measurement required by the sonar sensor group depending on the size of the vehicle VC. If the area is filled, it may be two to three.

  A plurality of cameras 3 are installed on the front and rear and on the left and right of the vehicle, and they are connected to the periphery monitoring camera controller 10 via the camera wiring 7. In FIG. 4, the cameras 3 are arranged one by one on the front, rear, left and right sides of the vehicle. However, the present invention is not limited to this. You may make it attach. In FIG. 4, the left and right cameras 3 are installed at the lower part of the door mirror, and the front and rear are respectively installed at the center of the bumper. However, the present invention is not limited to this.

  In addition to the sonar controller 9 and the peripheral monitoring camera controller 10, the vehicle travel support device 1 includes other sensors 11, an arithmetic device 12, and a brake control device 13, each of which is, for example, a CAN (Control Area Network). ) And the like via a communication network 5.

  The brake control device 13 is finally connected to the brake 4 installed on each wheel using the hydraulic pipe 6, and the brake 4 can draw a brake on the vehicle VC according to a command from the brake control device 13. 4 shows a hydraulic brake constituted by the brake 4, the brake control device 13, and the hydraulic pipe 6. However, the present invention is not limited to this configuration. For example, an electric vehicle (EV) driven by a motor, an engine, In HEV (Hybrid Electric Vehicle) or PHEV (Plug-in Hybrid Electric Vehicle) driven by a motor, motor deceleration regeneration may be used for braking.

  Hereinafter, each embodiment of the present invention will be described with reference to the drawings. In addition, the same code | symbol is attached | subjected and demonstrated about the same or equivalent member and site | part in each figure. In the following, as Embodiments 1 to 4, vehicle driving support apparatuses 100 to 400 are shown, which correspond to the vehicle driving support apparatus 1 shown in FIG. 4.

<Embodiment 1>
<Device configuration>
FIG. 5 is a functional block diagram showing the configuration of the vehicle travel support apparatus 100 according to the first embodiment of the present invention. As illustrated in FIG. 5, the vehicle travel support apparatus 100 includes a first obstacle detection unit 101, a second obstacle detection unit 102, a vehicle state detection unit 103, a vehicle motion calculation unit 104, and an estimated obstacle position calculation unit. 105, obstacle reliability calculation unit 106, obstacle position information correction unit 107, obstacle storage determination unit 108, obstacle storage unit 109, own vehicle route calculation unit 110, obstacle determination unit 111, collision time calculation unit 112, A target deceleration calculation unit 113 and a braking device 114 are provided.

  As shown in FIG. 5, the output of the second obstacle detection unit 102 is input to the first obstacle detection unit 101, and the output of the first obstacle detection unit 101 is the obstacle reliability calculation unit 106. The obstacle position information correction unit 107 and the obstacle storage determination unit 108 are input, and the output of the obstacle reliability calculation unit 106 is input to the obstacle position information correction unit 107. The output of the obstacle position information correction unit 107 is input to the obstacle storage determination unit 108 and the obstacle determination unit 111. The output of the obstacle storage determination unit 108 is input to the obstacle storage unit 109, and the output of the obstacle storage unit 109 is input to the estimated obstacle position calculation unit 105.

  The output of the vehicle state detection unit 103 is input to the vehicle motion calculation unit 104, the obstacle storage determination unit 108, the own vehicle route calculation unit 110, and the collision time calculation unit 112. The output of the vehicle motion calculation unit 104 is input to the estimated obstacle position calculation unit 105, and the output of the estimated obstacle position calculation unit 105 is input to the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107. .

  Further, the output of the vehicle route calculation unit 110 is input to the obstacle determination unit 111, and the output of the obstacle determination unit 111 is input to the collision time calculation unit 112. The output of the collision time calculation unit 112 is input to the target deceleration calculation unit 113, and the output of the target deceleration calculation unit 113 is input to the braking device 114.

  The first obstacle detection unit 101 includes a plurality of sonar sensors 2 and a sonar controller 9 shown in FIG. 4 and a sonar sensor wiring 8 that connects them. As described with reference to FIG. 1, the method of detecting the position of an obstacle using a plurality of sonar sensors 2 is a method of drawing an intersection of two circles based on a measurement distance to an obstacle obtained by direct reception and indirect reception. Use.

  The second obstacle detection unit 102 includes the plurality of cameras 3 and the peripheral monitoring camera controller 10 shown in FIG. 4 and the camera wiring 7 that connects them. The camera 3 can acquire the size and approximate position of the obstacle.

  Here, the difference between the obstacle position detected by the first obstacle detection unit 101 and the obstacle position detected by the second obstacle detection unit 102 will be described. The position detection by the sonar sensor 2 used in the first obstacle detection unit 101 is obtained from the intersection of two circles by the plurality of sonar sensors 2, and the distance between the sensor constituting the two-circle intersection and the obstacle is an accuracy of several centimeters or less. Can be measured. If this is an obstacle with a constant temperature and humidity in the air and high reflectivity, the factors that affect the distance between the sonar sensor 2 and the obstacle are from transmission to reception of ultrasonic waves. It is only time, and the resolution of the measurement distance is proportional to the resolution of the measurement time and is related to the clock of the sonar controller 9. At present, the clock of a general microcomputer moves at the unit of several MHz at the latest, and in the case of a small obstacle OB as shown in FIG. 2, it is possible to measure a distance of several centimeters using the sonar sensor 2. No problem at all. However, in the position detection using the sonar sensor 2, in the case of the obstacle OBL having a complicated shape as shown in FIG. 3, if a plurality of two-circle intersections are obtained by a plurality of sensors, the intersection points are dispersed. is there. This is because each intersection point indicates the position (surface position) of the obstacle almost accurately, but the reflection intensity varies depending on the reflection surface of the obstacle.

  FIG. 6 illustrates a case in which a bicycle is detected as an obstacle OBL by a plurality of sonar sensors 2, but in the case of such an obstacle OBL, a group of intersections of two-circle intersections is distributed as A to C. However, if processing is simply performed using point clouds, each point cloud is recognized as an obstacle.

  On the other hand, in the position detection by the camera 3 used in the second obstacle detection unit 102, the obstacle monitoring camera controller 10 recognizes the obstacle based on the image photographed by the camera 3, and obtains the position of the obstacle. There are various obstacle recognition methods.For example, a plurality of images are taken at a certain interval, the difference between the images is obtained, the position where the obstacle exists, There is a method of recognizing the size itself as the size of an obstacle. There is also a method of recognizing an obstacle based on a difference between surrounding colors, for example, a road surface color and an obstacle color. In recent years, a method of performing machine learning using a multilayer neural network in advance and recognizing an obstacle using the result, an obstacle recognition method using so-called deep learning has been developed. In the present invention, since these well-known obstacle recognition methods can be used, detailed description thereof is omitted.

  As a feature of obstacle recognition using a camera, the size and range of an obstacle can be recognized, but it is difficult to obtain accuracy with respect to position information, particularly a distance in the front-rear direction with respect to the camera. The reason is that the camera used for obstacle recognition mainly uses a wide-angle lens to monitor the surroundings, and the distortion of the camera is large except for the center of the camera angle of view. Because the number is small, depending on the position of the obstacle, the element spacing of the image sensor becomes wide, and in real space it corresponds to a range of several tens to 100 cm, and the position accuracy is lowered by the obstacle being located in this range To do.

  Therefore, in the present invention, the position information measured by the second obstacle detection unit 102 is referred to as approximate position information in the sense that the accuracy is lower than the position information measured by the first obstacle detection unit 101. Further, unlike the sonar sensor 2, the measurement using the camera 3 shoots an image by reflection of natural light or artificial light emitted by light or illumination, and therefore greatly depends on the environment. In particular, it has the characteristic that it is difficult to use as an obstacle recognition sensor at night.

  Since the position information of the obstacle detected by the first obstacle detection unit 101 and the approximate position information of the obstacle detected by the second obstacle detection unit 102 are fixed to the vehicle, the vehicle Move with the move. Therefore, the position information output from each sensor is the relative position of the obstacle with respect to the vehicle. In the first embodiment, the position of the obstacle is handled as a relative position with respect to the vehicle, and the vehicle is handled as a fixed point. Therefore, even if the obstacle is fixed on the ground, the vehicle moves, so that the position information As the obstacle moves, the vehicle approaches or moves away from the vehicle.

  The obstacle size and the approximate position information detected by the second obstacle detection unit 102 are input to the first obstacle detection unit 101 as obstacle size information. The first obstacle detection unit 101 compares a plurality of obstacle position detection information detected by itself with the obstacle size information detected by the second obstacle detection unit 102 to obtain a single obstacle. If it is determined that there is, the center coordinates of the plurality of obstacle detection position information detected by itself are output as new obstacle detection position information.

  Specifically, the first obstacle detection unit 101 includes the intersection points A to C shown in FIG. 6 detected by itself, the approximate position information of the obstacle detected by the second obstacle detection unit 102, and the obstacle. Based on the size information, the points included in the range of the obstacle size information from the intersection groups A to C are extracted, and the center coordinates of these intersection groups are output as the obstacle detection position information. FIG. 7 is a conceptual diagram showing this process, and shows the obstacle size information OBX detected by the camera 3 of the second obstacle detection unit 102 and the center coordinates CC of the intersection group.

  Here, even if the second obstacle detection unit 102 outputs the obstacle size information, if there is no corresponding detection information in the first obstacle detection unit 101, the first obstacle is output. The obstacle detection position information is not output from the detection unit 101. On the other hand, when the obstacle size information corresponding to the obstacle detected by the first obstacle detection unit 101 is not input from the second obstacle detection unit 102, the first obstacle detection unit 101 The detection information detected in step 1 is output as obstacle detection position information. That is, in the case as shown in FIG. 6, even if it is actually a single obstacle, it is output that three obstacles are detected according to the three intersection groups A to C measured by the plurality of sonar sensors 2. .

  The vehicle state detection unit 103 is a sensor that detects the vehicle state of the vehicle VC, and includes the other sensors 11 shown in FIG. Examples of the vehicle state quantity detected by the vehicle state detection unit 103 include vehicle speed, steering wheel angle, shift information, brake information, and yaw rate. In addition to these, in order to actually perform emergency braking with high accuracy, information on longitudinal acceleration, lateral acceleration, accelerator pedal opening, and engine speed is necessary to determine a slope or the like. The description is omitted because it is not an element that affects the operation and effect of.

  Among the functional blocks shown in FIG. 5, the vehicle motion calculation unit 104, the estimated obstacle position calculation unit 105, the obstacle reliability calculation unit 106, the obstacle position information correction unit 107, the obstacle storage determination unit 108, and the obstacle storage unit 109, the own vehicle route calculation unit 110, the obstacle determination unit 111, the collision time calculation unit 112, and the target deceleration calculation unit 113 are realized by the calculation device 12 shown in FIG. The arithmetic device 12 shown in FIG. 4 has a memory (not shown) that stores signals input to and output from the arithmetic device 12, intermediate values of arithmetic operations, and recorded values of the obstacle storage unit. The functional block performs processing based on the information in the memory.

<Operation>
The operation of the functional blocks realized by the arithmetic device 12 described above will be described using the flowcharts shown in FIGS. Note that the symbols (A) to (C) in FIG. 8 and the symbols (A) to (C) in FIG. 9 are connected to each other. Further, the processing of the flowcharts shown in FIGS. 8 and 9 is repeated in the arithmetic unit 12 at a cycle of 10 to 20 msec. Hereinafter, this cycle is referred to as a calculation cycle.

  As shown in FIGS. 8 and 9, the vehicle travel support apparatus 100 first stores the obstacle storage determination unit 108 in the obstacle storage unit 109 based on the vehicle state information output from the vehicle state detection unit 103. It is determined whether or not a condition for erasing all stored obstacle detection position information one cycle before is satisfied (step S101). For example, when a shift switching is performed for the purpose of moving forward or backward by an operation of the driver of the vehicle VC, when a vehicle speed exceeds a certain speed, a certain time has elapsed since the vehicle VC stopped. Cases.

  The reason for erasing the obstacle storage unit 109 due to the shift state is to cope with a case where the target obstacle that needs to be subjected to emergency brake control changes before and after the vehicle VC due to a change in the moving direction of the vehicle VC. Further, the reason for erasing the obstacle storage unit 109 when the vehicle speed exceeds a certain speed is to cope with the case where the obstacle measurement by the first obstacle detection unit 101 (sonar sensor) becomes difficult. It is. Further, the reason why the obstacle storage unit 109 is deleted when a certain time has elapsed since the vehicle VC has stopped is a case where the obstacle is a person or a vehicle other than the case where the obstacle is fixed on the ground. This is to cope with a case where an obstacle moves after a certain time has elapsed since the vehicle stopped. The conditions for erasing the obstacle storage unit 109 are not limited to these, and other requirements may be added or existing requirements may be deleted.

  If it is determined in step S101 that all the obstacle detection position information of the previous cycle stored in the obstacle storage unit 109 may be deleted (in the case of Yes), the obstacle storage unit 109 is checked in step S102. All stored obstacle detection position information of the previous cycle is erased. On the other hand, when it is determined in step S101 not to delete the obstacle detection position information of the previous cycle stored in the obstacle storage unit 109 (in the case of No), the process proceeds to step S103.

  Next, the matching flag is set to an unmatched state for all obstacle detection position information detected by the first obstacle detection unit 101 in step S103. When the matching flag of the obstacle detection position information is in the matching state, it indicates that the estimated obstacle position information output from the estimated obstacle position calculation unit 105 has already been matched at the time of the processing, The unmatched state indicates that it is not yet associated with the estimated obstacle position information. In S103, the matching flags of all the obstacle detection position information detected by the first obstacle detection unit 101 are set to an unmatched state in the matching process to be performed in the future. The process of step S103 is executed by the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107.

  After executing the processing of step S103, the movement amount of the vehicle VC per calculation cycle of the vehicle travel support device 100 using the vehicle speed and steering wheel angle information detected by the vehicle state detection unit 103 in the vehicle motion calculation unit 104. Is calculated (step S104). The moving amount of the vehicle VC is defined by a moving distance Lsamp in the traveling direction of the vehicle per calculation cycle and a turning angle Yawsamp in the turning direction of the vehicle VC per calculation cycle. Formulas for obtaining each are expressed by the following formulas (1) and (2).

  In the above formulas (1) and (2), Vel represents the vehicle speed of the vehicle VC, Yawrate represents the rotational speed about the vertical direction of the vehicle, and Tsamp represents the calculation cycle of the vehicle travel support apparatus 100.

  The vehicle motion calculation unit 104 calculates the vehicle travel distance Lsamp in the traveling direction of the vehicle per calculation cycle obtained by the above formulas (1) and (2) and the turning angle Yawsamp in the turning direction of the vehicle per calculation cycle. Output as.

  After executing the processing of step S104, the estimated obstacle position matching processing loop S1L1 is started. The estimated obstacle position matching processing loop is a loop for sequentially performing the processes of steps S105 to S117 on all obstacle detection position information of one cycle before stored in the obstacle storage unit 109. Note that when all the obstacle detection position information of the previous cycle stored in the obstacle storage unit 109 in step S102 is erased, the obstacle detection position information is 0, so the estimated obstacle position matching process It does not enter the loop S1L1, but proceeds to the subsequent obstacle detection position information storage processing loop S1L2. Illustration of this route is omitted.

  In the estimated obstacle position matching treatment loop S1L1, first, the estimated obstacle position calculation unit 105 calculates the estimated obstacle position (step S105). In this calculation, the obstacle detection position information of the previous cycle is the vehicle VC based on the obstacle position information of the previous cycle stored in the obstacle storage unit 109 and the vehicle movement amount calculated in step S104. The estimated obstacle position estimated to be moving at the present time is calculated by the movement.

  Here, calculation processing of the estimated obstacle position in the estimated obstacle position calculation unit 105 will be described. FIG. 10 illustrates the position O of the vehicle VC one cycle before and the position of the obstacle in the obstacle position information Pa, Pb, Pc one cycle before stored in the obstacle storage unit 109. The position O indicates the center of the rear wheel axle as a point O. In the estimated obstacle position matching treatment loop S1L1 of the present embodiment, the obstacle position information Pa shown in FIG. 10 is performed because loop processing is performed for each piece of obstacle position information stored in the obstacle storage unit 109. , Pb, and Pc, the obstacle position information Pa is processed in the first loop process, and the obstacle position information Pb is processed in the second loop process. In the first embodiment, the location of the point O indicating the position of the vehicle is the center of the rear wheel axle, but is not limited to this. Moreover, although the front direction of the vehicle VC is the x-axis positive direction and the left direction is the y-axis positive direction, it is not limited to this. The coordinate system will be described later.

  Next, FIG. 11 shows a case where the vehicle VC advances from FIG. 10 in one calculation cycle. The vehicle VC at the position of the point O in FIG. 10 advances to the position of the point O ′ during one calculation cycle as shown in FIG. Specifically, the position of the vehicle VC moves from point O by Lsamp × cos (Yaw_samp) in the x-axis direction and Lsamp × sin (Yawsamp) in the y-axis direction, and turns at the angle Yawsamp. This is expressed by the following formula (3).

  In the above equation (3), Lsamp is the moving distance of the vehicle VC in the traveling direction per calculation cycle, Yasamp is the turning angle of the turning direction of the vehicle per calculation cycle, and (Ox, Oy, Oθ) is the current vehicle position O. The coordinates shown, (O′x, O′y, O′θ) respectively represent coordinates indicating the vehicle position O ′ after one cycle.

  Formula (3) shows the vehicle position O ′ after one cycle estimated from the motion of the vehicle VC with respect to the vehicle position O at the present time, and these are based on a certain coordinate point on the ground. O and O ′ are defined. Such a coordinate system is called a “coordinate system fixed on the ground”. However, as described above, the position information and the approximate position information of the obstacle detected by the first obstacle detection unit 101 and the second obstacle detection unit 102 are measured as relative positions with respect to the vehicle. . Therefore, when the vehicle positions O and O ′ are converted to a so-called “coordinate system fixed to the vehicle” with the coordinate origin, the obstacle fixed on the ground moves relatively.

  The relative movement of such an obstacle is shown in FIG. As shown in FIG. 12, the position of the obstacle in the obstacle position information Pa, Pb, Pc one cycle before approaches the vehicle VC like points Pa ′, Pb ′, Pc ′. This is expressed by the following mathematical formula (4).

  In the above equation (4), (Pax, Pay) is the position of the obstacle Pa before one cycle, and (Pa'x, Pa'y) is a coordinate value indicating the position of the obstacle Pa 'at the present time. The relative position when the center of the rear wheel axle of the vehicle VC at the time point is the origin is shown. Although Equation (4) shows Paθ and Paθ ′ representing the inclination of the obstacle, Paθ = 0 is assumed because the inclination of the obstacle is ignored in the first embodiment. Paθ ′ is an intermediate value for obtaining Pa′x and Pa′y without using the calculated value. Therefore, formula (4) is organized into the following formula (5). In the calculation of the estimated obstacle position, the estimated obstacle position is sequentially calculated using Equation (5).

  Here, returning to the description of the flowchart, the estimated obstacle position calculation unit 105 determines whether or not the estimated obstacle position information is outside the range required for obstacle tracking (step S106), and the range required for obstacle tracking. When it determines with it being outside (in the case of Yes), the obstruction reliability of presumed obstacle position information is set to the minimum value in step S107. Note that a range is provided for the obstacle reliability of the estimated obstacle position information, and even if addition and subtraction are performed, the range is limited within the range, that is, the range between the minimum value and the maximum value.

  The reason for performing such processing will be described below. For example, in emergency brake control, when the vehicle is moving forward, specifically, when the vehicle is an automatic vehicle and the shift state is the D range, there is a possibility of collision with an obstacle behind the rear end of the vehicle. Low, no need to track. On the other hand, when the vehicle is moving backward, specifically, when the vehicle is an automatic vehicle and the shift state is the R range, it is not necessary to track an obstacle ahead of the front end of the vehicle.

  Also, in the left-right direction, it is less necessary to track an obstacle for an obstacle that is a certain distance from the left and right ends of the vehicle, for example, about 10 m. This is because it is difficult for the sonar sensor of the first obstacle detection unit 101 to detect such a distant object. As another reason, even if the vehicle travels at the maximum steering angle, it is difficult to reach the area of 10 m or more directly without switching forward and backward, and the possibility of colliding with a tracked obstacle is low. There is also a reason to say. The range of the obstacle tracking is not limited to the above-described example, and may be determined according to the memory capacity of the obstacle storage unit 109, the processing speed of the arithmetic device 12, and the characteristics of the vehicle.

  If the minimum value of the obstacle reliability in the first embodiment is set to 0 and the obstacle reliability is set to 0 in step S107, the process proceeds to step S117 and the estimated obstacle position information is deleted.

  On the other hand, in step S106, when the estimated obstacle position information is within the range required for obstacle tracking (in the case of No), the process proceeds to step S108.

  In step S <b> 108, it is determined whether the estimated obstacle position information is within the detection range of the first obstacle detection unit 101. In step S108, when it is determined that the obstacle position of the estimated obstacle is outside the detection range of the sonar sensor (in the case of No), the subsequent matching process is not performed and the process proceeds to step S117.

  This is because, when the obstacle position of the estimated obstacle is outside the detection range of the sonar sensor 2, the obstacle detection position information corresponding to the estimated obstacle is not detected, and the reliability is short in later calculation of the reliability. In such a case, the obstacle can be tracked based on the estimated obstacle position calculated in step S105. Yes.

  On the other hand, if it is determined in step S108 that the obstacle position of the estimated obstacle is within the detection range of the sonar sensor (Yes), the process proceeds to step 109.

  In step S109, matching processing of estimated obstacle position information and obstacle detection position information is performed. In the matching process, the estimated obstacle position information being processed in the estimated obstacle position matching treatment loop S1L1 is compared with all the obstacle detection position information in the unmatched state, and the estimated obstacle position information and the obstacle detection position information are compared. The straight line distance within the matching determination distance (predetermined determination distance) and the shortest straight line distance are determined as the obstacle detection position information to be matched with the estimated obstacle position information. Note that there is no unmatched obstacle position detection information for the estimated obstacle position information being processed, or there is an unmatched obstacle position within the matching determination distance from the estimated obstacle position information being processed. If the detection information does not exist, it is determined that there is no obstacle detection position information to be matched with the estimated obstacle position information being processed.

  The matching determination distance used in the present embodiment is a predetermined constant, and may be set based on the target maximum vehicle speed corresponding to the emergency brake control and the update period of the first obstacle detection unit 101. For example, if the target maximum vehicle speed is 10 km / h and the update cycle is 100 msec, it is assumed that the obstacle moves relatively by a distance of about 27 cm at the maximum during one update cycle of the obstacle detection position information. . Therefore, if the matching determination distance is set under the above conditions, it is preferable to set it to about 30 cm with a slight margin.

  After performing the matching process in step S109, it is determined whether there is obstacle detection position information to be matched with the estimated obstacle position information with respect to the matching result (step S110). In step S110, when it is determined that there is obstacle detection position information that should be matched with the estimated obstacle position information (in the case of Yes), the processes of steps S111 to S114 are performed to match the estimated obstacle position information. When it is determined that the obstacle detection position information does not exist (in the case of No), the processes of steps S115 and S116 are performed.

  In step S111, a predetermined specified value is added to the obstacle reliability of the estimated obstacle position information. However, if the obstacle reliability exceeds the maximum value by the addition operation of the reliability in step S111, the obstacle reliability is limited to the maximum value in step S112. Thereby, it can prevent that an obstruction reliability becomes large too much, and when the detected obstruction is a false detection, it can erase | eliminate from a control object at an early stage. Even when the detected obstacle moves and moves out of the detection range, it can be removed from the control target at an early stage.

  In step S113, the matching flag of the obstacle detection position information matched with the estimated obstacle position information in step S109 is set to a matched state. This processing prevents the already detected obstacle detection position information from being matched again with the estimated obstacle position information at the next loop in the process of step S109 at the next loop.

  Next, the obstacle position information correcting unit 107 uses the obstacle detection position information matched with the estimated obstacle position information and the obstacle reliability calculated in steps S111 and S112 to estimate the obstacle position information. And the corrected result is output as new obstacle position information after correction (step S114), and the process proceeds to step 117. The following equation (6) is used for position correction of the estimated obstacle position information.

  In the above equation (6), (xd2, yd2) is the obstacle position information output from the obstacle position information correction unit 107, and (xs1, ys1) is the obstacle position information correction from the first obstacle detection unit 101. Obstacle detection position information input to the unit 107, (xd1, yd1) is estimated obstacle position information input from the estimated obstacle position calculation unit 105 to the obstacle position information correction unit 107, and c is an obstacle reliability calculation unit 106. Obstacle reliability calculated in In the first embodiment, the obstacle position information is synthesized by increasing the ratio of the estimated obstacle position information to the obstacle detection position information according to the magnitude of the obstacle reliability, as shown in Equation (6). However, in order to obtain the effect of the present invention, the method is not necessarily limited to the method shown in Formula (6).

  If it is determined in step S110 that there is no obstacle detection position information that should be matched with the estimated obstacle position information, a predetermined specified value for the obstacle reliability of the estimated obstacle position information is determined in step S115. Is subtracted. However, if the obstruction reliability is less than the minimum value, that is, less than 0 in the first embodiment by the subtraction operation of the reliability in step S115, the obstruction reliability is limited to the minimum value or more in step S116. The process proceeds to step S117.

  Note that the processes in steps S108 to S116 described above are executed in the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107 of the vehicle travel support apparatus 100.

  In step S117, the estimated obstacle position information is deleted from the obstacle storage unit 109 when the obstacle reliability of the estimated obstacle position information is lower than a predetermined default value. Specifically, the estimated obstacle position information with the obstacle reliability of 0 is deleted from the obstacle storage unit 109 in the above-described S107 processing and S115 to S116 processing. The process of step S117 is executed by the obstacle storage determination unit 108 of the vehicle travel support apparatus 100.

  Such an estimated obstacle position matching processing loop S1L1 ends when the processing of steps S105 to S117 is performed on all the obstacle position information of one cycle before stored in the obstacle storage unit 109. Then, the obstacle detection position information storage processing loop S1L2 is entered.

  In the above-described processing of steps S111 and S115, the obstacle reliability is calculated by adding or subtracting a predetermined specified value to the obstacle reliability of the estimated obstacle position information. However, the specified value is not necessarily a constant, and may be a variable set by the vehicle state and the outputs of the first obstacle detection unit 101 and the second obstacle detection unit 102. For example, in the first embodiment, the first obstacle detection unit 101 detects the position of an obstacle using a plurality of sonar sensors 2. There is a characteristic that the intensity of the sound wave is low and the possibility of erroneous detection due to noise increases. In this case, based on the obstacle detection position information, the specified value for addition and subtraction should be changed so that the specified value for addition and subtraction is increased for nearby obstacles and the specified value for addition and subtraction is decreased for distant obstacles. Is also effective.

  The obstacle detection position information storage processing loop S1L2 is a loop that sequentially performs the processes of steps S118 to S120 on all the obstacle detection position information output by the first obstacle detection unit 101.

  First, in step S118, the obstacle storage determination unit 108 determines whether there is a free area in the obstacle storage unit 109 for adding new obstacle position information. If it is determined in step 118 that there is a free area for adding new obstacle position information in the obstacle storage unit 109 (Yes), the processes of steps S119 and S120 are performed. On the other hand, when the obstacle position information is written in all areas of the obstacle storage unit 109 and there is no free area for adding new obstacle position information (in the case of No), steps S119 and S120 are performed. Proceed to the next loop without performing the above process. However, since it is actually determined in step S118 that there is no free area even in the next loop, once it is determined in step S118 that there is no free area, the obstacle detection position information storage processing loop S1L2 Thereafter, the loop process is finished without performing the process.

In step S119, it is determined whether the matching flag of the obstacle position information is in an unmatched state. When it is determined in step S119 that the matching flag of the obstacle position information is unmatched (in the case of Yes), the process of step S120 is performed.
If it is determined that matching has been completed, the process proceeds to the next loop without performing the process of step S120.

  In step S120, the obstacle position information whose matching flag is not matched is added to the empty area of the obstacle storage unit 109 as the newly detected obstacle position information. In this case, as the reliability information of the obstacle position information to be additionally written, an addition constant which is the specified addition value used in step S111 is set.

  Such an obstacle detection position information storage processing loop S1L2 is completed by performing the processing of steps S118 to S120 for all the obstacle detection position information output by the first obstacle detection unit 101, and step S121. Proceed to Steps S118 to S120 are executed by the obstacle storage determination unit 108 of the vehicle travel support apparatus 100.

  In step S121, the host vehicle route calculation unit 110 calculates the host vehicle route. In the first embodiment, when the vehicle VC travels while maintaining the current steering wheel angle and vehicle speed, the boundary line between the region through which the vehicle VC passes and the region through which the vehicle VC does not pass is defined as the host vehicle route.

  In a straight traveling state, specifically, when the steering wheel angle is ± 10 degrees or less, the vehicle VC travels substantially straight in the traveling direction. At this time, the forward or backward direction depends on the shift state. When the shift state is the D range, the vehicle VC proceeds forward, and when the shift state is the R range, the vehicle VC proceeds backward. In this case, since the vehicle VC moves straight, as shown in FIG. 13, the left and right end positions of the boundary line between the region where the vehicle VC passes and the region where the vehicle VC does not pass are the boundary lines Yl and Yr. With reference to the center of the rear wheel axle that is the coordinate origin in the first embodiment, the boundary lines Yl and Yr can be expressed by the following mathematical formula (7).

  In the above equation (7), Yr is the right boundary line of the vehicle VC, Yl is the left boundary line of the vehicle VC, and α is half the width of the vehicle VC.

  In a state other than the turning state, specifically, the straight traveling state, the boundary line between the region through which the vehicle VC passes and the region through which the vehicle VC does not pass has a relationship as shown in FIG. FIG. 14 is a relationship diagram when the vehicle turns left. In the case of a left turn, the portion of the vehicle VC that travels on the innermost side is a point Pi shown in FIG. The path through which this point Pi passes continuously becomes the left boundary line of the vehicle VC. The portion of the vehicle VC that travels on the outermost side is a point Po shown in FIG. The path through which this point Po passes continuously becomes the right boundary line of the vehicle VC. In FIG. 14, the vehicle VC turns with reference to the point C. The turning radius ρ in this case is expressed by the following mathematical formula (8).

  In the above formula (8), ρ represents a turning radius, l represents a wheel base of the vehicle VC, and δ represents a tire angle of a front wheel.

  The tire angle δ and the steering wheel angle θ are expressed by the following formula (9), and are decelerated by the rack and pinion gear ratio Grp of the steering.

  For the derivation of the above formula (8), see “Sankaido Masato Abe, Automobile Movement and Control ISBN 4-381-08822-0 Chapter 3 Basics of Vehicle Movement, Section 3.3 Steady Circular Turning of Vehicles” It is described in. In the first embodiment, since the operation range of emergency brake control is limited to a low vehicle speed, Equation (8) shows the steering wheel angle and the turning radius ρ in a steady circular turn in which no centrifugal force is generated in the vehicle VC and no side slip occurs. The following relational expression is used.

  With respect to the turning radius ρ, an inner turning radius ρi indicating the radius of the left boundary line of the vehicle VC and an outer turning radius ρo indicating the radius of the right boundary line are expressed by the following formulas using α and β in FIG. 10) and Expression (11). Note that α shown in FIG. 14 is a half length of the lateral width of the vehicle VC, and β is a length obtained by adding the wheel base l and the front overhang of the vehicle VC.

  Based on the turning radius ρ, the inner turning radius ρi, and the outer turning radius ρo, a mathematical expression indicating the left boundary line and a mathematical expression indicating the right boundary line of the vehicle VC are obtained. It is represented by (13).

  Equations (12) and (13) are equations showing a left boundary line and a right boundary line of the vehicle VC when the vehicle VC makes a left turn, and when the vehicle VC makes a right turn, The left boundary line of the vehicle VC is represented by the following mathematical formula (14), and the right boundary line is represented by the following mathematical formula (15).

  In step S121, the host vehicle route is calculated based on the equations (12) to (15).

  Next, the obstacle determination unit 111 performs an obstacle contact determination using the vehicle route and the obstacle position information obtained by the host vehicle route calculation unit 110 based on the equations (7) to (15) (steps). S122). As a specific example, a case where the vehicle VC is moving backward is shown, as shown in FIG. 15, an obstacle that exists in the host vehicle route and has a predetermined obstacle reliability as shown in FIG. 15. Obstacles that have collision thresholds that do not exist in the vehicle path, or that have a predetermined obstacle reliability even if they are present in the vehicle path. Those that are less than the threshold are distinguished as non-collision obstacles. The obstacle determination unit 111 adds the information of the collision obstacle and the non-collision obstacle as information of the obstacle contact determination result to the input obstacle position information, and outputs the information.

  The specific determination method for distinguishing whether or not an obstacle is present on the host vehicle route in step S122 is as follows. First, based on the vehicle state quantity, the vehicle VC is in a straight traveling state, a left turn, or a right turn. Whether the obstacle position information falls within the range of the formula (7) in the case of the straight traveling state, or whether the obstacle position information falls between the formulas (12) and (13) in the case of the left turn. Further, in the case of a right turn, it is determined by determining whether the obstacle information falls within the formulas (14) and (15).

  Although not described in the first embodiment, the contact determination on the own vehicle route may be performed using not only the obstacle position information but also the obstacle size information. In this case, since the obstacle position information represents the center coordinates of the obstacle, the positions of the two points moved left and right by the obstacle size information from the center coordinates are obtained, and whether or not the two points are within the route. Judge with. In this case, when the two points are separated on the left and right across the route, it is determined as a collision obstacle.

  Here, returning to the description of the flowchart, the collision time calculation unit 112 determines that the vehicle VC has the current vehicle speed for a plurality of obstacle position information whose obstacle contact determination result is determined to be a collision obstacle in step S122. When the vehicle travels as it is, a collision time that is an expected time until the vehicle touches each obstacle is calculated (step S123).

  As a method for calculating the collision time, the straight distance between the obstacle and the vehicle VC may be simply divided by the vehicle speed if it is a simple method, and the position where the obstacle contacts the vehicle VC is calculated if it is a complicated method. The actual distance from the obstacle position to the position where the obstacle comes into contact with the vehicle, for example, the straight line distance when driving straight, and the arc distance corresponding to the turning when turning, are divided by the vehicle speed. May be. Even if a simple method or a complicated method is used, the effect of the present invention is not affected. Finally, in step S123, for a plurality of obstacle position information determined to be a collision obstacle, the shortest value in the collision time calculated individually, that is, the obstacle with the highest possibility of contact at the earliest time. The collision time is output as the shortest collision time. Note that when the vehicle VC is stopped, the vehicle speed used to calculate the collision time is 0, so that if the division is performed as it is, the arithmetic unit 12 causes an error. However, since the obstacle does not collide with the vehicle when the vehicle is stopped, the collision time of all obstacle position information is set to the maximum value of the predetermined collision time only in that case, and the shortest collision time is also the above maximum value. Set to. The maximum value set for the collision time may be set so that the target deceleration becomes 0 in step S124.

  Next, in step S124, the target deceleration calculation unit 113 obtains the target deceleration based on the shortest collision time. Various methods of calculating the target deceleration are considered. As an example, as shown in the table of FIG. 16, three types of target decelerations (G) are determined by the value of the shortest collision time (seconds). You may make it choose. That is, if the shortest collision time t is in the range of 0 ≦ t ≦ 0.4, the target deceleration is 0.8 (G), and if the shortest collision time t is in the range of 0.4 <t ≦ 0.8. When the target deceleration is 0.4 (G) and the shortest collision time t is 0.8 <t, the target deceleration is set to 0 (G) and braking is not performed.

  Although not shown in the functional block diagram of the vehicle travel support apparatus 100 shown in FIG. 5, a determination distance that is variable depending on the vehicle state quantity with respect to the obstacle position information, apart from the target deceleration due to the shortest collision time. And the target deceleration may be output only when the obstacle position falls below the determination distance. Further, the maximum value of the target deceleration may be limited by the obstacle reliability. In this way, for example, if the reliability is low, the vehicle VC can be decelerated with a small target deceleration, and the load on the driver in the case of erroneous detection can be reduced.

  Information on the target deceleration output by the target deceleration calculation unit 113 is given to the braking device 114 so that the actual deceleration of the vehicle VC follows the target deceleration calculated by the target deceleration calculation unit 113. The hydraulic pressure is controlled and the vehicle VC is decelerated.

  The vehicle travel support apparatus according to the first embodiment described above includes a plurality of obstacle detection position information detected by the sonar sensor 2 of the first obstacle detection unit 101 and the camera 3 of the second obstacle detection unit 102. Is compared with the obstacle size information detected in step 1, and if it is determined as a single obstacle, the central coordinates of the plurality of obstacle detection position information detected by the sonar sensor 2 are used as new obstacle detection positions. Output as information. For this reason, it is possible to reliably detect an obstacle even for an obstacle having a complicated shape, and it is possible to reduce the calculation load of the obstacle detection and tracking processing performed by the calculation device 12. In addition to reducing obstacle position information stored in the obstacle storage unit 109 in the computing device 12 and reducing the price of the computing device 12, it can store and track more obstacles with a small memory capacity, It is possible to execute emergency brake control that can deal with many obstacles and ensure safety.

  In the first embodiment, the vehicle VC is braked to avoid obstacles, but a warning is given to the driver immediately before braking by a separately provided speaker or the like (not shown) before braking. May be performed. Even such a configuration does not impair the effects of the present invention.

<Embodiment 2>
<Device configuration>
FIG. 17 is a functional block diagram showing the configuration of the vehicle travel support apparatus 200 according to the second embodiment of the present invention. As shown in FIG. 17, the configuration of vehicle driving support apparatus 200 is basically the same as the configuration of vehicle driving support apparatus 100 of the first embodiment shown in FIG. The description which overlaps is abbreviate | omitted.

  In the vehicle travel support apparatus 200 shown in FIG. 17, the obstacle size information detected by the second obstacle detection unit 102 is also input to the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107. The matching determination distance between the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107 is changed using the obstacle size information.

<Operation>
The operation of the vehicle travel support apparatus 200 shown in FIG. 17 will be described using the flowcharts shown in FIGS. 8 and 9. Hereinafter, only the processing in step S109 different from the operation of the vehicle travel support apparatus 100 of the first embodiment will be described.

  In the vehicle travel support apparatus 100 of the first embodiment, in the matching process of the estimated obstacle position information and the obstacle detection position information in step S109, the estimated obstacle position information being processed in the estimated obstacle position matching treatment loop S1L1 Compare all the obstacle detection position information in the unmatched state, and the estimated obstacle position is the one with the straight line distance between the estimated obstacle position information and the obstacle detection position information within the matching judgment distance and the shortest straight line distance. It was determined that the obstacle detection position information should be matched with the information. The matching determination distance is a constant in the first embodiment, but in the second embodiment, the obstacle size information detected by the second obstacle detection unit 102 is added to the matching determination distance in the first embodiment. Use.

  A method for calculating the matching determination distance in step S109 in the second embodiment will be described. In the vehicle travel support apparatus 100 according to the first embodiment, when an obstacle moves in one update cycle, based on the target maximum vehicle speed that the sudden brake control supports and the update cycle of the first obstacle detection unit 101. The assumed maximum distance was set as the matching determination distance. However, as described with reference to FIG. 6, the obstacle detection by the first obstacle detection unit 101 appears as an intersection group for an obstacle having a complicated shape, and an intersection group A in FIG. May not be detected. If the center coordinates of the intersection group B and the intersection group C remaining in such a case are tracked and limited by the matching determination distance of the first embodiment, the center coordinate becomes large when the intersection group A is measured. There was a possibility that it was moved and could not be matched beyond the matching judgment distance. Therefore, in the second embodiment, a value obtained by adding obstacle size information to the value obtained in the first embodiment in the calculation of the matching determination distance is used as the matching determination distance. Specifically, the second obstacle detection unit 102 detects the maximum distance that the obstacle is supposed to move in one update cycle of the first obstacle detection unit 101 described in the first embodiment. Based on the obstacle size information, a value obtained by adding the sizes of the obstacles in the left-right direction is set as a matching determination distance. Thereby, even if the obstacle is a complicated shape and a large object, it is difficult to be restricted by the matching determination distance, and the obstacle matching process with high accuracy can be performed.

  As described with reference to FIG. 7, the obstacle position information of the obstacle having a complicated shape has the center coordinates of the obstacle, and the two points moved to the left and right by the obstacle size information from the center coordinates. The size of the obstacle in the left-right direction can be acquired by obtaining the position of

  Thus, in the vehicle travel support apparatus 200 of the second embodiment, the value obtained by adding the obstacle size information to the value obtained in the first embodiment in the calculation of the matching determination distance is used as the matching determination distance. Even if the obstacle is a complicated shape and a large object, it is less likely to be restricted by the matching judgment distance, can perform the obstacle matching process with high accuracy, and execute emergency brake control with more safety be able to.

<Embodiment 3>
<Device configuration>
FIG. 18 is a functional block diagram showing the configuration of the vehicle travel support apparatus 300 according to the third embodiment of the present invention. As shown in FIG. 18, the configuration of vehicular travel support apparatus 300 is basically the same as the configuration of vehicular travel support apparatus 200 according to the second embodiment shown in FIG. The description which overlaps is abbreviate | omitted.

  In the vehicular travel support apparatus 300 shown in FIG. 18, the obstacle size information detected by the second obstacle detection unit 102 is not limited to the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107. The input is also input to the object storage determination unit 108.

<Operation>
The operation of the vehicle travel support apparatus 300 shown in FIG. 18 will be described using the flowcharts shown in FIGS. Note that the symbols (A) to (E) in FIG. 19 and the symbols (A) to (E) in FIG. 20 are connected to each other, and the symbol (F) in FIG. 20 and the symbol (F) in FIG. Are connected to each other.

  First, differences from the flowcharts shown in FIGS. 8 and 9 are listed. Step S103 of FIG. 8 and FIG. 9 becomes step S303, the estimated obstacle position matching processing loop S1L1 becomes the estimated obstacle position matching processing loop S3L1, and steps S108, S109, S111, S112, and S113 are steps S308, S309, Steps S3251, S312 and S313 are performed, and steps S325a, S326, S327, S328 and S329 are added after step S313. When the determination result of step 110 is No, steps S115 and S116 are continued through the determination of step S325b. Then, following the obstacle detection position information storage processing loop S1L2, an obstacle size information storage processing loop S3L3 is added. Then, the obstacle size information storage processing loop S3L3 is followed by steps S121 to S124.

  Hereinafter, a description will be given focusing on processing different from the flowcharts shown in FIGS. 8 and 9. In step S103, the matching flags of all the obstacle detection position information input from the first obstacle detection unit 101 are in the unmatched state. However, in step S303 of the third embodiment, the second obstacle The matching flag of the obstacle size information input from the object detection unit 102 to the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107 is also set to an unmatched state.

  The estimated obstacle position matching processing loop S3L1 has the same processing conditions as the estimated obstacle position matching processing loop S1L1, but the processing contents are different.

  That is, in step S108 of the estimated obstacle position matching processing loop S1L1, it is determined whether or not the estimated obstacle position information is within the detection range of the first obstacle detection unit 101. In Step S <b> 308 of Form 3, it is determined whether or not the estimated obstacle position information is within the detection range of the first obstacle detection unit 101 and the second obstacle detection unit 102. In step S308, when it is determined that the estimated obstacle position information is outside the detection range of the first obstacle detection unit 101 and the second obstacle detection unit 102 (in the case of No), a subsequent matching process is performed. It progresses to step S117, without performing.

  It should be noted that there is no problem in the operation of the vehicle travel support apparatus 300 even if the step 108 is not changed to the step S308. In this case, when the detection range of the second obstacle detection unit 102 is larger than the detection range of the first obstacle detection unit 101, the estimated obstacle position is included in the detection range of the first obstacle detection unit 101. However, a situation that is included only within the detection range of the second obstacle detection unit 102 occurs. In such a case, when the estimated obstacle enters the detection range of the first obstacle detection unit 101, the obstacle matching and the reliability calculation are not executed, and the effect of the second embodiment is effective. Are limited to the same effect.

  In step S309, while matching processing of estimated obstacle position information and obstacle detection position information is performed in step S109, matching processing of estimated obstacle position information and approximate position of obstacle size information is also performed. As the processing order, first, matching processing between the estimated obstacle position information and the obstacle detection position information is performed, and then matching processing between the estimated obstacle position information and the approximate position of the obstacle size information is performed.

  In step S110, it is determined whether there is obstacle detection position information to be matched with the estimated obstacle position information with respect to the matching result performed in step S309. In step S110, when it is determined that there is obstacle detection position information that should be matched with the estimated obstacle position information (in the case of Yes), the processing of steps S311 to S114 is performed to match the estimated obstacle position information. If it is determined that the obstacle detection position information does not exist (No), the process proceeds to step S325b.

  In step S311, a predetermined specified value is added to the obstacle reliability of the estimated obstacle position information in step S111, whereas a first added value is specified for the first obstacle reliability. It adds as a value. This is because the third embodiment sets different addition values and different maximum values for the obstacle position information detected by the first obstacle detection unit 101 and the second obstacle detection unit 102, respectively. is there. Therefore, in step S312, the obstacle reliability is restricted so as not to exceed the maximum value in step S112, whereas the first obstacle reliability is restricted so as not to exceed the first maximum value. ing.

  In the first embodiment, the estimated obstacle position information has only one obstacle reliability. However, in the third embodiment, the estimated obstacle position information has the first obstacle reliability. The first obstacle is assumed to be given as an added value of the obstacle reliability and the second obstacle reliability, and based on the obstacle detection position information output from the first obstacle detection unit 101 as described above. The first addition value is added to the reliability as a specified value. Further, as will be described later, based on the obstacle size information (approximate position) output by the second obstacle detection unit 102, the second added value is added as a specified value to the second obstacle reliability. . The first obstacle reliability and the second obstacle reliability have different ranges, and the maximum value of the first obstacle reliability is the first maximum value and the second obstacle reliability. The maximum reliability value is defined as the second maximum value. In this embodiment, the same minimum value is used for the minimum value in the range of the first obstacle reliability and the second obstacle reliability. In step S107 of the third embodiment, both the first obstacle reliability and the second obstacle reliability are set to the same minimum value.

  In step S313, when the estimated obstacle position information matches the obstacle detection position information, the matching flag of the matched obstacle position detection information is set as matched.

  Steps S325a and S325b are processing newly added in the third embodiment. In the matching processing performed in step S309, there is obstacle size information (rough position) that should be matched with the estimated obstacle position information. Determine whether or not.

  Step S325a is executed after the process of step S313, and when there is obstacle size information (rough position) to be matched with the estimated obstacle position information (in the case of Yes), the processes of steps S326 to S114 are performed. If it is determined that there is no obstacle size information (rough position) that should be matched with the estimated obstacle position information (in the case of No), the process proceeds to step S329.

  In step S325b, when there is obstacle size information (rough position) that should be matched with the estimated obstacle position information (in the case of Yes), the processing of steps S326 to S114 is performed to obtain the estimated obstacle position information. When it is determined that there is no obstacle size information (rough position) to be matched (in the case of No), the processes of steps S115 and S116 are performed.

  In step S326, the second added value is added as a specified value to the second obstacle reliability, and in step S327, the second obstacle reliability is limited so as not to exceed the second maximum value.

  In step S328, when the estimated obstacle position information and the obstacle size information (approximate position) are matched, the matching flag of the matched obstacle size information (approximate position) is set as matched.

  Then, in step S329, the obstacle reliability is calculated by adding the first obstacle reliability and the second obstacle reliability.

  As described above, in the vehicle travel support apparatus 300 according to the third embodiment, each of the obstacle position information detected by each of the first obstacle detection unit 102 and the second obstacle detection unit 102 is detected. By setting the reliability in accordance with the characteristics of and setting different addition values and different maximum values, it is possible to obtain obstacle accuracy with higher accuracy.

  As described with reference to FIGS. 8 and 9, in the obstacle detection position information storage processing loop S1L2, when there is obstacle detection position information in an unmatched state, an obstacle in the empty area of the obstacle storage unit 109 is displayed. The detected position information was stored.

  In the third embodiment, when the obstacle size information storage processing loop S3L3 is provided subsequent to the obstacle detection position information storage processing loop S1L2, there is obstacle size information (rough position) in an unmatched state. The obstacle size information (schematic position) is stored in the empty area of the obstacle storage unit 109.

  In step S330 of the obstacle size information storage processing loop S3L3, is there a free area for the obstacle storage determination unit 108 to add new obstacle size information (rough position) to the obstacle storage unit 109? Judging. If it is determined in step 330 that there is an empty area for adding new obstacle size information (approximate position) in the obstacle storage unit 109 (in the case of Yes), the processing of steps S331 and S332 is performed. . On the other hand, obstacle position information and obstacle size information (approximate position) are written in all areas of the obstacle storage unit 109, and a free area for additionally writing new obstacle size information (approximate position). Does not exist (in the case of No), the processing of steps S331 and S332 is not performed, and the process proceeds to the next loop. However, since it is actually determined in step S330 that there is no free area in the next loop, once it is determined in step S330 that there is no free area, the obstacle size information storage processing loop S3L3 Thereafter, the loop process is finished without performing the process.

  In step S331, it is determined whether or not the matching flag of the obstacle size information (rough position) is in an unmatched state. If it is determined in step S331 that the matching flag of the obstacle size information (approximate position) is unmatched (in the case of Yes), the process of step S332 is performed, and if it is determined that the match is completed, the process of step S332 is performed. Go to the next loop without doing it.

  In step S332, the obstacle size information (schematic position) in which the matching flag is in an unmatched state is added to the empty area of the obstacle storage unit 109 as the newly detected obstacle position information. In this case, the reliability information of the obstacle position information to be added is set to the prescribed second added value used in step S326. Further, in step S120 of the third embodiment, the reliability information of the obstacle position information to be added is set to the prescribed first addition value used in step S311.

  Such an obstacle size information storage processing loop S3L3 is completed by performing the processing of steps S330 to S332 for all the obstacle size information (rough position) output by the second obstacle detection unit 102. Then, the process proceeds to step S121. Steps S330 to S332 are executed by the obstacle storage determination unit 108 of the vehicle travel support apparatus 300.

  As described above, in the vehicle travel support apparatus 300 according to the third embodiment, the obstacle output from the first obstacle detection unit 101 in the obstacle reliability calculation unit 106 and the obstacle position information correction unit 107. The first addition value corresponding to the detected position information, the second addition value corresponding to the obstacle size information output by the second obstacle detection unit 102, and the first obstacle detection unit 101 output A first maximum value corresponding to the obstacle detection position information and a second maximum value corresponding to the obstacle size information output from the second obstacle detection unit 102 are set.

  When the obstacle position information output from the first obstacle detection unit 101 exists within the matching determination distance from the estimated obstacle position, the obstacle reliability is first within the range of the first maximum value. When there is obstacle size information (approximate position) output by the second obstacle detection unit 102, the second obstacle value is set within the range of the second maximum value. Add the added value. In the obstacle storage determination unit 108, when there is an empty area in the obstacle storage unit 109, the obstacle detection position information output by the first obstacle detection unit 101 and the second obstacle detection unit 102 output Record the obstacle size information (rough position).

  Thereby, when the camera 3 constituting the second obstacle detection unit 102 can detect an obstacle outside the detection range of the plurality of sonar sensors 2 constituting the first obstacle detection unit 101, The second added value is added to the reliability of the information. As a result, it is possible to perform the braking control only when the vehicle is surely recognized as an obstacle. Moreover, since a distant obstacle outside the detection range of the sonar sensor 2 can be detected at an early stage, emergency braking control that can perform braking with sufficient margin even when the speed of the vehicle VC is high is executed. Can do.

  Further, by managing the first maximum value and the second maximum value separately, for example, it is possible to prevent the braking control from being performed on an object recognized by the camera 3 alone as an obstacle. Specifically, if the second maximum value is set to be less than the obstacle determination threshold used in step S122, the final collision obstacle is not determined for the obstacle recognized by the camera 3 alone. If at least the recognition by the sonar sensor 2 is not performed, the final collision obstacle can be prevented. Thereby, in the situation where the reliability of the obstacle detected by the camera 3 is low, it is possible to perform the braking control only when the obstacle can be reliably determined.

  In the above description, the first added value and the second added value, and the first maximum value and the second maximum value are set as predetermined specified values. It may be changed according to the state, or may be changed according to the environment around the vehicle VC using an environmental sensor or the like provided separately. In this way, the first addition value, the second addition value, the first maximum value, and the second maximum value can be changed to detect the camera 3 that constitutes the second obstacle detection unit 102. At night when the ability is significantly reduced, especially in the dark, for example, by using an optical sensor used for auto light control as an environmental sensor, the second added value and the second maximum value are set to be higher than usual at night. It is possible to prevent malfunction of emergency brake control caused by erroneous detection by the camera 3.

  Further, if the camera 3 itself can detect that the surrounding environment is night without using the optical sensor, the second addition value and the second maximum value are lowered from the normal level in accordance with the detection information. You may do it.

  Further, as a situation in which the sonar sensor 2 constituting the first obstacle detection unit 101 is not good, for example, a noise source that significantly lowers the detection accuracy of the sonar sensor 2 around, for example, a supersonic wave emitted by the sonar sensor 2. When there is an inverter with the same frequency as the sound wave, or when there is another vehicle equipped with a sonar sensor with the same frequency nearby, the first added value and the first maximum value can be reduced from the normal values. The malfunction of emergency brake control can be prevented. Note that it is possible to detect whether or not there is a noise source in the vicinity of the vehicle VC that significantly reduces the detection accuracy of the sonar sensor 2 by the sonar sensor 2 itself.

<Modification>
Although operation | movement of the vehicle travel assistance device 300 which concerns on this Embodiment 3 was demonstrated using the flowchart shown in FIGS. 19-21, operation | movement of the vehicle travel assistance device 300 is not limited to this, For example, the operations shown in the flowcharts shown in FIGS. 22 to 24 are also possible. Note that the symbols (A) to (D) in FIG. 22 and the symbols (A) to (D) in FIG. 23 are connected to each other, and the symbol (E) in FIG. 23 and the symbol (3) in FIG. Are connected to each other.

  In the third embodiment described with reference to the flowcharts shown in FIGS. 19 to 21, the first obstacle reliability corresponding to the obstacle detection position information matched with the estimated obstacle position information is obtained in steps S <b> 310 to S <b> 313. After that, in steps S325a and S326 to S328, the second obstacle reliability corresponding to the obstacle size information (approximate position) matched with the estimated obstacle position information is obtained.

  On the other hand, in the flowcharts shown in FIGS. 22 to 24, after the first obstacle reliability corresponding to the obstacle detection position information matched with the estimated obstacle position information is obtained in steps S310 to S313, the second is shown. If the obstacle size information (approximate position) to be matched with the estimated obstacle position information exists (in the case of Yes) in step S325b, the processing of steps S326 to S328 is performed. Subsequently to step S328, the process of step S329 is performed. When it is determined that there is no obstacle size information (rough position) that should be matched with the estimated obstacle position information (in the case of No), the processing of steps S115 and S116 is performed.

  With such a configuration, when the same obstacle is detected by both the first obstacle detection unit 101 and the second obstacle detection unit 102, the obstacle reliability is prevented from being excessively increased. be able to.

<Embodiment 4>
<Device configuration>
FIG. 25 is a functional block diagram showing the configuration of the vehicle travel support apparatus 400 according to the fourth embodiment of the present invention. As shown in FIG. 25, the configuration of vehicle driving support apparatus 400 is basically the same as the configuration of vehicle driving support apparatus 100 of the first embodiment shown in FIG. The description which overlaps is abbreviate | omitted.

  In the vehicle travel support apparatus 400 shown in FIG. 25, the second obstacle detection unit 102 has a function of outputting object identification information in addition to the obstacle size information and the approximate position, and the obstacle size information. The approximate position is input to the first obstacle detection unit 102, and the object identification information is input to the obstacle control target determination threshold value calculation unit 115.

  The obstacle control target determination threshold value calculation unit 115 has a newly added configuration, and calculates an obstacle control target threshold value based on the input object identification information. Then, the obstacle determination unit 111 uses the obstacle control target threshold value calculated by the obstacle control target determination threshold value calculation unit 115 instead of the obstacle determination threshold value so that the obstacle contacts the vehicle VC. It is the structure which determines whether to do.

  The second obstacle detection unit 102 in the fourth embodiment is configured by the camera 3 as in the first embodiment. In recent years, obstacle detection using a camera has not only determined the size and approximate position of an obstacle, but also what the obstacle is, and what kind of object it is classified by using machine learning. It is possible to perform object recognition. In the fourth embodiment, the object recognition result of the obstacle is used for calculating the obstacle reliability threshold value.

  As an obstacle identification method performed by the second obstacle detection unit 102, for example, an object that generally tends to be an obstacle of a traveling vehicle such as a person, a vehicle, a two-wheeled vehicle, etc. is roughly classified in advance. There is a method of identifying the obstacle classification by preparing the obstacle classification, photographing the obstacle classification with the camera 3, and classifying the obstacle subjected to object recognition by the above-described method.

  In addition to the above-described classification of objects that are likely to be obstacles to a traveling vehicle, an obstacle classification is created for each detection sensitivity of the sonar sensor 2 constituting the first obstacle detection unit 101, and machine learning is performed. Etc. may be used for identification.

<Operation>
The operation of the vehicle travel support apparatus 400 shown in FIG. 25 will be described using the flowcharts shown in FIGS. 26 and 27. Note that the symbols (A) to (C) in FIG. 26 and the symbols (A) to (C) in FIG. 27 are connected to each other.

  The flowcharts shown in FIGS. 26 and 27 are basically the same as the flowcharts shown in FIGS. 8 and 9, and the difference is that after obtaining the vehicle route in step S121, the obstacle control target determination threshold value calculation is performed. The unit 115 performs the process of step S421 for obtaining the obstacle control determination threshold value from the obstacle identification information, and replaces the predetermined obstacle determination threshold value used in the first embodiment by the obstacle determination unit 111. In addition, the process of step S422 for determining a collision obstacle using the obstacle control determination threshold obtained in step S421 is performed. After the collision obstacle is determined in step S422, the collision time is calculated in step S123.

  The process of step S421 in the obstacle control target determination threshold value calculation unit 115 will be described. In order to obtain the obstacle control determination threshold value from the obstacle identification information, an obstacle control determination threshold value corresponding to the obstacle identification information is previously set and stored for each obstacle identification information. As an example, if the obstacle identification information is classified as a person, a vehicle, a two-wheeled vehicle, or the like, the obstacle control determination threshold value may be set as shown in a table shown in FIG.

  That is, the determination threshold value is 30 for a person, the determination threshold value 70 for a vehicle, the determination threshold value 50 for a two-wheeled vehicle, and the determination threshold value for others. The value is 40. This is based on other obstacles. For objects that are easy to detect by the sonar sensor 2 constituting the first obstacle detection unit 101, the obstacle control determination threshold value is increased, and obstacles that are difficult to detect are obstacle control. By setting the determination threshold value small, it is easy to identify as an obstacle. The example shown in FIG. 28 is an example, and the present invention is not limited to this.

  As described above, in the vehicle travel support apparatus 400 according to Embodiment 4, the second obstacle detection unit 102 has a function of recognizing an obstacle and outputting object identification information. An obstacle control target determination threshold value calculation unit 115 that calculates an obstacle control target threshold value based on the threshold value is provided. The obstacle determination unit 111 determines whether the obstacle contacts the vehicle VC using an obstacle control target threshold value instead of the obstacle determination threshold value. In this way, by using the obstacle control target threshold value set in accordance with the obstacle instead of the obstacle judgment threshold value determined in advance, even if the obstacle changes, the delay in determining the obstacle Can be prevented.

  It should be noted that the present invention can be freely combined with each other within the scope of the invention, and each embodiment can be appropriately modified or omitted.

  2 sonar sensors, 3 cameras, 12 computing devices, 101 first obstacle detection unit, 102 second obstacle detection unit, 105 estimated obstacle position calculation unit, 106 obstacle reliability calculation unit, 107 obstacle position information Correction unit, 108 Obstacle memory determination unit, 109 Obstacle storage unit, 110 Vehicle path calculation unit, 111 Obstacle determination unit, 112 Collision time calculation unit, 113 Target deceleration calculation unit, 115 Obstacle control target determination threshold Value calculation unit, VC vehicle.

Claims (7)

  1. A first obstacle detection unit that detects obstacles around the vehicle by a plurality of sonar sensors mounted on the vehicle and outputs the obstacle detection position as obstacle detection position information;
    A second obstacle detection unit that captures an image of the obstacle by an imaging device, detects a size and an approximate position of the obstacle based on the image, and outputs the obstacle as size information;
    A storage unit that periodically checks the relative position of the obstacle with respect to the vehicle based on the obstacle detection position information, and stores the relative position one cycle before the confirmed position as obstacle position information;
    Based on the obstacle position information stored in the storage unit, an estimated obstacle position that is a position where the obstacle is estimated to be present due to movement of the vehicle, and reliability of the estimated obstacle position An estimated obstacle position calculation unit that calculates estimated obstacle position information including obstacle reliability that defines
    A host vehicle route calculation unit for calculating a route along which the vehicle travels;
    When it is determined that the obstacle reliability is equal to or higher than a predetermined threshold and the obstacle exists on the route of the vehicle based on the obstacle position information, the obstacle is given to the vehicle. An obstacle determination unit that determines that the object is in contact and outputs the obstacle contact determination result together with the obstacle reliability and the obstacle position information;
    A collision time calculation unit that calculates a collision time, which is an expected time until the obstacle collides with the vehicle, based on the obstacle reliability, the obstacle position information, and the obstacle contact determination result;
    A target deceleration calculating unit that calculates a target deceleration that is a deceleration for decelerating the vehicle based on the collision time;
    A vehicle travel support device that controls a braking device that brakes the vehicle based on the target deceleration,
    The second obstacle detection unit is
    The obstacle size information is input to the first obstacle detection unit,
    The first obstacle detection unit uses the obstacle size information detected by the second obstacle detection unit when a plurality of obstacle detection position information is detected. It is determined whether or not the detected position information is a single obstacle. If the detected position information is the single obstacle, center coordinates of the plurality of obstacle detected position information are output as the obstacle detected position information. A vehicle travel support device.
  2. Based on the estimated obstacle position information and the obstacle detection position information, when it is determined that the obstacle detection position exists within a predetermined determination distance from the estimated obstacle position, the estimated obstacle position information In the case where it is determined that the obstacle detection position does not exist within the predetermined determination distance from the estimated obstacle position by adding a predetermined addition value to the obstacle reliability included, the obstacle An obstacle reliability calculation unit for subtracting a predetermined subtraction value from the reliability and outputting the result,
    When it is determined that the obstacle detection position exists within the predetermined determination distance from the estimated obstacle position based on the estimated obstacle position information and the obstacle detection position information, the obstacle reliability is set as follows. Using the obstacle position information correction unit that corrects the estimated obstacle position information and outputs the current obstacle position information;
    It is determined whether or not an empty area exists in the storage unit. If there is an empty area, the obstacle obstacle information added or subtracted by the current obstacle position information and the obstacle reliability calculation unit is determined. The vehicle travel support apparatus according to claim 1, further comprising: an obstacle storage determination unit that stores a degree in the storage unit.
  3. The obstacle reliability calculation unit is:
    The vehicle travel support device according to claim 2, wherein a value obtained by adding the added value to the obstacle reliability is limited so as not to exceed a predetermined maximum value.
  4. The obstacle size information is also input to the obstacle reliability calculation unit and the obstacle position information correction unit,
    The obstacle reliability calculation unit and the obstacle position information correction unit are:
    The vehicle travel support apparatus according to claim 2, wherein the predetermined determination distance is changed using the obstacle size information.
  5. The obstacle reliability calculation unit is:
    The addition value includes a first addition value corresponding to the obstacle detection position information and a second addition value corresponding to the obstacle size information,
    When the obstacle position information exists within the predetermined determination distance from the estimated obstacle position, the first addition value is added to the obstacle reliability included in the estimated obstacle position information,
    When the approximate position included in the size information exists within the predetermined determination distance from the estimated obstacle position, the second added value is added to the obstacle reliability included in the estimated obstacle position information. And add
    The obstacle memory determination unit
    The vehicular driving support apparatus according to claim 2, wherein when there is an empty area in the storage unit, information on the approximate position included in the obstacle size information is stored.
  6. The obstacle reliability includes a first obstacle reliability and a second obstacle reliability,
    The obstacle reliability calculation unit is:
    The value obtained by adding the first added value to the first obstacle reliability does not exceed a predetermined first maximum value, and the second added value is added to the second obstacle reliability. The vehicle travel support apparatus according to claim 5, wherein the value is limited so as not to exceed a predetermined second maximum value.
  7. The second obstacle detection unit is
    It further has a function of identifying what the obstacle is and outputting it as object identification information,
    The vehicle travel support device includes:
    An obstacle control target determination threshold value calculation unit for calculating an obstacle control target threshold value based on the object identification information;
    The obstacle determination unit
    The vehicle travel support apparatus according to claim 1, wherein, instead of the predetermined threshold value, the obstacle control target threshold value is used to determine whether or not the obstacle contacts the vehicle.
JP2017197372A 2017-10-11 2017-10-11 Vehicle travel support device Active JP6479130B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017197372A JP6479130B1 (en) 2017-10-11 2017-10-11 Vehicle travel support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017197372A JP6479130B1 (en) 2017-10-11 2017-10-11 Vehicle travel support device

Publications (2)

Publication Number Publication Date
JP6479130B1 true JP6479130B1 (en) 2019-03-06
JP2019070986A JP2019070986A (en) 2019-05-09

Family

ID=65655799

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017197372A Active JP6479130B1 (en) 2017-10-11 2017-10-11 Vehicle travel support device

Country Status (1)

Country Link
JP (1) JP6479130B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005324679A (en) * 2004-05-14 2005-11-24 Nissan Motor Co Ltd Warning device
JP2014191485A (en) * 2013-03-26 2014-10-06 Sharp Corp Obstacle detection device and electrically-driven vehicle with the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005324679A (en) * 2004-05-14 2005-11-24 Nissan Motor Co Ltd Warning device
JP2014191485A (en) * 2013-03-26 2014-10-06 Sharp Corp Obstacle detection device and electrically-driven vehicle with the same

Also Published As

Publication number Publication date
JP2019070986A (en) 2019-05-09

Similar Documents

Publication Publication Date Title
EP2242674B1 (en) Method and assistance system for detecting objects in the surrounding area of a vehicle
EP1593552B1 (en) System and method for monitoring a car trailer
JP4420011B2 (en) Object detection device
US9360556B2 (en) Methods and systems for detecting weather conditions including fog using vehicle onboard sensors
US9121703B1 (en) Methods and systems for controlling operation of a laser device
US7411486B2 (en) Lane-departure warning system with differentiation between an edge-of-lane marking and a structural boundary of the edge of the lane
CN102696060B (en) Object detection apparatus and object detection method
EP1564703B1 (en) Vehicle driving assist system
JP3822770B2 (en) Vehicle front monitoring device
US7447592B2 (en) Path estimation and confidence level determination system for a vehicle
US9841762B2 (en) Alerting predicted accidents between driverless cars
EP1750146B1 (en) Obstacle verification
EP1403660B1 (en) Vehicle surroundings monitoring apparatus and traveling control system incorporating the apparatus
RU2670845C9 (en) Method of assessing vehicle parking area
US20080189040A1 (en) Collision Avoidance System
CN105593700B (en) Adaptive learning algorithms with Entrance ramp detection
US9452760B2 (en) Driving control apparatus for vehicle
US8204678B2 (en) Vehicle drive assist system
JP2004531424A (en) Sensing device for cars
WO2012043184A1 (en) Parking assistance device
DE102004051365A1 (en) Collision prediction unit for a vehicle
JP2015501249A5 (en)
WO2007100000A1 (en) Obstacle detection method, obstacle detection device, and standard mobile body model
EP1731922A1 (en) Method and device for determining free areas in the vicinity of a motor vehicle
EP2481038B1 (en) Portable communication tool, driver assistance system comprising a portable communication tool, and method for aiding a driver when operating a vehicle

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20181219

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190108

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190205

R150 Certificate of patent or registration of utility model

Ref document number: 6479130

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150