JP5196279B2 - Driving assistance device - Google Patents

Driving assistance device Download PDF

Info

Publication number
JP5196279B2
JP5196279B2 JP2011120643A JP2011120643A JP5196279B2 JP 5196279 B2 JP5196279 B2 JP 5196279B2 JP 2011120643 A JP2011120643 A JP 2011120643A JP 2011120643 A JP2011120643 A JP 2011120643A JP 5196279 B2 JP5196279 B2 JP 5196279B2
Authority
JP
Japan
Prior art keywords
lane
vehicle
position
virtual
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011120643A
Other languages
Japanese (ja)
Other versions
JP2011175676A (en
Inventor
寛暁 片岡
Original Assignee
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by トヨタ自動車株式会社 filed Critical トヨタ自動車株式会社
Priority to JP2011120643A priority Critical patent/JP5196279B2/en
Publication of JP2011175676A publication Critical patent/JP2011175676A/en
Application granted granted Critical
Publication of JP5196279B2 publication Critical patent/JP5196279B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a driving support device that is mounted on a vehicle and outputs driving support information to a driver, for example.

Conventionally, through a CCD (Charge Coupled Device) camera or the like, vehicle lane lines (so-called white lines) laid on the right and left sides of the lane in which the vehicle travels are detected, and based on the detected vehicle lane lines, In order to prevent deviation from the lane, a technique for notifying a driver of driving support information is known. However, when the vehicle lane marking cannot be detected, there is a possibility that the driving support information cannot be accurately notified. In order to solve this problem, various apparatuses, methods, and the like have been proposed (see, for example, Patent Document 1).

  The partition line recognition apparatus described in Patent Document 1 includes an image processing unit that extracts a white line (= corresponding to a vehicle lane marking) drawn on a road surface from an image captured by a camera, and an image processing unit. When either the left or right white line is extracted, the coordinates of a plurality of sample points set on the extracted white line, the road width calculation parameter corresponding to the sample points, and the host vehicle travel And a white line estimation unit that estimates the position of the unextracted white line based on the road width data of the lane to be. According to this partition line recognition device, the other partition line can be estimated from one partition line (white line or the like) that can be detected.

JP 2003-44836 A

  However, in the partition line recognition apparatus described in Patent Document 1, the position of the undetected vehicle lane line is estimated based on the road width data of the lane on which the host vehicle travels, and therefore the undetected vehicle There is a possibility that the position of the lane marking cannot be accurately estimated. Specifically, in the partition line recognition device described in Patent Document 1, the coordinates of a plurality of sample points set on the extracted vehicle lane marking, the road width calculation parameter corresponding to the sample points, and the host vehicle are Since the position of the vehicle lane marking on the undetected side is estimated based on the road width data of the traveling lane, if the road width calculation parameter and the road width data are not accurate, the undetected vehicle This means that the position of the lane marking cannot be estimated accurately.

  Further, even if the vehicle lane line is accurately detected, the position of the vehicle lane line used for the purpose of preventing deviation from the lane, for example, is detected. It may be preferable to set the position on the inner side of the position of the vehicle lane marking). Specifically, when a side wall made of concrete or the like that is assumed to be seriously damaged when the vehicle comes into contact with the outside of the detected lane marking, the collision with the side wall is prevented. Therefore, it is preferable that the position of the lane marking used for the purpose of preventing deviation from the lane is set inside the detected position of the lane marking.

  The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a driving support device capable of appropriately estimating the position of a lane marking and outputting appropriate driving support information. It is to provide.

  A first aspect of the present invention is a driving support device that is mounted on a vehicle and outputs driving support information to a driver, and is a vehicle lane marking laid on the right side and the left side of the lane in which the vehicle travels White line detecting means for detecting roadside objects, roadside object detecting means for detecting roadside objects laid on the right and left sides of the lane in which the vehicle travels, vehicle lane markings detected by the white line detecting means, and the roadside Based on the roadside object detected by the object detection means, the virtual line estimation means for estimating the position of the virtual lane marking that is a virtual vehicle lane marking used for the output of the driving support information, and the virtual line estimation means Information output means for outputting the driving support information based on the position of the virtual lane marking.

  According to a second aspect of the present invention, in the first aspect, the virtual line estimation unit detects a vehicle lane marking on at least one of a right side and a left side of a lane in which the vehicle travels by the white line detection unit, And when the roadside object is detected by the roadside object detection means on the side where the vehicle lane line is detected, the vehicle lane line is preset from the position of the vehicle lane line to the inside of the lane in which the vehicle travels. The position separated by the second distance is estimated as the position of the virtual lane marking.

  According to a third aspect of the present invention, in the second aspect, the vehicle includes a distance setting unit that sets the second distance, the roadside object detection unit determines a type of the roadside object, and the distance setting unit includes: The second distance is set based on the type of the roadside object determined by the roadside object detection means, and the virtual line estimation means is located inside the lane in which the vehicle travels from the position of the lane marking. The position separated by the second distance set by the distance setting means is estimated as the position of the virtual lane marking.

  According to a fourth aspect of the present invention, in the third aspect, the degree of damage when the distance setting unit collides with the roadside object based on the type of the roadside object determined by the roadside object detection unit. As the estimated degree of damage is larger, a larger value is set as the second distance.

  According to a fifth aspect of the present invention, in the first aspect, the information output means outputs driving support information for preventing deviation from a traveling lane.

  According to the first aspect, the lane markings laid on the right and left sides of the lane in which the vehicle travels are detected. Then, roadside objects laid on the right and left sides of the lane in which the vehicle travels are detected. Further, based on the detected vehicle lane line and the detected roadside object, the position of a virtual lane line that is a virtual vehicle lane line used for output of driving support information is estimated. Further, driving support information is output based on the estimated position of the virtual lane marking. Therefore, it is possible to appropriately estimate the position of the virtual lane marking that is a virtual vehicle lane marking used for outputting the driving assistance information, and to output appropriate driving assistance information.

  That is, since the position of the virtual lane marking VL2 that is a virtual vehicle lane marking used for outputting the driving assistance information is estimated based on the detected vehicle lane marking WL and the detected roadside object WA, the virtual lane marking VL2 is estimated. The position of the lane marking VL2 can be estimated appropriately (see FIG. 6). Moreover, since driving support information is output based on the position of the virtual lane marking VL2 estimated appropriately, it is possible to output appropriate driving support information.

  According to the second aspect, the lane marking is detected on at least one of the right side and the left side of the lane in which the vehicle travels, and the roadside object is detected on the side where the lane marking is detected. In this case, a position that is separated from the position of the vehicle lane line by a preset second distance inside the lane in which the vehicle travels is estimated as the position of the virtual lane line. Therefore, the position of the virtual lane marking can be estimated more appropriately.

  That is, when the roadside object WA is detected on the side where the lane marking WL is detected, the position separated by the second distance ΔL2 from the position of the lane marking WL to the inside of the lane in which the vehicle travels, Since it is estimated as the position of the virtual lane line VL2, the position of the virtual lane line VL2 can be estimated more appropriately by setting the second distance ΔL2 to an appropriate value (see FIG. 6).

  According to the third aspect, the type of the roadside object is determined. Then, the second distance is set based on the determined type of roadside object. Further, a position that is separated from the position of the lane marking by the set second distance inside the lane in which the vehicle travels is estimated as the position of the virtual lane marking. Therefore, the position of the virtual lane marking can be estimated more appropriately.

  That is, since the type of the roadside object WA is determined and the second distance ΔL2 is set based on the determined type of the roadside object WA, the second distance ΔL2 can be set to an appropriate value. The position of the lane marking VL2 can be estimated more appropriately (see FIG. 6).

  According to the fourth aspect, based on the determined type of roadside object, the degree of damage when colliding with the roadside object is estimated, and as the estimated degree of damage is larger, the second distance is A large value is set. Therefore, the position of the virtual lane marking can be estimated more appropriately.

  That is, for a roadside object WA having a large degree of damage in the case of a collision, since the position of the virtual lane line VL2 is estimated (at a position away from the roadside object WA), it collides with the roadside object WA. Therefore, the position of the virtual lane marking VL2 can be estimated more appropriately (see FIG. 6).

  According to the fifth aspect, the driving support information for preventing deviation from the traveling lane is output. Accordingly, it is possible to output appropriate driving support information for preventing a deviation from the traveling lane.

The block diagram which shows an example of a structure of the driving assistance device which concerns on this invention The block diagram which shows an example of a function structure of the driving assistance ECU which concerns on 1st Embodiment. The top view which shows an example of the condition where the position of a lane marking is estimated by the driving assistance ECU which concerns on 1st Embodiment The flowchart which shows an example of operation | movement of the driving assistance ECU which concerns on 1st Embodiment. The block diagram which shows an example of a function structure of the driving assistance ECU which concerns on 2nd Embodiment. The top view which shows an example of the condition where the position of a virtual lane marking is estimated by the driving assistance ECU which concerns on 2nd Embodiment The flowchart which shows an example of operation | movement of the driving assistance ECU which concerns on 2nd Embodiment. The block diagram which shows an example of a function structure of the driving assistance ECU which concerns on 3rd Embodiment. The top view which shows an example of the condition where the position of a virtual center parting line is estimated by the driving assistance ECU which concerns on 3rd Embodiment The flowchart which shows an example of operation | movement of the driving assistance ECU which concerns on 3rd Embodiment. The block diagram which shows an example of a function structure of driving assistance ECU which concerns on 4th Embodiment The top view which shows an example of the condition where the position of a lane marking is correct | amended by the driving assistance ECU which concerns on 4th Embodiment The flowchart which shows an example of operation | movement of the driving assistance ECU which concerns on 4th Embodiment. The block diagram which shows an example of a function structure of driving assistance ECU which concerns on 5th Embodiment The top view which shows an example of the condition where the position of a virtual lane marking is estimated by the driving assistance ECU which concerns on 5th Embodiment The top view which shows an example of the condition where the position of a virtual lane marking is correct | amended by the driving assistance ECU which concerns on 5th Embodiment The flowchart which shows an example of operation | movement of the driving assistance ECU which concerns on 5th Embodiment. FIG. 17 is a detailed flowchart showing an example of the traveling direction estimation process executed in step S513 of the flowchart of FIG. Detailed flowchart showing an example of the position correction processing executed in step S517 of the flowchart of FIG. The graph for demonstrating an example of the position correction method of the virtual lane marking performed by the position correction part

  Hereinafter, embodiments of a driving support apparatus according to the present invention will be described with reference to the drawings. The driving support apparatus according to the present invention is an apparatus that is mounted on a vehicle and outputs driving support information to a driver. First, an example of the configuration of a driving support device mounted on a vehicle will be described with reference to FIG.

FIG. 1 is a block diagram illustrating an example of a configuration of a driving support apparatus according to the present invention. As shown in FIG. 1, a driving assistance ECU (Electronic Control Unit) 1 (= corresponding to a driving assistance device) according to the present invention is communicably connected to an input device 2 and an output device 3 as peripheral devices. . Here, a case will be described in which the driving support device outputs driving support information for preventing deviation from the lane in which the vehicle is traveling. In the following description, when it is not necessary to distinguish between the driving support ECU 11 to the driving support ECU 15, they are collectively referred to as the driving support ECU 1.

  First, the input device 2 of the driving assistance ECU 1 will be described with reference to FIG. The input device 2 includes a CCD camera 21, a navigation system 22, a steering torque detection sensor 23, and a turn signal detection switch 24.

The CCD camera 21 includes a CCD (Charge Coupled Device), and is a camera that generates image information such as the front and side of the vehicle. Further, the CCD camera 21 outputs the generated image information to the driving support ECU 1, and the driving support ECU 1 detects the lane markings, roadside objects, and the like based on the image information.

In the following first to fifth embodiments, the driving assistance ECU 1 (driving assistance ECU 11 to driving assistance ECU 15) detects a lane marking, a roadside object, and the like based on image information from the CCD camera 21. However, the driving assistance ECU 1 may detect the lane markings, roadside objects, and the like by other methods. For example, the driving support ECU 1 may detect a roadside object or the like via a radar device. In addition, for example, the driving support ECU 1 is a camera of another type (for example, a CMOS (Complementary Metal Oxide Semiconductor) camera).
On the basis of the image information from the vehicle, a form for detecting a lane marking, a roadside object, etc. may be used.

  Here, the “vehicle lane marking” is a target that divides the lane, and is arranged intermittently at predetermined intervals on the road in addition to the white line drawn continuously in a line on the road. In addition, a white or yellow broken line that is linear by connecting adjacent objects, a bots dot (a point having a diameter of about 10 cm), a cat's eye (a reflector), and the like are included. “Roadside objects” are laid outside the lane, and include guardrails, side walls, median strips, curbs, street trees, and the like.

The navigation system 22 is a system that includes map information, detects a vehicle position that is a position on the map of the vehicle via a GPS (Global Positioning System), and displays the vehicle position on the map on a display. Further, the navigation system 22 outputs the vehicle position information and map information corresponding to the vehicle position to the driving support ECU 1.

  The steering torque detection sensor 23 is a sensor that detects a steering torque generated as a result of the steering operation performed by the driver via the steering wheel. The steering torque detection sensor 23 outputs a detected steering torque signal to the driving assistance ECU 1.

  The turn signal detection switch 24 is a switch that detects the result of the blinking instruction operation of the turn signal lamp operated by the driver via the turn signal lever. That is, it is a switch that detects whether or not an operation for instructing blinking of the turn signal lamp indicating a change in the route to the right or a change in the route to the left is performed. The turn signal detection switch 24 outputs a signal indicating the detected operation of the turn signal lever (hereinafter referred to as “turn signal operation signal”) to the driving assistance ECU 1.

  Note that the navigation system 22, the steering torque detection sensor 23, and the turn signal detection switch 24 are input devices 2 used in some of the five embodiments described later, and are therefore indicated by broken lines. Yes. For example, the navigation system 22 is used in third to fifth embodiments described later (see FIGS. 8, 11, and 14).

  Next, the output device 3 of the driving assistance ECU 1 will be described with reference to FIG. The output device 3 includes a display 31, a speaker 32, and a steering control ECU 33. The display 31 is composed of an LCD (Liquid Crystal Display) or the like disposed in front of the driver's seat, and displays images, characters, and the like so as to be visible to the driver in accordance with instructions from the driving support ECU 1. For example, when it is determined by the driving assistance ECU 1 that there is a high possibility that the driving lane is deviating to the right side, the driving lane is displayed on the display 31 based on an instruction from the driving assistance ECU 1. A warning screen will be displayed to the effect that it will deviate.

  The speaker 32 is disposed on the side of the driver's seat or the like, and outputs a voice such as guidance to the driver in accordance with an instruction from the driving support ECU 1. For example, when it is determined by the driving support ECU 1 that there is a high possibility of deviating to the right side of the driving lane, the driving lane is set to the right side from the speaker 32 based on an instruction from the driving support ECU 1. Warning information indicating that a deviation is made is output.

A steering control ECU (Electronic Control Unit) 33 is an ECU that controls steering.
In this case, a predetermined torque set in advance is applied to the steering wheel in accordance with an instruction from the driving support ECU 1. For example, when it is determined by the driving assistance ECU 1 that there is a high possibility of deviating to the right side of the lane in which the vehicle is traveling, the steering wheel is steered to the left based on an instruction from the driving assistance ECU 1. Encouraging torque is applied.

<First Embodiment>
FIG. 2 is a block diagram illustrating an example of a functional configuration of the driving assistance ECU 11 according to the first embodiment. As shown in FIG. 2, the driving assistance ECU 11 functionally includes a white line detection unit 111, a roadside object detection unit 112, a white line estimation unit 113, and an information output unit 114.

  The driving support ECU 11 causes a microcomputer disposed in a proper position of the driving support ECU 11 to execute a control program stored in advance in a ROM (Read Only Memory) disposed in a proper position of the driving support ECU 11. The microcomputer is functionally functioned as functional units such as a white line detection unit 111, a roadside object detection unit 112, a white line estimation unit 113, and an information output unit 114. Below, each function part of driving assistance ECU11 is demonstrated, referring FIG.

  FIG. 3 is a plan view illustrating an example of a situation in which the position of the lane marking is estimated by the driving assistance ECU 11 according to the first embodiment. The upper diagram in FIG. 3 is a plan view showing an example of a situation in which the position of the lane marking VL1 is estimated by the driving assistance ECU 11, and the lower diagram in FIG. 3 is a vehicle estimated by the driving assistance ECU 11. It is a top view which shows the position of division line VL1. As shown in the upper diagram of FIG. 3, the vehicle VC travels with the lanes partitioned by the vehicle lane markings WL and WR facing upward. Further, a guard rail (= roadside object) GL is laid along the vehicle lane line WL outside the lane line WL. In the range NL in front of the vehicle VC, the lane marking WL disappears (or is blurred).

  As shown in the upper diagram of FIG. 3, when a part of the lane markings WL, WR is disappeared (or blurred), the lane that is running based on the lane markings WL, WR It is difficult to appropriately determine the possibility of deviation from In view of the above problems, the driving assistance ECU 11 according to the first embodiment determines the position of the vehicle lane marking VL1 in the range NL shown in the upper diagram of FIG. 3 as shown in the lower diagram of FIG. And the possibility of deviation from the lane is determined based on the estimated position of the lane marking VL1.

  The white line detection unit 111 (corresponding to the white line detection means) detects the vehicle lane markings WR and WL laid on the right side and the left side of the lane on which the vehicle VC travels based on image information from the CCD camera 21. Part.

  The roadside object detection unit 112 (corresponding to a roadside object detection unit) is configured to detect the CCD when the white line detection unit 111 detects the vehicle lane marking WR on one side and the vehicle lane marking WL on the other side is not detected. This is a functional unit that detects the roadside object GL on the other side based on image information from the camera 21. Specifically, the roadside object detection unit 112 detects a roadside object GL laid along the one side lane marking WR among the other side roadside objects. In other words, the roadside object detection unit 112 detects the roadside object GL laid along the detected vehicle lane line WR among the roadside objects on the side where the vehicle lane line WL is not detected.

  As shown in the upper diagram of FIG. 3, the roadside object (= guardrail) GL is laid along the detected vehicle lane marking WR, and therefore, the roadside object detection unit 112 performs the white line estimation unit 113. 3 is detected as a roadside object used for estimating the position of the lane marking VL1 corresponding to the range NL shown in the upper diagram of FIG. When the detected vehicle lane marking WR is a straight line, the roadside object detection unit 112 detects a roadside object substantially parallel to the vehicle lane marking WR.

  Thus, since the roadside object GL laid along the detected vehicle lane line WR among the roadside objects GL on the side where the vehicle laneline WL is not detected is detected, the vehicle lane line Roadside objects GL such as guardrails, median strips, side walls, curbs, and the like, which are useful for estimating the position of the lane marking VL1 on the side where WL is not detected, can be detected.

In the first embodiment, the case where the roadside object detection unit 112 detects the roadside object GL laid along the detected vehicle lane marking WR will be described. The form which detects a roadside thing may be sufficient. For example, the roadside object detection unit 112 may detect the roadside object closest to the lane on the side where the lane marking WL is not detected. In this case, when a roadside tree is planted along the lane (or when a pole is laid), this roadside tree (or pole) can be detected as a roadside object.

  A white line estimation unit 113 (corresponding to a white line estimation unit) is a functional unit that estimates the position of the other lane marking VL1 based on the roadside object GL detected by the roadside object detection unit 112. Specifically, the white line estimation unit 113 sets a position separated from the position of the roadside object GL detected by the roadside object detection unit 112 inside the lane in which the vehicle VC travels by a preset first distance ΔL1. The position of the other lane marking VL1 is estimated.

  Thus, the position separated from the detected position of the roadside object GL by the preset first distance ΔL1 inside the lane in which the vehicle VC travels is the position of the other side lane marking VL1. Therefore, the position of the lane marking VL1 can be estimated more appropriately by setting the first distance ΔL1 to an appropriate value.

  For example, the type of the roadside object GL is determined, and based on the determined type of the roadside object GL, the degree of damage in the case of colliding with the roadside object GL is estimated. A large value may be set as the one distance ΔL1. In this case, since the position of the lane marking VL1 with a margin (= a position away from the roadside object GL) is estimated for the roadside object GL having a large degree of damage in the case of a collision, Since the risk of colliding with the roadside object GL can be reduced, the position of the lane marking VL1 can be estimated more appropriately.

  In the first embodiment, a case will be described in which the white line estimation unit 113 estimates a position separated by a first distance ΔL1 from the position of the roadside object GL to the inside of the lane as the position of the lane marking VL1. 113 should just be a form which estimates the position of the said other side lane marking VL1 based on the roadside thing GL detected by the roadside thing detection part 112. FIG. For example, the white line estimation unit 113 may estimate the position of the roadside object GL as the position of the lane marking VL1. In this case, the process is simplified. In addition, when a curb or the like is laid in the vicinity of the lane marking, an appropriate position can be estimated as the position of the lane marking VL1.

  The information output unit 114 (corresponding to the information output unit) is a functional unit that outputs driving support information based on the position of the other side lane marking VL1 estimated by the white line estimation unit 113. Specifically, the information output unit 114 detects the lanes that are running based on the lane markings WR and WL detected by the white line detection unit 111 and the lane markings VL1 estimated by the white line estimation unit 113. When the possibility of departure is determined, and it is determined that the possibility of departure is high, driving support information (in this case, an alarm or the like) is output via the output device 3.

  FIG. 4 is a flowchart illustrating an example of the operation of the driving assistance ECU 11 according to the first embodiment. Here, for convenience, the case where the lane marking WR laid on the right side of the lane is detected by the white line detection unit 111 will be described. Further, in the flowchart shown in FIG. 4, “vehicle lane marking” is described as “white line” for convenience. First, the white line detection unit 111 detects the lane markings WR and WL laid on the right side and the left side of the lane in which the vehicle VC travels (S101). Then, the roadside object detection unit 112 determines whether or not the left lane marking WL is detected (S103).

  If it is determined that the left lane marking WL has been detected (YES in S103), the information output unit 114 determines the right and left lane markings WR and WL detected in step S101. The possibility of departure from the lane is determined (S105), and the process proceeds to step S115. When it is determined that the left lane marking WL has not been detected (NO in S103), the roadside object GL on the side where the vehicle lane marking is not detected (here, the left side) GL is detected. Whether or not is detected is determined (S107). When it is determined that the roadside object GL is not detected (NO in S107), the determination of departure from the lane by the information output unit 114 is stopped (S109), and the process is returned to step S101, and after step S101. This process is repeatedly executed.

  When it is determined that the roadside object GL has been detected (YES in S107), the white line estimation unit 113 detects the roadside object GL on the side that is not detected in step S101 (here, the left side). The position of the lane marking VL1 is estimated (S111). Then, the information output unit 114 determines the possibility of departure from the lane based on the lane marking WR detected in step S101 and the lane marking VL1 estimated in step S111 (S113). When the process of step S105 or step S113 is completed, the information output unit 114 determines whether or not there is a high possibility of departure from the lane (S115). If it is determined that there is a high possibility of departure from the lane (YES in S115), an alarm or the like is output by the information output unit 114 (S117), the process returns to step S101, and the processes after step S101 are performed. Is repeatedly executed. If it is determined that the possibility of departure from the lane is not high (NO in S115), the process returns to step S101, and the processes after step S101 are repeatedly executed.

  In this manner, when the one side lane marking WR is detected and the other side lane marking WL is not detected, the other side roadside object GL is detected, and the detected roadside object GL is detected. Since the position of the other lane marking VL1 is estimated based on the above, the position of the lane marking VL1 can be estimated appropriately. Moreover, since an alarm etc. are output based on the position of the vehicle lane marking VL1 estimated appropriately, an appropriate alarm etc. can be output.

Second Embodiment
FIG. 5 is a block diagram illustrating an example of a functional configuration of the driving assistance ECU 12 according to the second embodiment. As shown in FIG. 5, the driving support ECU 12 functionally includes a white line detection unit 121, a roadside object detection unit 122, a distance setting unit 123, a virtual line estimation unit 124, and an information output unit 125.

  The driving support ECU 12 causes the microcomputer disposed in a proper position of the driving support ECU 12 to execute a control program stored in advance in a ROM or the like disposed in a proper position of the driving support ECU 12, thereby causing the microcomputer to operate. Functionally, it functions as a functional unit such as a white line detection unit 121, a roadside object detection unit 122, a distance setting unit 123, a virtual line estimation unit 124, and an information output unit 125. Below, each function part of driving assistance ECU12 is demonstrated, referring FIG.

  FIG. 6 is a plan view illustrating an example of a situation where the position of the virtual lane marking is estimated by the driving assistance ECU 12 according to the second embodiment. The upper diagram in FIG. 6 is a plan view showing an example of the situation in which the position of the virtual lane marking VL2 is estimated by the driving assistance ECU 12, and the lower diagram in FIG. It is a top view which shows the position of division line VL2. As shown in the upper diagram of FIG. 6, the vehicle VC is traveling with the lane partitioned by the vehicle lane markings WL and WR facing upward. Further, a side wall (= roadside object) WA is laid along the vehicle lane line WL outside the lane line WL.

  As shown in the upper diagram of FIG. 6, when a roadside object WA is laid in the vicinity of the lane markings WL and WR, the deviation from the traveling lane based on the lane markings WL and WR. It may not be appropriate to determine the possibility of In view of the above problems, the driving assistance ECU 12 according to the second embodiment sets the position of the virtual lane marking VL2 facing the side wall (= roadside object) WA to the roadside object WA as shown in the lower diagram of FIG. Based on the estimated position of the virtual lane line VL2, the possibility of deviation from the lane is determined.

  The white line detection unit 121 (corresponding to the white line detection means) detects the lane markings WR and WL laid on the right side and the left side of the lane on which the vehicle VC travels based on image information from the CCD camera 21. Part.

  The roadside object detection unit 122 (corresponding to the roadside object detection means) is a functional unit that detects roadside objects WA laid on the right side and the left side of the lane in which the vehicle VC travels based on image information from the CCD camera 21. It is. Further, the roadside object detection unit 122 determines the type of the detected roadside object WA. Specifically, the roadside object detection unit 122 makes a determination by selecting a corresponding type from among guardrails, side walls, median strips, curbs, street trees, and the like that are assumed in advance as the roadside object WA.

The distance setting unit 123 (corresponding to the distance setting unit) is a second distance between the vehicle lane line WL detected by the white line detection unit 121 and the virtual lane line VL2 estimated by the virtual line estimation unit 124. This is a functional unit for setting the distance ΔL2. Specifically, the distance setting unit 123 sets the second distance ΔL2 based on the type of the roadside object WA determined by the roadside object detection unit 122. Here, the distance setting unit 123 estimates the degree of damage when colliding with the roadside object WA based on the type of the roadside object WA determined by the roadside object detection unit 122, and the estimated degree of damage is large. As the second distance ΔL2, a larger value is set.

  For example, when the roadside object detection unit 122 determines the type of the roadside object WA by selecting a corresponding type from the side wall, guardrail, median strip, roadside tree, and curb, as follows: Thus, the second distance ΔL2 is set. The distance setting unit 123 sets a large value as the second distance ΔL2 in the order of the side wall, the guardrail, the median strip, the roadside tree, and the curb, which are in the descending order of damage when the vehicle collides with the roadside object WA. . For example, when the roadside object detection unit 122 determines that the type of the roadside object WA is a side wall, a guard rail, a median strip, a roadside tree, and a curb, the distance setting unit 123 sets the second distance ΔL2. , 1.0 m, 0.7 m, 0.6 m, 0.5 m, and 0.1 m, respectively.

  The virtual line estimation unit 124 (corresponding to a virtual line estimation unit) is operated based on the lane markings WR and WL detected by the white line detection unit 121 and the roadside object WA detected by the roadside object detection unit 122. It is a functional unit that estimates the position of a virtual lane marking VL2 that is a virtual lane marking used for output of support information (here, an output such as a warning indicating departure from the lane). Specifically, the virtual line estimation unit 124 detects the lane markings WR, WL on at least one of the right side and the left side of the lane in which the vehicle travels by the white line detection unit 121, and the lane markings WR, WL When the roadside object WA is detected by the roadside object detection unit 122 on the detected side (here, the left side of the lane), the distance is set from the position of the vehicle lane line WL to the inside of the lane on which the vehicle VC is traveling. The position separated by the second distance ΔL2 set by the unit 123 is estimated as the position of the virtual lane marking VL2.

  Thus, when the roadside object WA is detected on the side where the lane marking WL is detected, the vehicle is separated from the position of the lane marking WL by the second distance ΔL2 inside the lane in which the vehicle travels. Since the position is estimated as the position of the virtual lane marking VL2, the position of the virtual lane marking VL2 can be estimated more appropriately by setting the second distance ΔL2 to an appropriate value.

  In the second embodiment, when the virtual line estimation unit 124 detects the roadside object WA, the position separated by the second distance ΔL2 from the position of the vehicle lane line WL to the inside of the lane is the position of the virtual lane line VL2. However, the virtual line estimation unit 124 may be configured to estimate the position of the virtual lane marking VL2 based on the vehicle lane marking WL and the roadside object WA. For example, when the roadside object WA is laid outside the vehicle lane line WL, the virtual line estimation unit 124 estimates the position of the vehicle lane line WL as the position of the virtual lane line VL2, and from the vehicle lane line WL When the roadside object WA is laid inside, the position of the roadside object WA may be estimated as the position of the virtual lane marking VL2. In this case, the process is simplified.

  In addition, since the type of the roadside object WA is determined and the second distance ΔL2 is set based on the determined type of the roadside object WA, the second distance ΔL2 can be set to an appropriate value. The position of the lane marking VL2 can be estimated more appropriately.

  Further, for a roadside object WA having a large degree of damage in the case of a collision, since the position of the virtual lane line VL2 is estimated (at a position away from the roadside object WA), it collides with the roadside object WA. Therefore, the position of the virtual lane marking VL2 can be estimated more appropriately.

In the second embodiment, the case where the distance setting unit 123 sets a larger value as the second distance ΔL2 as the degree of damage when the distance setting unit 123 collides with the roadside object WA has been described. The second distance ΔL2 may be set by this method. For example, the distance setting unit 123 estimates the risk when a lane departure occurs based on the type of the roadside object WA, and sets a larger value as the second distance ΔL2 as the estimated risk increases. Form may be sufficient. For example, in the case where the roadside object WA is a rigidly laid guard rail for preventing the vehicle from falling from a cliff, the risk is high when a lane departure occurs, so the second distance ΔL2 is large. Set the value.

  The information output unit 125 (corresponding to an information output unit) is a functional unit that outputs driving support information based on the position of the virtual lane marking VL2 estimated by the virtual line estimation unit 124. Specifically, the information output unit 125 travels based on the vehicle lane markings WR and WL detected by the white line detection unit 111 and the virtual lane marking VL2 estimated by the virtual line estimation unit 124. The possibility of departure from the lane is determined, and when it is determined that the possibility of departure is high, driving support information (here, an alarm or the like) is output via the output device 3.

  FIG. 7 is a flowchart showing an example of the operation of the driving assistance ECU 12 according to the second embodiment. Here, for convenience, the case where the lane marking WR laid on the right side of the lane is detected by the white line detection unit 111 will be described. Further, in the flowchart shown in FIG. 7, “vehicle lane marking” is described as “white line” for convenience. First, the white line detector 121 detects the lane markings WR and WL laid on the right side and the left side of the lane in which the vehicle VC travels (S201). Then, the roadside object detector 122 determines whether or not the left lane marking WL is detected (S203).

  If it is determined that the left lane marking WL has not been detected (NO in S203), the determination of departure from the lane by the information output unit 125 is stopped (S205), and the process returns to step S201. The processing after step S201 is repeatedly executed. If it is determined that the left lane marking WL is detected (YES in S203), the roadside object detection unit 122 determines whether the left roadside object WA is detected (S207). . If it is determined that the roadside object WA has not been detected (NO in S207), the possibility of departure from the lane is determined based on the right and left lane markings WR and WL detected in step S201. In step S209, the process proceeds to step S217.

  When it is determined that the roadside object WA has been detected (YES in S207), the type of the roadside object WA is determined by the roadside object detection unit 122 (S211). Then, the distance setting unit 123 sets the second distance ΔL2 based on the type of the roadside object WA determined in step S211 (S213). Next, the virtual lane estimation unit 124 estimates the position of the virtual lane line VL2 based on the vehicle lane line WL detected in step S201 and the second distance ΔL2 set in step S213, and the estimated virtual lane line VL2 The possibility of departure from the lane is determined based on the lane marking WR detected in step S201 (S215).

When the process of step S209 or step S215 is completed, the information output unit 125 determines whether or not there is a high possibility of departure from the lane (S217). If it is determined that there is a high possibility of departure from the lane (YES in S217), an alarm or the like is output by the information output unit 125 (S219), the process returns to step S201, and the processes after step S201 are performed. Is repeatedly executed. If it is determined that the possibility of departure from the lane is not high (NO in S217), the process returns to step S201, and the processes after step S201 are repeatedly executed.

  In this way, based on the detected vehicle lane line WL and the detected roadside object WA, the position of the virtual lane line VL2 that is a virtual vehicle lane line used for output of driving assistance information is estimated. Therefore, the position of the virtual lane marking VL2 can be estimated appropriately. In addition, since an alarm or the like is output based on the position of the virtual lane marking VL2 that is appropriately estimated, an appropriate alarm or the like can be output.

<Third Embodiment>
FIG. 8 is a block diagram illustrating an example of a functional configuration of the driving support ECU 13 according to the third embodiment. As shown in FIG. 8, the driving support ECU 13 functionally includes a white line detection unit 131 , a lane determination unit 132, a center line estimation unit 133, and an information output unit 134.

The driving support ECU 13 causes the microcomputer disposed in a proper position of the driving support ECU 13 to execute a control program stored in advance in a ROM or the like disposed in a proper position of the driving support ECU 13, thereby causing the microcomputer to operate. Functionally, the white line detection unit 131 , the lane determination unit 132, the center line estimation unit 133, the information output unit 134, and the like function. Below, each function part of driving assistance ECU13 is demonstrated, referring FIG.

  FIG. 9 is a plan view illustrating an example of a situation in which the position of the virtual center dividing line is estimated by the driving assistance ECU 13 according to the third embodiment. The upper diagram in FIG. 9 is a plan view showing an example of a situation in which the position of the virtual central dividing line VL3 is estimated by the driving support ECU 13, and the lower diagram in FIG. It is a top view which shows the position of the made virtual center division line VL3. As shown in the upper diagram of FIG. 9, the vehicle VC travels upward in a lane that does not have a central lane marking defined by the lane markings WL and WR.

On roads without a central lane marking as shown in the upper diagram of FIG. 9, a warning or the like is appropriately output when the possibility of deviation from the lane in which the vehicle is traveling is determined based on the lane markings WL and WR. You may not be able to. In other words, in order to avoid the danger of a collision with the oncoming vehicle, it may be preferable to travel along the left lane marking WL (or the right lane marking WR). In view of the above problems, the driving assistance ECU 13 according to the third embodiment estimates the position of the virtual center division line VL3 based on the lane markings WL and WR as shown in the lower diagram of FIG. The possibility of deviation from the lane is determined on the basis of the estimated position of the virtual center line VL3 .

  The white line detection unit 131 (corresponding to white line detection means) detects the vehicle lane markings WR and WL laid on the right side and the left side of the lane on which the vehicle VC travels based on image information from the CCD camera 21. Part.

  The lane determination unit 132 (corresponding to the lane determination means) determines whether or not the road on which the vehicle VC travels is a road without a central dividing line based on the position information of the vehicle VC from the navigation system 22 and the map information. It is a functional unit that determines whether Specifically, the lane determination unit 132 acquires travel lane information, which is information related to the lane in which the vehicle VC is traveling, from the navigation system 22 based on the position information of the vehicle VC and the map information. Based on the travel lane information, it is determined whether or not the road on which the vehicle VC travels is a road without a central dividing line.

  The center line estimation unit 133 (corresponding to the center line estimation unit) detects the lane markings WL and WR detected by the white line detection unit 131 when the lane determination unit 132 determines that the road has no center division line. Is a functional unit that estimates the position of the virtual center dividing line VL3. Specifically, the center line estimation unit 133 estimates the approximate center position of the lane markings WL and WR on both sides detected by the white line detection unit 131 as the position of the virtual center division line VL3.

  That is, as shown in FIG. 9, the center line estimation unit 133 determines that the distance RW1 between the vehicle lane line WL and the virtual center line VL3 is based on the distance RW between the vehicle lane line WL and the vehicle lane line WR. The position of the virtual center division line VL3 is estimated so as to substantially coincide with the distance RW2 between the vehicle lane marking WR and the virtual center division line VL3. That is, the position of the virtual center line VL3 is estimated so that the distance RW1 and the distance RW2 are approximately ½ of the distance RW.

  Thus, since the approximate center position of the detected vehicle lane markings WR and WL on both sides is estimated as the position of the virtual center section line VL3, the virtual center section line used to output the driving assistance information It is possible to appropriately estimate the position of VL3.

  In the third embodiment, a case will be described in which the center line estimation unit 133 estimates the approximate center positions of the lane markings WL and WR on both sides as the position of the virtual center division line VL3. However, what is necessary is just the form which estimates the position of virtual center division line VL3 based on the lane markings WL and WR detected by the white line detection part 131. For example, the center line estimation unit 133 may be configured to estimate the position of the virtual center division line VL3 based on the distance RW between the lane markings WL and WR on both sides.

  Specifically, for example, when the distance RW is greater than or equal to a preset threshold value (for example, 5 m), the center line estimation unit 133 determines the predetermined distance (for example, a predetermined distance from the left lane marking WL) (for example, 2m) is estimated as the position of the virtual central dividing line VL3. Further, when the distance RW is less than a preset threshold value (here, 5 m), the center line estimation unit 133 sets the approximate center position of the vehicle lane markings WR and WL on both sides to the virtual center division line VL3. Estimated as the position of In this case, the center division line VL3 can be estimated at a more appropriate position.

The information output unit 134 (corresponding to an information output unit) is a functional unit that outputs driving support information based on the position of the virtual center division line VL3 estimated by the center line estimation unit 133. Specifically, the information output unit 134 travels based on the lane marking WL detected by the white line detection unit 131 and the virtual center line VL3 estimated by the center line estimation unit 133. The possibility of departure from the lane is determined, and when it is determined that the possibility of departure is high, driving support information (here, an alarm or the like) is output via the output device 3.

  FIG. 10 is a flowchart illustrating an example of the operation of the driving support ECU 13 according to the third embodiment. In the flowchart shown in FIG. 10, for the sake of convenience, “vehicle lane marking” is described as “white line”. First, the white line detector 131 detects the lane markings WR and WL laid on the right side and the left side of the lane in which the vehicle VC travels (S301). Then, the lane determination unit 132 determines whether or not the lane markings WR and WL on both sides are detected in step S301 (S303). When it is determined that at least one of the lane markings WR and WL is not detected (NO in S303), the determination of departure from the lane by the information output unit 134 is stopped (S305), and the process proceeds to step S301. The process after step S301 is repeatedly executed.

  If it is determined that the lane markings WR and WL on both sides have been detected (YES in S303), the lane determination unit 132 acquires travel lane information from the navigation system 22 (S307). Then, the lane determination unit 132 determines whether or not the road on which the vehicle VC travels is a road having no central lane marking based on the travel lane information acquired in step S307 (S309). When it is determined that the road has the central lane marking (NO in S309), the information output unit 134 determines the deviation from the lane based on the lane marking WL and the lane marking WR detected in step S301. The possibility is determined (S311), and the process proceeds to step S317.

  If it is determined that the road has no central lane marking (YES in S309), a virtual central lane marking based on the lane markings WL and WR detected in step S301 by the central line estimation unit 133 is determined. The position of VL3 is estimated (S313). Then, the possibility of departure from the lane is determined based on the lane marking WL detected in step S301 and the virtual center lane line VL3 estimated in step S313 (S315).

  When the process of step S311 or step S315 is completed, the information output unit 134 determines whether or not there is a high possibility of departure from the lane (S317). If it is determined that there is a high possibility of departure from the lane (YES in S317), an alarm or the like is output by the information output unit 134 (S319), the process returns to step S301, and the processes after step S301 are performed. Is repeatedly executed. If it is determined that the possibility of departure from the lane is not high (NO in S317), the process returns to step S301, and the processes after step S301 are repeatedly executed.

  In this way, since it is determined whether or not the road on which the vehicle VC travels is a road without a central dividing line based on the position information of the vehicle and the map information, it is a road without a central dividing line. Whether or not can be determined appropriately. In addition, when it is determined that the road has no central dividing line, the position of the virtual central dividing line VL3 is estimated based on the detected lane markings WR and WL. An appropriate position of the dividing line VL3 can be estimated. Furthermore. Since an alarm or the like is output based on the position of the virtual center division line VL3 estimated at an appropriate position, an appropriate alarm or the like can be output.

<Fourth embodiment>
FIG. 11 is a block diagram illustrating an example of a functional configuration of the driving assistance ECU 14 according to the fourth embodiment. As shown in FIG. 11, the driving support ECU 14 functionally includes a white line detection unit 141, a stop zone determination unit 142, a lane marking correction unit 143, and an information output unit 144.

  The driving support ECU 14 causes the microcomputer disposed in a proper position of the driving support ECU 14 to execute a control program stored in advance in a ROM or the like disposed in a proper position of the driving support ECU 14, thereby causing the microcomputer to operate. Functionally, it functions as a functional unit such as a white line detection unit 141, a stop zone determination unit 142, a lane marking correction unit 143, and an information output unit 144. Below, each function part of driving assistance ECU14 is demonstrated, referring FIG.

FIG. 12 is a plan view illustrating an example of a situation in which the position of the lane marking is corrected by the driving assistance ECU 14 according to the fourth embodiment. Upper diagram in FIG. 12 is a plan view showing an example of a context in which the position of the lane line WLB is corrected by the driving support ECU 14, under the side view of FIG. 12 is corrected by the driving support ECU 14 It is a top view which shows the position of the lane marking VWL. As shown in the upper diagram of FIG. 12, the vehicle VC travels upward in a lane that does not have a central lane marking defined by the lane markings WL and WR.

  As shown in FIG. 12, in the vehicle stop zone ARB such as a bus stop, the left lane marking WL is widened by the stop lane width outside the lane, so the vehicle travels based on the lane markings WL and WR. If the possibility of deviation from the lane is determined, there may be a case where an alarm or the like cannot be output appropriately. In other words, when a vehicle stop band ARB such as a bus stop is laid, it may not be preferable to travel along the left vehicle lane line WL (or right vehicle lane line WR). In view of the above problems, the driving assistance ECU 14 according to the fourth embodiment corrects the position of the vehicle lane marking WLB by correcting it to the position of the lane marking VWL as shown in the lower diagram of FIG. The possibility of deviation from the lane is determined based on the lane marking VWL.

  The white line detection unit 141 (corresponding to the white line detection means) detects the lane markings WR and WL laid on the right side and the left side of the lane on which the vehicle VC travels based on image information from the CCD camera 21. Part.

  The stop zone determination unit 142 (corresponding to the stop zone determination means) widens by a preset stop lane width ahead of the vehicle VC based on the position information of the vehicle VC from the navigation system 22 and the map information. It is a function part which determines whether there is the made vehicle stop zone ARB. Specifically, the stop zone determination unit 142 acquires stop zone information, which is information related to the vehicle stop zone laid in the lane in which the vehicle VC is traveling, from the navigation system 22, and based on the stop zone information. Then, it is determined whether or not there is a vehicle stop zone ARB in front of the vehicle VC. Here, the vehicle stop zone ARB includes a bus stop shown in FIG. 12, a refuge place such as an accident car laid on an expressway, and the like. In the fourth embodiment, a case where the vehicle stop band ARB is a bus stop will be described.

  The lane line correction unit 143 (corresponding to the lane line correction unit) is arranged on the vehicle lane lines WR and WL detected by the white line detection unit 141 when the stop zone determination unit 142 determines that the vehicle stop zone ARB is present. Based on this, it is a functional unit that corrects the position of the lane marking WL on the side where the vehicle stop band ARB is laid. In other words, the lane marking correction unit 143 is a functional unit that determines the position of the corrected lane marking VWL.

  Specifically, the lane marking correction unit 143 is positioned at the position of the lane marking WL laid before the vehicle stop zone ARB and the position of the lane marking WR on the side where the vehicle stop zone ARB is not laid. Based on this, the position of the lane marking WL on the side where the vehicle stop band ARB is laid is corrected. For example, as shown in the lower diagram of FIG. 12, the lane marking correction unit 143 extends in parallel with the lane marking WR starting from the position of the lane marking WL laid in front of the vehicle stop zone ARB. By doing so, the corrected position of the lane marking VWL is obtained.

  In this way, based on the position of the lane marking WL laid before the vehicle stop zone ARB and the position of the lane marking WR on the side where the vehicle stop zone ARB is not laid, the vehicle stop zone ARB Can be corrected appropriately (= the correct position of the corrected vehicle lane line VWL can be obtained).

  In the fourth embodiment, the lane marking correction unit 143 corrects the lane marking after correction based on the position of the lane marking WL laid before the vehicle stop zone ARB and the position of the lane marking WR. Although the case where the position of VWL is calculated | required is demonstrated, the lane line correction | amendment part 143 is based on the position of the vehicle lane line WL currently laid before the vehicle stop zone ARB, and the position of the vehicle lane line WR. Any form may be used as long as the position of the corrected lane marking VWL is obtained. For example, the lane line correction unit 143 may obtain the corrected position of the vehicle lane line VWL based on the position of the vehicle lane line WL laid before the vehicle stop zone ARB. Specifically, for example, the lane marking correction unit 143 extends the lane marking WL starting from the position of the lane marking WL laid in front of the vehicle stop band ARB, and the corrected lane marking The form which calculates | requires the position of VWL may be sufficient.

The information output unit 144 (corresponding to an information output unit) is a functional unit that outputs driving support information based on the position of the lane marking VWL corrected by the lane marking correction unit 143. Specifically, the information output unit 144 is traveling based on the lane markings WR and WL detected by the white line detection unit 141 and the lane marking VWL corrected by the lane marking correction unit 143. The possibility of departure from the lane is determined, and when it is determined that the possibility of departure is high, driving support information (here, an alarm or the like) is output via the output device 3.

  FIG. 13 is a flowchart illustrating an example of the operation of the driving assistance ECU 14 according to the fourth embodiment. In the flowchart shown in FIG. 13, “vehicle lane marking” is described as “white line” for convenience. First, the white line detection unit 141 detects the lane markings WR and WL laid on the right side and the left side of the lane in which the vehicle VC travels (S401). Then, the stop zone determination unit 142 determines whether or not the lane markings WR and WL on both sides are detected in step S401 (S403). When it is determined that at least one of the lane markings WR and WL is not detected (NO in S403), the information output unit 144 stops the departure from the lane (S405), and the process proceeds to step S401. The processing after step S401 is repeatedly executed.

  When it is determined that the lane markings WR and WL on both sides are detected (YES in S403), the stop zone information is acquired from the navigation system 22 by the stop zone determination unit 142 (S407). Then, the stop zone determination unit 142 determines whether or not there is a vehicle stop zone ARB in front of the vehicle VC based on the stop zone information acquired in step S407 (S409). If it is determined that there is no vehicle stop band ARB (NO in S409), the information output unit 144 may cause a departure from the lane based on the vehicle lane line WL and the vehicle lane line WR detected in step S401. Is determined (S411), and the process proceeds to Step S417.

  If it is determined that the vehicle stop zone ARB is present (YES in S409), the lane marking correction unit 143 lays the vehicle stop zone ARB based on the lane markings WR and WL detected in step S401. The position of the vehicle lane marking WL on the side of the vehicle is corrected, and the corrected position of the lane marking VWL is obtained (S413). Then, the possibility of departure from the lane is determined based on the lane marking WR detected in step S401 and the lane marking VWL corrected in step S413 (S415).

  When the process of step S411 or step S415 is completed, the information output unit 144 determines whether or not there is a high possibility of departure from the lane (S417). If it is determined that there is a high possibility of departure from the lane (YES in S417), an alarm or the like is output by the information output unit 144 (S419), the process returns to step S401, and the processes after step S401 are performed. Is repeatedly executed. If it is determined that the possibility of departure from the lane is not high (NO in S417), the process returns to step S401, and the processes after step S401 are repeatedly executed.

  Thus, it is determined whether or not there is a vehicle stop zone ARB widened by a preset stop lane width in front of the vehicle VC based on the position information of the vehicle VC and the map information. Whether or not the vehicle stop band ARB is present can be appropriately determined. Further, when it is determined that the vehicle stop zone ARB is present, the position of the vehicle lane marking WL on the side where the vehicle stop zone ARB is laid is corrected based on the detected vehicle lane markings WR and WL. The position of the lane marking WL can be corrected appropriately. Furthermore, since an alarm or the like is output based on the corrected position of the lane marking VWL, an appropriate alarm or the like can be output.

<Fifth Embodiment>
FIG. 14 is a block diagram illustrating an example of a functional configuration of the driving support ECU 15 according to the fifth embodiment. As shown in FIG. 14, the driving support ECU 15 functionally includes a white line detection unit 151, a branch determination unit 152, an operation acquisition unit 153, a direction estimation unit 154, a virtual line estimation unit 155, a reliability estimation unit 156, a position correction. A unit 157 and an information output unit 158.

Note that the driving support ECU 15 causes the microcomputer disposed in a proper position of the driving support ECU 15 to execute a control program stored in advance in a ROM or the like disposed in a proper position of the driving support ECU 15, thereby causing the microcomputer to operate. Functionally, functional units such as a white line detection unit 151, a branch determination unit 152, an operation acquisition unit 153, a direction estimation unit 154, a virtual line estimation unit 155, a reliability estimation unit 156, a position correction unit 157, an information output unit 158, etc. To function as. Hereinafter, each functional unit of the driving support ECU 15 will be described with reference to FIGS . 15 and 16 .

15 and 16 are plan views illustrating an example of a situation in which the position of the virtual lane marking is estimated by the driving assistance ECU 15 according to the fifth embodiment. Upper diagram in FIG. 15, the virtual lane marking VWR by the driving support ECU 15, a plan view illustrating an example of a context in which the position of the VWL is estimated under the side view of Figure 15, estimated by the driving support ECU 15 It is a top view which shows the position of made virtual division line VWR, VWL. As shown in the upper diagram of FIG. 15, the vehicle VC travels toward the upper side of the lane divided by the lane markings WL and WR, and there is a branch point ahead.

Upper diagram in FIG. 16 is a plan view showing an example of a context in which the position of the virtual lane marking VWL1, VWL2 is estimated by the driving support ECU 15, under the side view of FIG. 16 is estimated by the driving support ECU 15 It is a top view which shows the position of made virtual division line VWL1, VWL2. As shown in the upper diagram of FIG. 16, the vehicle VC is running with the lane divided by the vehicle lane markings WL and WR1 (or the vehicle lane markings WL and WR2) facing upward, and is branched forward. Exists. Note that the types of the right lane markings WR1 and WR2 are different from the diagram on the left side of FIG. 16 and the diagram on the right side. That is, in the left diagram of FIG. 16, the right vehicle lane line WR1 is a solid white line, and in the right diagram of FIG. 16, the right vehicle lane line WR2 is botsdots.

  As shown in the upper diagrams of FIG. 15 and FIG. 16, there is no left or right lane marking at the branch point, so it is not possible to determine the possibility of deviation from the lane in which the vehicle is traveling (or Have difficulty). In view of the above problems, the driving assistance ECU 15 according to the fifth embodiment generates virtual lane markings VWR, VWL, VWL1, and VWL2 that are virtual vehicle lane markings at a branch point, and the generated virtual lane marking VWR. , VWL, VWL1, and VWL2 are used to determine the possibility of departure from the lane.

  The white line detection unit 151 (corresponding to the white line detection means) detects the lane markings WR and WL laid on the right side and the left side of the lane on which the vehicle VC travels based on image information from the CCD camera 21. Part.

  The branch determination unit 152 (corresponding to the branch determination unit) is based on the position information of the vehicle VC from the navigation system 22 and the map information, and the branch point of the road on which the vehicle VC is traveling ahead of the vehicle VC. It is a function part which determines whether there exists. Here, “branch point” refers to a point where the road travels in two directions, such as an expressway junction, an exit from a highway to a general road, an entrance from a highway to a service area, etc. Say. Specifically, the branch determination unit 152 acquires branch information, which is information related to a branch point, from the navigation system 22, and based on the acquired branch information, a branch point of the road is located in front of the vehicle VC. It is determined whether or not there is.

  The operation acquisition unit 153 (corresponding to the operation acquisition unit) is a functional unit that acquires operation information indicating the content of the operation by the driver. Specifically, the operation acquisition unit 153 acquires the steering torque information and the turn signal operation information generated by the driver's operation via the steering torque detection sensor 23 and the turn signal detection switch 24.

  The direction estimation unit 154 (corresponding to the direction estimation unit) is based on the operation information acquired by the operation acquisition unit 153 (here, the steering torque information and the turn signal operation information), and the vehicle VC proceeds at a branch point of the road. It is a functional unit that estimates the direction. Specifically, the direction estimation unit 154 determines whether or not a turn signal operation has been performed based on the turn signal operation information. If the turn signal operation has been performed, the direction corresponding to the turn signal operation is determined. Estimated to progress to. When the turn signal operation is not performed, the direction estimation unit 154 determines whether or not the absolute value of the steering torque TR acquired by the operation acquisition unit 153 is an intentional steering operation (for example, , 1.5 Nm) or more, and if it is greater than or equal to the determination threshold TSH, the traveling direction of the vehicle VC is estimated based on the direction of the steering torque (= positive or negative of the steering torque TR).

  In this way, the traveling direction of the vehicle VC at the branch point of the road can be accurately estimated based on the steering torque information and the turn signal operation information.

  In 5th Embodiment, although the direction estimation part 154 demonstrates the case where the advancing direction of the vehicle VC in the branch location of a road is estimated based on the operation information acquired by the operation acquisition part 153, the direction estimation part 154 is demonstrated. However, the traveling direction of the vehicle VC at the branch point of the road may be estimated by other methods. For example, the direction estimating unit 154 may be configured to estimate the traveling direction of the vehicle VC at a branch point of the road based on one of the steering torque information and the turn signal operation information. In this case, the process is simplified.

  Further, for example, the direction estimation unit 154 may estimate the traveling direction of the vehicle VC at the branch point of the road based on the route guidance information from the navigation system 22. When the driver has set a destination or the like to the navigation system 22 and outputs route guidance information to the destination, the driver follows the route guidance information from the navigation system 22. There are many cases where operations are performed (= progress in directions indicated by route guidance). Therefore, in this case, the direction estimation unit 154 can accurately estimate the traveling direction of the vehicle VC with a simple configuration.

  The virtual line estimation unit 155 (corresponding to the virtual line estimation means) determines the possibility of deviating from the lane at the road branch point when the branch determination unit 152 determines that there is a road branch point. This is a functional unit that generates virtual lane markings VWL, VWR, and VWL1 that are virtual vehicle lane markings to be used so as to cross a branch road that is not the traveling direction estimated by the direction estimation unit 154.

  For example, in the situation shown in FIG. 15, when proceeding to the right branch road, as shown in the lower right side of FIG. 15, the virtual line estimation unit 155 causes the virtual section to cross the left branch road that is not in the traveling direction. Line VWL is generated. Further, when proceeding to the left branch road, as shown in the lower left side of FIG. 15, the virtual line estimation unit 155 generates a virtual lane marking VWR so as to cross the right branch road that is not the traveling direction. Further, in the situation shown in FIG. 16, when going straight (= going to the right branch road), as shown in the lower part of FIG. 16, the virtual dividing line VWL 1 to cross the left branch road that is not the traveling direction Is generated.

  Further, when the direction estimation unit 154 estimates that the virtual line estimation unit 155 proceeds to the right branch road, the virtual line estimation unit 155 follows the right vehicle division line WR detected by the white line detection unit 151 and the virtual division line VWL. Is generated by the direction estimation unit 154, and the virtual lane markings VWL, VWR, and VWL1 are taken along the left lane marking WL detected by the white line detection portion 151. Generate.

  For example, in the situation shown in FIG. 15, when it is estimated by the direction estimation unit 154 to proceed to the right branch road, as shown in the lower right side of FIG. 15, the virtual line estimation unit 155 performs the white line detection unit 151. A virtual lane marking VWL is generated along the detected right vehicle lane marking WR. Further, for example, in the situation shown in FIG. 15, when it is estimated that the direction estimating unit 154 proceeds to the left branch road, as shown in the lower left side of FIG. 15, the virtual line estimating unit 155 performs the white line detecting unit. A virtual lane marking VWR is generated along the left lane marking WL detected at 151. Further, in the situation shown in FIG. 16, when it is estimated by the direction estimation unit 154 that the vehicle travels straight (= in the case of going to the right branch road), as shown in the lower part of FIG. A virtual lane marking VWL1 is generated along the right vehicle lane marking WR detected by the white line detector 151.

  Thus, when it is estimated that the vehicle proceeds to the right branch road, it is estimated that the virtual lane line VWL is generated along the detected right vehicle lane line WR and proceeds to the left branch road. In this case, since the virtual lane marking VWR is generated along the detected left lane marking WL, the positions of the virtual lane markings VWR and VWL can be estimated appropriately.

  The reliability estimation unit 156 (corresponding to reliability estimation means) is a functional unit that estimates the reliability of the positions of the lane markings WR and WL detected by the white line detection unit 151. Specifically, the reliability estimation unit 156 estimates the reliability of the positions of the lane markings WR and WL based on the types of the lane markings WR and WL detected by the white line detection unit 151. Here, the types of the lane markings include a white solid line, a white broken line, a yellow broken line, a botsdots, and a cat's eye. It is estimated that the reliability decreases in the order of eyes.

  The position correction unit 157 (corresponding to the position correction unit) is a virtual lane marking generated by the virtual line estimation unit 155 based on the reliability of the position of the vehicle lane markings WR and WL estimated by the reliability estimation unit 156. This is a functional unit that corrects the position of VWL1. Specifically, the position correction unit 157 determines the position of the virtual lane line VWL1 generated by the virtual line estimation unit 155 as the reliability of the position of the vehicle lane lines WR and WL estimated by the reliability estimation unit 156 is lower. To the outside.

  For example, in the situation shown in FIG. 16, in the left diagram of FIG. 16, the right vehicle lane marking WR is a solid white line, and in the right diagram of FIG. 16, the right vehicle lane marking WR is botsdots. is there. Accordingly, in the diagram on the left side of FIG. 16, since the right vehicle lane line WR is a solid white line, it is determined that the reliability of the right vehicle lane line WR is high, and the position of the virtual lane line VWL1 is not corrected. . On the other hand, in the diagram on the right side of FIG. 16, since the right vehicle lane line WR is a botsdot, it is determined that the reliability of the right vehicle lane line WR is low, and the position of the virtual lane line VWL1 is the virtual lane line VWL2. The position is corrected.

  FIG. 20 is a graph for explaining an example of the position correction method for the virtual lane marking VWL1 executed by the position correction unit 157. The upper diagram in FIG. 20 shows the types (horizontal axis) of the lane markings WR and WL detected by the white line detection unit 151 and the reliability of the positions of the lane markings WR and WL estimated by the reliability estimation unit 156. It is graph G1 which shows the relationship with (vertical axis). As shown in the graph G1, the reliability decreases in the order of a white solid line, a white broken line, a yellow broken line, a bots dot, and a cat's eye.

  The lower diagram in FIG. 20 shows the reliability (horizontal axis) of the positions of the lane markings WR and WL estimated by the reliability estimation unit 156, and the correction distance (vertical axis) corrected by the position correction unit 157. It is graph G2 which shows the relationship of these. As shown in the graph G2, the lower the reliability, the larger the correction distance (= the position correction unit 157 corrects the position of the virtual partition line VWL1 to the outside). For example, when the reliability is 50%, the position of the virtual marking line VWL1 is corrected outward by 0.3 m.

  In this way, it is estimated that the lower the reliability of the detected position of the lane markings WR and WL is, the lower the reliability of the position of the generated virtual lane marking VWL1 is. In order to prevent this, it is preferable to correct the position of the generated virtual lane marking VWL1 to the outside of the lane. Therefore, since the position of the generated virtual lane line VWL1 is appropriately corrected based on the detected reliability of the positions of the vehicle lane lines WR and WL, unnecessary operation of the driving support ECU 15 can be prevented. Further, an alarm or the like can be output more appropriately.

  In addition, since the reliability estimation unit 156 estimates that the reliability decreases in the order of the white solid line, the white broken line, the yellow broken line, the botsdots, and the cat's eye, the reliability of the detected position of the vehicle lane markings WR and WL is determined. The degree can be estimated more appropriately.

  In the fifth embodiment, a case will be described in which the position correction unit 157 corrects the position of the virtual lane marking VWL1 outward as the reliability estimated by the reliability estimation unit 156 is lower. Conversely, the position correction unit 157 may be configured to correct the position of the virtual lane marking VWL1 inward as the reliability estimated by the reliability estimator 156 is lower. In this case, deviation from the lane is more reliably prevented.

  In the fifth embodiment, the reliability estimation unit 156 describes a case where the reliability is estimated to decrease in the order of a white solid line, a white broken line, a yellow broken line, a botsdot, and a cat's eye. 156 should just be the form which estimates the reliability of the position of the lane markings WR and WL based on the kind etc. of the lane markings WR and WL detected by the white line detection part 151. For example, the reliability estimation unit 156 may estimate the reliability of the positions of the lane markings WR and WL based on the types of the vehicle lane markings WR and WL detected by the white line detection unit 151 and the sharpness. good. Here, “definition” indicates clear distinction in an image based on the color, reflectance, etc. of the lane markings WR, WL with respect to the road surface. In this case, the reliability of the position of the lane markings WR and WL can be estimated more accurately.

Returning to FIG. 14 again, the functional configuration of the driving support ECU 15 will be described. The information output unit 158 (corresponding to the information output unit) is at the position of the virtual partition line VWL, VWR, VWL1 estimated by the virtual line estimation unit 155 or the virtual partition line VWL2 corrected by the position correction unit 157. Based on this, it is a functional unit that outputs driving support information. Specifically, the information output unit 158 includes the lane markings WR and WL detected by the white line detection unit 141 and the virtual lane markings VWL, VWR and VWL1 estimated by the virtual line estimation unit 155 (or position correction). Based on the virtual lane marking VWL2 corrected by the unit 157), the possibility of deviation from the traveling lane is determined, and if it is determined that the possibility of deviation is high, the output device 3 , Driving support information (in this case, an alarm or the like) is output.

  FIG. 17 is a flowchart showing an example of the operation of the driving support ECU 15 according to the fifth embodiment. In the flowchart shown in FIG. 15, “vehicle lane marking” is described as “white line” for convenience. First, the white line detection unit 151 detects the lane markings WR and WL laid on the right side and the left side of the lane in which the vehicle VC travels (S501). Then, the branch determination unit 152 determines whether or not the vehicle lane markings WR and WL on both sides are detected in step S501 (S503). When it is determined that at least one of the lane markings WR and WL is not detected (NO in S503), the determination of departure from the lane by the information output unit 158 is stopped (S505), and the process proceeds to step S501. The processing after step S501 is repeatedly executed.

  If it is determined that the lane markings WR and WL on both sides have been detected (YES in S503), the branch determination unit 152 acquires branch information from the navigation system 22 (S507). Then, the branch determination unit 152 determines whether there is a branch point ahead of the vehicle VC based on the branch information acquired in step S507 (S509). If it is determined that there is no branch point (NO in S509), the information output unit 158 determines the possibility of departure from the lane based on the vehicle lane line WL and the vehicle lane line WR detected in step S501. (S511), and the process proceeds to step S517.

  If it is determined that there is a branch point (YES in S509), the direction estimation unit 154 or the like executes a traveling direction estimation process that is a process for estimating the traveling direction of the vehicle VC at the branch point (S513). The virtual lane estimation unit 155 generates virtual lane markings VWL, VWR, and VWL1 that are virtual vehicle lane markings used to determine the possibility of deviating from the lane based on the traveling direction estimated in step S513. (S515). Next, a position correction process that is a process of correcting the positions of the virtual lane markings VWL, VWR, and VWL1 generated in step S515 is executed by the reliability estimation unit 156 and the position correction unit 157 (S517). Next, the vehicle lane markings WR, WL detected in step S501 by the information output unit 158 and the virtual lane markings VWL, VWR, VWL1 generated in step S513 (or the virtual lane marking corrected in step S515) VWL2) is determined based on the possibility of departure from the lane (S519).

  When the process of step S511 or step S519 is completed, the information output unit 158 determines whether or not there is a high possibility of departure from the lane (S521). If it is determined that there is a high possibility of departure from the lane (YES in S521), the information output unit 158 outputs an alarm or the like (S523), the process returns to step S501, and the processes after step S501 are performed. Is repeatedly executed. If it is determined that the possibility of departure from the lane is not high (NO in S521), the process returns to step S501, and the processes after step S501 are repeatedly executed.

FIG. 18 is a detailed flowchart showing an example of the traveling direction estimation process executed in step S513 of the flowchart of FIG. First, turn signal operation information is acquired by the operation acquisition unit 153, and it is determined whether or not a direction instruction operation is accepted (S601). If it is determined that a direction instruction operation has not been received (NO in S601), the process proceeds to step S609. If it is determined that a direction instruction operation is accepted (YES in S601), whether or not the direction instruction operation accepted in step S601 indicates a direction to the right by the direction estimation unit 154. Is determined (S603). When it is determined that the direction to the right direction is instructed (YES in S603), the direction estimating unit 154 estimates that the vehicle VC proceeds rightward at the branch point (S607), and the process is shown in FIG. The process returns to S515 . If it is determined that the direction to the left is instructed (NO in S603), the direction estimating unit 154 estimates that the vehicle VC proceeds to the left at the branch point (S605), and the process is shown in FIG. The process returns to S515 .

If NO in step S601, the operation acquisition unit 153 acquires steering torque TR information via the steering torque detection sensor 23 (S609). Then, the direction estimation unit 154 determines whether or not the absolute value of the steering torque TR acquired in step S609 is greater than or equal to the determination threshold value TSH (S611). If it is determined that the absolute value of the steering torque TR is less than the determination threshold value TSH (NO in S611), the process returns to step S601, and the processes after step S601 are repeatedly executed. When it is determined that the absolute value of the steering torque TR is greater than or equal to the determination threshold TSH (YES in S611), the direction estimation unit 154 determines the rightward direction based on the sign of the steering torque TR acquired in step S609. It is determined whether or not the steering torque TR corresponds to steering (S613). If it is determined that the steering torque TR corresponds to steering in the right direction (YES in S613), the direction estimating unit 154 estimates that the vehicle VC proceeds rightward at the branch point (S617), and the process is performed. Returning to step S515 shown in FIG. If it is determined that the steering torque TR corresponds to leftward steering (NO in S613), the direction estimating unit 154 estimates that the vehicle VC proceeds leftward at the branch point (S615), and the process is performed. Returning to step S515 shown in FIG.

  FIG. 19 is a detailed flowchart showing an example of the position correction process executed in step S517 of the flowchart of FIG. First, the virtual line estimation unit 155 determines whether or not the vehicle VC is estimated to travel in the right direction in step S513 in the flowchart of FIG. 17 (S701). If it is estimated that the vehicle will advance to the left (NO in S701), the reliability estimation unit 156 determines the type of the left lane marking WL (S703). When it is estimated that the vehicle proceeds in the right direction (YES in S701), the reliability estimation unit 156 determines the type of the right lane marking WR (S705).

  When the process of step S703 is completed, or when the process of step S705 is completed, the lane marking WR (or the lane marking WL) estimated by the reliability estimation unit 156 in step S703 or step S705. The reliability of the position of the lane marking WR (or the lane marking WL) is estimated based on the type (S707). Then, the position correcting unit 157 corrects the position of the virtual lane marking VWL1 generated in step S515 in FIG. 17 based on the reliability estimated in step S707 (S709), and the process is performed in step S519 shown in FIG. Is returned to.

  Thus, since it is determined whether or not there is a branch point of the road on which the vehicle VC is traveling in front of the vehicle VC based on the position information of the vehicle VC and the map information, the branch of the road Whether or not there is a place can be appropriately determined. In addition, when it is determined that there is a branch point on the road, virtual lane lines VWR, VWL, and VWL1 that are virtual vehicle lane lines used for outputting an alarm or the like indicating departure from the lane at the road branch point are Since it is generated to cross a branch path that is not in the estimated traveling direction, the positions of the virtual lane markings VWR, VWL, and VWL1 can be estimated appropriately (see FIGS. 15 and 16). Furthermore, since an alarm or the like is output based on the virtual lane markings VWR, VWL, and VWL1 estimated at an appropriate position, an appropriate alarm or the like can be output.

In addition, the driving assistance device according to the present invention is not limited to the above embodiment, and may be the following form.
(A) In the first embodiment, the case where the driving assistance ECU 11 functionally includes the white line detection unit 111, the roadside object detection unit 112, the white line estimation unit 113, the information output unit 114, and the like has been described. Of the unit 111, the roadside object detection unit 112, the white line estimation unit 113, and the information output unit 114, at least one functional unit may be realized by hardware such as an electric circuit.

  Similarly, in the second embodiment, the driving support ECU 12 functionally includes a white line detection unit 121, a roadside object detection unit 122, a distance setting unit 123, a virtual line estimation unit 124, an information output unit 125, and the like. As described above, at least one of the white line detection unit 121, the roadside object detection unit 122, the distance setting unit 123, the virtual line estimation unit 124, and the information output unit 125 is realized by hardware such as an electric circuit. It may be in the form.

Similarly, in the third embodiment, the case where the driving support ECU 13 functionally includes the white line detection unit 131 , the lane determination unit 132, the center line estimation unit 133, the information output unit 134, and the like has been described. Of the unit 131 , the lane determination unit 132, the center line estimation unit 133, and the information output unit 134, at least one functional unit may be realized by hardware such as an electric circuit.

  Similarly, in the fourth embodiment, the case where the driving support ECU 14 functionally includes the white line detection unit 141, the stop zone determination unit 142, the lane marking correction unit 143, the information output unit 144, and the like has been described. Of the detection unit 141, the stop zone determination unit 142, the lane marking correction unit 143, and the information output unit 144, at least one functional unit may be realized by hardware such as an electric circuit.

  Similarly, in the fifth embodiment, the driving support ECU 15 functionally includes a white line detection unit 151, a branch determination unit 152, an operation acquisition unit 153, a direction estimation unit 154, a virtual line estimation unit 155, and a reliability estimation unit 156. Although the case where the position correction unit 157, the information output unit 158, and the like are provided has been described, the white line detection unit 151, the branch determination unit 152, the operation acquisition unit 153, the direction estimation unit 154, the virtual line estimation unit 155, and the reliability estimation unit 156 Of the position correction unit 157 and the information output unit 158, at least one functional unit may be realized by hardware such as an electric circuit.

  (B) In 1st Embodiment-5th Embodiment, driving assistance ECU1 (= driving assistance ECU11-15) demonstrates the case where the driving assistance information for preventing the deviation from the driving | running | working lane is output. However, the driving support ECU 1 may output other driving support information. For example, the driving support ECU 1 may output driving support information that supports lane change (or overtaking).

  (C) In the first to fifth embodiments, the driving assistance ECU 1 (= driving assistance ECUs 11 to 15) detects a lane marking, a roadside object, and the like based on image information from the CCD camera 21. However, the driving support ECU 1 may detect the lane markings, roadside objects, and the like by other methods. For example, the driving support ECU 1 may detect a roadside object or the like via a radar device. Further, for example, the driving support ECU 1 may detect a lane marking, a roadside object, or the like based on image information from another type of camera (for example, a CMOS camera).

  The present invention can be applied to, for example, a driving support device that is mounted on a vehicle and outputs driving support information to a driver.

1 (11, 12, 13, 14, 15) Driving assistance ECU (driving assistance device)
11 Driving assistance ECU (Driving assistance device according to the first embodiment)
111 White line detection unit (white line detection means)
112 Roadside object detection unit (roadside object detection means)
113 White line estimation unit (white line estimation means)
114 Information output unit (information output means)
12 Driving assistance ECU (Driving assistance device according to the second embodiment)
121 White line detection unit (white line detection means)
122 Roadside object detection unit (roadside object detection means)
123 Distance setting unit (distance setting means)
124 Virtual line estimation unit (virtual line estimation means)
125 Information output unit (information output means)
13 Driving Support ECU (Driving Support Device According to Third Embodiment)
131 White line detection unit (white line detection means)
132 Lane determination unit (lane determination means)
133 Chuo Line Estimator (Chuo Line Estimator)
134 Information output unit (information output means)
14 Driving assistance ECU (Driving assistance device according to the fourth embodiment)
141 White line detection unit (white line detection means)
142 Stop zone determination unit (stop zone determination means)
143 Marking line correction unit (marking line correction means)
144 Information output unit (information output means)
15 Driving Support ECU (Driving Support Device According to Fifth Embodiment)
151 White line detection unit (white line detection means)
152 Branch determination unit (branch determination means)
153 Operation acquisition unit (operation acquisition means)
154 Direction estimation unit (direction estimation means)
155 Virtual line estimation unit (virtual line estimation means)
156 Reliability estimation unit (reliability estimation means)
157 Position correction unit (position correction means)
158 Information output unit (information output means)
2 Input device 21 CCD camera 22 Navigation system 23 Steering torque detection sensor 24 Turn signal detection switch 3 Output device 31 Display 32 Speaker 33 Steering control ECU

Claims (3)

  1. A driving support device that outputs driving support information for preventing deviation from a lane in which a vehicle is running and is traveling to a driver,
    White line detecting means for detecting vehicle lane markings laid on the right and left sides of the lane in which the vehicle travels;
    Roadside object detection means for detecting a roadside object laid along the vehicle lane line and discriminating the type of the detected roadside object when the lane marking is detected by the white line detection means; ,
    Distance setting means for setting the second distance based on the type of roadside object determined by the roadside object detection means;
    The lane marking is detected on the right side and the left side of the lane in which the vehicle travels by the white line detection means , and the roadside object detection means detects the lane marking on either the right side or the left side where the vehicle lane marking is detected. When an object is detected , a virtual vehicle lane line that is used to output driving assistance information is a position separated from the position of the vehicle lane line by the second distance inside the lane in which the vehicle travels. Virtual line estimation means for estimating the position of a certain virtual lane marking;
    An information output unit that outputs the driving support information based on the position of the virtual lane marking estimated by the virtual line estimating unit.
  2. The distance setting means estimates the degree of damage when colliding with the roadside object based on the type of roadside object determined by the roadside object detection means, and the greater the estimated damage degree, The driving support device according to claim 1 , wherein a large value is set as the two distance.
  3. 3. The virtual lane marking is not estimated by the virtual line estimation unit when the white lane detection unit does not detect the lane marking on at least one of the right side and the left side of the lane in which the vehicle travels. Driving assistance device.
JP2011120643A 2011-05-30 2011-05-30 Driving assistance device Active JP5196279B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011120643A JP5196279B2 (en) 2011-05-30 2011-05-30 Driving assistance device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011120643A JP5196279B2 (en) 2011-05-30 2011-05-30 Driving assistance device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2009124205 Division 2009-05-22

Publications (2)

Publication Number Publication Date
JP2011175676A JP2011175676A (en) 2011-09-08
JP5196279B2 true JP5196279B2 (en) 2013-05-15

Family

ID=44688406

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011120643A Active JP5196279B2 (en) 2011-05-30 2011-05-30 Driving assistance device

Country Status (1)

Country Link
JP (1) JP5196279B2 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3465538B2 (en) * 1997-07-09 2003-11-10 トヨタ自動車株式会社 Vehicle running state determination device
JP3704987B2 (en) * 1999-01-27 2005-10-12 三菱自動車工業株式会社 Vehicle travel control device
JP4193815B2 (en) * 2005-05-30 2008-12-10 トヨタ自動車株式会社 Lane departure warning device
JP4654796B2 (en) * 2005-06-29 2011-03-23 トヨタ自動車株式会社 Vehicle driving support device
JP4759547B2 (en) * 2007-09-27 2011-08-31 日立オートモティブシステムズ株式会社 Driving support device

Also Published As

Publication number Publication date
JP2011175676A (en) 2011-09-08

Similar Documents

Publication Publication Date Title
JP6180968B2 (en) Vehicle control device
CN104417561B (en) Context aware threat-response judges
US9688273B2 (en) Methods of improving performance of automotive intersection turn assist features
DE102013105046B4 (en) Target track selection method by means of navigation input in road change scenarios
US9751506B2 (en) Algorithms for avoiding automotive crashes at left and right turn intersections
JP6296162B2 (en) Vehicle travel control apparatus and method
CN104044587B (en) For the system and method for the sensor visual for improving the vehicle being under autonomous driving pattern
US9965957B2 (en) Driving support apparatus and driving support method
JP5991340B2 (en) Driving assistance device
JP6250180B2 (en) Vehicle irradiation control system and image irradiation control method
JP5979259B2 (en) Collision avoidance control device
EP2746137B1 (en) Method and system for assisting a driver
US9514648B2 (en) Alerting apparatus
US8838337B2 (en) Vehicle automatic steering control apparatus
EP2330009B1 (en) Vehicle control apparatus
JP5821917B2 (en) Driving assistance device
JP4614005B2 (en) Moving locus generator
US7548634B2 (en) Pedestrian detection system and vehicle driving assist system with a pedestrian detection system
DE102016123878A1 (en) Vehicle signal detection blink
JP4604683B2 (en) Hazardous situation warning device
JP3592043B2 (en) Intersection warning device
JP4933962B2 (en) Branch entry judgment device
DE102012009297A1 (en) Method for assisting rider when feeding e.g. vehicle, involves proving information, warning and automatic engagement, which results during risk of collision and/or secondary collision with highest priority in priority list
US9669829B2 (en) Travel lane marking recognition system
DE102012010865A1 (en) Driving control device and driving control method

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120820

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120823

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121019

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130110

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130123

R151 Written notification of patent or utility model registration

Ref document number: 5196279

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160215

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160215

Year of fee payment: 3