JP5556077B2 - Driving support device - Google Patents

Driving support device Download PDF

Info

Publication number
JP5556077B2
JP5556077B2 JP2009175103A JP2009175103A JP5556077B2 JP 5556077 B2 JP5556077 B2 JP 5556077B2 JP 2009175103 A JP2009175103 A JP 2009175103A JP 2009175103 A JP2009175103 A JP 2009175103A JP 5556077 B2 JP5556077 B2 JP 5556077B2
Authority
JP
Japan
Prior art keywords
trajectory
host vehicle
position
rudder angle
locus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009175103A
Other languages
Japanese (ja)
Other versions
JP2011028609A (en
Inventor
佳紀 草柳
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to JP2009175103A priority Critical patent/JP5556077B2/en
Publication of JP2011028609A publication Critical patent/JP2011028609A/en
Application granted granted Critical
Publication of JP5556077B2 publication Critical patent/JP5556077B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

The present invention relates to a driving support equipment for supporting the travel of the vehicle, in particular, locations passable by the presence of obstacles on the road the road width is narrowed (hereinafter, referred to as a narrow portion.) The about the driving support equipment for supporting such vehicle can properly pass.

  As a driving support device that supports driving of the host vehicle in a narrow part, for example, the one described in Patent Document 1 is known. The driving support apparatus described in Patent Literature 1 displays vehicle width information such as a tire traveling path of a road ahead of the host vehicle and a tire locus according to the steering angle of the host vehicle on a head-up display in front of the driver. In this way, the driver is provided with an index for judging the possibility of wheel removal and contact with an obstacle.

JP 2005-78414 A

  However, in the technique described in Patent Document 1, although the driver can recognize whether or not the vehicle is derailed or contacted with an obstacle when the vehicle travels while maintaining the current steering angle, It was not possible for the driver to recognize what route it is desirable to pass through the narrow part, and it was insufficient as driving support.

The present invention is more a was conceived in view of the prior art problems, to provide a driving support equipment which can be recognized by the driver an optimal route for passing the narrow portion It is aimed.

  The present invention includes a first trajectory when the host vehicle travels from the current position while maintaining the current steering angle, a steering angle change point that changes the steering angle by stopping the host vehicle before the narrow position, A first locus, a steering angle change point, and a second locus are calculated by calculating a second locus when the host vehicle travels while maintaining the rudder angle after changing from the rudder angle change point to the rudder angle change point. The above-described problem is solved by displaying on the display means route information indicating the route of the narrow portion consisting of

  According to the present invention, the route information indicating the series of routes including the change of the steering angle is displayed on the display means, so that the driver can accurately recognize the optimum route for passing through the narrow portion. Can do.

It is a lineblock diagram of the run support device concerning a 1st embodiment. It is a flowchart which shows the outline | summary of the process by the control unit of the driving assistance device which concerns on 1st Embodiment. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that a narrow part position is detected. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that an intermediate point is calculated. It is a figure explaining the example of a route calculation, and is a figure which shows a mode that a steering interlocking locus is calculated. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that the predicted arrival position of the own vehicle which uses a steering angle change point (middle point) as a reference point is calculated. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that a reverse locus | trajectory is calculated. It is a figure explaining the example of route calculation, and is a figure which shows a mode that the predicted arrival position of the own vehicle which uses the end point of a reverse locus as a reference point is calculated. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that the locus | trajectory of a turning outer peripheral side front end part and the locus | trajectory of a turning inner peripheral side rear wheel are calculated. It is a figure explaining the example of route calculation, and is a figure which shows a mode that the driving | running | working area | region of the own vehicle is calculated. It is a figure which shows the change of the prediction arrival position when the steering angle of the own vehicle in a present position is changed. It is a figure explaining the other specific example of path | route calculation, and is a figure which shows a mode that a narrow part position is detected. It is a figure explaining the other specific example of path | route calculation, and is a figure which shows a mode that an intermediate point is calculated. It is a figure explaining the other specific example of route calculation, and is a figure which shows a mode that a steering interlocking locus is calculated. It is a figure explaining the other specific example of route calculation, and is a figure which shows a mode that the predicted arrival position of the own vehicle which uses a steering angle change point (intermediate point) as a reference point is calculated. It is a figure explaining the other specific example of path | route calculation, and is a figure which shows a mode that a reverse locus | trajectory is calculated. It is a figure explaining the other specific example of path | route calculation, and is a figure which shows a mode that the predicted reach | attainment position of the own vehicle which uses the end point of a reverse locus as a reference point is calculated. It is a figure explaining the other specific example of path | route calculation, and is a figure which shows a mode that the locus | trajectory of a turning outer peripheral side front end part and the locus | trajectory of a turning inner peripheral side rear wheel are calculated. It is a figure explaining the other specific example of route calculation, and is a figure which shows a mode that the driving | running | working area | region of the own vehicle is calculated. It is a figure which shows the change of the prediction arrival position when the steering angle of the own vehicle in a present position is changed. It is a figure which shows the other display form of the assistance image displayed on a monitor. It is a figure which shows the specific installation position of the vehicle-mounted camera for imaging the image used as the base of the assistance image of FIG. It is a block diagram of the driving assistance device which concerns on 2nd Embodiment. It is a flowchart which shows the outline | summary of the process by the control unit of the driving assistance apparatus which concerns on 2nd Embodiment. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that a narrow part position is detected. It is a figure explaining the example of a route calculation, and is a figure which shows a mode that a straight line which sets an avoidance point and touches the turning circle of a predetermined steering angle through this avoidance point is calculated. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that a steering angle change point is calculated. It is a figure explaining the example of a route calculation, and is a figure which shows a mode that a pair of linear locus line is calculated. It is a figure explaining the example of route calculation, and is a figure which shows a mode that a pair of turning locus line according to the present steering angle of the own vehicle is calculated. It is a figure explaining the example of path | route calculation, and is a figure which shows a mode that a pair of turning locus line according to a present steering angle was superimposed on a bird's-eye view image. It is a figure explaining the specific example of route calculation, and shows a mode that a straight locus line and a turning locus line were connected, and a driver changed the rudder angle of the own vehicle by operating steering. FIG. It is a figure which shows the other display form of the assistance image displayed on a monitor, and is a figure which shows the example of an image of the assistance image displayed on a monitor, when the own vehicle is drive | working before the steering angle change point. It is a figure which shows the other display form of the assistance image displayed on a monitor, and is a figure which shows the example of an image of the assistance image displayed on a monitor, when the own vehicle is drive | working the position exceeding the steering angle change point. .

  Hereinafter, specific embodiments of the present invention will be described in detail with reference to the drawings.

[First Embodiment]
<Configuration>
FIG. 1 is a configuration diagram of a travel support apparatus according to the first embodiment of the present invention. The travel support apparatus according to the present embodiment is configured around a controller 10 that integrally controls the operation of the entire apparatus. Connected to the controller 10 are four in-vehicle cameras 1a to 1d, a start switch 2, a steering angle sensor 3, a wheel speed sensor 4, a route determination switch 5, a monitor 6, and a speaker 7.

  The in-vehicle cameras 1a to 1d are, for example, wide-angle CCD cameras or CMOS cameras having an angle of view of about 180 degrees, and these four in-vehicle cameras 1a to 1d can capture all the areas surrounding the host vehicle. It is installed on each side of the vehicle. These in-vehicle cameras 1 a to 1 d capture an image including a part of the body of the host vehicle, for example, and output data of the captured image to the control unit 10 as needed.

  The start switch 2 is a switch that is operated by the driver of the host vehicle to start the driving support device of the present embodiment, and is installed near the steering or the instrument panel of the host vehicle. The start switch 2 outputs a start command for driving support processing to the control unit 10 when operated by the driver.

  The steering angle sensor 3 is installed on the steering shaft of the host vehicle, detects the steering angle of the host vehicle according to the steering operation, and outputs a steering angle signal to the control unit 10.

  The wheel speed sensor 4 is installed near the wheel of the host vehicle, detects the rotational speed of the wheel, and outputs a vehicle speed pulse signal corresponding to the rotational speed of the wheel to the control unit 10.

  The route determination switch 5 is a switch operated by the driver to determine the route calculated by the control unit 10 as the route of the host vehicle, and is installed near the steering or the instrument panel of the host vehicle. The route determination switch 5 outputs a route determination command to the control unit 10 when operated by the driver.

  The monitor 6 is composed of, for example, a liquid crystal display or the like installed on the center console of the host vehicle, and displays an image for driving support (hereinafter referred to as a support image) generated by the control unit 10. The speaker 7 outputs various guide sounds for driving support, and an audio speaker or the like generally mounted on a vehicle is used.

  The control unit 10 includes, for example, an electronic control unit (ECU) including a microcomputer that operates according to a predetermined processing program. Various processings for driving support are performed by the processing program being executed by the CPU of the microcomputer. Realize the function. As shown in FIG. 1, various control functions realized by the control unit 10 include a viewpoint conversion unit 11, an overhead image generation unit 12, an image processing unit 13, a narrow portion position detection unit 14, an intermediate point calculation unit 15, and steering. Interlocking locus calculation unit 16, turning outer periphery side front end locus calculation unit 17, turning inner periphery side rear wheel locus calculation unit 18, locus reversal calculation unit 19, travel area calculation unit 20, turning locus storage unit 21, host vehicle icon storage unit 22, the vehicle information storage unit 23, the movement amount calculation unit 24, the image update unit 25, the image synthesis unit 26, and the voice guide generation unit 27. Among these, the turning locus storage unit 21, the host vehicle icon storage unit 22, and the host vehicle information storage unit 23 are realized by a memory inside the control unit 10.

  The viewpoint conversion unit 11 converts the output images from the in-vehicle cameras 1a to 1d into images with the viewpoint directly above the host vehicle (rearrangement of each pixel).

  The bird's-eye view image generation unit 12 synthesizes the individual viewpoint-converted images output from the viewpoint conversion unit 11 and generates a bird's-eye view image with its own vehicle as a substantial center and the surrounding area looking down from directly above.

  The image processing unit 13 performs image processing such as edge detection on the bird's-eye view image output from the bird's-eye view image generation unit 12, and obstacles ahead of the host vehicle, specifically, walls and implants existing on the left and right sides of the road Detect three-dimensional obstacles such as utility poles. The image processing unit 13 may detect a three-dimensional obstacle such as a utility pole by performing image processing on the captured images of the in-vehicle cameras 1a to 1d before being converted into the overhead view image.

  Based on the output of the image processing unit 13, the narrow part position detection unit 14 specifies a narrow part position where the road width that can be passed on the road ahead of the host vehicle is narrow, and calculates the distance to the narrow part position. . Specifically, the narrow part position detection unit 14 calculates the distance from the focal length of the in-vehicle cameras 1a to 1d and the distance per pixel to the three-dimensional obstacle detected by the image processing unit 13 on the overhead view image. Is calculated as a distance in front of the vehicle (front-rear direction = Y-axis direction). If another object detection device such as a laser range finder is mounted on the host vehicle, the position of the narrow portion may be specified and the distance may be calculated using the detection signal of the object detection device. .

  The midpoint calculator 15 calculates the midpoint of the distance from the current position of the host vehicle to the narrow position (that is, the distance to a three-dimensional obstacle such as a utility pole). The intermediate point here is an arbitrary point on a straight line parallel to a straight line in the vehicle width direction (= X-axis direction) of the host vehicle passing through the reference point of the host vehicle (for example, the center point of the rear wheel axle). The calculated coordinate position of the intermediate point is the intersection of this straight line with the steering interlocking trajectory calculated by the steering interlocking trajectory calculating unit 16 described later. In other words, the final intermediate point position becomes the end point of the steering interlocking locus calculated by the steering interlocking locus calculating unit 16, and this intermediate point stops the vehicle in front of the narrow portion position, as will be described in detail later. The steering angle change point is a position where the steering angle is changed.

  Based on the output of the steering angle sensor 3, the steering interlocking track calculation unit 16 determines the reference point (for example, the rear wheel axle center point) of the host vehicle when the host vehicle travels from the current position while maintaining the current steering angle. A trajectory up to the intermediate point to be traced (hereinafter referred to as a steering interlocking trajectory) is calculated.

  Based on the output of the steering angle sensor 3, the turning outer peripheral side front end locus calculation unit 17 turns when the host vehicle advances from the current position while maintaining the current steering angle (that is, along the steering interlocking locus). The trajectory followed by the front end on the outer peripheral side (the left front end during right turn and the right front end during left turn) is calculated.

  Based on the output of the steering angle sensor 3, the turning inner periphery side rear wheel locus calculation unit 18 maintains the current steering angle from the current position (that is, along the steering interlocking locus). A trajectory followed by the rear wheel on the inner periphery side of the turn (right rear wheel when turning right and left rear wheel when turning left) is calculated.

  The trajectory reversal calculation unit 19 rotates the steering interlocking trajectory up to the intermediate point calculated by the steering interlocking trajectory calculating unit 16 by 180 degrees around the intermediate point, so that the intermediate point from the steering angle change point to the narrow part position is rotated. A trajectory (hereinafter referred to as a reverse trajectory) is calculated. In addition, the trajectory reversal calculation unit 19 includes the trajectory of the turning outer peripheral side front end up to the intermediate point calculated by the turning outer peripheral side front end trajectory calculating unit 17 and the intermediate point calculated by the turning inner peripheral side rear wheel trajectory calculating unit 18. The trajectory of the turning inner peripheral rear wheel is reversed with respect to the longitudinal axis (Y axis) at the current position of the host vehicle, and rotated in the turning direction by the amount of change in the host vehicle angle at the intermediate point. By moving in parallel, the trajectory of the turning outer peripheral side front end and the trajectory of the turning inner peripheral side rear wheel from the intermediate point serving as the steering angle change point to the narrow portion position are calculated.

  The travel area calculation unit 20 outputs the steering interlocking track calculation unit 16, the output of the turning outer periphery side front end track calculation unit 17, the output of the turning inner periphery side rear wheel track calculation unit 18, and the track reversal calculation unit 19. Based on the output and the size of the host vehicle, a track that the entire vehicle body follows when the host vehicle travels along the steering interlocking track and the reversing track is calculated as a travel region of the host vehicle.

  The turning trajectory storage unit 21 stores information (such as the radius of the turning circle and the relative position of the center of the turning circle with respect to the reference position of the own vehicle) on each portion (left and right front ends and rear wheel position) corresponding to the steering angle of the own vehicle. Remember.

  The own vehicle icon storage unit 22 stores information such as the design and size (full length and full width) of the own vehicle icon displayed on the overhead view image generated by the overhead view image generation unit 12.

  The host vehicle information storage unit 23 stores information such as the total length and width of the host vehicle, the positions of the left and right front end portions that are the outermost peripheral ends when the host vehicle is turning, and the ground contact position of the rear wheels.

  The movement amount calculation unit 24 calculates the movement amount of the host vehicle (front-rear direction, left-right direction, and yaw angle change amount) based on the output of the steering angle sensor 3 and the output of the wheel speed sensor 4.

  The image update unit 25 includes route information on the route determined by the driver by operating the route determination switch 5 (for example, an image showing a travel region when the host vehicle travels along a steering interlocking track and a reverse track). Depending on the movement amount of the host vehicle calculated by the movement amount calculation unit 24, the monitor 6 is fixed to the environment (that is, the background around the host vehicle displayed as a bird's-eye view image) and displayed on the monitor 6. The display position of the route information on the display screen is updated.

  The image composition unit 26 displays route information on the route determined by the driver by operating the route determination switch 5 on the overhead image generated by the overhead image generation unit 12 (for example, the host vehicle is turned into a steering interlocking locus and a reversal locus). An image showing a traveling region when traveling along the vehicle) and a host vehicle icon stored in the host vehicle icon storage unit 22 are generated and output to the monitor 6. In the present embodiment, an image indicating a travel region when the host vehicle travels along the steering interlocking track and the reversal track is superimposed on the overhead image around the host vehicle and displayed on the monitor 6 as route information. However, the route information superimposed on the bird's-eye view image and displayed on the monitor 6 is not limited to the image of the traveling area of the host vehicle. For example, the trajectory of the turning outer peripheral side front end and the turning inner peripheral side rear wheel A combination of the trajectory with the steering interlocking trajectory and the reversal trajectory may be superimposed on the overhead image as route information and displayed on the monitor 6, or only the outline indicating the outer edge of the travel area may be displayed as the route information on the overhead image. May be displayed on the monitor 6 in a superimposed manner.

  The voice guide generation unit 27 specifies the position of the host vehicle on the route determined by the operation of the route determination switch 5 based on the movement amount of the host vehicle calculated by the movement amount calculation unit 24, and performs necessary driving. A guide voice for teaching the operation to the driver is generated and output to the speaker 7.

<Processing by control unit>
Next, an outline of processing by the control unit 10 of the driving support apparatus of the present embodiment configured as described above will be described with reference to the flowchart of FIG. A series of processes shown in the flowchart of FIG. 2 is started when the driver of the host vehicle operates the start switch 2 of the driving support device.

  When a start command for driving support processing is input by operating the start switch 2, the control unit 10 first generates a bird's-eye view image based on the images of the surroundings of the host vehicle taken by the in-vehicle cameras 1a to 1d in step S101. To do. In step S102, image processing such as edge detection is performed on the overhead view image generated in step S101 to detect a narrow portion position in front of the host vehicle. In this embodiment, an example in which the narrow portion position is detected by image processing on the overhead image is illustrated, but the narrow portion position may be detected by image processing on the captured images of the in-vehicle cameras 1a to 1d. The narrow portion position may be detected using another object detection device such as a laser range finder.

  Next, in step S103, the control unit 10 calculates an intermediate point from the current position of the host vehicle to the narrow position detected in step S102. As described above, the intermediate point here is obtained as an arbitrary point on a straight line along the X-axis direction between the current position and the narrow position of the host vehicle.

  Next, the control unit 10 reads the output of the steering angle sensor 3 in step S104. In step S105, the trajectory line of the turning radius corresponding to the output of the rudder angle sensor 3 read in step S104 is read from the turning trajectory storage unit 21, and the host vehicle proceeds from the current position while maintaining the current rudder angle. The steering interlocking locus at the time is calculated. The starting point of the steering interlocking locus is the current position of the host vehicle, and the ending point is an intermediate point that is an intersection with the straight line calculated in step S103.

  Next, in step S106, the control unit 10 rotates the steering interlock locus calculated in step S105 by 180 degrees around the intermediate point that is the end point of the steering interlock locus calculated in step S105, and narrows it from the intermediate point. A reverse trajectory that is a trajectory to the part position is calculated.

  Next, in step S107, the control unit 10 determines that the trajectory followed by the front end of the turning outer periphery of the host vehicle and the rear wheel on the turning inner periphery when the host vehicle travels along the steering interlocking track calculated in step S105. The trajectory to be traced is obtained, and this is set as the trajectory outer peripheral side front end trajectory and the traverse inner peripheral side rear wheel trajectory up to the intermediate point.

  Next, in step S108, the control unit 10 calculates the yaw angle change amount of the host vehicle at an intermediate point with respect to the current position of the host vehicle (an intermediate point that is the end point of the steering interlocking locus calculated in step S105).

  Next, in step S109, the control unit 10 uses the trajectory of the turning outer peripheral side front end and the trajectory of the turning inner peripheral side rear wheel up to the intermediate point calculated in step S107 to determine the longitudinal axis ( (Y-axis) as a reference and rotated in the turning direction by the amount of yaw angle change calculated in step S108, and a parallel trajectory is obtained. This trajectory follows the reverse trajectory calculated by the host vehicle in step S106. And the trajectory of the turning outer peripheral side front end and the trajectory of the turning inner peripheral side rear wheel.

  Next, in step S110, the control unit 10 determines the turning outer periphery side front end locus and the turning inner periphery rear wheel locus calculated in steps S107 and step 109, and the own vehicle information stored in the own vehicle information storage unit 23. Based on the size information, the trajectory followed by the entire vehicle body of the host vehicle when the host vehicle travels along the steering interlocking track and the reversal track is calculated as the travel region of the host vehicle.

  Next, in step S111, the control unit 10 causes the monitor 6 to display an image representing the travel area calculated in step S110 on the overhead image around the host vehicle generated in step S101. At this time, the host vehicle icon is superimposed on the current position of the host vehicle that is the starting point of the steering interlocking track, the intermediate point that is the end point of the steering interlocking track and the starting point of the reversing track, and the end point of the reversing track. It is desirable to do.

  Next, in step S112, the control unit 10 monitors whether or not the route determination switch 5 has been operated by the driver of the host vehicle. If the route determination switch 5 has been operated by the driver of the host vehicle, the control unit 10 steps. Proceed to S114. On the other hand, if the route determination switch 5 is not operated, it is monitored in step S113 whether or not the rudder angle has been changed (whether or not the output of the rudder angle sensor 3 has changed), and the rudder angle has been changed. Returns to step S104 and repeats the processing after step S104. That is, the driver of the host vehicle grasps the route through the narrow position according to the current rudder angle while looking at the support image displayed on the monitor 6, and the route according to the current rudder angle is appropriate. If there is, the route determination switch 5 is operated to determine the route. However, if it is determined that there is a possibility of contact with a three-dimensional obstacle such as a power pole in the route according to the current steering angle, the steering angle is adjusted by operating the steering. Change and search for the best route.

  Next, in step S114, the control unit 10 passes the steering interlocking locus calculated according to the current steering angle of the host vehicle, changes the steering angle at the intermediate point (steering angle change point), and changes the turning direction. The route that is reversed and passes through the reversing trajectory is determined as the route of the host vehicle for passing through the narrow position, and the image of the traveling area displayed on the monitor 6 is fixed to the environment side displayed as the overhead image. (Lock on).

  Next, in step S115, the control unit 10 causes the speaker 7 to output a guide sound for instructing the driver to advance the host vehicle while maintaining the current steering angle. Note that the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S116, the control unit 10 calculates the movement amount of the host vehicle based on the output of the wheel speed sensor 4, and in step S117, monitoring the movement amount of the host vehicle calculated in step S116, It is determined whether or not the host vehicle has reached an intermediate point that is the end point of the steering interlocking trajectory calculated in step S105. When the host vehicle reaches the intermediate point, in step S118, the speaker 7 gives a guide voice for teaching the driver that the host vehicle is stopped and the steering is turned in the opposite direction to reverse the steering angle by the same angle. Output from. In this case as well, the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S119, the control unit 10 determines whether or not the steering angle of the host vehicle has reached the steering angle taught to the driver in step S118 while monitoring the output of the steering angle sensor 3. When the steering angle of the host vehicle becomes the steering angle taught to the driver, a guide voice for teaching the driver to advance the host vehicle while maintaining the current steering angle in step S120, The signal is output from the speaker 7 again. In this case as well, the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S121, the control unit 10 calculates the amount of movement of the host vehicle based on the output of the wheel speed sensor 4, and in step S122, while monitoring the amount of movement of the host vehicle calculated in step S121, It is determined whether or not the host vehicle has reached the narrow position that is the end point of the reverse trajectory calculated in step S106. And if the own vehicle reaches | attains a narrow part position, a series of processes shown with the flowchart of FIG. 2 will be complete | finished.

<Specific example 1 of route calculation for passing through narrow part>
Next, a method for calculating a path for passing through a narrow portion by the control unit 10 will be described in more detail while assuming a specific scene.

  Here, as shown in FIG. 3, the own vehicle is going to travel on a narrow road that is partitioned by side walls 101 and 102 on both the left and right sides, and an electric pole 103 serving as an obstacle is located on the left front side of the own vehicle. Assume a scene installed on a road. In such a traveling scene, the road width that can be passed when passing by the side of the utility pole 103 is the narrowest, so the road position where the utility pole 103 exists is defined as the narrow position. Note that the narrow portion here is in contact with the end face of the utility pole 103 facing the own vehicle as shown in FIG. 3 and is perpendicular to the front-rear direction (Y-axis direction) at the current position V of the own vehicle. The position is defined by the straight line 104 in the (X-axis direction). The position of the host vehicle is a position defined by a straight line 106 in the Y-axis direction with the rear wheel axle center 105 of the host vehicle as a reference point. Here, the center of the rear wheel axle is used as the reference point for the position of the own vehicle because it is suitable for calculating the trajectory in the Ackermann turning model, but other parts such as the left and right front ends of the own vehicle are used. Can be used as a reference point for the position of the vehicle.

  In the scene shown in FIG. 3, the control unit 10 calculates a route for passing through the narrow portion by the following method. That is, first, the utility pole 103 in front of the host vehicle is detected on the overhead image generated using the captured images of the in-vehicle cameras 1a to 1d, the road position where the utility pole 103 is present is recognized as the narrow position, and the narrow section A straight line 104 indicating the position is obtained. Further, a straight line 106 indicating the current position V of the host vehicle is obtained with the rear wheel axle center 105 of the host vehicle as a reference point. And the distance L1 from the straight line 106 which shows the present position V of the own vehicle to the straight line 104 which shows a narrow part position is calculated.

  Next, an intermediate point from the current position V of the host vehicle to the narrow position is calculated. As shown in FIG. 4, the intermediate point here is parallel to the straight line 106 indicating the current position V of the host vehicle and the straight line 104 indicating the narrow position, and is a distance from 106 indicating the current position V of the host vehicle. The position is defined by a straight line 107 in which L2 and the distance L3 from the straight line 104 indicating the narrow portion position are equal. Here, the reason why the intermediate point is defined by the straight line 107 is that a steering interlocking trajectory described later varies depending on the steering angle of the host vehicle, and at this stage, the intermediate point cannot be specified as one point.

  Next, based on the output of the rudder angle sensor 3, a steering interlocking trajectory when the host vehicle travels from the current position V while maintaining the current rudder angle is calculated. For example, as shown in FIG. 5, the steering interlocking locus is obtained as a locus 108 followed by the rear wheel axle center 105 serving as a reference point of the position of the host vehicle, and the steering interlocking locus 108 and a straight line 107 indicating the intermediate point described above are obtained. Is an intermediate point (steering angle change point) 109 that is the end point of the steering interlocking locus 108.

  Next, as shown in FIG. 6, the vehicle position and posture when the host vehicle travels along the steering interlocking locus 108 and reaches the rudder angle change point 109 are obtained, and the rudder angle change point 109 is used as a reference point. The position to be used is assumed to be the predicted arrival position V1 of the host vehicle. Note that the predicted arrival position V1 with the rudder angle change point 109 as a reference point is a position where the host vehicle is stopped and the rudder angle is changed, so that as described above, the corresponding on the overhead view image displayed on the monitor 6 It is desirable to superimpose and display the own vehicle icon at the position where the vehicle is to stop and to prompt the driver to change the stop and the steering angle.

  Next, as shown in FIG. 7, a trajectory obtained by rotating the steering interlocking trajectory 180 by 180 degrees around the rudder angle change point 109 is obtained, and this trajectory is set as an inverted trajectory 110. The reverse trajectory 110 is a trajectory followed by the rear wheel axle center 105 when the steering angle is reversed by the same angle in the opposite direction at the predicted arrival position V1 with the rudder angle change point 109 as a reference point. The starting point is the steering angle change point 109, and the end point is the point 111 on the straight line 104 indicating the narrow portion position.

  Next, as shown in FIG. 8, the vehicle position and posture when the host vehicle turns along the reversing trajectory 110 and reaches the end point 111 are obtained, and a position with the end point 111 of the reversing trajectory 110 as a reference point is obtained. The vehicle's predicted arrival position V2. Note that the predicted arrival position V2 with the end point 111 of the reversal trajectory 110 as a reference point is a position at which the support operation for appropriately passing through the narrow portion ends, and thus, as described above, an overhead view displayed on the monitor 6 It is desirable that the vehicle icon is superimposed on the corresponding position on the image and the end position of the support operation is clearly indicated to the driver.

  Next, a trajectory followed by the entire vehicle body of the host vehicle when the host vehicle travels along the steering interlocking track 108 and the reversal track 110 is calculated as a travel region. The route calculated by the control unit 10 can be obtained by superimposing the steering interlocking trajectory 108 and the reversal trajectory 110 on the overhead view image and further superimposing the own vehicle icon on the predicted arrival positions V1 and V2 on the overhead view image and displaying them on the monitor 6. It is possible for the driver to grasp the outline of the vehicle to a certain extent, but whether the route calculated by the controller 10 is a route on which the own vehicle can actually travel without contacting the side walls 101, 102 or the utility pole 103. In order to make the driver surely grasp, it is desirable to obtain a travel area that the entire body of the host vehicle follows, and to display an image representing the travel area superimposed on the overhead image. That is, if an image representing the traveling area of the host vehicle is displayed on the overhead image so as to overlap the side walls 101 and 102 and the utility pole 103, it can be easily recognized that the obstacle calculated by the controller 10 is in contact with these obstacles. . From this point of view, the travel support device according to the present embodiment calculates a travel region that the entire vehicle body follows when the host vehicle travels along the steering interlocking track 108 and the reversal track 110, and calculates the travel region. The image to be displayed is superimposed on the overhead image as route information and displayed on the monitor 6.

  Considering the characteristics of the vehicle (Ackermann turning), the trajectory of the rear wheel on the outer periphery of the turning when the vehicle is turning is located inside the trajectory of the entire vehicle body. Similarly, the trajectory of the front end portion on the inner periphery side of the turn is located inside the trajectory of the entire vehicle body after movement. For this reason, in calculating the traveling area of the host vehicle, the trajectory of the turning outer peripheral side rear wheel and the trajectory of the turning inner peripheral side front end are unnecessary, and the trajectory of the turning outer peripheral side front end is required. This is the locus of the rear wheel on the inner circumference side of the turn.

  Therefore, first, the trajectory of the turning outer peripheral side front end and the trajectory of the turning inner peripheral side rear wheel until the host vehicle reaches the steering angle change point 109 (predicted arrival position V1) along the steering interlocking trajectory 108 are obtained. Calculate. Specifically, the turning center of the steering interlocking locus 108 (a part of the circumference having a predetermined radius) is obtained, and the turning outer peripheral side front end of the vehicle at the current position V and the predicted arrival position V1 with the turning center as the center. The circumference of the circle passing through the section and the circumference of the circle passing through the position of the rear wheel on the inner turning side at the current position V and the predicted arrival position V1 are obtained. Then, as shown in FIG. 9, from the former circumference, an arc between the turning outer peripheral front end of the host vehicle at the current position V and the turning outer peripheral front end of the host vehicle at the predicted arrival position V1 is extracted. Thus, the trajectory 112 of the front end of the turning outer periphery corresponding to the steering interlocking trajectory 108 can be obtained. Similarly, from the circumference of the latter, an arc between the position of the turning inner periphery side rear wheel of the host vehicle at the current position V and the position of the turning inner periphery side rear wheel of the host vehicle at the predicted arrival position V1 is extracted. Thus, the trajectory 113 of the turning inner peripheral side rear wheel corresponding to the steering interlocking trajectory 108 can be obtained.

  Next, the trajectory of the turning outer peripheral side front end and the trajectory of the turning inner peripheral side rear wheel until the host vehicle reaches the predicted arrival position V2 along the reverse trajectory 110 are calculated. Specifically, the trajectory 112 at the front end of the turning outer periphery corresponding to the steering interlocking trajectory 108 calculated as described above is reversed left and right with reference to the front-rear direction (Y-axis direction) of the host vehicle at the current position V, and Then, the vehicle is rotated in the turning direction by an angle corresponding to the yaw angle change amount at the steering angle change point 109 (predicted arrival position V1). Then, as shown in FIG. 9, the trajectory is connected in the X axis direction and the Y axis direction so as to connect the turning outer peripheral side front end portion at the predicted arrival position V1 and the turning outer periphery side front end portion at the predicted arrival position V2. By moving in parallel, the trajectory 114 of the turning outer peripheral front end corresponding to the reversal trajectory 110 can be obtained. Similarly, the trajectory 113 of the turning inner periphery side rear wheel corresponding to the steering interlocking trajectory 108 calculated as described above is reversed left and right with reference to the front-rear direction (Y-axis direction) of the host vehicle at the current position V, and The vehicle is rotated in the turning direction by an angle corresponding to the amount of yaw angle change at the steering angle change point 109 (predicted arrival position V1). Then, this trajectory is parallel to the X-axis direction and the Y-axis direction so as to connect between the position of the turning inner periphery side rear wheel at the predicted arrival position V1 and the position of the turning inner periphery side rear wheel at the predicted arrival position V2. By moving, the trajectory 115 of the turning inner periphery side rear wheel corresponding to the reversal trajectory 110 can be obtained.

  Next, a trajectory outer peripheral front end trajectory 112 and a traverse inner peripheral rear wheel trajectory 113 corresponding to the steering interlock trajectory 108 obtained as described above, and a turning outer peripheral front end trajectory 114 corresponding to the reverse trajectory 110. Based on the trajectory 115 of the turning inner periphery side rear wheel and the size of the host vehicle, the travel region that the entire vehicle body of the host vehicle follows when the host vehicle travels along the steering interlocking track 108 and the reverse track 110 is calculated. To do. Specifically, as shown in FIG. 10, it corresponds to the outline of the own vehicle at the current position V, the outline of the own vehicle at the predicted arrival position V1, the outline of the own vehicle at the predicted arrival position V2, and the steering interlocking locus 108. The region surrounded by the trajectory 112 of the turning outer peripheral side front end and the trajectory 113 of the turning inner peripheral rear wheel, and the trajectory 114 of the turning outer peripheral side front end corresponding to the reverse trajectory 110 and the trajectory 115 of the turning inner peripheral side rear wheel. Is calculated as a travel region 116 that the entire vehicle body follows when the host vehicle travels along the steering interlocking track 108 and the reversing track 110.

  When the control unit 10 calculates the travel region 116 of the host vehicle as described above, the control unit 10 superimposes an image representing the travel region 116 on the overhead image as route information and displays it on the monitor 6. Thereby, the driver of the own vehicle refers to the support image displayed on the monitor 6 to determine whether the route calculated according to the current steering angle is the optimum route as the narrow passage route. Can be judged accurately. If it is determined that the route is not the optimum route, the optimum route can be searched by changing the steering angle by operating the steering wheel while viewing the support image displayed on the monitor 6.

  FIG. 11 shows changes in the predicted arrival positions V1, V2 when the steering angle of the host vehicle at the current position V is changed. In this example, the steering direction is the right direction, and the steering angle is switched between three types of θ1, θ2, and θ3 (θ1> θ2> θ3). The predicted arrival positions V1, V2 at the steering angle θ1 are illustrated. Is a broken line, predicted arrival positions V1 and V2 at the steering angle θ2 are shown by alternate long and short dash lines, and predicted arrival positions V1 and V2 at the steering angle θ3 are shown by solid lines, respectively. In the example shown in FIG. 11, when the steering angle of the host vehicle at the current position V is θ1 or θ2, the utility pole 103 can be avoided, but the right front end of the host vehicle contacts the right side wall 102. However, by setting the steering angle of the vehicle at the current position V to θ3, it is possible to pass through the narrow portion position without contacting the side wall 102 or the power pole 103. As described above, the driver of the host vehicle can search for an optimum route by checking the presence or absence of contact with the side wall 102 or the utility pole 103 according to the steering angle of the host vehicle at the current position V.

  In the above example, the determination of whether the route is optimal is left to the driver of the host vehicle. However, for example, contact with obstacles such as the side walls 101 and 102 and the power pole 103 by image processing on the overhead view image is performed. A means for detecting a passable area that can be moved without being provided is provided, and if a part of the travel area of the host vehicle protrudes from the passable area, the route is automatically determined to be not the optimal route and the display is interrupted. Alternatively, a guide voice that prompts the driver of the host vehicle to change the rudder angle may be output.

  In the above example, the case where only one narrow portion position is detected on the road ahead of the host vehicle has been described. However, when a plurality of narrow portion positions are detected, the vehicle passes through the narrow portion position close to the host vehicle. After that, the same processing may be repeated for the next narrow position.

<Specific example 2 of route calculation for passing through narrow part>
Next, as shown in FIG. 12, when the utility pole 103 is installed on the left front side of the host vehicle on a narrow road partitioned by the side walls 101 and 102 on the left and right sides, the current position of the host vehicle A route calculation method in the case where the posture in FIG.

  As in the example described above, the control unit 10 detects the utility pole 103 in front of the host vehicle on the overhead view image generated using the captured images of the in-vehicle cameras 1a to 1d, and determines the road position where the utility pole 103 exists as the narrow position. Recognize as A straight line 204 in the direction (X-axis direction) perpendicular to the front-rear direction (Y-axis direction) at the current position V of the own vehicle is in contact with the end surface of the utility pole 103 facing the own vehicle side, indicating the position of the narrow portion. Calculate as a straight line. Further, a straight line 206 indicating the current position V of the host vehicle is obtained with the rear wheel axle center 205 of the host vehicle as a reference point. And the distance L21 from the straight line 206 which shows the present position V of the own vehicle to the straight line 204 which shows a narrow part position is calculated.

  Next, as shown in FIG. 13, the distance L22 from the straight line 206 indicating the current position V of the host vehicle and the straight line 204 indicating the narrow position and the distance L22 from the 206 indicating the current position V of the own vehicle and the narrow portion. A straight line 207 having the same distance L23 from the straight line 204 indicating the position is obtained, and this straight line 207 is defined as a straight line indicating an intermediate point from the current position V of the host vehicle to the narrow position.

  Next, as shown in FIG. 14, when the host vehicle travels from the current position V while maintaining the current steering angle, a trajectory 208 followed by the rear axle center 205 serving as a reference point for the position of the host vehicle is steered. Obtained as an interlocked trajectory. Then, an intersection point between the steering interlocking locus 208 and the straight line 207 indicating the intermediate point is set as an intermediate point (steering angle changing point) 209 that is an end point of the steering interlocking locus 208.

  Next, as shown in FIG. 15, the vehicle position and orientation when the host vehicle travels along the steering interlocking locus 208 and reaches the rudder angle change point 209 is obtained, and this rudder angle change point 209 is used as a reference point. The position to be used is assumed to be the predicted arrival position V1 of the host vehicle.

  Next, as shown in FIG. 16, a trajectory obtained by rotating the steering interlocking trajectory 208 by 180 degrees around the rudder angle change point 209 is obtained, and this trajectory is set as an inverted trajectory 210. The reverse trajectory 210 is a trajectory followed by the rear wheel axle center 205 when the steering angle is reversed by the same angle in the opposite direction at the predicted arrival position V1 with the rudder angle change point 209 as a reference point. The start point is the steering angle change point 209, and the end point is the point 211 on the straight line 204 indicating the narrow position.

  Next, as shown in FIG. 17, the vehicle position and posture when the host vehicle turns along the reversing trajectory 210 and reaches the end point 211 are obtained, and a position with the end point 211 of the reversing trajectory 210 as a reference point is obtained. The vehicle's predicted arrival position V2.

  Next, as shown in FIG. 18, the trajectory 212 of the turning outer peripheral side front end and the trajectory 213 of the turning inner peripheral side rear wheel when the host vehicle travels along the steering interlocking trajectory 208, and the host vehicle A trajectory 214 of the turning outer peripheral side front end and a trajectory 215 of the turning inner peripheral side rear wheel of the host vehicle when traveling along the reverse trajectory 210 are obtained. Specifically, based on the turning center of the steering interlocking locus 208 and the position of the turning outer peripheral front end of the host vehicle at the current position V and the predicted arrival position V1, the turning outer peripheral front end corresponding to the steering interlocking locus 208 is obtained. The trajectory 212 is obtained. Similarly, based on the turning center of the steering interlocking locus 208 and the position of the turning inner peripheral rear wheel of the host vehicle at the current position V and the predicted arrival position V1, the turning inner peripheral rear wheel corresponding to the steering interlocking locus 208 is used. The trajectory 213 is obtained. Further, the trajectory 212 of the turning outer peripheral side front end corresponding to the steering interlocking trajectory 208 is reversed left and right with reference to the front-rear direction of the host vehicle at the current position V, and the yaw at the steering angle change point 209 (predicted arrival position V1). A trajectory 214 at the front end of the turning outer periphery corresponding to the reversing trajectory 210 is obtained by translating the rotation in the turning direction by an angle corresponding to the angle change amount in the X-axis direction and the Y-axis direction. Similarly, the trajectory 214 of the turning outer peripheral side front end corresponding to the steering interlocking trajectory 208 is reversed left and right with reference to the front-rear direction of the host vehicle at the current position V, and further at the steering angle change point 209 (predicted arrival position V1). A trajectory 215 of the turning outer peripheral side front end corresponding to the reversing trajectory 210 is obtained by translating an object rotated in the turning direction by an angle according to the yaw angle change amount in the X-axis direction and the Y-axis direction.

  Next, as shown in FIG. 19, the outline of the host vehicle at the current position V, the outline of the host vehicle at the predicted arrival position V1, the outline of the host vehicle at the predicted arrival position V2, and the turn corresponding to the steering interlocking locus 208 A region surrounded by a trajectory 212 of the outer peripheral side front end portion and a trajectory 213 of the turning inner peripheral side rear wheel, and a trajectory 214 of the turning outer peripheral side front end portion corresponding to the reverse trajectory 210 and a trajectory 215 of the turning inner peripheral side rear wheel, This is calculated as a travel region 216 that the entire vehicle body follows when the host vehicle travels along the steering interlocking track 208 and the reversal track 210.

  When the control unit 10 calculates the travel region 216 of the host vehicle as described above, the control unit 10 superimposes an image representing the travel region 216 on the bird's-eye view image as route information and displays it on the monitor 6. Thereby, the driver of the own vehicle refers to the support image displayed on the monitor 6 to determine whether the route calculated according to the current steering angle is the optimum route as the narrow passage route. Can be judged accurately. If it is determined that the route is not the optimum route, the optimum route can be searched by changing the steering angle by operating the steering wheel while viewing the support image displayed on the monitor 6.

  FIG. 20 shows changes in the predicted arrival positions V1, V2 when the steering angle of the host vehicle at the current position V is changed. In this example, the steering direction is the right direction, and the steering angle is switched between three types of θ1, θ2, and θ3 (θ1> θ2> θ3). The predicted arrival positions V1, V2 at the steering angle θ1 are illustrated. Is a broken line, predicted arrival positions V1 and V2 at the steering angle θ2 are shown by alternate long and short dash lines, and predicted arrival positions V1 and V2 at the steering angle θ3 are shown by solid lines, respectively. In the example shown in FIG. 20, when the steering angle of the host vehicle at the current position V is θ1, the right pole end of the host vehicle contacts the right side wall 102 although the utility pole 103 can be avoided. Thus, when the steering angle of the host vehicle at the current position V is θ3, the left front end of the host vehicle after avoiding the utility pole 103 comes into contact with the left side wall 101, but at the current position V By setting the steering angle of the host vehicle to θ2, it is possible to pass through the narrow portion position without contacting the side walls 101, 102 and the power pole 103. As described above, the driver of the host vehicle can search for the optimum route by checking the presence or absence of contact with the side walls 101 and 102 and the utility pole 103 according to the steering angle of the host vehicle at the current position V. .

<Other display forms of support images>
By the way, in the above description, the control unit 10 uses the images taken by the on-vehicle cameras 1a to 1d to generate a bird's-eye view of the surroundings of the host vehicle from directly above, and the traveling area of the host vehicle is displayed on the bird's-eye view image. However, since the overhead image obtained by looking down the surroundings of the host vehicle from directly above is greatly different from the scenery that the driver actually sees, It may be difficult for some drivers to associate with the actual scenery. From such a viewpoint, as shown in FIG. 21, for example, an image representing the traveling area of the host vehicle is displayed on the image based on an image close to the actual scenery in front of the host vehicle viewed from the viewpoint position of the driver. A superimposition of route information may be displayed on the monitor 6 as a support image.

  In order to display the support image as shown in FIG. 21 on the monitor 6, an in-vehicle camera captures an image of the front of the host vehicle including the hood tip F of the host vehicle from an upper body position close to the viewpoint position of the driver of the host vehicle. Then, the image representing the traveling region of the host vehicle described above may be converted into an image viewed from the viewpoint position of the onboard camera and superimposed on the image ahead of the host vehicle captured by the onboard camera.

  FIG. 22 shows a specific installation of the in-vehicle camera 1 for capturing an image (an image close to the actual scenery in front of the host vehicle viewed from the viewpoint position of the driver) as a base of the support image as shown in FIG. It is a figure which shows a position. In order to capture an image close to the actual scenery in front of the host vehicle viewed from the viewpoint position of the driver by the in-vehicle camera 1 installed in the own vehicle, the in-vehicle camera 1 is installed near the roof side base of the own vehicle pillar. Is the best. This position is close to the driver's viewpoint position in the XY plane (the front-rear direction and the vehicle width direction of the host vehicle), and more than the driver's viewpoint position in the Z-axis direction (the height direction of the host vehicle). This is because the hood of the host vehicle can be looked down at a high position.

  The angle of the camera optical axis L of the in-vehicle camera 1 is preferably directed to the left and right front end positions of the hood of the host vehicle in the XY plane. Therefore, in the example shown in FIG. 21, two in-vehicle cameras 1 are used, the optical axis L of one in-vehicle camera 1 is directed to the left front end position of the hood, and the optical axis L of the other in-vehicle camera 1 is directed to the right front end of the hood. The images taken by these two in-vehicle cameras 1 are combined in the left-right direction.

  Further, the angle of the camera optical axis L of the in-vehicle camera 1 in the Z-axis direction is such a length that the own vehicle hood is reflected in the captured image and an extension line in the forward direction of the left and right side surfaces of the own vehicle can be imagined. Specifically, it is desirable that the angle is such that the hood is reflected by about ¼ to 下端 from the lower end in the vertical direction of the image. Further, regarding the front (back) direction of the left and right ends of the host vehicle, it is desirable that the road surface about 20 m ahead is reflected in the image, depending on the vehicle speed and width.

  If an image satisfying the above conditions can be obtained, the installation position of the in-vehicle camera 1 and the angle of the camera optical axis L are not particularly limited to the example of FIG. 22, and are installed at different positions, for example. The focal length and the angle of view of the in-vehicle camera 1 may be adjusted, and only a necessary part of the image captured from the in-vehicle camera 1 may be cut out and used. Furthermore, as described above, in order to generate a bird's-eye view image, the wide-angle vehicle-mounted cameras 1a to 1d installed at four locations on the front, rear, left, and right sides of the host vehicle are used, and pixels at necessary positions are obtained from the captured images of these vehicle-mounted cameras 1a to 1d. May be extracted and converted into an image from the vicinity of the driver's viewpoint position.

<Effects of First Embodiment>
As described above in detail with specific examples, according to the travel support device of the present embodiment, a series of routes including a steering interlocking locus, a steering angle change point, and a reversing locus are calculated, and this Since the route information such as the image of the traveling area of the host vehicle indicating the route is displayed on the monitor 6, the driver can accurately recognize the optimum route for passing through the narrow portion, and driving Even an unfamiliar beginner can support driving of the vehicle so that it can pass through the narrow part with peace of mind.

  Further, according to the travel support device of the present embodiment, reverse trajectories obtained by rotating the steering angle interlocking trajectory from the current position to the steering angle change point by 180 ° around the intermediate point serving as the steering angle change point are obtained. Since the route connecting the rudder angle interlocking locus and the reversing locus is calculated as the narrow passage route, the calculation load for calculating the route can be reduced.

  Further, according to the driving support device of the present embodiment, the trajectory that the entire vehicle body follows when the host vehicle travels along the steering interlocking track and the reversing track is calculated as the traveling region of the host vehicle. Since the image representing the traveling area of the vehicle is displayed on the monitor 6 as route information, the driver can grasp at a glance whether the route calculated according to the current steering angle of the host vehicle is the optimum route. Can do. In addition to the image representing the travel area of the host vehicle, the track of the front end of the turning outer periphery of the host vehicle when the host vehicle travels along the steering interlock trajectory and the reverse trajectory, A similar effect can be obtained by displaying the trajectory of the turning inner periphery side rear wheel of the host vehicle when traveling along the trajectory on the monitor 6 as route information.

  Further, according to the travel support device of the present embodiment, the trajectory of the turning outer peripheral side front end corresponding to the steering interlocking trajectory and the trajectory of the turning inner peripheral side rear wheel are reversed to change the angle of the host vehicle at the steering angle change point. As the result of rotating and moving in parallel is calculated as the trajectory of the turning outer peripheral side front end corresponding to the reverse trajectory and the trajectory of the turning inner peripheral side rear wheel, the trajectory of the turning outer peripheral side front end It is possible to reduce a calculation load for calculating the trajectory of the turning inner peripheral side rear wheel.

  Further, according to the driving support device of the present embodiment, when the route is determined by the driver's operation of the route determination switch 5, the route information indicating this route is fixed to the environment side and is displayed on the display screen of the monitor 6. Since the display position of the route information is updated according to the amount of movement of the host vehicle, the route information can be correctly displayed on the monitor 6 even after the host vehicle starts running.

  Moreover, according to the driving assistance apparatus of this embodiment, the overhead image which looked down the circumference | surroundings of the own vehicle from directly above using the image imaged with vehicle-mounted camera 1a-1d is produced | generated, and route information is shown on this overhead image. By displaying the superimposed image on the monitor 6 as a support image, it is possible to make the driver recognize the route for passing through the narrow portion globally. On the other hand, an in-vehicle camera 1 captures an image of the front of the host vehicle including the left and right front end portions of the host vehicle hood from the upper position of the vehicle body close to the driver's viewpoint, and the route information superimposed on this image is used as a support image. When displayed on the monitor 6, the driver can recognize the route for passing through the narrow portion in association with the actual landscape.

[Second Embodiment]
<Configuration>
FIG. 23 is a configuration diagram of a travel support apparatus according to the second embodiment of the present invention. The driving support device of the present embodiment is different from the first embodiment described above in the method of calculating a route for passing through a narrow portion in the control unit 10 and the content of the support image displayed on the monitor 6. The basic configuration of the apparatus is the same as that of the first embodiment described above, but the functional configuration realized by the control unit 10 is different from that of the first embodiment. Hereinafter, the configuration of the control unit 10 characteristic of the present embodiment will be described. Note that the same reference numerals are used for the components common to the first embodiment.

  As in the first embodiment, the control unit 10 in the travel support apparatus according to the present embodiment includes an electronic control unit (ECU) including a microcomputer that operates according to a predetermined processing program. Is executed to realize various control functions for driving support. As shown in FIG. 23, various control functions realized by the control unit 10 include a viewpoint conversion unit 11, an overhead image generation unit 12, an image processing unit 13, a narrow position detection unit 14, and a steering angle change point calculation unit 31. , Straight trajectory line calculation unit 32, turning trajectory line calculation unit 33, turning trajectory storage unit 21, host vehicle information storage unit 23, movement amount calculation unit 24, host vehicle position determination unit 34, display control unit 35, image update unit 25 , The image synthesizing unit 26 and the voice guide generating unit 27. Among these, the turning locus storage unit 21 and the own vehicle information storage unit 23 are realized by a memory inside the control unit 10.

  The viewpoint conversion unit 11 converts the output images from the in-vehicle cameras 1a to 1d into images with the viewpoint directly above the host vehicle (rearrangement of each pixel).

  The bird's-eye view image generation unit 12 synthesizes the individual viewpoint-converted images output from the viewpoint conversion unit 11 and generates a bird's-eye view image with its own vehicle as a substantial center and the surrounding area looking down from directly above.

  The image processing unit 13 performs image processing such as edge detection on the bird's-eye view image output from the bird's-eye view image generation unit 12, and obstacles ahead of the host vehicle, specifically, walls and implants existing on the left and right sides of the road Detect three-dimensional obstacles such as utility poles. The image processing unit 13 may detect a three-dimensional obstacle such as a utility pole by performing image processing on the captured images of the in-vehicle cameras 1a to 1d before being converted into the overhead view image.

  Based on the output of the image processing unit 13, the narrow part position detection unit 14 specifies a narrow part position where the road width that can be passed on the road ahead of the host vehicle is narrow, and a power pole or the like that exists in the narrow part position. The distance to the distance to the three-dimensional obstacle is calculated. In addition, the narrow position detection unit 14 determines the angle of the three-dimensional obstacle with respect to the front-rear direction of the own vehicle (Y-axis direction) from the installation position and angle of the on-vehicle cameras 1a to 1d with respect to the own vehicle and the position of the three-dimensional obstacle such as a utility pole. Is also calculated. If another object detection device such as a laser range finder is mounted on the host vehicle, use the detection signal of this object detection device to specify the narrow part position and calculate the distance and angle. Also good.

  The rudder angle change point calculation unit 31 calculates a rudder angle change point that is a position where the host vehicle is stopped before the narrow portion position detected by the narrow portion position detection unit 14 and the rudder angle is changed. Specifically, the rudder angle change point calculation unit 31 has a three-dimensional obstacle at a position where a predetermined margin (for example, a distance corresponding to one utility pole) is taken with respect to a three-dimensional obstacle such as a utility pole existing at a narrow position. Set an avoidance point that is a standard for avoiding objects. Further, a turning circle (for example, a minimum turning circle) that the turning outer peripheral rear wheel follows when the host vehicle turns at a predetermined steering angle (for example, a steering angle when the steering wheel is fully turned) from the current position is obtained. Then, a straight line that is in contact with the turning circle and passes through the set avoidance point is obtained, and a position where the turning outer peripheral side surface of the host vehicle matches the straight line is calculated as a steering angle change point.

  The straight locus line calculation unit 32 is based on the straight line calculated by the steering angle change point calculation unit 31 and starts from the position of the front end of the turning outer periphery when the host vehicle reaches the steering angle change point. And the vehicle is parallel to the line segment and at a distance from the line segment by a distance corresponding to the vehicle width of the host vehicle, and the host vehicle reaches the steering angle change point. The line segment extending in the narrow part position direction is obtained from the position of the front end of the turning outer periphery when the vehicle is turned, and the pair of line segments is calculated as a straight trajectory line of the host vehicle from the steering angle change point to the narrow part position. .

  Based on the output of the steering angle sensor 3, the turning trajectory line calculation unit 33 calculates a pair of turning circles that the left and right front ends of the host vehicle follow when the host vehicle travels from the current position while maintaining the current steering angle. It is calculated as a turning trajectory line according to the steering angle of the host vehicle.

  The turning trajectory storage unit 21 stores information (such as the radius of the turning circle and the relative position of the center of the turning circle with respect to the reference position of the own vehicle) on each portion (left and right front ends and rear wheel position) corresponding to the steering angle of the own vehicle. Remember.

  The host vehicle information storage unit 23 stores information such as the total length and width of the host vehicle, the positions of the left and right front end portions that are the outermost peripheral ends when the host vehicle is turning, and the ground contact position of the rear wheels.

  The movement amount calculation unit 24 calculates the movement amount of the host vehicle (front-rear direction, left-right direction, and yaw angle change amount) based on the output of the steering angle sensor 3 and the output of the wheel speed sensor 4.

  Based on the travel amount of the host vehicle calculated by the travel amount calculation unit 24, the host vehicle position determination unit 34 determines whether the host vehicle is traveling in front of the rudder angle change point or exceeds the rudder angle change point. It is determined whether the vehicle is traveling in the selected position, and the determination result is output to the display control unit 35 described later.

  When the host vehicle position determination unit 34 determines that the host vehicle is traveling in front of the steering angle change point, the display control unit 35 sets the current steering angle calculated by the turning trajectory line calculation unit 33. Information on both the corresponding turning trajectory line and the straight trajectory line corresponding to the own vehicle width calculated by the straight trajectory line calculation unit 32 is output to the image updating unit 25, and the own vehicle position determination unit 34 steers the own vehicle. When it is determined that the vehicle is traveling at a position beyond the corner change point, only the information on the straight locus line corresponding to the vehicle width calculated by the straight locus line calculation unit 32 is output to the image update unit 25.

  In the image updating unit 25, the straight locus line corresponding to the own vehicle width calculated by the straight locus line calculating unit 32 is fixed to the environment (background around the own vehicle displayed as a bird's-eye view image) and displayed on the monitor 6. As described above, the display position of the straight locus line on the display screen of the monitor 6 is updated according to the movement amount of the host vehicle calculated by the movement amount calculation unit 24.

  The image compositing unit 26 includes a turning trajectory line corresponding to the current steering angle of the host vehicle calculated by the turning trajectory line calculation unit 33 and a straight trajectory line calculation unit on the overhead image generated by the overhead image generation unit 12. Assistance in which a straight locus line corresponding to the vehicle width calculated by 32 and a pole-shaped icon (hereinafter referred to as a pole icon) indicating the left and right front end positions of the vehicle when the steering angle change point is reached are superimposed. An image is generated and output to the monitor 6.

  The voice guide generation unit 27 specifies the position of the host vehicle on the route for passing through the narrow part based on the amount of movement of the host vehicle calculated by the movement amount calculation unit 24, and performs a necessary driving operation to the driver. A guide voice for teaching is generated and output to the speaker 7.

<Processing by control unit>
Next, an outline of processing by the control unit 10 of the driving support apparatus of the present embodiment configured as described above will be described with reference to the flowchart of FIG. A series of processes shown in the flowchart of FIG. 24 is started when the driver of the host vehicle operates the start switch 2 of the driving support device.

  When a start command for driving support processing is input by operating the start switch 2, the control unit 10 first generates an overhead view image based on images around the own vehicle taken by the in-vehicle cameras 1a to 1d in step S201. To do. In step S202, image processing such as edge detection is performed on the bird's-eye view image generated in step S201 to detect a narrow portion position in front of the host vehicle. In this embodiment, an example in which the narrow portion position is detected by image processing on the overhead image is illustrated, but the narrow portion position may be detected by image processing on the captured images of the in-vehicle cameras 1a to 1d. The narrow portion position may be detected using another object detection device such as a laser range finder.

  Next, in step S203, the control unit 10 sets an avoidance point at a position where a predetermined margin is taken with respect to a three-dimensional obstacle such as a utility pole existing at the narrow position detected in step S202.

  Next, in step S204, the control unit 10 turns a turning circle (for example, the outer wheel on the turning outer periphery follows when the host vehicle turns from the current position at a predetermined steering angle (for example, the steering angle when the steering is fully turned)). Minimum turning circle), a straight line that touches this turning circle and passes through the avoidance point set in step S203 is obtained, and the position at which the turning outer peripheral side surface of the vehicle coincides with this straight line is the steering angle change point. Calculate as

  Next, in step S205, the control unit 10 calculates a straight trajectory line of the host vehicle from the rudder angle change point to passing through the narrow position based on the straight line obtained in step S204.

  Next, the control unit 10 detects the output of the rudder angle sensor 3 in step S206, and in step S207, the trajectory line of the turning radius corresponding to the output of the rudder angle sensor 3 detected in step S206 (left and right of the host vehicle). A pair of trajectory lines followed by the front end portion) is read out from the trajectory trajectory storage unit 21 and used as a trajectory trajectory line corresponding to the current steering angle of the host vehicle.

  Next, in step S208, the control unit 10 indicates the straight trajectory line calculated in step S205, the turning trajectory line calculated in step S207, and the left and right front end positions when the host vehicle reaches the rudder angle change point. The pole icon is displayed on the monitor 6 so as to be superimposed on the overhead view image around the host vehicle generated in step S201.

  Next, in step S209, the control unit 10 teaches the driver that the steering is operated until the turning locus line on the image displayed on the monitor 6 intersects the linear locus line at the position of the pole icon. Guide voice is output from the speaker 7. Note that the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S210, the control unit 10 monitors whether or not the steering angle has been changed (whether or not the output of the steering angle sensor 3 has changed). If the steering angle has been changed, the control unit 10 proceeds to step S206. Returning to step S206 and subsequent steps are repeated, and when the change of the steering angle is completed (if there is no change in the steering angle), the process proceeds to the next step S211.

  Next, in step S211, based on the output of the steering angle sensor 3, the control unit 10 determines a predetermined steering angle (for example, fully turns the steering) based on the current steering angle of the host vehicle as a reference for calculating the steering angle change point. To see if it is at the rudder angle). If the current rudder angle of the host vehicle is a predetermined rudder angle, in step S212, a guide voice for instructing the driver to advance the own vehicle while maintaining the current rudder angle is displayed as a speaker. 7 to output. In this case as well, the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S213, the control unit 10 calculates the movement amount of the host vehicle based on the output of the wheel speed sensor 4, and in step S214, while monitoring the movement amount of the host vehicle calculated in step S213, It is determined whether or not the host vehicle has reached the steering change point calculated in step S204. Then, when the host vehicle reaches the rudder angle change point, in step S215, the turning trajectory line corresponding to the current rudder angle of the host vehicle is switched to non-display, and an image in which only the straight trajectory line is superimposed on the overhead image is displayed. It is displayed on the monitor 6. Further, in step S216, the control unit 10 causes the speaker 7 to output a guide voice for teaching the driver to stop the host vehicle and return the steering to the neutral state. In this case as well, the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S217, the control unit 10 determines whether or not the steering angle of the host vehicle has become the steering angle in the steering neutral state while monitoring the output of the steering angle sensor 3. When the steering angle of the host vehicle reaches the neutral steering angle, in step S218, a guide voice for teaching the driver to advance the host vehicle while maintaining the steering in the neutral state is again transmitted to the speaker. 7 to output. In this case as well, the message text having the above contents may be displayed on the monitor 6 instead of outputting the guide voice or together with the output of the guide voice.

  Next, in step S219, the control unit 10 calculates the amount of movement of the host vehicle based on the output of the wheel speed sensor 4, and in step S220, while monitoring the amount of movement of the host vehicle calculated in step S219, It is determined whether or not the host vehicle has reached the narrow position. And if the own vehicle reaches | attains a narrow part position, a series of processes shown by the flowchart of FIG. 24 will be complete | finished.

<Specific example of route calculation for passing through narrow part>
Next, as shown in FIG. 25, the present embodiment will be described by taking as an example a case where a utility pole 303 is installed on the left front side of the host vehicle on a narrow road divided on both left and right sides by side walls 301 and 302. A specific example of the route calculation method by the control unit 10 of the driving support apparatus will be described.

  In the scene shown in FIG. 25, the control unit 10 calculates a route for passing through the narrow portion by the following method. That is, first, a power pole 303 located on the left front side of the current position V of the host vehicle is detected on a bird's-eye view image generated using captured images of the in-vehicle cameras 1a to 1d, and a road position where the power pole 303 exists is set as a narrow position. recognize.

  Next, as shown in FIG. 26, an avoidance point 304 is set at a position where a predetermined margin is taken with respect to the utility pole 303 existing at the narrow position. Further, a turning circle (for example, a minimum turning circle) 305 followed by the rear wheel on the turning outer periphery when the vehicle turns at a predetermined steering angle (for example, a steering angle when the steering wheel is fully turned to the right) from the current position V Ask. Then, a straight line 306 in contact with the turning circle 305 and passing through the avoidance point 304 is obtained.

  Next, as shown in FIG. 27, in the process of turning the vehicle along a turning circle, a position where the side surface portion on the turning outer peripheral side coincides with the straight line 306 is obtained, and this position is set as a steering angle change point V11. Also, the position 307 of the left front end of the host vehicle and the position 308 of the right front end of the host vehicle when the host vehicle is positioned at the steering angle change point V11 are obtained.

  Next, as shown in FIG. 28, a line segment that forms part of the straight line 306, starting from the position 307 of the left front end of the host vehicle when the host vehicle is positioned at the steering angle change point V11, A line segment 309 having a predetermined length extending in the direction of the position is obtained. Further, a line segment parallel to the line segment 309 and separated by a distance corresponding to the vehicle width of the host vehicle, that is, the position 308 of the right front end of the host vehicle when the host vehicle is positioned at the steering angle change point V11. A line segment 310 having a predetermined length extending in the direction of the narrow portion position is obtained as a starting point. Then, the pair of line segments 309 and 310 is set as a straight trajectory line of the host vehicle from the steering angle change point V11 to the narrow portion position.

  Next, as shown in FIG. 29, when the host vehicle turns while maintaining the current steering angle, a pair of trajectories 311 and 312 that the left and right front ends of the host vehicle follow are obtained. A turning locus line corresponding to the current steering angle of the host vehicle is used.

  When the control unit 10 calculates the straight locus lines 309 and 310 and the turning locus lines 311 and 312 as described above, as shown in FIG. 30, the straight locus lines 309 and 310 and the turning locus lines 311 and 312 The pole icon indicating the positions 307 and 308 of the left and right front ends of the host vehicle when the host vehicle is positioned at the steering angle change point V11 is superimposed on the overhead image and displayed on the monitor 6. Then, in a state where the support image as shown in FIG. 30 is displayed on the monitor 6, the turning trajectory lines 311 and 312 on the image displayed on the monitor 6 are pole icons by the output of the guide voice from the speaker 7. The driver is instructed to operate the steering wheel until it intersects the straight locus lines 309 and 310 at the position.

  Here, when the driver of the host vehicle operates the steering, the turning trajectory lines 311 and 312 superimposed on the overhead view image change in conjunction with the steering, and the steering is performed up to a predetermined steering angle (for example, full turn). When operated, as shown in FIG. 31, the turning trajectory lines 311 and 312 intersect the straight trajectory lines 309 and 310 at the position of the pole icon. And the path | route which connected the turning trace lines 311 and 312 and the linear trace lines 309 and 310 at this time becomes an optimal path | route for narrow part passage.

<Other display forms of support images>
By the way, in the above description, the control unit 10 generates an overhead view image of the host vehicle using images captured by the in-vehicle cameras 1a to 1d, and the turning locus lines 309 and 310 and the linear locus line 311 are formed on the overhead view image. , 312 and a pole icon superimposed on each other are displayed on the monitor 6 as a support image. From the viewpoint of facilitating the association with the actual landscape, as in the first embodiment, FIG. As shown in FIG. 33, based on an image close to the actual scenery in front of the host vehicle viewed from the viewpoint position of the driver, turning trajectory lines 309 and 310, straight trajectory lines 311 and 312 and a pole icon P are displayed on this image. May be displayed on the monitor 6 as a support image. Note that the image example in FIG. 32 is a support image displayed on the monitor 6 when the host vehicle is traveling in front of the rudder angle change point, and the image example in FIG. It is the support image displayed on the monitor 6 when driving | running | working the position beyond.

  The driver of the own vehicle checks the support image as shown in FIG. 32 on the monitor 6 and then travels straight ahead from the state of traveling along the turning trajectory lines 311 and 312 while maintaining a predetermined steering angle. It is possible to grasp the position (steering angle change point) at which the steering operation is performed for switching to the actual scenery while associating it with the actual scenery. For this reason, it is easy to grasp the sense of distance to the rudder angle change point, and it is possible to prevent the steering operation from being delayed. Further, after the position of the host vehicle reaches the rudder angle change point, the turning trajectory lines 311 and 312 are behind the own vehicle, so the turning trajectory lines 311 and 312 are not displayed as shown in FIG. The monitor 6 displays only the straight trajectory lines 309 and 310. In this way, by hiding unnecessary information, it is possible to focus on steering for aligning the front end of the hood of the host vehicle with the left and right straight locus lines 309 and 310, and it is easy to trace the displayed locus. Become.

  In order to display the support image as shown in FIGS. 32 and 33 on the monitor 6, for example, the in-vehicle camera 1 is installed so that the installation position and the camera optical axis angle illustrated in FIG. An image in front of the host vehicle including the hood tip F of the host vehicle is captured from the upper position of the vehicle body near the viewpoint position of the driver of the vehicle, and a turning locus is displayed on the image in front of the host vehicle captured by the in-vehicle camera 1. The lines 309 and 310, the linear locus lines 311 and 312 and the pole icon P may be converted into an image viewed from the viewpoint position of the in-vehicle camera 1 and superimposed.

<Effects of Second Embodiment>
As described above in detail with specific examples, according to the travel support device of the present embodiment, a series of routes including a turning locus, a steering angle change point, and a linear locus are calculated, and this route is calculated. Is displayed on the monitor 6 such as a turning trajectory line or a straight trajectory line indicating that it is possible for the driver to accurately recognize the optimal route for passing through the narrow part, and is unfamiliar with driving. Even beginners can assist in driving the vehicle so that they can pass through narrow spaces with peace of mind.

  Further, according to the driving support device of the present embodiment, the pole icon indicating the left and right front end positions of the host vehicle when the steering angle change point is reached is displayed on the monitor 6 as one piece of route information. Thus, the steering operation by the driver for matching the turning trajectory according to the steering angle of the host vehicle to the straight trajectory can be facilitated.

  In addition, according to the travel support device of the present embodiment, after the host vehicle starts traveling, while confirming the travel position, if the travel position of the host vehicle is in front of the rudder angle change point, a straight line with the turning trajectory line Since both the trajectory lines are displayed on the monitor 6 and the travel position of the host vehicle exceeds the rudder angle change point, only the straight trajectory lines are displayed on the monitor 6, so that the driver's unnecessary display is displayed. It is possible to focus attention on driving operations by preventing attention.

  Further, according to the driving support device of the present embodiment, environmental information such as a turning trajectory line, a straight trajectory line, and a pole icon indicating the path of narrow passage is fixed to the environment side, and the path on the display screen of the monitor 6 Since the display position of the information is updated according to the movement amount of the host vehicle, the route information can be correctly displayed on the monitor 6 even after the host vehicle starts running.

  Moreover, according to the driving assistance apparatus of this embodiment, the overhead image which looked down on the circumference | surroundings of the own vehicle from right above using the image imaged with the vehicle-mounted cameras 1a-1d is produced | generated, and a turning locus line on this overhead image By displaying on the monitor 6 as a support image an overlay of route information such as a straight locus line, a pole icon, etc., it is possible to make the driver recognize the route for passing through the narrow part in a global manner. On the other hand, the on-vehicle camera 1 captures an image of the front of the host vehicle including the left and right tip portions of the host vehicle hood from the upper position of the vehicle body close to the driver's viewpoint, and supports the above-described route information superimposed on the image. When the image is displayed on the monitor 6, the driver can recognize the route for passing through the narrow portion in association with the actual landscape.

  As described above, the first embodiment and the second embodiment have been described as specific embodiments of the present invention. However, each of these embodiments is an example of application of the present invention, and It is not intended that the technical scope of the invention be limited to the content disclosed as these embodiments. That is, the technical scope of the present invention is not limited to the specific technical items disclosed in the above-described embodiments, but includes various modifications, changes, alternative techniques, and the like that can be easily derived from this disclosure.

1 (1a-1d) In-vehicle camera (imaging means)
3 Rudder angle sensor (steering angle detection means)
6 Monitor (display means)
DESCRIPTION OF SYMBOLS 10 Control unit 11 Viewpoint conversion part 12 Overhead image generation part (Overhead image generation means)
13 Image processing unit 14 Narrow part position detection unit (narrow part position detection means)
15 Intermediate point calculation part (steering angle change point calculation means)
16 Steering interlocking locus calculation unit (first locus calculating means)
17 Turning outer periphery side front end locus calculation unit (outer periphery end locus calculation means)
18 Turning inner circumference side rear wheel locus calculation unit (inner circumference rear wheel locus calculation means)
19 Trajectory reversal calculation unit (second trajectory calculation means)
20 Traveling area calculation unit (traveling area calculation means)
24 Movement amount calculation unit (movement amount detection means)
25 Image update unit 26 Image composition unit 31 Rudder angle change point calculation unit (steer angle change point calculation means)
32 Straight locus line calculation unit (second locus calculation means)
33 Turning locus line calculation unit (first locus calculating means)
34 Self-vehicle position determination unit (determination means)
35 Display controller

Claims (11)

  1. Narrow part position detecting means for detecting a narrow part position where the road width in front of the host vehicle is narrow;
    Rudder angle detection means for detecting the rudder angle of the host vehicle;
    First trajectory calculating means for calculating a first trajectory when the host vehicle travels from the current position while maintaining the current steering angle;
    Rudder angle change point calculating means for calculating a rudder angle change point for changing the rudder angle by stopping the host vehicle before the narrow part position;
    Second trajectory calculation means for calculating a second trajectory when the host vehicle travels while maintaining the rudder angle after changing from the rudder angle change point to the rudder angle change point;
    Display means for displaying route information indicating a route of the narrow portion passing through the first locus, the steering angle change point, and the second locus ;
    The rudder angle change point calculating means calculates an intermediate point from the current position of the host vehicle to the narrow portion position as the rudder angle change point,
    The second trajectory calculating means calculates a trajectory obtained by rotating the first trajectory 180 degrees around the intermediate point as the second trajectory,
    Furthermore,
    The vehicle includes an outer peripheral end portion locus calculating means for calculating a locus of the front end portion on the outer periphery of the turn when the own vehicle travels along the first locus and the second locus. The trajectory followed by the turning outer peripheral front end when traveling while maintaining the current steering angle from the current position is calculated as the trajectory of the turning outer peripheral front end corresponding to the first trajectory, and the trajectory is calculated at the current position. A trajectory of the front end on the outer periphery of the turn corresponding to the second trajectory is reversed with respect to the vehicle longitudinal axis and rotated in the turning direction by the amount of change in the vehicle angle at the intermediate point and translated. As
    The said display means displays the locus | trajectory of the said turning outer peripheral side front end part as one of the said route information, The driving assistance apparatus characterized by the above-mentioned .
  2. Narrow part position detecting means for detecting a narrow part position where the road width in front of the host vehicle is narrow;
    Rudder angle detection means for detecting the rudder angle of the host vehicle;
    First trajectory calculating means for calculating a first trajectory when the host vehicle travels from the current position while maintaining the current steering angle;
    Rudder angle change point calculating means for calculating a rudder angle change point for changing the rudder angle by stopping the host vehicle before the narrow part position;
    Second trajectory calculation means for calculating a second trajectory when the host vehicle travels while maintaining the rudder angle after changing from the rudder angle change point to the rudder angle change point;
    Display means for displaying route information indicating a route of the narrow portion passing through the first locus, the steering angle change point, and the second locus;
    The rudder angle change point calculating means calculates an intermediate point from the current position of the host vehicle to the narrow portion position as the rudder angle change point,
    The second trajectory calculating means calculates a trajectory obtained by rotating the first trajectory 180 degrees around the intermediate point as the second trajectory,
    Furthermore,
    An outer peripheral end locus calculating means for calculating a trajectory of the front end portion of the turning outer periphery when the host vehicle travels along the first locus and the second locus;
    An inner peripheral rear wheel trajectory calculating means for calculating a trajectory of the turning inner peripheral rear wheel when the host vehicle travels along the first trajectory and the second trajectory;
    Based on the trajectory of the turning outer peripheral side front end, the trajectory of the turning inner peripheral rear wheel, and the size of the host vehicle, the host vehicle travels along the first trajectory and the second trajectory. A travel area calculation means for calculating a travel area that the entire vehicle body follows,
    The display means displays the travel area as one of the route information,
    The outer peripheral end locus calculating means is a trajectory of the turning outer peripheral side front end corresponding to the first trajectory that follows the trajectory followed by the turning outer peripheral front end when the host vehicle advances from the current position while maintaining the current steering angle. The trajectory is reversed with respect to the vehicle longitudinal axis at the current position as a reference, rotated in the turning direction by the amount of change in the vehicle angle at the intermediate point, and the trajectory translated in parallel is A travel support device that calculates a trajectory of a front end portion on the outer periphery of the turn corresponding to two trajectories .
  3. Narrow part position detecting means for detecting a narrow part position where the road width in front of the host vehicle is narrow;
    Rudder angle detection means for detecting the rudder angle of the host vehicle;
    First trajectory calculating means for calculating a first trajectory when the host vehicle travels from the current position while maintaining the current steering angle;
    Rudder angle change point calculating means for calculating a rudder angle change point for changing the rudder angle by stopping the host vehicle before the narrow part position;
    Second trajectory calculation means for calculating a second trajectory when the host vehicle travels while maintaining the rudder angle after changing from the rudder angle change point to the rudder angle change point;
    Display means for displaying route information indicating a route of the narrow portion passing through the first locus, the steering angle change point, and the second locus;
    The rudder angle change point calculating means calculates an intermediate point from the current position of the host vehicle to the narrow portion position as the rudder angle change point,
    The second trajectory calculating means calculates a trajectory obtained by rotating the first trajectory 180 degrees around the intermediate point as the second trajectory,
    Furthermore,
    The vehicle includes inner rear wheel trajectory calculating means for calculating a trajectory of the turning inner peripheral rear wheel when the host vehicle travels along the first trajectory and the second trajectory. The trajectory that the turning inner peripheral rear wheel follows when the host vehicle travels from the current position while maintaining the current rudder angle is calculated as the trajectory of the inner turning rear wheel corresponding to the first trajectory. Is reversed with respect to the vehicle longitudinal axis at the current position and rotated in the turning direction by the amount of change in the vehicle angle at the intermediate point, and the trajectory moved in parallel corresponds to the second trajectory corresponding to the second trajectory. Calculated as the trajectory of the peripheral rear wheel,
    The display means displays a trajectory of the turning inner periphery side rear wheel as one of the route information .
  4. An outer peripheral end locus calculating means for calculating a trajectory of the front end portion of the turning outer periphery when the host vehicle travels along the first locus and the second locus;
    The travel support apparatus according to claim 3 , wherein the display unit displays a trajectory of the front end of the turning outer periphery as one of the route information.
  5. Narrow part position detecting means for detecting a narrow part position where the road width in front of the host vehicle is narrow;
    Rudder angle detection means for detecting the rudder angle of the host vehicle;
    First trajectory calculating means for calculating a first trajectory when the host vehicle travels from the current position while maintaining the current steering angle;
    Rudder angle change point calculating means for calculating a rudder angle change point for changing the rudder angle by stopping the host vehicle before the narrow part position;
    Second trajectory calculation means for calculating a second trajectory when the host vehicle travels while maintaining the rudder angle after changing from the rudder angle change point to the rudder angle change point;
    Display means for displaying route information indicating a route of the narrow portion passing through the first locus, the steering angle change point, and the second locus;
    The rudder angle change point calculating means touches a turning circle followed by the rear wheel on the outer periphery of the turning when the host vehicle turns at a predetermined rudder angle from the current position, and an avoidance point that is set based on the narrow portion position. The position where the turning outer peripheral side surface of the host vehicle matches the straight line passing through is calculated as the rudder angle change point,
    The second trajectory calculating means calculates a pair of straight trajectories consisting of the straight line and a straight line parallel to the straight line and separated by the vehicle width as the second trajectory,
    The display means displays a trajectory line indicating the first trajectory and a trajectory line indicating the second trajectory as the route information .
  6. The said display means displays the icon which shows the position of the left-right front end part of the own vehicle in the said steering angle change point with the locus line which shows the said 1st locus | trajectory, and the locus line which shows the said 2nd locus | trajectory. Item 6. The driving support device according to Item 5 .
  7. A determination means for determining whether the traveling position of the host vehicle is a position in front of the rudder angle change point or a position beyond the rudder angle change point;
    The display means displays both a trajectory line indicating the first trajectory and a trajectory line indicating the second trajectory when the host vehicle is traveling in front of the rudder angle change point. 7. The travel support apparatus according to claim 5, wherein when the vehicle travels at a position beyond the steering angle change point, only the trajectory line indicating the second trajectory is displayed .
  8. It further comprises movement amount detection means for detecting the movement amount of the host vehicle,
    The travel support apparatus according to any one of claims 1 to 7, wherein the display unit updates a display position of the route information on a display screen according to a movement amount of the host vehicle .
  9. It further comprises imaging means for imaging an image around the host vehicle,
    The display means superimposes and displays the route information on an image picked up by the image pickup means or an image obtained by image processing of an image picked up by the image pickup means. The travel support device according to any one of claims 1 to 8 .
  10. Further comprising an overhead image generation means for converting an image captured by the imaging means into an overhead image looking down from a virtual viewpoint above the host vehicle;
    The travel support apparatus according to claim 9 , wherein the display unit displays the route information superimposed on the overhead image .
  11. The imaging means captures an image of the front of the host vehicle including the left and right front end portions of the host vehicle hood from the upper body position close to the viewpoint position of the driver of the host vehicle;
    The travel support apparatus according to claim 9 , wherein the display unit displays the route information superimposed on an image captured by the imaging unit.
JP2009175103A 2009-07-28 2009-07-28 Driving support device Active JP5556077B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009175103A JP5556077B2 (en) 2009-07-28 2009-07-28 Driving support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009175103A JP5556077B2 (en) 2009-07-28 2009-07-28 Driving support device

Publications (2)

Publication Number Publication Date
JP2011028609A JP2011028609A (en) 2011-02-10
JP5556077B2 true JP5556077B2 (en) 2014-07-23

Family

ID=43637259

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009175103A Active JP5556077B2 (en) 2009-07-28 2009-07-28 Driving support device

Country Status (1)

Country Link
JP (1) JP5556077B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9740943B2 (en) * 2011-12-19 2017-08-22 Nissan Motor Co., Ltd. Three-dimensional object detection device
KR20140147205A (en) 2013-06-18 2014-12-30 삼성전자주식회사 Method for providing driving route of portable medical diagnosis device and apparatus thereto
KR101678448B1 (en) * 2015-06-24 2016-11-23 (주)캠시스 Driving monitoring system for providing guide information
JP2017151524A (en) 2016-02-22 2017-08-31 富士ゼロックス株式会社 Monitoring device, and monitoring system
WO2018173581A1 (en) * 2017-03-21 2018-09-27 株式会社デンソー Driving assistance device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3372190B2 (en) * 1997-07-17 2003-01-27 富士重工業株式会社 Vehicle collision prevention device
JP4563531B2 (en) * 1999-10-13 2010-10-13 富士重工業株式会社 Vehicle driving support device
JP4900174B2 (en) * 2007-10-03 2012-03-21 アイシン・エィ・ダブリュ株式会社 Parking assistance device, parking assistance method, and computer program

Also Published As

Publication number Publication date
JP2011028609A (en) 2011-02-10

Similar Documents

Publication Publication Date Title
US7375651B2 (en) Parking assistance apparatus
DE60314676T2 (en) Device for parking aid
US7969326B2 (en) Parking assist method and parking assist apparatus
US8872919B2 (en) Vehicle surrounding monitoring device
JP5182545B2 (en) Parking assistance device
DE60119431T2 (en) parking aid
JP4657495B2 (en) Vehicle driving support device
JP4108314B2 (en) Vehicle periphery monitoring device
JP2009298178A (en) Parking assistant device and parking assistant method
JP4604703B2 (en) Parking assistance device
CN1132750C (en) Driving-operation assist and recording medium
US7634110B2 (en) Drive assist system and navigation system for vehicle
JP4412380B2 (en) Driving support device, driving support method, and computer program
JP4853712B2 (en) Parking assistance device
US6587760B2 (en) Motor vehicle parking support unit and method thereof
JP4235026B2 (en) Parking assistance device
US7755511B2 (en) Parking assistance apparatus
KR101498976B1 (en) Parking asistance system and parking asistance method for vehicle
JP2008143430A (en) Parking support device
JP2009154654A (en) Vehicular parking assistance device and image display method
EP2147826B1 (en) Parking assistance apparatus and parking assistance method
JP2005112004A (en) Vehicle reversing motion assisting device and vehicle reversing motion assisting method
JP2005313710A (en) Parking assist device
JP5212748B2 (en) Parking assistance device
US7363130B2 (en) Parking assist systems, methods, and programs

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120525

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130930

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20131008

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131128

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140507

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140520