WO2016104198A1 - Vehicle control device - Google Patents

Vehicle control device Download PDF

Info

Publication number
WO2016104198A1
WO2016104198A1 PCT/JP2015/084845 JP2015084845W WO2016104198A1 WO 2016104198 A1 WO2016104198 A1 WO 2016104198A1 JP 2015084845 W JP2015084845 W JP 2015084845W WO 2016104198 A1 WO2016104198 A1 WO 2016104198A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
host vehicle
priority
moving body
blind spot
Prior art date
Application number
PCT/JP2015/084845
Other languages
French (fr)
Japanese (ja)
Inventor
今井 正人
雅男 坂田
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Publication of WO2016104198A1 publication Critical patent/WO2016104198A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device.
  • the dead angle is detected from the image in front of the host vehicle, and the movement when the object pops out assuming that the object exists in the blind spot
  • an object of the present invention is to provide a vehicle control device that can appropriately control a vehicle according to the situation when the host vehicle travels in a blind spot.
  • the vehicle control device detects a blind spot area that is a blind spot for the host vehicle, determines a relative priority between a course of a moving body that may appear from the blind spot area and a course of the host vehicle, A control signal for the host vehicle is output based on the determined priority.
  • the vehicle when the host vehicle travels in a blind spot, the vehicle can be appropriately controlled according to the situation.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 2 is a diagram illustrating an example of a priority determination table for left-hand traffic.
  • FIG. 3 is a diagram illustrating an example of a priority determination table for right-hand traffic.
  • FIG. 4 is a diagram illustrating a target route at a crossroad intersection.
  • FIG. 5 is a diagram illustrating blind spot detection at a crossroad intersection.
  • FIG. 6 is a diagram illustrating the position of the host vehicle at which the blind spot is eliminated at the crossroad intersection.
  • FIG. 7 is a diagram showing feature points on the target route when calculating the target speed at the crossroad intersection.
  • FIG. 8 is a diagram illustrating a calculation example of the target speed at the crossroad intersection.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 2 is a diagram illustrating an example of a priority determination table for left-hand traffic.
  • FIG. 3 is
  • FIG. 9 is a diagram illustrating an example in which the host vehicle 700 performs driving support in consideration of a blind spot on a side road while driving on a priority road.
  • FIG. 10 is a diagram illustrating an example in which traveling support is performed in consideration of the dead angle of a stopped vehicle on the oncoming lane when the host vehicle 800 approaches a pedestrian crossing.
  • FIG. 11 is a flowchart for explaining the operation of the control device 100a.
  • FIG. 12 is a flowchart illustrating a driving support content determination process based on priority.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 1 is a block diagram showing a control device 100a as a vehicle control device and its peripheral devices.
  • a control device 100a illustrated in FIG. 1 is a computer that controls a host vehicle, and by executing a program stored in a storage medium (not shown), a surrounding environment recognition unit 1, a blind spot detection unit 2, road information It functions as the acquisition unit 3, the target route generation unit 4, the priority determination unit 5, the moving body course prediction unit 6, and the vehicle control unit 7.
  • the control device 100a is connected to the steering device 102, the driving device 103, and the braking device 104 of the host vehicle, and the external environment recognition device 101, the sound generator 105, the display device 106, and the automatic driving button 107 provided in the host vehicle. ing.
  • the control device 100a is connected to a CAN (not shown) of the host vehicle, and vehicle information such as the vehicle speed, the steering angle, and the yaw rate of the host vehicle is input via the CAN.
  • CAN Controller Area Network
  • CAN Controller Area Network
  • the external environment recognition device 101 acquires information related to the surrounding environment of the host vehicle, and is, for example, four in-vehicle cameras that respectively capture the surrounding environment of the front, rear, right side, and left side of the host vehicle. .
  • Image data obtained by the external environment recognition apparatus 101 is analog data or A / D converted and input to the control apparatus 100a using a dedicated line or the like.
  • a radar that measures a distance from an object using a millimeter wave or a laser
  • a sonar that measures a distance from an object using an ultrasonic wave, or the like can be used.
  • the external environment recognition apparatus 101 outputs information such as the obtained distance to the object and the direction of the object to the control apparatus 100a using a dedicated line or the like.
  • the steering device 102 is configured by an electric power steering, a hydraulic power steering, or the like that can control the steering angle by an electric or hydraulic actuator or the like according to a drive command of the control device 100a.
  • the drive device 103 is an engine system capable of controlling the engine torque with an electric throttle or the like according to a drive command from the control device 100a, or an electric powertrain system capable of controlling the drive force according to a drive command from the control device 100a.
  • the braking device 104 is configured by an electric brake, a hydraulic brake, or the like that can control a braking force by an electric or hydraulic actuator or the like according to a drive command of the control device 100a.
  • the sound generator 105 includes a speaker or the like, and is used to output a warning or voice guidance to the driver.
  • the display device 106 includes a display such as a navigation device, a meter panel, a warning light, and the like. In addition to the operation screen of the control device 100a, the display device 106 displays a warning screen or the like that visually informs the driver that the vehicle is in danger of colliding with an obstacle.
  • the automatic operation button 107 is an operation member provided at a position where the driver can operate, and outputs a start signal for starting the operation of the control device 100a to the control device 100a based on the operation of the driver.
  • a mode change switch may be provided in the automatic operation button 107 so that a plurality of travel modes such as an automatic parking mode, a general road automatic operation mode, and an expressway automatic operation mode can be switched. Further, the operation of the control device 100a may be terminated by operating the automatic operation button 107 while the operation of the control device 100a is being performed.
  • the automatic operation button 107 is installed as a switch in a place where the driver can easily operate such as around the steering.
  • a button corresponding to the automatic operation button 107 may be displayed on the display device 106 so that the driver can operate it.
  • the image data obtained by the external environment recognition device 101 is input to the control device 100a.
  • the surrounding environment recognition unit 1 uses the image data input from the outside environment recognition device 101 to detect the shape and position of objects such as stationary solid objects, moving objects, road surface paint such as lane markings, and signs around the host vehicle. In addition, it has a function of determining whether or not the vehicle is a road surface on which the vehicle can travel by detecting unevenness on the road surface.
  • the stationary solid object is, for example, a parked vehicle, a wall, a guardrail, a pole, a pylon, a curb, a roadside tree, a car stop, and the like.
  • the moving body is, for example, a pedestrian, a bicycle, a motorcycle, or a vehicle.
  • the stationary solid object and the moving object are collectively referred to as an obstacle.
  • the shape and position of the object are detected using a pattern matching technique or other known techniques.
  • the position of the object is expressed, for example, using a coordinate system having an origin at the position of the in-vehicle camera that captures the front of the host vehicle.
  • the surrounding environment recognition unit 1 may be, for example, when driving on a general road based on information on the shape and position of the detected object and a determination result of whether or not the vehicle is a road surface on which the vehicle can travel. For example, a lane position where the vehicle can travel and a space where the vehicle can turn at an intersection are detected. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
  • the road information acquisition unit 3 acquires map data around the current vehicle position.
  • the acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present).
  • traffic regulation information speed limit, types of vehicles that can be passed, etc.
  • lane classification main line, overtaking lane, uphill lane
  • Straight lane left turn lane, right turn lane, etc.
  • presence / absence of traffic lights signs, etc.
  • the target route generation unit 4 generates a route for moving the host vehicle from the current host vehicle position to the target position. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
  • the blind spot detection unit 2 detects an area that is a blind spot from the host vehicle using the position information of the obstacle recognized by the surrounding environment recognition unit 1 and the map data acquired by the road information acquisition unit 3. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
  • the moving body course prediction unit 6 estimates a moving body (obstacle or blind spot object) that may jump out from the blind spot detected by the blind spot detection unit 2, and predicts the course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
  • the course of the mobile body can be rephrased as the planned travel path of the mobile body.
  • the priority of the path of the moving body can be associated with the planned traveling path of the moving body. The priority is determined based on the priority of the moving body itself (moving body specific priority) and the priority of the moving path itself on which the moving body moves (moving path specific priority). In some cases, the body priority is dominant and the path of the moving body is determined. In other cases, the path priority is determined in addition to the moving body priority. . For example, in the relationship between the pedestrian and the host vehicle, the priority of the moving body is dominant, and therefore the route priority is always higher for the pedestrian than for the host vehicle. In addition, since both the vehicle and the other vehicle are vehicles, the priority of the course cannot be determined only by the priority of the moving body itself, and the road where the host vehicle and the other vehicle are located cannot be determined. Priorities need to be considered.
  • the estimation of the moving body that may jump out from the blind spot is performed based on the position of the own vehicle that is grasped using GPS or the like, and the map data and information from the surrounding environment recognition unit 1. For example, when detecting a pedestrian crossing, the control device 100a estimates that a pedestrian or a bicycle may exist, and when detecting an intersection, the control device 100a estimates the presence of a vehicle in addition to the pedestrian or bicycle.
  • the priority determination unit 5 determines which route has priority when the route (route) of the host vehicle generated by the target route generation unit 4 and the route of the mobile body predicted by the mobile body route prediction unit 6 intersect. judge.
  • FIG. 2 shows an example of a priority determination table used by the priority determination unit 5.
  • the priority determination table of FIG. 2 is a case of left-hand traffic, and shows the priority of the route of the opponent vehicle (horizontal axis) with respect to the route of the own vehicle (vertical axis).
  • FIG. 2A is a priority determination table in a case where the host vehicle and the opponent vehicle pass the same road in the opposite direction and enter the intersection.
  • FIG. 2B is a priority determination table when an opponent vehicle enters the intersection from the intersection road on the right side of the own vehicle at an intersection where there is no traffic signal.
  • FIG. 2C is a priority determination table in a case where an opponent vehicle enters an intersection on the right intersection road with respect to the own vehicle at an intersection with a traffic light.
  • FIG. 1 is a case of left-hand traffic, and shows the priority of the route of the opponent vehicle (horizontal axis) with respect to the route of the own vehicle (vertical axis).
  • FIG. 2A is a
  • FIG. 2D is a priority determination table when the opponent vehicle enters the intersection from the intersection road on the left side with respect to the own vehicle at the intersection where there is no traffic signal.
  • FIG. 2E is a priority determination table in a case where an opponent vehicle enters an intersection on the left intersection road with respect to the own vehicle at an intersection with a traffic light.
  • FIG. 2A shows a case where the host vehicle and the opponent vehicle enter the intersection while passing the same road in the opposite direction.
  • the priority of the host vehicle becomes higher than that of the partner vehicle.
  • the priority relationship is not established, and therefore, “ ⁇ ” is represented in the priority determination table.
  • the priority is determined only by the priority of the road (that is, the moving path) (that is, the priority that is specific to the moving path), if the host vehicle and the other vehicle pass the same road in the opposite direction.
  • FIG. 3 shows a priority determination table in the case of right-hand traffic.
  • the scene settings in FIGS. 3A to 3E are the same as those in FIGS. 2A to 2E. Then, when the scene of FIG. 3A, that is, when the host vehicle and the partner vehicle pass the same road in the opposite direction and enter the intersection, the priority determination table is the same as that of the left-hand traffic.
  • FIGS. 3B to 3E details of FIGS. 3B to 3E are omitted, but are partially different from FIGS. 2B to 2E when the determination content is left-hand traffic. Specifically, in FIG. 3 (b) to FIG. 3 (e), the determination contents are different for the cells indicated by the shaded or gray background.
  • map data and the road traffic law can be stored in a navigation device mounted on the vehicle.
  • communication navigation system in which the in-vehicle navigation device communicates with the data center to perform navigation, there is a method of using the latest map data and road traffic law possessed by the data center. Conceivable.
  • the vehicle control unit 7 controls the host vehicle along the target route generated by the target route generation unit 4.
  • the vehicle control unit 7 calculates a target rudder angle and a target speed based on the target route.
  • the target rudder angle and the target speed are calculated so that the host vehicle does not collide with the obstacle.
  • the target speed is changed or the command value of the target brake pressure is calculated.
  • the vehicle control unit 7 then outputs a target steering torque for realizing the target steering angle to the steering device 102. Further, the vehicle control unit 7 outputs a target engine torque and a target brake pressure for realizing the target speed to the driving device 103 and the braking device 104.
  • the driving support content, etc. Is output to the sound generator 105 and the display device 106.
  • the surrounding environment recognition unit 1 recognizes the oncoming vehicle 201, the pedestrian 204, and the pedestrian 205 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 210 for turning right at the intersection so as not to collide with these obstacles recognized by the surrounding environment recognition unit 1.
  • the route 210 calculated by the target route generating unit 4 is divided into a straight section and a turning section, for example, a straight section until entering the intersection, a turning section that turns in the intersection, and a straight section after passing the intersection. Can think.
  • the target route generation unit 4 represents a straight section route by a straight line, and approximates a turn section route by combining a clothoid curve and an arc.
  • the clothoid curve represents a trajectory drawn by the host vehicle when the speed of the host vehicle 200 is constant and the rudder angle of the host vehicle 200 is changed at a constant angular velocity.
  • the circular arc represents a trajectory drawn by the own vehicle when the vehicle is driven while the speed of the own vehicle 200 is fixed and the rudder angle of the own vehicle 200 is fixed to a predetermined value (excluding the rudder angle in which the own vehicle goes straight).
  • the route calculation method by the target route generation unit 4 includes an approximation method using a spline curve, a polynomial, or the like, and is not limited to the above.
  • the target route generation unit 4 calculates the target route of the host vehicle and calculates the target speed when traveling on the target route.
  • the target speed of the vehicle is set to be a slow speed within the intersection (for example, a section where there is no lane marking between pedestrian crossings), and the other sections are set to have a speed limit, and the slow speed and the speed limit are set.
  • the changeover is connected so that the acceleration / deceleration is smooth.
  • blind spot detection by the blind spot detection unit 2 is performed to extract a blind spot that may have an obstacle obstructing the course of the host vehicle, and the mobile course prediction unit 6 may jump out of the blind spot. Predict the path of an obstacle. And the priority of the course of the own vehicle and an obstacle is determined, and the driving assistance content based on the priority is determined.
  • the blind spot detection unit 2 acquires map data around the intersection of FIG. 5 which is map data around the host vehicle 200 by the road information acquisition unit 3, and the oncoming vehicle 201 and the vehicle 202 recognized by the surrounding environment recognition unit 1.
  • the vehicle 203 is arranged on the acquired map data. Due to the characteristics of the external environment recognition device 101, the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the own vehicle, so a camera for recognizing the front of the own vehicle 200 is installed at the center of the front end of the own vehicle 200.
  • a line 301 extending from the front center of the host vehicle 200 toward the right end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203, and extending toward the left end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203.
  • a region 303 on the opposite lane on the back side of the line 302 is determined as a blind spot and extracted. As described above, by arranging the own vehicle and the obstacle on the map data, it is possible to extract a blind spot where an obstacle that obstructs the course of the own vehicle may exist.
  • the moving body course prediction unit 6 selects an assumed moving body having a high traveling speed.
  • the blind spot 303 detected by the blind spot detection unit 2 is on the opposite lane, and examples of assumed moving bodies include four-wheeled vehicles (passenger cars, trucks, etc.), two-wheeled vehicles, bicycles, etc.
  • the fastest four-wheeled vehicle is selected as the moving body 304.
  • the fastest route among the routes that the moving body 304 can take is selected as the moving body route.
  • the straight path 305 is selected as the fastest path as the moving body path.
  • the route here also includes the meaning of actions (straight intersection / right / left turn, straight on priority road, straight on non-priority road, straight on straight road, right / left turn outside road to parking, etc.)
  • the priority of the route (target route) and the moving body route can be determined by the priority determination unit 5 based on the map data acquired by the road information acquisition unit 3 and the road traffic law. For example, in the case of FIG. 5, the route (target route 210) of the host vehicle 200 turns right at the intersection, and the route 305 of the moving body 304 goes straight through the intersection, so it is determined that the priority of the moving body 304 that goes straight through the intersection is high. .
  • the vehicle control unit 7 performs speed control ignoring the moving object when the priority of the own vehicle is higher than the priority of the moving object, and the priority of the own vehicle is higher than the priority of the moving object.
  • the driving support control for avoiding the collision with the moving body is performed.
  • the priority determination unit 5 determines that the priority of the moving body 304 is higher than that of the own vehicle 200
  • the vehicle control unit 7 avoids the collision of the own vehicle 200 with the moving body 304. For driving support.
  • the host vehicle 200 since the speed of the host vehicle 200 only needs to be controlled so that the host vehicle 200 does not interfere with the path 305 of the moving body 304, the host vehicle 200 moves forward at a slow speed to the position shown in FIG. When the blind spot 303 detected in the state of FIG. 5 disappears and it can be confirmed that there is no oncoming vehicle, the vehicle returns to the original speed and passes through the intersection.
  • FIG. 7 is the same as the situation described with reference to FIGS. 4 to 6.
  • the vehicle passes the intersection at the target speed shown in FIG.
  • the target speed in FIG. 8B is calculated to perform speed control. 7 and 8
  • point B is a point where the blind spot is predicted to disappear when the host vehicle 200 is at point A
  • point C is a point where the blind spot actually disappears. Decelerate to the intersection passing speed V1 and further to the point B to decelerate to the speed V2 for avoiding the collision with the moving body 304.
  • the blind spot disappears at the point C, it accelerates again to the intersection passing speed V1 and passes through the point D Accelerate again.
  • the priority between the course of the host vehicle and the course of the moving body is determined based on the information of the map data, and the driving support content is determined based on the determined priority, so that an appropriate response according to the surrounding situation can be obtained.
  • Driving support is possible and safety is improved.
  • an area that becomes a blind spot due to the opposite right turn waiting vehicle is extracted as map data information, and the course of the own vehicle (turns right at the intersection).
  • FIG. 9 is a scene in which the host vehicle 700 goes straight on the priority road, and a side road is connected to the left side in the traveling direction of the host vehicle 700. Note that a boundary 701 between the road on which the host vehicle 700 is traveling and a side road is blocked by a wall or a building, and the side road becomes a blind spot from the position of the host vehicle 700 in FIG.
  • the surrounding environment recognition unit 1 recognizes the boundary 701 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates the route 702 as a target route that travels in the center position of the lane because there is no obstacle on the travel lane of the host vehicle 700. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
  • the blind spot detection unit 2 acquires map data around the host vehicle 700 by the road information acquisition unit 3, and information on the boundary 701 recognized by the surrounding environment recognition unit 1 and a camera that recognizes the front of the host vehicle 700. Is installed at the center of the front end of the host vehicle 700, a blind spot area 704 on the side road is extracted from a line 703 extending from the center of the front end of the host vehicle 700 toward the location where the boundary 701 is interrupted.
  • the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 704 as the four-wheeled vehicle 705, and calculates the straight path 706 as the moving body course.
  • the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path.
  • the route of the own vehicle 700 (target route 702) goes straight on the priority road, and the route 706 of the moving body 704 goes straight on the non-priority road, so the priority of the own vehicle 700 going straight on the priority road is Determined to be high.
  • the driving support content based on the priority determined by the priority determination unit 5 is determined.
  • the vehicle control unit 7 determines whether or not the moving body 705 is present.
  • the point A the point where the blind spot 704 is extracted or the side road in FIG.
  • the target brake hydraulic pressure is increased to B1 at a predetermined distance from the front). By doing so, it is possible to assist in shortening the time until the start of braking when the moving body 705 jumps out. Then, when reaching point C where the blind spot disappears, the target brake fluid pressure is restored.
  • FIG. 10A is a scene in which the host vehicle 800 travels straight on a single road. A stop vehicle 801 exists in the oncoming lane, and a pedestrian crossing exists behind the stop vehicle 801.
  • the surrounding environment recognition unit 1 recognizes the stopped vehicle 801 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 802 as a target route that travels in the lane center position because there is no obstacle on the travel lane of the host vehicle 800. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
  • the blind spot detection unit 2 acquires map data around the own vehicle 800 by the road information acquisition unit 3, and recognizes the information of the stopped vehicle 801 recognized by the surrounding environment recognition unit 1 and the front of the own vehicle 800.
  • the camera is installed at the center of the front end of the host vehicle 800, a blind spot region 805 from the line 803 and the line 804 extending from the center of the front end of the host vehicle 800 toward the left and right ends of the stop vehicle 801 is extracted.
  • the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 805 as the pedestrian 806, and calculates the straight path 807 as the moving body course.
  • the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path.
  • the route of the own vehicle 800 (target route 802) goes straight on the priority road, and the route 807 of the moving body 806 is a route crossing the pedestrian crossing. It is determined that the priority of 806 is high.
  • the driving support content based on the priority determined by the priority determination unit 5 is determined.
  • the priority determination unit 5 since the priority determination unit 5 determines that the priority of the moving body 806 is higher than that of the host vehicle 800, the vehicle control unit 7 avoids a collision with the moving body 806. To support for. Specifically, first, deceleration control is performed to a speed V2 that stops before the pedestrian crossing, and as shown in FIG. 10 (b), the host vehicle 800 travels and there are no obstacles such as pedestrians around the pedestrian crossing. After confirming the above, the original speed V1 is restored.
  • FIG. 11 is a flowchart showing an example of the processing procedure of the control device 100a.
  • the control program shown in the flowchart of FIG. 11 is executed.
  • the process of FIG. 11 is repeatedly executed at predetermined time intervals (for example, 0.01 second intervals).
  • an operation for stopping the automatic driving is performed (for example, re-operation of the automatic driving button 107)
  • the repeated execution is stopped and the automatic driving is stopped.
  • step S1001 the control device 100a determines whether or not the current state is automatic driving by operating the automatic driving button 107. If the current state is automatic operation, the control device 100a proceeds to the process of step S1002. On the other hand, if the current state is not automatic driving, the control device 100a skips a series of processes and continues.
  • step S1002 the control device 100a starts capturing image data from the external environment recognition device 101. Thereafter, image data is captured from the external environment recognition apparatus 101 for each frame.
  • step S1003 the control device 100a inputs the image data captured in step S1002 to the surrounding environment recognition unit 1, and detects objects such as stationary solid objects around the host vehicle, moving objects, road surface paint such as parking frame lines, and signs. Detect shape and position. In addition, based on the detected information on the shape and position of the object and the determination result of whether or not the vehicle is a road surface on which the vehicle can travel, for example, when traveling on a general road, Detects a swivelable space. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
  • step S1004 the control device 100a acquires map data around the current vehicle position.
  • the acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present).
  • step S1005 the control device 100a is configured to move the host vehicle from the current host vehicle position to the target position based on the information about the host vehicle surrounding environment detected in step S1003 and the map data acquired in step S1004. Is generated. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
  • step S1006 the control device 100a detects an area that becomes a blind spot from the own vehicle based on the surrounding environment of the host vehicle detected in step S1003 and the map data acquired in step S1004. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
  • step S1007 the control device 100a estimates a moving body that may jump out from the blind spot detected in step S1006, and predicts its course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
  • step S1008 the control device 100a determines which route has priority when the route (route) of the host vehicle generated in step S1005 and the route of the moving body predicted in step S1007 intersect.
  • step S1009 the control device 100a executes driving support based on the priority determined in step S1008. Specific processing here will be described with reference to FIG.
  • FIG. 12 is a flowchart showing an example of the processing procedure of step S1009 of FIG.
  • step S1201 it is determined whether or not a blind spot is detected in step S1006 of FIG. 11. If a blind spot is detected, the process proceeds to step S1202, and if a blind spot is not detected, the process proceeds to step S1205.
  • step S1202 it is determined whether or not the route (route) of the host vehicle generated in step S1005 of FIG. 11 and the route of the moving body predicted in step S1007 intersect. If so, the process proceeds to step S1203 and does not intersect. In this case, the process proceeds to step S1205.
  • step S1203 if it is determined in step S1008 in FIG. 11 that the host vehicle is prioritized, the process proceeds to step S1204. If it is determined that the host vehicle is not priority, the process proceeds to step S1206.
  • step S1204 as in the situation described with reference to FIG. 9, the target brake fluid pressure is increased in preparation for the jumping out of the moving body.
  • step S1205 the support in consideration of the blind spot is not performed, the normal traveling is maintained, and the series of processes in step S1009 in FIG.
  • step S1206 priority is given to a moving body that may jump out of the blind spot as in the situation described with reference to FIG. 7 or FIG. 10, so that the vehicle can be stopped at any time while performing deceleration control to a predetermined position.
  • step S1010 the control device 100a calculates a target rudder angle and a target speed for traveling along the target route from the target route and target speed generated in step S1005 and the target speed determined by the driving support determined in step S1009. To do.
  • step S1011 the control device 100a calculates a target steering torque, a target engine torque, and a target brake pressure for realizing the target rudder angle and target speed calculated in step S1010. , Output to the braking device 104. Then, the series of processing ends, and the process returns to step S1001.
  • control parameter output to the steering device 102 includes a target steering torque for realizing the target steering angle, but depending on the configuration of the steering device 102, the target speed steering angle can be directly output.
  • the control parameters output to the driving device 103 and the braking device 104 include a target engine torque and a target brake pressure for realizing the target speed.
  • the target parameter may be directly set. It is also possible to output the speed.
  • Control device 200 700, 800 Own vehicle 201 Oncoming vehicle 202, 203, 801 Vehicle 204, 205 Pedestrian 210, 702, 802 Target route 303, 704, 805 Blind spot area 304, 705, 806 Moving body 305, 706, 807 Mobile path

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Navigation (AREA)

Abstract

The objective of the present invention is to provide a vehicle control device capable of controlling a host vehicle appropriately in accordance with the conditions when the host vehicle is driving under conditions in which there is a dead angle. This vehicle control device 100a is characterized by: the detection of a dead-angle region 303 that will be a dead angle for the host vehicle 200; the determination of the relative priorities of the road 210 of the host vehicle 200 and the road 305 of a moving body 304 that potentially can appear from the dead angle region 303; and the outputting of a control signal for the host vehicle 200 on the basis of the determined priorities.

Description

車両制御装置Vehicle control device
 本発明は、車両制御装置に関する。 The present invention relates to a vehicle control device.
 車載カメラやレーダなどの外界認識センサを用いて自車両周辺の物体(車両、歩行者、構造物など)や道路標示・標識(区画線などの路面ペイント、止まれなどの標識など)を認識するための技術が種々提案されている。さらに、これらの技術を用いて自車両を制御し、乗員の安心感や快適性を向上させる技術も種々提案されている。ただし、車両に搭載される外界認識センサでは、検出した物体の背後が死角となるために、死角に存在する可能性のある物体を考慮した制御を行う必要がある。 To recognize objects around the vehicle (vehicles, pedestrians, structures, etc.) and road markings / signs (road surface paint such as lane markings, signs such as stops) using external recognition sensors such as in-vehicle cameras and radar Various techniques have been proposed. Furthermore, various technologies for controlling the host vehicle using these technologies to improve the occupant's sense of security and comfort have been proposed. However, in the external field recognition sensor mounted on the vehicle, since the back of the detected object becomes a blind spot, it is necessary to perform control in consideration of an object that may exist in the blind spot.
 死角に存在する可能性のある物体を考慮して自車両を制御するものとして、自車両前方の画像から死角を検出し、死角に物体が存在すると想定してその物体が飛び出してきた時の移動可能範囲を求め、自車予測位置と物体の移動可能範囲とから自車両と物体とが衝突する可能性があるか否かを判定し、自車両と物体とが衝突する可能性がある場合には、自車両が物体と衝突しないように走行支援制御を実行する技術が知られている(特許文献1参照)。また、自車両が走行している道路が優先道路か非優先道路かをカーナビゲーションシステムの地図情報に基づいて判定し、自車両が非優先道路を走行する場合には減速制御を行う技術が開示されており(特許文献2参照)、この技術を用いれば、見通しの悪い交差点において自車両が非優先道路を走行する際に死角から飛び出してくる物体との衝突が回避可能である。 As a means of controlling the host vehicle in consideration of an object that may exist in the blind spot, the dead angle is detected from the image in front of the host vehicle, and the movement when the object pops out assuming that the object exists in the blind spot When the possible range is obtained, it is determined whether there is a possibility of collision between the vehicle and the object from the predicted position of the vehicle and the movable range of the object, and the vehicle and the object may collide Is known to perform driving support control so that the host vehicle does not collide with an object (see Patent Document 1). Also disclosed is a technique for determining whether a road on which the vehicle is traveling is a priority road or a non-priority road based on map information of the car navigation system, and performing deceleration control when the vehicle is traveling on a non-priority road. If this technology is used, it is possible to avoid a collision with an object popping out from a blind spot when the vehicle travels on a non-priority road at an intersection with poor visibility.
特開2006-260217号公報JP 2006-260217 A 特開2010-132260号公報JP 2010-132260 A
 しかしながら、特許文献1に開示されている車両用走行支援装置では、死角の有無のみで走行支援を実施するか否かを判定するため、自車両が優先道路を走行中で明らかに走行支援が不要な場合にもかかわらず常に走行支援が実施され、乗員に不安感を与えるおそれがあった。これに対しては、特許文献2に開示されている運転支援方法を組み合わせることで、自車両が優先道路を走行している際には走行支援を実施せず、自車両が非優先道路を走行している際にのみ減速制御を行うことが可能となるため、交差点直進時における課題が解決できる。しかし、この方法では交差点直進時のみが対応可能であり、自車両が交差点を右左折する場合や道路が交差しない場合(例えば横断歩道上を横断する歩行者が存在する場合など)においては対応が困難であった。 However, in the vehicle travel support device disclosed in Patent Document 1, it is determined whether or not the travel support is performed only by the presence or absence of a blind spot. In spite of this, driving assistance was always implemented, and there was a risk of anxiety for the passengers. In response to this, by combining the driving support methods disclosed in Patent Document 2, when the host vehicle is traveling on a priority road, the host vehicle does not perform driving support and the host vehicle travels on a non-priority road. Since it is possible to perform deceleration control only when the vehicle is traveling, it is possible to solve the problem at the time of straight traveling at the intersection. However, this method can be used only when going straight at the intersection, and can be used when the vehicle turns right or left at the intersection or when the road does not intersect (for example, when there is a pedestrian crossing the pedestrian crossing). It was difficult.
 そこで、本発明は、自車両が死角のある状況を走行する場合に、状況に応じて適切に車両を制御することができる車両制御装置を提供することを目的とする。 Therefore, an object of the present invention is to provide a vehicle control device that can appropriately control a vehicle according to the situation when the host vehicle travels in a blind spot.
 本発明による車両制御装置は、自車両にとって死角となる死角領域を検出し、前記死角領域から出現する可能性のある移動体の進路と前記自車両の進路の相対的な優先度を判定し、判定した優先度に基づいて前記自車両に対する制御信号を出力することを特徴とする。 The vehicle control device according to the present invention detects a blind spot area that is a blind spot for the host vehicle, determines a relative priority between a course of a moving body that may appear from the blind spot area and a course of the host vehicle, A control signal for the host vehicle is output based on the determined priority.
 本発明によれば、自車両が死角のある状況を走行する場合に、状況に応じて適切に車両を制御することができる。 According to the present invention, when the host vehicle travels in a blind spot, the vehicle can be appropriately controlled according to the situation.
図1は、本発明による車両制御装置の一実施の形態を示す図である。FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention. 図2は、左側通行における優先度判定テーブルの一例を示す図である。FIG. 2 is a diagram illustrating an example of a priority determination table for left-hand traffic. 図3は、右側通行における優先度判定テーブルの一例を示す図である。FIG. 3 is a diagram illustrating an example of a priority determination table for right-hand traffic. 図4は、十字路交差点における目標経路を示す図である。FIG. 4 is a diagram illustrating a target route at a crossroad intersection. 図5は、十字路交差点における死角検出を示す図である。FIG. 5 is a diagram illustrating blind spot detection at a crossroad intersection. 図6は、十字路交差点において死角が解消される自車両の位置を示す図である。FIG. 6 is a diagram illustrating the position of the host vehicle at which the blind spot is eliminated at the crossroad intersection. 図7は、十字路交差点における目標速度を演算する際の目標経路上の特徴点を示す図である。FIG. 7 is a diagram showing feature points on the target route when calculating the target speed at the crossroad intersection. 図8は、十字路交差点における目標速度の演算例を示す図である。FIG. 8 is a diagram illustrating a calculation example of the target speed at the crossroad intersection. 図9は、自車両700が優先道路を走行中に脇道の死角を考慮した走行支援を行う例を示す図である。FIG. 9 is a diagram illustrating an example in which the host vehicle 700 performs driving support in consideration of a blind spot on a side road while driving on a priority road. 図10は、自車両800が横断歩道に差し掛かる際に対向車線上の停止車両の死角を考慮した走行支援を行う例を示す図である。FIG. 10 is a diagram illustrating an example in which traveling support is performed in consideration of the dead angle of a stopped vehicle on the oncoming lane when the host vehicle 800 approaches a pedestrian crossing. 図11は、制御装置100aの動作を説明するフローチャートである。FIG. 11 is a flowchart for explaining the operation of the control device 100a. 図12は、優先度に基づく走行支援内容決定処理を説明するフローチャートである。FIG. 12 is a flowchart illustrating a driving support content determination process based on priority.
 以下、図を参照して本発明を実施するための形態について説明する。
[実施例1]
 図1は、本発明による車両制御装置の一実施の形態を示す図である。図1では、車両制御装置としての制御装置100aと、その周辺装置とを示すブロック図である。図1に例示される制御装置100aは、自車両を制御するコンピュータであって、不図示の記憶媒体に記憶されたプログラムを実行することにより、周辺環境認識部1、死角検出部2、道路情報取得部3、目標経路生成部4、優先度判定部5、移動体進路予測部6、車両制御部7として機能する。
Hereinafter, embodiments for carrying out the present invention will be described with reference to the drawings.
[Example 1]
FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention. FIG. 1 is a block diagram showing a control device 100a as a vehicle control device and its peripheral devices. A control device 100a illustrated in FIG. 1 is a computer that controls a host vehicle, and by executing a program stored in a storage medium (not shown), a surrounding environment recognition unit 1, a blind spot detection unit 2, road information It functions as the acquisition unit 3, the target route generation unit 4, the priority determination unit 5, the moving body course prediction unit 6, and the vehicle control unit 7.
 制御装置100aは、自車両の操舵装置102、駆動装置103、および制動装置104と、自車両に設けられた外環境認識装置101、音発生装置105、表示装置106および自動運転ボタン107に接続されている。また、制御装置100aは、自車両のCAN(不図示)などに接続されており、そのCANを介して自車両の車速、舵角、ヨーレートなどの車両情報が入力される。なお、CAN(Controller Area Network)とは車載の電子回路や各装置を接続するためのネットワーク規格である。 The control device 100a is connected to the steering device 102, the driving device 103, and the braking device 104 of the host vehicle, and the external environment recognition device 101, the sound generator 105, the display device 106, and the automatic driving button 107 provided in the host vehicle. ing. The control device 100a is connected to a CAN (not shown) of the host vehicle, and vehicle information such as the vehicle speed, the steering angle, and the yaw rate of the host vehicle is input via the CAN. Note that CAN (Controller Area Network) is a network standard for connecting in-vehicle electronic circuits and devices.
 外環境認識装置101は、自車両の周囲環境に関する情報を取得するものであって、例えば、自車両の前方、後方、右側方、左側方の周囲環境をそれぞれ撮影する4個の車載カメラである。外環境認識装置101(車載カメラ)により得られた画像データは、アナログデータのまま、もしくはA/D変換されて、専用線などを用いて制御装置100aに入力される。外環境認識装置101としては、車載カメラ以外にもミリ波やレーザーを用いて物体との距離を計測するレーダ、超音波を用いて物体との距離を計測するソナー等を用いることができる。外環境認識装置101は、得られた物体との距離、物体の方角等の情報を、専用線などを用いて制御装置100aに出力する。 The external environment recognition device 101 acquires information related to the surrounding environment of the host vehicle, and is, for example, four in-vehicle cameras that respectively capture the surrounding environment of the front, rear, right side, and left side of the host vehicle. . Image data obtained by the external environment recognition apparatus 101 (vehicle camera) is analog data or A / D converted and input to the control apparatus 100a using a dedicated line or the like. As the external environment recognition apparatus 101, besides a vehicle-mounted camera, a radar that measures a distance from an object using a millimeter wave or a laser, a sonar that measures a distance from an object using an ultrasonic wave, or the like can be used. The external environment recognition apparatus 101 outputs information such as the obtained distance to the object and the direction of the object to the control apparatus 100a using a dedicated line or the like.
 操舵装置102は、制御装置100aの駆動指令により電動や油圧のアクチュエータなどで舵角を制御することの可能な電動パワーステアリング、油圧パワーステアリング等で構成される。駆動装置103は、制御装置100aの駆動指令により電動のスロットルなどでエンジントルクを制御することの可能なエンジンシステムや、制御装置100aの駆動指令により駆動力を制御することが可能な電動パワートレインシステム等で構成される。制動装置104は、制御装置100aの駆動指令により電動や油圧のアクチュエータなどで制動力を制御することの可能な電動ブレーキや油圧ブレーキ等で構成される。 The steering device 102 is configured by an electric power steering, a hydraulic power steering, or the like that can control the steering angle by an electric or hydraulic actuator or the like according to a drive command of the control device 100a. The drive device 103 is an engine system capable of controlling the engine torque with an electric throttle or the like according to a drive command from the control device 100a, or an electric powertrain system capable of controlling the drive force according to a drive command from the control device 100a. Etc. The braking device 104 is configured by an electric brake, a hydraulic brake, or the like that can control a braking force by an electric or hydraulic actuator or the like according to a drive command of the control device 100a.
 音発生装置105は、スピーカー等で構成され、運転者に対する警報や音声ガイダンス等の出力に用いられる。表示装置106は、ナビゲーション装置等のディスプレイ、メーターパネル、警告灯等で構成される。表示装置106には、制御装置100aの操作画面のほか、自車両が障害物に衝突する危険があることなどを運転者に視覚的に伝える警告画面等が表示される。 The sound generator 105 includes a speaker or the like, and is used to output a warning or voice guidance to the driver. The display device 106 includes a display such as a navigation device, a meter panel, a warning light, and the like. In addition to the operation screen of the control device 100a, the display device 106 displays a warning screen or the like that visually informs the driver that the vehicle is in danger of colliding with an obstacle.
 自動運転ボタン107は、運転者が操作可能な位置に設けられた操作部材であって、運転者の操作に基づいて、制御装置100aの動作を開始させる開始信号を制御装置100aへ出力する。なお、自動運転ボタン107にモード切り替えスイッチを設けて、自動駐車モード、一般道自動運転モード、高速道自動運転モード等の複数の走行モードを切り替えられるようにしてもよい。さらには、制御装置100aの動作が行われている最中に自動運転ボタン107を操作することで、制御装置100aの動作を終了するように構成してもよい。 The automatic operation button 107 is an operation member provided at a position where the driver can operate, and outputs a start signal for starting the operation of the control device 100a to the control device 100a based on the operation of the driver. Note that a mode change switch may be provided in the automatic operation button 107 so that a plurality of travel modes such as an automatic parking mode, a general road automatic operation mode, and an expressway automatic operation mode can be switched. Further, the operation of the control device 100a may be terminated by operating the automatic operation button 107 while the operation of the control device 100a is being performed.
 なお、自動運転ボタン107は、ステアリング周辺等の運転者が操作しやすい場所にスイッチとして設置される。また、表示装置106がタッチパネル式のディスプレイの場合には、表示装置106に自動運転ボタン107に相当するボタンを表示して、運転者が操作できるようにしてもよい。 The automatic operation button 107 is installed as a switch in a place where the driver can easily operate such as around the steering. When the display device 106 is a touch panel display, a button corresponding to the automatic operation button 107 may be displayed on the display device 106 so that the driver can operate it.
 次に、制御装置100aの各構成を説明する。上述したように、制御装置100aには外環境認識装置101(車載カメラ)により得られた画像データが入力される。周辺環境認識部1は、外環境認識装置101から入力された画像データを用いて、自車両周辺の静止立体物、移動体、区画線等の路面ペイント、標識等の物体の形状や位置を検出し、さらに、路面の凹凸等を検出して自車両が走行可能な路面であるか否かの判定機能を持つ。 Next, each configuration of the control device 100a will be described. As described above, the image data obtained by the external environment recognition device 101 (vehicle camera) is input to the control device 100a. The surrounding environment recognition unit 1 uses the image data input from the outside environment recognition device 101 to detect the shape and position of objects such as stationary solid objects, moving objects, road surface paint such as lane markings, and signs around the host vehicle. In addition, it has a function of determining whether or not the vehicle is a road surface on which the vehicle can travel by detecting unevenness on the road surface.
 静止立体物とは、例えば、駐車車両、壁、ガードレール、ポール、パイロン、縁石、街路樹、車止めなどである。また、移動体とは、例えば、歩行者、自転車、バイク、車両などである。以降、静止立体物と移動体の二つをまとめて障害物と呼ぶ。物体の形状や位置は、パターンマッチング手法やその他の公知技術を用いて検出される。物体の位置は、例えば、自車両の前方を撮影する車載カメラの位置に原点を有する座標系を用いて表現される。 The stationary solid object is, for example, a parked vehicle, a wall, a guardrail, a pole, a pylon, a curb, a roadside tree, a car stop, and the like. The moving body is, for example, a pedestrian, a bicycle, a motorcycle, or a vehicle. Hereinafter, the stationary solid object and the moving object are collectively referred to as an obstacle. The shape and position of the object are detected using a pattern matching technique or other known techniques. The position of the object is expressed, for example, using a coordinate system having an origin at the position of the in-vehicle camera that captures the front of the host vehicle.
 また、周辺環境認識部1は、検出した物体の形状や位置に関する情報と、自車両が走行可能な路面であるか否かの判定結果とに基づいて、例えば、一般道を走行する場合であれば、走行可能な車線位置や交差点の旋回可能スペース等を検出する。また、駐車場の場合であれば、自車両を駐車させることができる空間、駐車可能スペース等を検出する。 Further, the surrounding environment recognition unit 1 may be, for example, when driving on a general road based on information on the shape and position of the detected object and a determination result of whether or not the vehicle is a road surface on which the vehicle can travel. For example, a lane position where the vehicle can travel and a space where the vehicle can turn at an intersection are detected. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
 道路情報取得部3は、現在の自車位置周辺の地図データを取得する。取得される地図データは、ポリゴンやポリライン等で表現される実際の道路形状に近い形状データと、通行規制情報(制限速度,通行可能車両種別等),車線区分(本線,追越車線,登坂車線,直進車線,左折車線,右折車線等),信号機や標識等の有無(有の場合はその位置情報)等のデータである。なお、周辺環境認識部1において、上記の地図データと同等の情報が得られる場合は周辺環境認識部1によって検出された情報を代用することができる。 The road information acquisition unit 3 acquires map data around the current vehicle position. The acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present). In addition, in the surrounding environment recognition part 1, when the information equivalent to said map data is obtained, the information detected by the surrounding environment recognition part 1 can be substituted.
 目標経路生成部4は、現在の自車位置から目標位置に自車両を移動するための経路を生成する。さらに、生成した経路を走行する目標速度を地図データの制限速度や経路の曲率,信号機,一時停止位置,先行車の速度等の情報を用いて演算する。例えば、一般道を走行する場合には、地図データを持つナビゲーション装置等を利用して目的地を設定し、目的地に向かって走行する際の自車両と障害物との位置関係や車線位置等の情報から経路を生成する。また、駐車場の場合では、自車両と障害物との位置関係から自車両を駐車する目標駐車位置を駐車可能スペース内に設定し、そこまでの経路を生成する。 The target route generation unit 4 generates a route for moving the host vehicle from the current host vehicle position to the target position. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
 死角検出部2は、周辺環境認識部1で認識した障害物の位置情報と道路情報取得部3で取得した地図データを用いて自車両から死角となる領域を検出する。外環境認識装置101の特性上、周辺環境認識部1で認識される障害物の裏側が自車両から死角となり、その死角に存在する可能性のある障害物を考慮する必要がある。ここではとくに、自車両の進路を妨害する障害物が存在する可能性のある死角を抽出する。 The blind spot detection unit 2 detects an area that is a blind spot from the host vehicle using the position information of the obstacle recognized by the surrounding environment recognition unit 1 and the map data acquired by the road information acquisition unit 3. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
 移動体進路予測部6は、死角検出部2で検出した死角から飛び出してくる可能性のある移動体(障害物又は死角物体)を推定し、その進路を予測する。例えば、検出した死角が道路の場合は車両、横断歩道の一部が死角となる場合は歩行者や自転車が飛び出してくることを推定し、その進路を予測する。 The moving body course prediction unit 6 estimates a moving body (obstacle or blind spot object) that may jump out from the blind spot detected by the blind spot detection unit 2, and predicts the course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
 ここで、移動体の進路は、移動体の予定進行経路と言い換えることができる。従って、移動体の進路の優先度は、移動体の予定進行経路に関連付けることができる。また、優先度とは、移動体自体の優先度(移動体固有優先度)と、移動体が移動する移動路自体の優先度(移動路固有優先度)とに基づいて決まるものであり、移動体固有優先度が支配的となって移動体の進路の優先度が決定される場合もあれば、移動体固有優先度に加えて移動路固有優先度をも考慮して決定される場合もある。例えば、歩行者と自車両との関係では、移動体自体の優先度が支配的であるため、進路の優先度は常に歩行者の方が自車両よりも高くなる。また、自車両と他車両との関係では、双方とも車両であることから、移動体自体の優先度のみでは進路の優先度を決定することができず、自車両及び他車両が位置する道路の優先度を考慮する必要がある。 Here, the course of the mobile body can be rephrased as the planned travel path of the mobile body. Accordingly, the priority of the path of the moving body can be associated with the planned traveling path of the moving body. The priority is determined based on the priority of the moving body itself (moving body specific priority) and the priority of the moving path itself on which the moving body moves (moving path specific priority). In some cases, the body priority is dominant and the path of the moving body is determined. In other cases, the path priority is determined in addition to the moving body priority. . For example, in the relationship between the pedestrian and the host vehicle, the priority of the moving body is dominant, and therefore the route priority is always higher for the pedestrian than for the host vehicle. In addition, since both the vehicle and the other vehicle are vehicles, the priority of the course cannot be determined only by the priority of the moving body itself, and the road where the host vehicle and the other vehicle are located cannot be determined. Priorities need to be considered.
 ところで、死角から飛び出してくる可能性のある移動体の推定は、GPS等を用いて把握される自車の位置と、地図データや周辺環境認識部1からの情報とに基づいて行う。例えば、制御装置100aは、横断歩道を検知すると、歩行者や自転車が存在し得ることを推定し、交差点を検知すると、歩行者や自転車に加えて車両の存在を推定する。 By the way, the estimation of the moving body that may jump out from the blind spot is performed based on the position of the own vehicle that is grasped using GPS or the like, and the map data and information from the surrounding environment recognition unit 1. For example, when detecting a pedestrian crossing, the control device 100a estimates that a pedestrian or a bicycle may exist, and when detecting an intersection, the control device 100a estimates the presence of a vehicle in addition to the pedestrian or bicycle.
 優先度判定部5は、目標経路生成部4で生成した自車両の経路(進路)と移動体進路予測部6で予測した移動体の進路が交差する場合にどちらの進路が優先であるかを判定する。 The priority determination unit 5 determines which route has priority when the route (route) of the host vehicle generated by the target route generation unit 4 and the route of the mobile body predicted by the mobile body route prediction unit 6 intersect. judge.
 図2に優先度判定部5が利用する優先度判定テーブルの一例を示す。図2の優先度判定テーブルは、左側通行の場合であり、自車両(縦軸)の進路に対する相手車両(横軸)の進路の優先度を示すものである。まず、図2(a)は、自車両及び相手車両が同じ道路を反対方向に通行して交差点に進入する場合の優先度判定テーブルである。図2(b)は、信号機が無い交差点において、相手車両が自車両に対して右側の交差道路から交差点に進入する場合の優先度判定テーブルである。図2(c)は、信号機がある交差点において、相手車両が自車両に対して右側の交差道路の交差点に進入する場合の優先度判定テーブルである。図2(d)は、信号機が無い交差点において、相手車両が自車両に対して左側の交差道路から交差点に進入する場合の優先度判定テーブルである。図2(e)は、信号機がある交差点において、相手車両が自車両に対して左側の交差道路の交差点に進入する場合の優先度判定テーブルである。 FIG. 2 shows an example of a priority determination table used by the priority determination unit 5. The priority determination table of FIG. 2 is a case of left-hand traffic, and shows the priority of the route of the opponent vehicle (horizontal axis) with respect to the route of the own vehicle (vertical axis). First, FIG. 2A is a priority determination table in a case where the host vehicle and the opponent vehicle pass the same road in the opposite direction and enter the intersection. FIG. 2B is a priority determination table when an opponent vehicle enters the intersection from the intersection road on the right side of the own vehicle at an intersection where there is no traffic signal. FIG. 2C is a priority determination table in a case where an opponent vehicle enters an intersection on the right intersection road with respect to the own vehicle at an intersection with a traffic light. FIG. 2D is a priority determination table when the opponent vehicle enters the intersection from the intersection road on the left side with respect to the own vehicle at the intersection where there is no traffic signal. FIG. 2E is a priority determination table in a case where an opponent vehicle enters an intersection on the left intersection road with respect to the own vehicle at an intersection with a traffic light.
 具体的な優先度の判定内容の一例について、図2(a)を用いて説明する。上述のとおり、図2(a)は、自車両及び相手車両が同じ道路を反対方向に通行する状態で交差点に進入した場合である。そして、例えば、自車両が直進し相手車両が右折する場合には、自車両の優先度が相手車両よりも高くなる。なお、同じ交差点において、例えば自車両も相手車両も直進する場合には、優先度の関係が成立しないため、優先度判定テーブルでは「―」と表わされる。ここで、仮に優先度を道路(即ち、移動路)の優先度(即ち、移動路固有優先度)のみで判断しようとすれば、自車両及び他車両が同じ道路を反対方向に通行する場合には、移動路固有優先度の優劣が存在せず、判断を行うことができない。一方、その移動体が直進車両であるのか右折車両であるのかという点を考慮する(即ち、移動路固有優先度を加味する)ことによって、このような場合でも優先度の判定が可能となる。 An example of specific priority determination contents will be described with reference to FIG. As described above, FIG. 2A shows a case where the host vehicle and the opponent vehicle enter the intersection while passing the same road in the opposite direction. For example, when the host vehicle goes straight and the partner vehicle turns right, the priority of the host vehicle becomes higher than that of the partner vehicle. Note that, for example, when the host vehicle and the opponent vehicle go straight at the same intersection, the priority relationship is not established, and therefore, “−” is represented in the priority determination table. Here, if the priority is determined only by the priority of the road (that is, the moving path) (that is, the priority that is specific to the moving path), if the host vehicle and the other vehicle pass the same road in the opposite direction. There is no superiority or inferiority of the travel path specific priority, and determination cannot be made. On the other hand, by considering whether the moving body is a straight-ahead vehicle or a right-turn vehicle (that is, taking into account the priority inherent in the travel path), the priority can be determined even in such a case.
 また、図3に右側通行の場合の優先度判定テーブルを示す。図3(a)~図3(e)の場面設定は、図2(a)~図2(e)と同様である。そして、図3(a)の場面、即ち、自車両及び相手車両が同じ道路を反対方向に通行して交差点に進入する場合には、優先度判定テーブルは左側通行のものと同様となる。一方、図3(b)~図3(e)に関しては、詳細は割愛するが、判定内容が左側通行の場合の図2(b)~図2(e)と一部相違する。具体的には、図3(b)~図3(e)のうち、網掛け又はグレーの背景で示されるセルに関して、判定内容が相違する。 In addition, FIG. 3 shows a priority determination table in the case of right-hand traffic. The scene settings in FIGS. 3A to 3E are the same as those in FIGS. 2A to 2E. Then, when the scene of FIG. 3A, that is, when the host vehicle and the partner vehicle pass the same road in the opposite direction and enter the intersection, the priority determination table is the same as that of the left-hand traffic. On the other hand, details of FIGS. 3B to 3E are omitted, but are partially different from FIGS. 2B to 2E when the determination content is left-hand traffic. Specifically, in FIG. 3 (b) to FIG. 3 (e), the determination contents are different for the cells indicated by the shaded or gray background.
 なお、地図データ及び道路交通法は、車両に搭載されるナビゲーション装置に収められるものを利用することができる。なお、車載ナビゲーション装置がデータセンターとの間で通信を行ってナビゲーションを行ういわゆる通信ナビゲーションシステムを利用している場合には、データセンターが保有する最新の地図データや道路交通法を利用する方法も考えられる。 Note that the map data and the road traffic law can be stored in a navigation device mounted on the vehicle. In addition, when using a so-called communication navigation system in which the in-vehicle navigation device communicates with the data center to perform navigation, there is a method of using the latest map data and road traffic law possessed by the data center. Conceivable.
 車両制御部7は、目標経路生成部4で生成した目標経路に沿って自車両を制御する。車両制御部7は、目標経路に基づいて目標舵角と目標速度を演算する。なお、自車両と障害物との衝突が予測される場合には、自車両が障害物に衝突しないように目標舵角と目標速度を演算する。さらに、優先度判定部5で判定された優先度に基づいた走行支援を実行するため、目標速度を変更したり、目標ブレーキ圧の指令値を演算したりする。そして、車両制御部7は、その目標舵角を実現するための目標操舵トルクを操舵装置102へ出力する。また、車両制御部7は、目標速度を実現するための目標エンジントルクや目標ブレーキ圧を駆動装置103や制動装置104へ出力する。また、自車両が障害物に衝突しないように目標舵角と目標速度を演算する場合や、優先度判定部5で判定された優先度に基づいた走行支援を実行する場合は、走行支援内容等を音発生装置105と表示装置106に出力する。 The vehicle control unit 7 controls the host vehicle along the target route generated by the target route generation unit 4. The vehicle control unit 7 calculates a target rudder angle and a target speed based on the target route. When a collision between the host vehicle and an obstacle is predicted, the target rudder angle and the target speed are calculated so that the host vehicle does not collide with the obstacle. Further, in order to execute the driving support based on the priority determined by the priority determination unit 5, the target speed is changed or the command value of the target brake pressure is calculated. The vehicle control unit 7 then outputs a target steering torque for realizing the target steering angle to the steering device 102. Further, the vehicle control unit 7 outputs a target engine torque and a target brake pressure for realizing the target speed to the driving device 103 and the braking device 104. In addition, when the target rudder angle and the target speed are calculated so that the host vehicle does not collide with an obstacle, or when the driving support based on the priority determined by the priority determination unit 5 is executed, the driving support content, etc. Is output to the sound generator 105 and the display device 106.
 以下、一般道での自動運転シーンを3件例にとって、自車両が自動運転する場合の制御装置100aの動作について説明する。なお、それぞれのシーンでは、運転者が事前に自動運転ボタン107を操作して自動運転を開始しているものとする。
(第1の自動運転シーン)
 図4から図7は、自車両200が十字路交差点を右折するシーンである。このとき対向車201は右折待ちのため停車しているものとし、さらに車両202と車両203が対向車201の後に続いているものとする。また、自車両200が右折先の横断歩道上に歩行者204と歩行者205がそれぞれの矢印の方向に動いているものとする。
Hereinafter, the operation of the control device 100a when the host vehicle automatically operates will be described with respect to three examples of the automatic driving scene on the general road. In each scene, it is assumed that the driver has started automatic driving by operating the automatic driving button 107 in advance.
(First automatic driving scene)
4 to 7 are scenes in which the host vehicle 200 makes a right turn at a crossroad intersection. At this time, it is assumed that the oncoming vehicle 201 stops for a right turn, and that the vehicle 202 and the vehicle 203 follow the oncoming vehicle 201. In addition, it is assumed that the pedestrian 204 and the pedestrian 205 are moving in the directions of the respective arrows on the pedestrian crossing where the host vehicle 200 turns right.
 まず、図4を用いて自車両の目標経路生成手法を説明する。周辺環境認識部1は、外環境認識装置101から入力された画像データを用いて、公知の手法(パターンマッチング手法等)により対向車201と歩行者204,歩行者205を認識する。そして、目標経路生成部4は、周辺環境認識部1により認識されたこれらの障害物に衝突しないように交差点を右折するための経路210を演算する。 First, the target route generation method of the own vehicle will be described with reference to FIG. The surrounding environment recognition unit 1 recognizes the oncoming vehicle 201, the pedestrian 204, and the pedestrian 205 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 210 for turning right at the intersection so as not to collide with these obstacles recognized by the surrounding environment recognition unit 1.
 目標経路生成部4により演算される経路210は、例えば、交差点に入るまでの直進区間、交差点内を旋回する旋回区間、交差点通過後の直進区間のように直進区間と旋回区間の2つに分けて考えることができる。目標経路生成部4は、直進区間の経路を直線で表し、旋回区間の経路をクロソイド曲線と円弧とを組み合わせて近似する。クロソイド曲線は、自車両200の速度を一定にし、自車両200の舵角を一定の角速度で変化させたときに自車両が描く軌跡を表す。円弧は、自車両200の速度を一定にし、自車両200の舵角を所定値(自車両が直進する舵角を除く)に固定して運転したときに自車両が描く軌跡を表す。なお、目標経路生成部4による経路の演算方法は、スプライン曲線や多項式などを用いて近似する方法もあり、上記の限りではない。 The route 210 calculated by the target route generating unit 4 is divided into a straight section and a turning section, for example, a straight section until entering the intersection, a turning section that turns in the intersection, and a straight section after passing the intersection. Can think. The target route generation unit 4 represents a straight section route by a straight line, and approximates a turn section route by combining a clothoid curve and an arc. The clothoid curve represents a trajectory drawn by the host vehicle when the speed of the host vehicle 200 is constant and the rudder angle of the host vehicle 200 is changed at a constant angular velocity. The circular arc represents a trajectory drawn by the own vehicle when the vehicle is driven while the speed of the own vehicle 200 is fixed and the rudder angle of the own vehicle 200 is fixed to a predetermined value (excluding the rudder angle in which the own vehicle goes straight). Note that the route calculation method by the target route generation unit 4 includes an approximation method using a spline curve, a polynomial, or the like, and is not limited to the above.
 また、目標経路生成部4は、自車両の目標経路を演算するとともに、この目標経路を走行する際の目標速度を演算する。図4のシーンにおいて、対向車201が停止しており,車両202,車両203が自車両200の進路を妨害しないとし、歩行者204と歩行者205が矢印の通りに動き続けるものとすると、自車両の目標速度は、交差点内(例えば、横断歩道間の区画線の存在しない区間)で徐行速度となるように設定し、その他の区間は制限速度となるように設定し、徐行速度と制限速度の切り替わりは加減速度が滑らかになるようにつなげる。 Further, the target route generation unit 4 calculates the target route of the host vehicle and calculates the target speed when traveling on the target route. In the scene of FIG. 4, if the oncoming vehicle 201 is stopped, the vehicle 202 and the vehicle 203 do not obstruct the course of the host vehicle 200, and the pedestrian 204 and the pedestrian 205 continue to move as indicated by the arrows, The target speed of the vehicle is set to be a slow speed within the intersection (for example, a section where there is no lane marking between pedestrian crossings), and the other sections are set to have a speed limit, and the slow speed and the speed limit are set. The changeover is connected so that the acceleration / deceleration is smooth.
 次に、図5を用いて死角の検出手法およびその死角から飛び出してくる可能性のある障害物に対しての対応方法に関して説明する。ここでは、死角検出部2による死角検出を行い、自車両の進路を妨害する障害物が存在する可能性のある死角を抽出して、移動体進路予測部6により死角から飛び出してくる可能性のある障害物の進路を予測する。そして、自車両と障害物との進路の優先度を判定し、優先度に基づいた走行支援内容を決定する。 Next, a method for detecting a blind spot and a method for dealing with an obstacle that may jump out of the blind spot will be described with reference to FIG. Here, blind spot detection by the blind spot detection unit 2 is performed to extract a blind spot that may have an obstacle obstructing the course of the host vehicle, and the mobile course prediction unit 6 may jump out of the blind spot. Predict the path of an obstacle. And the priority of the course of the own vehicle and an obstacle is determined, and the driving assistance content based on the priority is determined.
 まず、死角検出部2による死角検出の方法を説明する。死角検出部2は、道路情報取得部3により自車両200周辺の地図データであるところの図5の交差点周辺の地図データを取得し、周辺環境認識部1により認識された対向車201,車両202,車両203を取得した地図データ上に配置する。外環境認識装置101の特性上、周辺環境認識部1で認識される障害物の裏側が自車両から死角となるため、自車両200の前方を認識するカメラが自車両200の先端中央に設置されている場合、自車両200先端中央から対向車201,車両202,車両203の右側の端に向かって伸ばした線301と、対向車201,車両202,車両203の左側の端に向かって伸ばした線302の裏側にあたる対向車線上の領域303を死角と判定し抽出する。このように、地図データ上に自車両と障害物を配置することで自車両の進路を妨害する障害物が存在する可能性のある死角を抽出することが可能となる。 First, the method of blind spot detection by the blind spot detection unit 2 will be described. The blind spot detection unit 2 acquires map data around the intersection of FIG. 5 which is map data around the host vehicle 200 by the road information acquisition unit 3, and the oncoming vehicle 201 and the vehicle 202 recognized by the surrounding environment recognition unit 1. The vehicle 203 is arranged on the acquired map data. Due to the characteristics of the external environment recognition device 101, the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the own vehicle, so a camera for recognizing the front of the own vehicle 200 is installed at the center of the front end of the own vehicle 200. A line 301 extending from the front center of the host vehicle 200 toward the right end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203, and extending toward the left end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203. A region 303 on the opposite lane on the back side of the line 302 is determined as a blind spot and extracted. As described above, by arranging the own vehicle and the obstacle on the map data, it is possible to extract a blind spot where an obstacle that obstructs the course of the own vehicle may exist.
 次に、死角検出部2により検出された死角から飛び出してくる可能性のある移動体の進路を予測する手法に関して説明する。まず、移動体進路予測部6は、想定される移動体の中で走行速度の速いものを選択する。例えば図5の場合、死角検出部2により検出された死角303は対向車線上のため、想定される移動体としては、四輪車(乗用車,トラック等),二輪車,自転車等が挙げられ、この中で最も速い四輪車を移動体304として選択する。さらに、移動体304がとりうる進路の中で最も速い進路を移動体進路として選択する。ここでは、移動体304は交差点を直進もしくは左折することが予測されるため、最も速い進路として直進の進路305を移動体進路として選択する。 Next, a method for predicting the course of a moving body that may jump out from the blind spot detected by the blind spot detection unit 2 will be described. First, the moving body course prediction unit 6 selects an assumed moving body having a high traveling speed. For example, in the case of FIG. 5, the blind spot 303 detected by the blind spot detection unit 2 is on the opposite lane, and examples of assumed moving bodies include four-wheeled vehicles (passenger cars, trucks, etc.), two-wheeled vehicles, bicycles, etc. The fastest four-wheeled vehicle is selected as the moving body 304. Further, the fastest route among the routes that the moving body 304 can take is selected as the moving body route. Here, since the moving body 304 is predicted to go straight or turn left at the intersection, the straight path 305 is selected as the fastest path as the moving body path.
 次に、優先度判定部5において、自車両の進路(目標経路)と移動体進路の優先度を判定する手法に関して説明する。ここでの進路とは、行動(交差点直進/右折/左折,優先道路直進,非優先道路直進,単路直進,駐車場などへの道路外への右左折等)の意味も含むため、自車両の進路(目標経路)と移動体進路の優先度は、優先度判定部5において、道路情報取得部3により取得した地図データと道路交通法に基づいて判定することができる。例えば図5の場合、自車両200の進路(目標経路210)は交差点を右折、移動体304の進路305は交差点を直進のため、交差点を直進する移動体304の優先度が高いと判定される。 Next, a method for determining the priority of the course (target route) of the host vehicle and the moving body course in the priority determination unit 5 will be described. The route here also includes the meaning of actions (straight intersection / right / left turn, straight on priority road, straight on non-priority road, straight on straight road, right / left turn outside road to parking, etc.) The priority of the route (target route) and the moving body route can be determined by the priority determination unit 5 based on the map data acquired by the road information acquisition unit 3 and the road traffic law. For example, in the case of FIG. 5, the route (target route 210) of the host vehicle 200 turns right at the intersection, and the route 305 of the moving body 304 goes straight through the intersection, so it is determined that the priority of the moving body 304 that goes straight through the intersection is high. .
 次に、優先度判定部5において判定された優先度に基づいた走行支援内容の決定方法に関して説明する。基本的な考え方として、車両制御部7は、自車両の優先度が移動体の優先度より高い場合は移動体を無視した速度制御を実施し、自車両の優先度が移動体の優先度より低い場合は移動体との衝突を回避するための走行支援制御を実施する。例えば図5の場合、優先度判定部5により自車両200より移動体304の優先度が高いと判定されているため、車両制御部7は、自車両200が移動体304との衝突を回避するための走行支援制御を実施する。具体的には、自車両200が移動体304の進路305を妨害しないように自車両200の速度を制御すればよいため、図6に示す位置まで自車両200を微速前進(すぐに止まれる速度)させ、図5の状態で検出していた死角303がなくなり、対向車両の存在がないことが確認できたら元の速度に復帰して交差点を通過する。 Next, a method for determining the content of driving support based on the priority determined by the priority determination unit 5 will be described. As a basic idea, the vehicle control unit 7 performs speed control ignoring the moving object when the priority of the own vehicle is higher than the priority of the moving object, and the priority of the own vehicle is higher than the priority of the moving object. When it is low, the driving support control for avoiding the collision with the moving body is performed. For example, in the case of FIG. 5, since the priority determination unit 5 determines that the priority of the moving body 304 is higher than that of the own vehicle 200, the vehicle control unit 7 avoids the collision of the own vehicle 200 with the moving body 304. For driving support. Specifically, since the speed of the host vehicle 200 only needs to be controlled so that the host vehicle 200 does not interfere with the path 305 of the moving body 304, the host vehicle 200 moves forward at a slow speed to the position shown in FIG. When the blind spot 303 detected in the state of FIG. 5 disappears and it can be confirmed that there is no oncoming vehicle, the vehicle returns to the original speed and passes through the intersection.
 次に、図7と図8を用いて車両制御部7における走行支援制御方法に関して説明する。図7は図4から図6までで説明した状況と同じであり、図7において障害物が存在しない場合は図8(a)の目標速度で交差点を通過する。これに対し、図7の状況下では移動体との衝突を回避するための走行支援を実施するため、図8(b)の目標速度を演算して速度制御を行う。図7と図8における地点Bは自車両200が地点Aにいるときに死角がなくなると予測された地点、地点Cは実際に死角がなくなった地点であるため、自車両200は地点Aまでに交差点通過速度V1まで減速し、さらに地点Bまでに移動体304との衝突を回避するための速度V2まで減速、C地点で死角がなくなったら再度交差点通過速度V1まで加速してD地点を通過後に再度加速する。 Next, the driving support control method in the vehicle control unit 7 will be described with reference to FIGS. FIG. 7 is the same as the situation described with reference to FIGS. 4 to 6. When no obstacle is present in FIG. 7, the vehicle passes the intersection at the target speed shown in FIG. On the other hand, under the situation of FIG. 7, in order to carry out driving support for avoiding a collision with a moving body, the target speed in FIG. 8B is calculated to perform speed control. 7 and 8, point B is a point where the blind spot is predicted to disappear when the host vehicle 200 is at point A, and point C is a point where the blind spot actually disappears. Decelerate to the intersection passing speed V1 and further to the point B to decelerate to the speed V2 for avoiding the collision with the moving body 304. When the blind spot disappears at the point C, it accelerates again to the intersection passing speed V1 and passes through the point D Accelerate again.
 以上説明したように、自車両の進路と移動体の進路との優先度を地図データの情報に基づいて判定し、判定した優先度により走行支援内容を決定することで周囲状況に応じた適切な走行支援が可能となり、安全性が向上する。とくに図4から図8で説明したシーンに関しては、自車両が十字路交差点を右折する際に、対向の右折待ち車両によって死角となる領域を地図データの情報に抽出し、自車両の進路(交差点右折右折)と移動体の進路(交差点直進)との優先度を判定することで、優先度の高い移動体の進路を妨げることのない走行支援が可能となり、安全性が向上する。
(第2の自動運転シーン)
 図9は、自車両700が優先道路を直進するシーンであり、自車両700の進行方向左側に脇道が接続している状況である。なお、自車両700が走行している道路と脇道との境界701は壁や建物により遮られており、図9における自車両700の位置からは脇道が死角になる状況である。
As described above, the priority between the course of the host vehicle and the course of the moving body is determined based on the information of the map data, and the driving support content is determined based on the determined priority, so that an appropriate response according to the surrounding situation can be obtained. Driving support is possible and safety is improved. In particular, regarding the scenes described in FIGS. 4 to 8, when the own vehicle turns right at the crossroad intersection, an area that becomes a blind spot due to the opposite right turn waiting vehicle is extracted as map data information, and the course of the own vehicle (turns right at the intersection). By determining the priority between the right turn) and the course of the moving body (straight ahead of the intersection), it becomes possible to provide driving support without hindering the course of the moving body having a high priority, and safety is improved.
(Second automatic driving scene)
FIG. 9 is a scene in which the host vehicle 700 goes straight on the priority road, and a side road is connected to the left side in the traveling direction of the host vehicle 700. Note that a boundary 701 between the road on which the host vehicle 700 is traveling and a side road is blocked by a wall or a building, and the side road becomes a blind spot from the position of the host vehicle 700 in FIG.
 まず、周辺環境認識部1は、外環境認識装置101から入力された画像データを用いて、公知の手法(パターンマッチング手法等)により境界701を認識する。そして、目標経路生成部4は、自車両700の走行車線上に障害物が存在しないため、車線中心位置を走行する目標経路として経路702を演算する。さらに、目標経路生成部4は、自車両の目標速度として道路の制限速度V1を設定する。 First, the surrounding environment recognition unit 1 recognizes the boundary 701 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates the route 702 as a target route that travels in the center position of the lane because there is no obstacle on the travel lane of the host vehicle 700. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
 次に、死角検出部2は、道路情報取得部3により自車両700周辺の地図データを取得し、周辺環境認識部1により認識された境界701の情報と、自車両700の前方を認識するカメラが自車両700の先端中央に設置されている場合、自車両700先端中央から境界701が途切れた箇所に向かって伸ばした線703とから脇道の死角領域704を抽出する。 Next, the blind spot detection unit 2 acquires map data around the host vehicle 700 by the road information acquisition unit 3, and information on the boundary 701 recognized by the surrounding environment recognition unit 1 and a camera that recognizes the front of the host vehicle 700. Is installed at the center of the front end of the host vehicle 700, a blind spot area 704 on the side road is extracted from a line 703 extending from the center of the front end of the host vehicle 700 toward the location where the boundary 701 is interrupted.
 次に、移動体進路予測部6は、死角704から飛び出してくる可能性のある移動体を四輪車705として選択し、直進の進路706を移動体進路として演算する。 Next, the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 704 as the four-wheeled vehicle 705, and calculates the straight path 706 as the moving body course.
 次に、優先度判定部5において、自車両の進路(目標経路)と移動体進路の優先度を判定する。図9の状況下では、自車両700の進路(目標経路702)は優先道路を直進、移動体704の進路706は非優先道路を直進のため、優先道路を直進する自車両700の優先度が高いと判定される。 Next, the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path. In the situation of FIG. 9, the route of the own vehicle 700 (target route 702) goes straight on the priority road, and the route 706 of the moving body 704 goes straight on the non-priority road, so the priority of the own vehicle 700 going straight on the priority road is Determined to be high.
 次に、優先度判定部5において判定された優先度に基づいた走行支援内容を決定する。図9の状況下では、優先度判定部5により移動体705より自車両700の優先度が高いと判定されているため、車両制御部7は、移動体705の有無にかかわらず目標経路生成部4で演算した目標速度での走行を継続する。ただし、移動体705が自車両700の前に飛び出してくる可能性はあるため、例えば自車両700が油圧ブレーキ搭載車であれば、図9の地点A(死角704が抽出された地点,もしくは脇道から所定距離手前)において目標ブレーキ液圧をB1まで高めておく。このようにすることにより、移動体705が飛び出してきた際の制動開始までの時間を短縮する支援が可能である。そして、死角がなくなる地点Cに到達すると、目標ブレーキ液圧を元に戻す。 Next, the driving support content based on the priority determined by the priority determination unit 5 is determined. In the situation of FIG. 9, since the priority determination unit 5 determines that the priority of the host vehicle 700 is higher than the moving body 705, the vehicle control unit 7 determines whether or not the moving body 705 is present. Continue running at the target speed calculated in step 4. However, since there is a possibility that the moving body 705 jumps out in front of the own vehicle 700, for example, if the own vehicle 700 is a vehicle equipped with a hydraulic brake, the point A (the point where the blind spot 704 is extracted or the side road in FIG. The target brake hydraulic pressure is increased to B1 at a predetermined distance from the front). By doing so, it is possible to assist in shortening the time until the start of braking when the moving body 705 jumps out. Then, when reaching point C where the blind spot disappears, the target brake fluid pressure is restored.
 以上説明したように、図9におけるシーンでは、自車両が優先道路を直進する際に、壁などで死角となる脇道からの移動体に対しては自車両のほうが優先度が高いため、目標速度の変更は行わず、車両制御が実行される。ただし、自車両が油圧ブレーキ搭載車の場合は、移動体の飛び出しに備えて目標ブレーキ液圧を一時的に高めておく支援が可能である。
(第3の自動運転シーン)
 図10(a)は、自車両800が単路を直進するシーンであり、対向車線に停止車両801が存在し、停止車両801の後方に横断歩道が存在する。
As described above, in the scene in FIG. 9, when the host vehicle travels straight on the priority road, the host vehicle has a higher priority with respect to a moving body from a side road that becomes a blind spot on a wall or the like. The vehicle control is executed without changing the above. However, when the host vehicle is a vehicle equipped with a hydraulic brake, it is possible to provide support for temporarily increasing the target brake fluid pressure in preparation for the jumping out of the moving body.
(Third automated driving scene)
FIG. 10A is a scene in which the host vehicle 800 travels straight on a single road. A stop vehicle 801 exists in the oncoming lane, and a pedestrian crossing exists behind the stop vehicle 801.
 まず、周辺環境認識部1は、外環境認識装置101から入力された画像データを用いて、公知の手法(パターンマッチング手法等)により停止車両801を認識する。そして、目標経路生成部4は、自車両800の走行車線上に障害物が存在しないため、車線中心位置を走行する目標経路として経路802を演算する。さらに、目標経路生成部4は、自車両の目標速度として道路の制限速度V1を設定する。 First, the surrounding environment recognition unit 1 recognizes the stopped vehicle 801 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 802 as a target route that travels in the lane center position because there is no obstacle on the travel lane of the host vehicle 800. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
 次に、死角検出部2は、道路情報取得部3により自車両800周辺の地図データを取得し、周辺環境認識部1により認識された停止車両801の情報と、自車両800の前方を認識するカメラが自車両800の先端中央に設置されている場合、自車両800先端中央から停止車両801の左右端に向かって伸ばした線803および線804とからの死角領域805を抽出する。 Next, the blind spot detection unit 2 acquires map data around the own vehicle 800 by the road information acquisition unit 3, and recognizes the information of the stopped vehicle 801 recognized by the surrounding environment recognition unit 1 and the front of the own vehicle 800. When the camera is installed at the center of the front end of the host vehicle 800, a blind spot region 805 from the line 803 and the line 804 extending from the center of the front end of the host vehicle 800 toward the left and right ends of the stop vehicle 801 is extracted.
 次に、移動体進路予測部6は、死角805から飛び出してくる可能性のある移動体を歩行者806として選択し、直進の進路807を移動体進路として演算する。 Next, the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 805 as the pedestrian 806, and calculates the straight path 807 as the moving body course.
 次に、優先度判定部5において、自車両の進路(目標経路)と移動体進路の優先度を判定する。図10(a)の状況下では、自車両800の進路(目標経路802)は優先道路を直進、移動体806の進路807は横断歩道を横断する進路のため、横断歩道上を横断する移動体806の優先度が高いと判定される。 Next, the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path. In the situation of FIG. 10 (a), the route of the own vehicle 800 (target route 802) goes straight on the priority road, and the route 807 of the moving body 806 is a route crossing the pedestrian crossing. It is determined that the priority of 806 is high.
 次に、優先度判定部5において判定された優先度に基づいた走行支援内容を決定する。図10(a)の状況下では、優先度判定部5により自車両800より移動体806の優先度が高いと判定されているため、車両制御部7は、移動体806との衝突を回避するための支援を実行する。具体的には、まず横断歩道手前で止まれる速度V2まで減速制御を行い、図10(b)に示すように、自車両800が進行し、横断歩道周辺に歩行者などの障害物が存在しないことを確認後に元の速度V1に復帰させる。 Next, the driving support content based on the priority determined by the priority determination unit 5 is determined. 10A, since the priority determination unit 5 determines that the priority of the moving body 806 is higher than that of the host vehicle 800, the vehicle control unit 7 avoids a collision with the moving body 806. To support for. Specifically, first, deceleration control is performed to a speed V2 that stops before the pedestrian crossing, and as shown in FIG. 10 (b), the host vehicle 800 travels and there are no obstacles such as pedestrians around the pedestrian crossing. After confirming the above, the original speed V1 is restored.
 以上説明したように、図10で説明したシーンでは、自車両が単路を走行中に、停止車両の後方(死角)に横断歩道が存在する場合には、死角となっている横断歩道を横断する歩行者の優先度が自車両よりたかいと判定されるため、歩行者の通行を妨げることのない走行支援が可能となり、安全性が向上する。 As described above, in the scene described with reference to FIG. 10, when the host vehicle is traveling on a single road and there is a pedestrian crossing behind the stop vehicle (dead angle), the vehicle crosses the pedestrian crossing that is a blind spot. Since it is determined that the priority of the pedestrian is higher than that of the own vehicle, it is possible to provide driving support without hindering the passage of the pedestrian and improve safety.
 図11は、制御装置100aの処理手順の一例を示すフローチャートであり、自動運転ボタン107が操作されると、図11のフローチャートで示す制御プログラムが実行される。図11の処理は所定時間間隔(例えば、0.01秒間隔)で繰り返し実行される。そして、自動運転中止の操作が行われると(例えば、自動運転ボタン107の再操作)、上記繰り返し実行が停止され自動運転が中止される。 FIG. 11 is a flowchart showing an example of the processing procedure of the control device 100a. When the automatic operation button 107 is operated, the control program shown in the flowchart of FIG. 11 is executed. The process of FIG. 11 is repeatedly executed at predetermined time intervals (for example, 0.01 second intervals). When an operation for stopping the automatic driving is performed (for example, re-operation of the automatic driving button 107), the repeated execution is stopped and the automatic driving is stopped.
 ステップS1001では、制御装置100aは、自動運転ボタン107の操作により、現在の状態が自動運転中であるか否かを判定する。制御装置100aは、現在の状態が自動運転中である場合にはステップS1002の処理に進む。一方、制御装置100aは、現在の状態が自動運転中でない場合は一連の処理をスキップして継続する。 In step S1001, the control device 100a determines whether or not the current state is automatic driving by operating the automatic driving button 107. If the current state is automatic operation, the control device 100a proceeds to the process of step S1002. On the other hand, if the current state is not automatic driving, the control device 100a skips a series of processes and continues.
 ステップS1002では、制御装置100aは、外環境認識装置101から画像データの取り込みを開始する。以降、毎フレームごとに外環境認識装置101から画像データを取り込む。 In step S1002, the control device 100a starts capturing image data from the external environment recognition device 101. Thereafter, image data is captured from the external environment recognition apparatus 101 for each frame.
 ステップS1003では、制御装置100aは、ステップS1002で取り込んだ画像データを周辺環境認識部1に入力し、自車両周辺の静止立体物、移動体、駐車枠線等の路面ペイント、標識等の物体の形状や位置を検出する。また、検出した物体の形状や位置に関する情報と自車両が走行可能な路面であるか否かの判定結果に基づいて、例えば一般道を走行する場合であれば、走行可能な車線位置や交差点の旋回可能スペース等を検出する。また、駐車場の場合であれば、自車両を駐車させることができる空間、駐車可能スペース等を検出する。 In step S1003, the control device 100a inputs the image data captured in step S1002 to the surrounding environment recognition unit 1, and detects objects such as stationary solid objects around the host vehicle, moving objects, road surface paint such as parking frame lines, and signs. Detect shape and position. In addition, based on the detected information on the shape and position of the object and the determination result of whether or not the vehicle is a road surface on which the vehicle can travel, for example, when traveling on a general road, Detects a swivelable space. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
 ステップS1004では、制御装置100aは、現在の自車位置周辺の地図データを取得する。取得される地図データは、ポリゴンやポリライン等で表現される実際の道路形状に近い形状データと、通行規制情報(制限速度,通行可能車両種別等),車線区分(本線,追越車線,登坂車線,直進車線,左折車線,右折車線等),信号機や標識等の有無(有の場合はその位置情報)等のデータである。 In step S1004, the control device 100a acquires map data around the current vehicle position. The acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present).
 ステップS1005では、制御装置100aは、ステップS1003で検出された自車両周辺環境およびステップS1004で取得した地図データの情報に基づいて、現在の自車位置から目標位置に自車両を移動するための経路を生成する。さらに、生成した経路を走行する目標速度を地図データの制限速度や経路の曲率,信号機,一時停止位置,先行車の速度等の情報を用いて演算する。例えば、一般道を走行する場合には、地図データを持つナビゲーション装置等を利用して目的地を設定し、目的地に向かって走行する際の自車両と障害物との位置関係や車線位置等の情報から経路を生成する。また、駐車場の場合では、自車両と障害物との位置関係から自車両を駐車する目標駐車位置を駐車可能スペース内に設定し、そこまでの経路を生成する。 In step S1005, the control device 100a is configured to move the host vehicle from the current host vehicle position to the target position based on the information about the host vehicle surrounding environment detected in step S1003 and the map data acquired in step S1004. Is generated. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
 ステップS1006では、制御装置100aは、ステップS1003で検出された自車両周辺環境およびステップS1004で取得した地図データの情報に基づいて、自車両から死角となる領域を検出する。外環境認識装置101の特性上、周辺環境認識部1で認識される障害物の裏側が自車両から死角となり、その死角に存在する可能性のある障害物を考慮する必要がある。ここではとくに、自車両の進路を妨害する障害物が存在する可能性のある死角を抽出する。 In step S1006, the control device 100a detects an area that becomes a blind spot from the own vehicle based on the surrounding environment of the host vehicle detected in step S1003 and the map data acquired in step S1004. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
 ステップS1007では、制御装置100aは、ステップS1006で検出した死角から飛び出してくる可能性のある移動体を推定し、その進路を予測する。例えば、検出した死角が道路の場合は車両、横断歩道の一部が死角となる場合は歩行者や自転車が飛び出してくることを推定し、その進路を予測する。 In step S1007, the control device 100a estimates a moving body that may jump out from the blind spot detected in step S1006, and predicts its course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
 ステップS1008では、制御装置100aは、ステップS1005で生成した自車両の経路(進路)とステップS1007で予測した移動体の進路が交差する場合にどちらの進路が優先であるかを判定する。 In step S1008, the control device 100a determines which route has priority when the route (route) of the host vehicle generated in step S1005 and the route of the moving body predicted in step S1007 intersect.
 ステップS1009では、制御装置100aは、ステップS1008で判定された優先度に基づいた走行支援を実行する。ここでの具体的な処理に関して図12を用いて説明する。 In step S1009, the control device 100a executes driving support based on the priority determined in step S1008. Specific processing here will be described with reference to FIG.
 図12は図11のステップS1009の処理手順の一例を示すフローチャートである。 FIG. 12 is a flowchart showing an example of the processing procedure of step S1009 of FIG.
 ステップS1201では、図11のステップS1006で死角が検出されたか否かを判断し、死角が検出された場合にはステップS1202に進み、死角が検出されなかった場合にはステップS1205に進む。 In step S1201, it is determined whether or not a blind spot is detected in step S1006 of FIG. 11. If a blind spot is detected, the process proceeds to step S1202, and if a blind spot is not detected, the process proceeds to step S1205.
 ステップS1202では、図11のステップS1005で生成した自車両の経路(進路)とステップS1007で予測した移動体の進路が交差するか否かを判断し、交差する場合はステップS1203に進み、交差しない場合はステップS1205に進む。 In step S1202, it is determined whether or not the route (route) of the host vehicle generated in step S1005 of FIG. 11 and the route of the moving body predicted in step S1007 intersect. If so, the process proceeds to step S1203 and does not intersect. In this case, the process proceeds to step S1205.
 ステップS1203では、図11のステップS1008で自車両が優先と判定された場合はステップS1204に進み、自車両が優先でないと判定された場合はステップS1206に進む。 In step S1203, if it is determined in step S1008 in FIG. 11 that the host vehicle is prioritized, the process proceeds to step S1204. If it is determined that the host vehicle is not priority, the process proceeds to step S1206.
 ステップS1204では、図9で説明した状況のように移動体の飛び出しに備えて目標ブレーキ液圧を上昇させる支援を行う。 In step S1204, as in the situation described with reference to FIG. 9, the target brake fluid pressure is increased in preparation for the jumping out of the moving body.
 ステップS1205では、死角を考慮した支援は行わず、通常走行を維持し、図11のステップS1009の一連の処理を終了する。 In step S1205, the support in consideration of the blind spot is not performed, the normal traveling is maintained, and the series of processes in step S1009 in FIG.
 ステップS1206では、図7もしくは図10で説明した状況のように死角から飛び出してくる可能性のある移動体を優先するため、所定位置までに減速制御を行っていつでも停車できる状態にするための支援を行い、図11のステップS1009の一連の処理を終了する。 In step S1206, priority is given to a moving body that may jump out of the blind spot as in the situation described with reference to FIG. 7 or FIG. 10, so that the vehicle can be stopped at any time while performing deceleration control to a predetermined position. To end the series of processes in step S1009 of FIG.
 ステップS1010では、制御装置100aは、ステップS1005で生成した目標経路および目標速度と、ステップS1009で決まった走行支援による目標速度から、目標経路に沿って走行するための目標舵角と目標速度を演算する。 In step S1010, the control device 100a calculates a target rudder angle and a target speed for traveling along the target route from the target route and target speed generated in step S1005 and the target speed determined by the driving support determined in step S1009. To do.
 ステップS1011では、制御装置100aは、ステップS1010で演算した目標舵角と目標速度を実現するための目標操舵トルクと目標エンジントルク,目標ブレーキ圧を演算して、それぞれを操舵装置102,駆動装置103,制動装置104に出力する。そして、一連の処理を終了し、ステップS1001に戻る。 In step S1011, the control device 100a calculates a target steering torque, a target engine torque, and a target brake pressure for realizing the target rudder angle and target speed calculated in step S1010. , Output to the braking device 104. Then, the series of processing ends, and the process returns to step S1001.
 例えば、操舵装置102に出力する制御パラメータとしては、目標操舵角を実現するための目標操舵トルクが挙げられるが、操舵装置102の構成によっては直接目標速舵角を出力することも可能である。また、駆動装置103と制動装置104に出力する制御パラメータとしては、目標速度を実現するための目標エンジントルクや目標ブレーキ圧等が挙げられるが、駆動装置103と制動装置104の構成によっては直接目標速度を出力することも可能である。 For example, the control parameter output to the steering device 102 includes a target steering torque for realizing the target steering angle, but depending on the configuration of the steering device 102, the target speed steering angle can be directly output. The control parameters output to the driving device 103 and the braking device 104 include a target engine torque and a target brake pressure for realizing the target speed. Depending on the configuration of the driving device 103 and the braking device 104, the target parameter may be directly set. It is also possible to output the speed.
 なお、以上の説明はあくまでも一例であり、本発明は、上記実施の形態の記載事項に限定も拘束もされない。例えば、上述した実施の形態では自車両として乗用車を想定して説明したが、建設機械やロボットなどの自動走行にも本発明は適用可能である。 The above description is merely an example, and the present invention is not limited or constrained by the items described in the above embodiment. For example, in the above-described embodiment, the description has been made on the assumption that the own vehicle is a passenger car.
100a 制御装置
200、700、800 自車両
201 対向車
202、203、801 車両
204、205 歩行者
210、702、802 目標経路
303、704、805 死角領域
304、705、806 移動体
305、706、807 移動体進路
100a Control device 200, 700, 800 Own vehicle 201 Oncoming vehicle 202, 203, 801 Vehicle 204, 205 Pedestrian 210, 702, 802 Target route 303, 704, 805 Blind spot area 304, 705, 806 Moving body 305, 706, 807 Mobile path

Claims (9)

  1.  自車両にとって死角となる死角領域を検出し、前記死角領域から出現する可能性のある移動体の進路と前記自車両の進路の相対的な優先度を判定し、判定した優先度に基づいて前記自車両に対する制御信号を出力することを特徴とする車両制御装置。 Detecting a blind spot area that is a blind spot for the host vehicle, determining a relative priority between a path of a moving body that may appear from the blind spot area and a path of the host vehicle, and based on the determined priority A vehicle control device that outputs a control signal for the host vehicle.
  2.  自車両周辺の環境を認識する周辺環境認識部と、道路情報を取得する道路情報取得部と、前記自車両周辺環境と前記道路情報に基づいて死角を検出する死角検出部と、前記自車両周辺環境と前記道路情報に基づいて前記自車両を走行させる目標経路を生成する目標経路生成部と、前記道路情報と前記死角に基づいて該死角から出現する可能性のある移動体を予測して該移動体の進路を予測する移動体進路予測部と、前記自車両の目標経路と前記移動体の進路との優先度を判定し、判定した優先度に基づいて前記自車両に対する制御信号を出力する車両制御部と、を有することを特徴とする車両制御装置。 A surrounding environment recognition unit that recognizes an environment around the host vehicle, a road information acquisition unit that acquires road information, a blind spot detection unit that detects a blind spot based on the surrounding environment of the host vehicle and the road information, and the periphery of the host vehicle A target route generator for generating a target route for driving the vehicle based on the environment and the road information; and a mobile object that may appear from the blind spot based on the road information and the blind spot; A moving body course prediction unit that predicts a course of the moving body, and a priority between the target path of the host vehicle and the course of the moving body are determined, and a control signal for the host vehicle is output based on the determined priority. And a vehicle control unit.
  3.  請求項1又は2に記載の車両制御装置において、
     前記自車両は、予め設定された目標経路に沿って移動するように操舵制御と速度制御を行う自動運転車両であり、
     自動運転中に、前記移動体によって生じる死角領域を検出し、前記死角領域から出現する可能性のある移動体を予測し、前記移動体の進路と前記自車両の進路の相対的な優先度を判定し、判定した優先度に基づいて前記自車両に対する制御信号を出力することを特徴とする車両制御装置。
    In the vehicle control device according to claim 1 or 2,
    The host vehicle is an autonomous driving vehicle that performs steering control and speed control so as to move along a preset target route,
    During automatic driving, a blind spot area generated by the moving body is detected, a moving body that may appear from the blind spot area is predicted, and the relative priority between the path of the moving body and the path of the host vehicle is determined. A vehicle control device that determines and outputs a control signal for the host vehicle based on the determined priority.
  4.  請求項1乃至3のいずれか一項に記載の車両制御装置において、
     前記自車両周辺の環境を認識する環境認識装置を備え、
     前記優先度は道路情報に基づいて判定され、前記道路情報は少なくとも地図情報もしくは前記外環境認識装置で認識した情報から取得することを特徴とする車両制御装置。
    In the vehicle control device according to any one of claims 1 to 3,
    An environment recognition device for recognizing the environment around the host vehicle;
    The priority is determined based on road information, and the road information is acquired from at least map information or information recognized by the external environment recognition device.
  5.  請求項1乃至4のいずれか一項に記載の車両制御装置において、
     前記移動体の方が前記自車両より前記優先度が高い場合は、前記自車両が前記移動体に衝突しないように前記自車両に対する制御信号を出力することを特徴とする車両制御装置。
    In the vehicle control device according to any one of claims 1 to 4,
    When the priority of the moving body is higher than that of the own vehicle, the vehicle control apparatus outputs a control signal for the own vehicle so that the own vehicle does not collide with the moving body.
  6.  請求項5に記載の車両制御装置において、
     前記自車両が前記移動体に衝突しないように前記自車両を制御し、前記自車両の乗員に報知することを特徴とする車両制御装置。
    The vehicle control device according to claim 5, wherein
    A vehicle control apparatus that controls the host vehicle so that the host vehicle does not collide with the moving body and notifies a passenger of the host vehicle.
  7.  請求項1乃至6のいずれか一項に記載の車両制御装置において、
     作動油の圧力を制御することで制動力を発生させる制動装置を備え、
     前記自車両の方が前記移動体より前記優先度が高い場合は、前記制動装置の作動油の圧力を所定値まで上げることを特徴とする車両制御装置。
    In the vehicle control device according to any one of claims 1 to 6,
    A braking device that generates a braking force by controlling the pressure of hydraulic oil is provided.
    When the priority of the host vehicle is higher than that of the moving body, the vehicle control device increases the pressure of the hydraulic fluid of the braking device to a predetermined value.
  8.  請求項1乃至7のいずれか一項に記載の車両制御装置において、
     前記移動体が歩行者である場合には、該歩行者の進路の優先度を前記自車両の進路の優先度よりも高く設定することを特徴とする車両制御装置。
    In the vehicle control device according to any one of claims 1 to 7,
    When the moving body is a pedestrian, the vehicle control apparatus sets the priority of the route of the pedestrian higher than the priority of the route of the host vehicle.
  9.  請求項1乃至8のいずれか一項に記載の車両制御装置において、
     前記自車両が右折する際には、自車両の進路の優先度を直進車両の進路の優先度よりも高く設定することを特徴とする車両制御装置。
    The vehicle control device according to any one of claims 1 to 8,
    When the host vehicle makes a right turn, the vehicle control device sets the priority of the route of the host vehicle higher than the priority of the route of the straight vehicle.
PCT/JP2015/084845 2014-12-25 2015-12-11 Vehicle control device WO2016104198A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014261537A JP2016122308A (en) 2014-12-25 2014-12-25 Vehicle controller
JP2014-261537 2014-12-25

Publications (1)

Publication Number Publication Date
WO2016104198A1 true WO2016104198A1 (en) 2016-06-30

Family

ID=56150227

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/084845 WO2016104198A1 (en) 2014-12-25 2015-12-11 Vehicle control device

Country Status (2)

Country Link
JP (1) JP2016122308A (en)
WO (1) WO2016104198A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017138658A1 (en) * 2016-02-10 2017-08-17 株式会社デンソー Drive assistance device
WO2018158911A1 (en) 2017-03-02 2018-09-07 日産自動車株式会社 Drive assistance method and drive assistance device
WO2018193535A1 (en) 2017-04-19 2018-10-25 日産自動車株式会社 Travel assistance method and travel assistance device
WO2018216125A1 (en) 2017-05-24 2018-11-29 日産自動車株式会社 Travel assistance method for travel assistance device, and travel assistance device
WO2019003603A1 (en) * 2017-06-29 2019-01-03 株式会社デンソー Collision prediction device and collision prediction method
CN110062722A (en) * 2016-12-14 2019-07-26 株式会社电装 Brake auxiliary device and braking householder method in vehicle
WO2020230523A1 (en) * 2019-05-16 2020-11-19 株式会社小糸製作所 Transportation system and transportation infrastructure
CN112714929A (en) * 2018-09-14 2021-04-27 松下电器产业株式会社 Pedestrian device, in-vehicle device, moving body guidance system, and moving body guidance method
US20210388578A1 (en) * 2019-03-14 2021-12-16 Hitachi Construction Machinery Co., Ltd. Construction machine
CN114126940A (en) * 2019-09-18 2022-03-01 日立安斯泰莫株式会社 Electronic control device

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6680170B2 (en) * 2016-09-30 2020-04-15 株式会社デンソー Driving support device and driving support method
WO2018066710A1 (en) * 2016-10-07 2018-04-12 アイシン・エィ・ダブリュ株式会社 Travel assistance device and computer program
JP6900775B2 (en) * 2017-05-12 2021-07-07 株式会社デンソー Vehicle automatic driving control system
CN107963077B (en) 2017-10-26 2020-02-21 东软集团股份有限公司 Control method, device and system for vehicle to pass through intersection
US20200398868A1 (en) * 2018-02-21 2020-12-24 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and program
JP7244215B2 (en) * 2018-05-07 2023-03-22 株式会社デンソーテン Object detection device
JP2019197467A (en) * 2018-05-11 2019-11-14 トヨタ自動車株式会社 Vehicle control device
JP7182376B2 (en) * 2018-05-14 2022-12-02 日産自動車株式会社 Driving support method and driving support device
JP7121361B2 (en) * 2018-06-29 2022-08-18 国立大学法人金沢大学 Autonomous mobile
JP7163729B2 (en) 2018-11-08 2022-11-01 トヨタ自動車株式会社 vehicle controller
US11505181B2 (en) * 2019-01-04 2022-11-22 Toyota Motor Engineering & Manufacturing North America, Inc. System, method, and computer-readable storage medium for vehicle collision avoidance on the highway
US10974732B2 (en) * 2019-01-04 2021-04-13 Toyota Motor Engineering & Manufacturing North America, Inc. System, method, and computer-readable storage medium for traffic intersection navigation
JP7180436B2 (en) * 2019-02-15 2022-11-30 株式会社デンソー Behavior control method and behavior control device
CN109765902B (en) 2019-02-22 2022-10-11 阿波罗智能技术(北京)有限公司 Unmanned vehicle driving reference line processing method and device and vehicle
JP7239353B2 (en) * 2019-03-12 2023-03-14 株式会社デンソー Braking support control device, braking support control system, and braking support control method for vehicle
DE102019108142A1 (en) * 2019-03-29 2020-10-01 Bayerische Motoren Werke Aktiengesellschaft Selecting an option for an automated motor vehicle
JP7145815B2 (en) * 2019-05-27 2022-10-03 日立Astemo株式会社 electronic controller
JP2021012467A (en) * 2019-07-04 2021-02-04 本田技研工業株式会社 Vehicle controller, method for controlling vehicle, and program
JP2021105909A (en) * 2019-12-27 2021-07-26 マツダ株式会社 Vehicle control device
JP2021117039A (en) * 2020-01-23 2021-08-10 アイシン・エィ・ダブリュ株式会社 Driving support device and computer program
JP7310667B2 (en) * 2020-03-17 2023-07-19 いすゞ自動車株式会社 warning device
JP7377359B2 (en) * 2020-06-29 2023-11-09 日立Astemo株式会社 Vehicle control device and vehicle control system
KR102385431B1 (en) * 2020-09-14 2022-04-13 주식회사 옐로나이프 The Method and Apparatus for Preventing Accident in Children's Safety Zone
CN115777121A (en) * 2020-09-16 2023-03-10 日立安斯泰莫株式会社 Driving support device
US11731661B2 (en) 2020-10-01 2023-08-22 Argo AI, LLC Systems and methods for imminent collision avoidance
US11358598B2 (en) 2020-10-01 2022-06-14 Argo AI, LLC Methods and systems for performing outlet inference by an autonomous vehicle to determine feasible paths through an intersection
US11618444B2 (en) 2020-10-01 2023-04-04 Argo AI, LLC Methods and systems for autonomous vehicle inference of routes for actors exhibiting unrecognized behavior
US20220105959A1 (en) * 2020-10-01 2022-04-07 Argo AI, LLC Methods and systems for predicting actions of an object by an autonomous vehicle to determine feasible paths through a conflicted area
JP2022147924A (en) 2021-03-24 2022-10-06 株式会社Subaru Drive support device
CN117616391A (en) * 2021-07-02 2024-02-27 株式会社电装 In-vehicle apparatus, control program, and startup method
CN115257728B (en) * 2022-10-08 2022-12-23 杭州速玛科技有限公司 Blind area risk area detection method for automatic driving

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006154967A (en) * 2004-11-25 2006-06-15 Nissan Motor Co Ltd Risk minimum locus generating device, and dangerous situation warning device using it
JP2006260217A (en) * 2005-03-17 2006-09-28 Advics:Kk Traveling support device for vehicle
JP2008307999A (en) * 2007-06-13 2008-12-25 Denso Corp Vehicular collision relaxing device
JP2011096009A (en) * 2009-10-29 2011-05-12 Fuji Heavy Ind Ltd Intersection driving support apparatus
JP2011194979A (en) * 2010-03-18 2011-10-06 Toyota Motor Corp Driving support device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003099898A (en) * 2001-09-25 2003-04-04 Nissan Motor Co Ltd Driver's future condition forecasting device
JP4578795B2 (en) * 2003-03-26 2010-11-10 富士通テン株式会社 Vehicle control device, vehicle control method, and vehicle control program
JP4327062B2 (en) * 2004-10-25 2009-09-09 三菱電機株式会社 Navigation device
JP4297045B2 (en) * 2004-12-14 2009-07-15 株式会社デンソー Display control apparatus and program for head-up display
JP2006242643A (en) * 2005-03-01 2006-09-14 Fujitsu Ten Ltd Navigation device
JP2008046766A (en) * 2006-08-11 2008-02-28 Denso Corp Vehicle external information display device
JP2009086788A (en) * 2007-09-28 2009-04-23 Hitachi Ltd Vehicle surrounding monitoring device
JP4814928B2 (en) * 2008-10-27 2011-11-16 三菱電機株式会社 Vehicle collision avoidance device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006154967A (en) * 2004-11-25 2006-06-15 Nissan Motor Co Ltd Risk minimum locus generating device, and dangerous situation warning device using it
JP2006260217A (en) * 2005-03-17 2006-09-28 Advics:Kk Traveling support device for vehicle
JP2008307999A (en) * 2007-06-13 2008-12-25 Denso Corp Vehicular collision relaxing device
JP2011096009A (en) * 2009-10-29 2011-05-12 Fuji Heavy Ind Ltd Intersection driving support apparatus
JP2011194979A (en) * 2010-03-18 2011-10-06 Toyota Motor Corp Driving support device

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017138658A1 (en) * 2016-02-10 2017-08-17 株式会社デンソー Drive assistance device
CN110062722A (en) * 2016-12-14 2019-07-26 株式会社电装 Brake auxiliary device and braking householder method in vehicle
US11396289B2 (en) 2016-12-14 2022-07-26 Denso Corporation Braking assistance device and braking assistance method for vehicle
WO2018158911A1 (en) 2017-03-02 2018-09-07 日産自動車株式会社 Drive assistance method and drive assistance device
US10766492B2 (en) 2017-03-02 2020-09-08 Nissan Motor Co., Ltd. Driving assistance method and driving assistance device
KR20190113918A (en) 2017-03-02 2019-10-08 닛산 지도우샤 가부시키가이샤 Driving assistance method and driving assistance device
US10994730B2 (en) 2017-04-19 2021-05-04 Nissan Motor Co., Ltd. Traveling assistance method and traveling assistance device
JPWO2018193535A1 (en) * 2017-04-19 2020-05-14 日産自動車株式会社 Driving support method and driving support device
KR20190135042A (en) 2017-04-19 2019-12-05 닛산 지도우샤 가부시키가이샤 Driving support method and driving support device
WO2018193535A1 (en) 2017-04-19 2018-10-25 日産自動車株式会社 Travel assistance method and travel assistance device
RU2720226C1 (en) * 2017-04-19 2020-04-28 Ниссан Мотор Ко., Лтд. Method of assisting in movement and device for assisting in movement
WO2018216125A1 (en) 2017-05-24 2018-11-29 日産自動車株式会社 Travel assistance method for travel assistance device, and travel assistance device
US11069242B2 (en) 2017-05-24 2021-07-20 Nissan Motor Co., Ltd. Traveling assistance method of traveling assistance device and traveling assistance device
CN110709911B (en) * 2017-05-24 2022-01-11 日产自动车株式会社 Travel assist method for travel assist device and travel assist device
CN110709911A (en) * 2017-05-24 2020-01-17 日产自动车株式会社 Travel assist method for travel assist device and travel assist device
WO2019003603A1 (en) * 2017-06-29 2019-01-03 株式会社デンソー Collision prediction device and collision prediction method
JP2019012314A (en) * 2017-06-29 2019-01-24 株式会社デンソー Collision estimation device and collision estimation method
CN112714929A (en) * 2018-09-14 2021-04-27 松下电器产业株式会社 Pedestrian device, in-vehicle device, moving body guidance system, and moving body guidance method
US20210388578A1 (en) * 2019-03-14 2021-12-16 Hitachi Construction Machinery Co., Ltd. Construction machine
WO2020230523A1 (en) * 2019-05-16 2020-11-19 株式会社小糸製作所 Transportation system and transportation infrastructure
CN113826152A (en) * 2019-05-16 2021-12-21 株式会社小糸制作所 Traffic system and traffic infrastructure
CN114126940A (en) * 2019-09-18 2022-03-01 日立安斯泰莫株式会社 Electronic control device
US20220314968A1 (en) * 2019-09-18 2022-10-06 Hitachi Astemo, Ltd. Electronic control device

Also Published As

Publication number Publication date
JP2016122308A (en) 2016-07-07

Similar Documents

Publication Publication Date Title
WO2016104198A1 (en) Vehicle control device
US11163310B2 (en) Vehicle control device
CN108778882B (en) Vehicle control device, vehicle control method, and storage medium
JP6839770B2 (en) Mobile control system and control device
CN109641591B (en) Automatic driving device
JP6344695B2 (en) Vehicle control device, vehicle control method, and vehicle control program
JP6397934B2 (en) Travel control device
JP6120371B2 (en) Automatic parking control device and parking assist device
EP3366540B1 (en) Information processing apparatus and non-transitory computer-readable recording medium
JP6368574B2 (en) Vehicle control device
CN110799403B (en) Vehicle control device
JP6951271B2 (en) Vehicle control device
JP2019160032A (en) Vehicle control device, vehicle control method, and program
JP6729326B2 (en) Automatic driving device
JP7156252B2 (en) Driving support device
JP2019156270A (en) Vehicle controller, vehicle control method and program
CN109703563B (en) Vehicle, travel control device, and travel control method
CN112985435B (en) Method and system for operating an autonomously driven vehicle
JP7220192B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
JP2021049873A (en) Vehicle control device, vehicle control method and program
JP2022123581A (en) Control device
JP2021193007A (en) Travel support method and travel support device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15872767

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15872767

Country of ref document: EP

Kind code of ref document: A1