WO2016104198A1 - Dispositif de commande de véhicule - Google Patents

Dispositif de commande de véhicule Download PDF

Info

Publication number
WO2016104198A1
WO2016104198A1 PCT/JP2015/084845 JP2015084845W WO2016104198A1 WO 2016104198 A1 WO2016104198 A1 WO 2016104198A1 JP 2015084845 W JP2015084845 W JP 2015084845W WO 2016104198 A1 WO2016104198 A1 WO 2016104198A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
host vehicle
priority
moving body
blind spot
Prior art date
Application number
PCT/JP2015/084845
Other languages
English (en)
Japanese (ja)
Inventor
今井 正人
雅男 坂田
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Publication of WO2016104198A1 publication Critical patent/WO2016104198A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device.
  • the dead angle is detected from the image in front of the host vehicle, and the movement when the object pops out assuming that the object exists in the blind spot
  • an object of the present invention is to provide a vehicle control device that can appropriately control a vehicle according to the situation when the host vehicle travels in a blind spot.
  • the vehicle control device detects a blind spot area that is a blind spot for the host vehicle, determines a relative priority between a course of a moving body that may appear from the blind spot area and a course of the host vehicle, A control signal for the host vehicle is output based on the determined priority.
  • the vehicle when the host vehicle travels in a blind spot, the vehicle can be appropriately controlled according to the situation.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 2 is a diagram illustrating an example of a priority determination table for left-hand traffic.
  • FIG. 3 is a diagram illustrating an example of a priority determination table for right-hand traffic.
  • FIG. 4 is a diagram illustrating a target route at a crossroad intersection.
  • FIG. 5 is a diagram illustrating blind spot detection at a crossroad intersection.
  • FIG. 6 is a diagram illustrating the position of the host vehicle at which the blind spot is eliminated at the crossroad intersection.
  • FIG. 7 is a diagram showing feature points on the target route when calculating the target speed at the crossroad intersection.
  • FIG. 8 is a diagram illustrating a calculation example of the target speed at the crossroad intersection.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 2 is a diagram illustrating an example of a priority determination table for left-hand traffic.
  • FIG. 3 is
  • FIG. 9 is a diagram illustrating an example in which the host vehicle 700 performs driving support in consideration of a blind spot on a side road while driving on a priority road.
  • FIG. 10 is a diagram illustrating an example in which traveling support is performed in consideration of the dead angle of a stopped vehicle on the oncoming lane when the host vehicle 800 approaches a pedestrian crossing.
  • FIG. 11 is a flowchart for explaining the operation of the control device 100a.
  • FIG. 12 is a flowchart illustrating a driving support content determination process based on priority.
  • FIG. 1 is a diagram showing an embodiment of a vehicle control device according to the present invention.
  • FIG. 1 is a block diagram showing a control device 100a as a vehicle control device and its peripheral devices.
  • a control device 100a illustrated in FIG. 1 is a computer that controls a host vehicle, and by executing a program stored in a storage medium (not shown), a surrounding environment recognition unit 1, a blind spot detection unit 2, road information It functions as the acquisition unit 3, the target route generation unit 4, the priority determination unit 5, the moving body course prediction unit 6, and the vehicle control unit 7.
  • the control device 100a is connected to the steering device 102, the driving device 103, and the braking device 104 of the host vehicle, and the external environment recognition device 101, the sound generator 105, the display device 106, and the automatic driving button 107 provided in the host vehicle. ing.
  • the control device 100a is connected to a CAN (not shown) of the host vehicle, and vehicle information such as the vehicle speed, the steering angle, and the yaw rate of the host vehicle is input via the CAN.
  • CAN Controller Area Network
  • CAN Controller Area Network
  • the external environment recognition device 101 acquires information related to the surrounding environment of the host vehicle, and is, for example, four in-vehicle cameras that respectively capture the surrounding environment of the front, rear, right side, and left side of the host vehicle. .
  • Image data obtained by the external environment recognition apparatus 101 is analog data or A / D converted and input to the control apparatus 100a using a dedicated line or the like.
  • a radar that measures a distance from an object using a millimeter wave or a laser
  • a sonar that measures a distance from an object using an ultrasonic wave, or the like can be used.
  • the external environment recognition apparatus 101 outputs information such as the obtained distance to the object and the direction of the object to the control apparatus 100a using a dedicated line or the like.
  • the steering device 102 is configured by an electric power steering, a hydraulic power steering, or the like that can control the steering angle by an electric or hydraulic actuator or the like according to a drive command of the control device 100a.
  • the drive device 103 is an engine system capable of controlling the engine torque with an electric throttle or the like according to a drive command from the control device 100a, or an electric powertrain system capable of controlling the drive force according to a drive command from the control device 100a.
  • the braking device 104 is configured by an electric brake, a hydraulic brake, or the like that can control a braking force by an electric or hydraulic actuator or the like according to a drive command of the control device 100a.
  • the sound generator 105 includes a speaker or the like, and is used to output a warning or voice guidance to the driver.
  • the display device 106 includes a display such as a navigation device, a meter panel, a warning light, and the like. In addition to the operation screen of the control device 100a, the display device 106 displays a warning screen or the like that visually informs the driver that the vehicle is in danger of colliding with an obstacle.
  • the automatic operation button 107 is an operation member provided at a position where the driver can operate, and outputs a start signal for starting the operation of the control device 100a to the control device 100a based on the operation of the driver.
  • a mode change switch may be provided in the automatic operation button 107 so that a plurality of travel modes such as an automatic parking mode, a general road automatic operation mode, and an expressway automatic operation mode can be switched. Further, the operation of the control device 100a may be terminated by operating the automatic operation button 107 while the operation of the control device 100a is being performed.
  • the automatic operation button 107 is installed as a switch in a place where the driver can easily operate such as around the steering.
  • a button corresponding to the automatic operation button 107 may be displayed on the display device 106 so that the driver can operate it.
  • the image data obtained by the external environment recognition device 101 is input to the control device 100a.
  • the surrounding environment recognition unit 1 uses the image data input from the outside environment recognition device 101 to detect the shape and position of objects such as stationary solid objects, moving objects, road surface paint such as lane markings, and signs around the host vehicle. In addition, it has a function of determining whether or not the vehicle is a road surface on which the vehicle can travel by detecting unevenness on the road surface.
  • the stationary solid object is, for example, a parked vehicle, a wall, a guardrail, a pole, a pylon, a curb, a roadside tree, a car stop, and the like.
  • the moving body is, for example, a pedestrian, a bicycle, a motorcycle, or a vehicle.
  • the stationary solid object and the moving object are collectively referred to as an obstacle.
  • the shape and position of the object are detected using a pattern matching technique or other known techniques.
  • the position of the object is expressed, for example, using a coordinate system having an origin at the position of the in-vehicle camera that captures the front of the host vehicle.
  • the surrounding environment recognition unit 1 may be, for example, when driving on a general road based on information on the shape and position of the detected object and a determination result of whether or not the vehicle is a road surface on which the vehicle can travel. For example, a lane position where the vehicle can travel and a space where the vehicle can turn at an intersection are detected. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
  • the road information acquisition unit 3 acquires map data around the current vehicle position.
  • the acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present).
  • traffic regulation information speed limit, types of vehicles that can be passed, etc.
  • lane classification main line, overtaking lane, uphill lane
  • Straight lane left turn lane, right turn lane, etc.
  • presence / absence of traffic lights signs, etc.
  • the target route generation unit 4 generates a route for moving the host vehicle from the current host vehicle position to the target position. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
  • the blind spot detection unit 2 detects an area that is a blind spot from the host vehicle using the position information of the obstacle recognized by the surrounding environment recognition unit 1 and the map data acquired by the road information acquisition unit 3. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
  • the moving body course prediction unit 6 estimates a moving body (obstacle or blind spot object) that may jump out from the blind spot detected by the blind spot detection unit 2, and predicts the course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
  • the course of the mobile body can be rephrased as the planned travel path of the mobile body.
  • the priority of the path of the moving body can be associated with the planned traveling path of the moving body. The priority is determined based on the priority of the moving body itself (moving body specific priority) and the priority of the moving path itself on which the moving body moves (moving path specific priority). In some cases, the body priority is dominant and the path of the moving body is determined. In other cases, the path priority is determined in addition to the moving body priority. . For example, in the relationship between the pedestrian and the host vehicle, the priority of the moving body is dominant, and therefore the route priority is always higher for the pedestrian than for the host vehicle. In addition, since both the vehicle and the other vehicle are vehicles, the priority of the course cannot be determined only by the priority of the moving body itself, and the road where the host vehicle and the other vehicle are located cannot be determined. Priorities need to be considered.
  • the estimation of the moving body that may jump out from the blind spot is performed based on the position of the own vehicle that is grasped using GPS or the like, and the map data and information from the surrounding environment recognition unit 1. For example, when detecting a pedestrian crossing, the control device 100a estimates that a pedestrian or a bicycle may exist, and when detecting an intersection, the control device 100a estimates the presence of a vehicle in addition to the pedestrian or bicycle.
  • the priority determination unit 5 determines which route has priority when the route (route) of the host vehicle generated by the target route generation unit 4 and the route of the mobile body predicted by the mobile body route prediction unit 6 intersect. judge.
  • FIG. 2 shows an example of a priority determination table used by the priority determination unit 5.
  • the priority determination table of FIG. 2 is a case of left-hand traffic, and shows the priority of the route of the opponent vehicle (horizontal axis) with respect to the route of the own vehicle (vertical axis).
  • FIG. 2A is a priority determination table in a case where the host vehicle and the opponent vehicle pass the same road in the opposite direction and enter the intersection.
  • FIG. 2B is a priority determination table when an opponent vehicle enters the intersection from the intersection road on the right side of the own vehicle at an intersection where there is no traffic signal.
  • FIG. 2C is a priority determination table in a case where an opponent vehicle enters an intersection on the right intersection road with respect to the own vehicle at an intersection with a traffic light.
  • FIG. 1 is a case of left-hand traffic, and shows the priority of the route of the opponent vehicle (horizontal axis) with respect to the route of the own vehicle (vertical axis).
  • FIG. 2A is a
  • FIG. 2D is a priority determination table when the opponent vehicle enters the intersection from the intersection road on the left side with respect to the own vehicle at the intersection where there is no traffic signal.
  • FIG. 2E is a priority determination table in a case where an opponent vehicle enters an intersection on the left intersection road with respect to the own vehicle at an intersection with a traffic light.
  • FIG. 2A shows a case where the host vehicle and the opponent vehicle enter the intersection while passing the same road in the opposite direction.
  • the priority of the host vehicle becomes higher than that of the partner vehicle.
  • the priority relationship is not established, and therefore, “ ⁇ ” is represented in the priority determination table.
  • the priority is determined only by the priority of the road (that is, the moving path) (that is, the priority that is specific to the moving path), if the host vehicle and the other vehicle pass the same road in the opposite direction.
  • FIG. 3 shows a priority determination table in the case of right-hand traffic.
  • the scene settings in FIGS. 3A to 3E are the same as those in FIGS. 2A to 2E. Then, when the scene of FIG. 3A, that is, when the host vehicle and the partner vehicle pass the same road in the opposite direction and enter the intersection, the priority determination table is the same as that of the left-hand traffic.
  • FIGS. 3B to 3E details of FIGS. 3B to 3E are omitted, but are partially different from FIGS. 2B to 2E when the determination content is left-hand traffic. Specifically, in FIG. 3 (b) to FIG. 3 (e), the determination contents are different for the cells indicated by the shaded or gray background.
  • map data and the road traffic law can be stored in a navigation device mounted on the vehicle.
  • communication navigation system in which the in-vehicle navigation device communicates with the data center to perform navigation, there is a method of using the latest map data and road traffic law possessed by the data center. Conceivable.
  • the vehicle control unit 7 controls the host vehicle along the target route generated by the target route generation unit 4.
  • the vehicle control unit 7 calculates a target rudder angle and a target speed based on the target route.
  • the target rudder angle and the target speed are calculated so that the host vehicle does not collide with the obstacle.
  • the target speed is changed or the command value of the target brake pressure is calculated.
  • the vehicle control unit 7 then outputs a target steering torque for realizing the target steering angle to the steering device 102. Further, the vehicle control unit 7 outputs a target engine torque and a target brake pressure for realizing the target speed to the driving device 103 and the braking device 104.
  • the driving support content, etc. Is output to the sound generator 105 and the display device 106.
  • the surrounding environment recognition unit 1 recognizes the oncoming vehicle 201, the pedestrian 204, and the pedestrian 205 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 210 for turning right at the intersection so as not to collide with these obstacles recognized by the surrounding environment recognition unit 1.
  • the route 210 calculated by the target route generating unit 4 is divided into a straight section and a turning section, for example, a straight section until entering the intersection, a turning section that turns in the intersection, and a straight section after passing the intersection. Can think.
  • the target route generation unit 4 represents a straight section route by a straight line, and approximates a turn section route by combining a clothoid curve and an arc.
  • the clothoid curve represents a trajectory drawn by the host vehicle when the speed of the host vehicle 200 is constant and the rudder angle of the host vehicle 200 is changed at a constant angular velocity.
  • the circular arc represents a trajectory drawn by the own vehicle when the vehicle is driven while the speed of the own vehicle 200 is fixed and the rudder angle of the own vehicle 200 is fixed to a predetermined value (excluding the rudder angle in which the own vehicle goes straight).
  • the route calculation method by the target route generation unit 4 includes an approximation method using a spline curve, a polynomial, or the like, and is not limited to the above.
  • the target route generation unit 4 calculates the target route of the host vehicle and calculates the target speed when traveling on the target route.
  • the target speed of the vehicle is set to be a slow speed within the intersection (for example, a section where there is no lane marking between pedestrian crossings), and the other sections are set to have a speed limit, and the slow speed and the speed limit are set.
  • the changeover is connected so that the acceleration / deceleration is smooth.
  • blind spot detection by the blind spot detection unit 2 is performed to extract a blind spot that may have an obstacle obstructing the course of the host vehicle, and the mobile course prediction unit 6 may jump out of the blind spot. Predict the path of an obstacle. And the priority of the course of the own vehicle and an obstacle is determined, and the driving assistance content based on the priority is determined.
  • the blind spot detection unit 2 acquires map data around the intersection of FIG. 5 which is map data around the host vehicle 200 by the road information acquisition unit 3, and the oncoming vehicle 201 and the vehicle 202 recognized by the surrounding environment recognition unit 1.
  • the vehicle 203 is arranged on the acquired map data. Due to the characteristics of the external environment recognition device 101, the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the own vehicle, so a camera for recognizing the front of the own vehicle 200 is installed at the center of the front end of the own vehicle 200.
  • a line 301 extending from the front center of the host vehicle 200 toward the right end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203, and extending toward the left end of the oncoming vehicle 201, the vehicle 202, and the vehicle 203.
  • a region 303 on the opposite lane on the back side of the line 302 is determined as a blind spot and extracted. As described above, by arranging the own vehicle and the obstacle on the map data, it is possible to extract a blind spot where an obstacle that obstructs the course of the own vehicle may exist.
  • the moving body course prediction unit 6 selects an assumed moving body having a high traveling speed.
  • the blind spot 303 detected by the blind spot detection unit 2 is on the opposite lane, and examples of assumed moving bodies include four-wheeled vehicles (passenger cars, trucks, etc.), two-wheeled vehicles, bicycles, etc.
  • the fastest four-wheeled vehicle is selected as the moving body 304.
  • the fastest route among the routes that the moving body 304 can take is selected as the moving body route.
  • the straight path 305 is selected as the fastest path as the moving body path.
  • the route here also includes the meaning of actions (straight intersection / right / left turn, straight on priority road, straight on non-priority road, straight on straight road, right / left turn outside road to parking, etc.)
  • the priority of the route (target route) and the moving body route can be determined by the priority determination unit 5 based on the map data acquired by the road information acquisition unit 3 and the road traffic law. For example, in the case of FIG. 5, the route (target route 210) of the host vehicle 200 turns right at the intersection, and the route 305 of the moving body 304 goes straight through the intersection, so it is determined that the priority of the moving body 304 that goes straight through the intersection is high. .
  • the vehicle control unit 7 performs speed control ignoring the moving object when the priority of the own vehicle is higher than the priority of the moving object, and the priority of the own vehicle is higher than the priority of the moving object.
  • the driving support control for avoiding the collision with the moving body is performed.
  • the priority determination unit 5 determines that the priority of the moving body 304 is higher than that of the own vehicle 200
  • the vehicle control unit 7 avoids the collision of the own vehicle 200 with the moving body 304. For driving support.
  • the host vehicle 200 since the speed of the host vehicle 200 only needs to be controlled so that the host vehicle 200 does not interfere with the path 305 of the moving body 304, the host vehicle 200 moves forward at a slow speed to the position shown in FIG. When the blind spot 303 detected in the state of FIG. 5 disappears and it can be confirmed that there is no oncoming vehicle, the vehicle returns to the original speed and passes through the intersection.
  • FIG. 7 is the same as the situation described with reference to FIGS. 4 to 6.
  • the vehicle passes the intersection at the target speed shown in FIG.
  • the target speed in FIG. 8B is calculated to perform speed control. 7 and 8
  • point B is a point where the blind spot is predicted to disappear when the host vehicle 200 is at point A
  • point C is a point where the blind spot actually disappears. Decelerate to the intersection passing speed V1 and further to the point B to decelerate to the speed V2 for avoiding the collision with the moving body 304.
  • the blind spot disappears at the point C, it accelerates again to the intersection passing speed V1 and passes through the point D Accelerate again.
  • the priority between the course of the host vehicle and the course of the moving body is determined based on the information of the map data, and the driving support content is determined based on the determined priority, so that an appropriate response according to the surrounding situation can be obtained.
  • Driving support is possible and safety is improved.
  • an area that becomes a blind spot due to the opposite right turn waiting vehicle is extracted as map data information, and the course of the own vehicle (turns right at the intersection).
  • FIG. 9 is a scene in which the host vehicle 700 goes straight on the priority road, and a side road is connected to the left side in the traveling direction of the host vehicle 700. Note that a boundary 701 between the road on which the host vehicle 700 is traveling and a side road is blocked by a wall or a building, and the side road becomes a blind spot from the position of the host vehicle 700 in FIG.
  • the surrounding environment recognition unit 1 recognizes the boundary 701 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates the route 702 as a target route that travels in the center position of the lane because there is no obstacle on the travel lane of the host vehicle 700. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
  • the blind spot detection unit 2 acquires map data around the host vehicle 700 by the road information acquisition unit 3, and information on the boundary 701 recognized by the surrounding environment recognition unit 1 and a camera that recognizes the front of the host vehicle 700. Is installed at the center of the front end of the host vehicle 700, a blind spot area 704 on the side road is extracted from a line 703 extending from the center of the front end of the host vehicle 700 toward the location where the boundary 701 is interrupted.
  • the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 704 as the four-wheeled vehicle 705, and calculates the straight path 706 as the moving body course.
  • the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path.
  • the route of the own vehicle 700 (target route 702) goes straight on the priority road, and the route 706 of the moving body 704 goes straight on the non-priority road, so the priority of the own vehicle 700 going straight on the priority road is Determined to be high.
  • the driving support content based on the priority determined by the priority determination unit 5 is determined.
  • the vehicle control unit 7 determines whether or not the moving body 705 is present.
  • the point A the point where the blind spot 704 is extracted or the side road in FIG.
  • the target brake hydraulic pressure is increased to B1 at a predetermined distance from the front). By doing so, it is possible to assist in shortening the time until the start of braking when the moving body 705 jumps out. Then, when reaching point C where the blind spot disappears, the target brake fluid pressure is restored.
  • FIG. 10A is a scene in which the host vehicle 800 travels straight on a single road. A stop vehicle 801 exists in the oncoming lane, and a pedestrian crossing exists behind the stop vehicle 801.
  • the surrounding environment recognition unit 1 recognizes the stopped vehicle 801 by a known method (pattern matching method or the like) using the image data input from the external environment recognition device 101. Then, the target route generation unit 4 calculates a route 802 as a target route that travels in the lane center position because there is no obstacle on the travel lane of the host vehicle 800. Further, the target route generation unit 4 sets a road speed limit V1 as the target speed of the host vehicle.
  • the blind spot detection unit 2 acquires map data around the own vehicle 800 by the road information acquisition unit 3, and recognizes the information of the stopped vehicle 801 recognized by the surrounding environment recognition unit 1 and the front of the own vehicle 800.
  • the camera is installed at the center of the front end of the host vehicle 800, a blind spot region 805 from the line 803 and the line 804 extending from the center of the front end of the host vehicle 800 toward the left and right ends of the stop vehicle 801 is extracted.
  • the moving body course prediction unit 6 selects a moving body that may jump out from the blind spot 805 as the pedestrian 806, and calculates the straight path 807 as the moving body course.
  • the priority determination unit 5 determines the priority of the own vehicle path (target path) and the moving body path.
  • the route of the own vehicle 800 (target route 802) goes straight on the priority road, and the route 807 of the moving body 806 is a route crossing the pedestrian crossing. It is determined that the priority of 806 is high.
  • the driving support content based on the priority determined by the priority determination unit 5 is determined.
  • the priority determination unit 5 since the priority determination unit 5 determines that the priority of the moving body 806 is higher than that of the host vehicle 800, the vehicle control unit 7 avoids a collision with the moving body 806. To support for. Specifically, first, deceleration control is performed to a speed V2 that stops before the pedestrian crossing, and as shown in FIG. 10 (b), the host vehicle 800 travels and there are no obstacles such as pedestrians around the pedestrian crossing. After confirming the above, the original speed V1 is restored.
  • FIG. 11 is a flowchart showing an example of the processing procedure of the control device 100a.
  • the control program shown in the flowchart of FIG. 11 is executed.
  • the process of FIG. 11 is repeatedly executed at predetermined time intervals (for example, 0.01 second intervals).
  • an operation for stopping the automatic driving is performed (for example, re-operation of the automatic driving button 107)
  • the repeated execution is stopped and the automatic driving is stopped.
  • step S1001 the control device 100a determines whether or not the current state is automatic driving by operating the automatic driving button 107. If the current state is automatic operation, the control device 100a proceeds to the process of step S1002. On the other hand, if the current state is not automatic driving, the control device 100a skips a series of processes and continues.
  • step S1002 the control device 100a starts capturing image data from the external environment recognition device 101. Thereafter, image data is captured from the external environment recognition apparatus 101 for each frame.
  • step S1003 the control device 100a inputs the image data captured in step S1002 to the surrounding environment recognition unit 1, and detects objects such as stationary solid objects around the host vehicle, moving objects, road surface paint such as parking frame lines, and signs. Detect shape and position. In addition, based on the detected information on the shape and position of the object and the determination result of whether or not the vehicle is a road surface on which the vehicle can travel, for example, when traveling on a general road, Detects a swivelable space. In the case of a parking lot, a space where the host vehicle can be parked, a space where parking is possible, and the like are detected.
  • step S1004 the control device 100a acquires map data around the current vehicle position.
  • the acquired map data includes shape data close to the actual road shape expressed by polygons, polylines, etc., traffic regulation information (speed limit, types of vehicles that can be passed, etc.), lane classification (main line, overtaking lane, uphill lane) , Straight lane, left turn lane, right turn lane, etc.), presence / absence of traffic lights, signs, etc. (position information if present).
  • step S1005 the control device 100a is configured to move the host vehicle from the current host vehicle position to the target position based on the information about the host vehicle surrounding environment detected in step S1003 and the map data acquired in step S1004. Is generated. Further, the target speed for traveling on the generated route is calculated using information such as the speed limit of the map data, the curvature of the route, the traffic light, the temporary stop position, and the speed of the preceding vehicle. For example, when driving on a general road, the destination is set using a navigation device having map data, and the positional relationship between the host vehicle and the obstacle when driving toward the destination, the lane position, etc. A route is generated from the information. In the case of a parking lot, a target parking position where the host vehicle is parked is set in the parking space from the positional relationship between the host vehicle and the obstacle, and a route to that point is generated.
  • step S1006 the control device 100a detects an area that becomes a blind spot from the own vehicle based on the surrounding environment of the host vehicle detected in step S1003 and the map data acquired in step S1004. Due to the characteristics of the external environment recognition device 101, it is necessary to consider an obstacle that may be present in the blind spot when the back side of the obstacle recognized by the surrounding environment recognition unit 1 becomes a blind spot from the host vehicle. Here, in particular, a blind spot where an obstacle that obstructs the course of the host vehicle may exist is extracted.
  • step S1007 the control device 100a estimates a moving body that may jump out from the blind spot detected in step S1006, and predicts its course. For example, if the detected blind spot is a road, the vehicle is estimated. If a part of the pedestrian crossing is a blind spot, it is estimated that a pedestrian or a bicycle jumps out, and the course is predicted.
  • step S1008 the control device 100a determines which route has priority when the route (route) of the host vehicle generated in step S1005 and the route of the moving body predicted in step S1007 intersect.
  • step S1009 the control device 100a executes driving support based on the priority determined in step S1008. Specific processing here will be described with reference to FIG.
  • FIG. 12 is a flowchart showing an example of the processing procedure of step S1009 of FIG.
  • step S1201 it is determined whether or not a blind spot is detected in step S1006 of FIG. 11. If a blind spot is detected, the process proceeds to step S1202, and if a blind spot is not detected, the process proceeds to step S1205.
  • step S1202 it is determined whether or not the route (route) of the host vehicle generated in step S1005 of FIG. 11 and the route of the moving body predicted in step S1007 intersect. If so, the process proceeds to step S1203 and does not intersect. In this case, the process proceeds to step S1205.
  • step S1203 if it is determined in step S1008 in FIG. 11 that the host vehicle is prioritized, the process proceeds to step S1204. If it is determined that the host vehicle is not priority, the process proceeds to step S1206.
  • step S1204 as in the situation described with reference to FIG. 9, the target brake fluid pressure is increased in preparation for the jumping out of the moving body.
  • step S1205 the support in consideration of the blind spot is not performed, the normal traveling is maintained, and the series of processes in step S1009 in FIG.
  • step S1206 priority is given to a moving body that may jump out of the blind spot as in the situation described with reference to FIG. 7 or FIG. 10, so that the vehicle can be stopped at any time while performing deceleration control to a predetermined position.
  • step S1010 the control device 100a calculates a target rudder angle and a target speed for traveling along the target route from the target route and target speed generated in step S1005 and the target speed determined by the driving support determined in step S1009. To do.
  • step S1011 the control device 100a calculates a target steering torque, a target engine torque, and a target brake pressure for realizing the target rudder angle and target speed calculated in step S1010. , Output to the braking device 104. Then, the series of processing ends, and the process returns to step S1001.
  • control parameter output to the steering device 102 includes a target steering torque for realizing the target steering angle, but depending on the configuration of the steering device 102, the target speed steering angle can be directly output.
  • the control parameters output to the driving device 103 and the braking device 104 include a target engine torque and a target brake pressure for realizing the target speed.
  • the target parameter may be directly set. It is also possible to output the speed.
  • Control device 200 700, 800 Own vehicle 201 Oncoming vehicle 202, 203, 801 Vehicle 204, 205 Pedestrian 210, 702, 802 Target route 303, 704, 805 Blind spot area 304, 705, 806 Moving body 305, 706, 807 Mobile path

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

La présente invention a pour objet de réaliser un dispositif de commande de véhicule capable de commander un véhicule hôte de manière appropriée en fonction des conditions lorsque le véhicule hôte circule dans des conditions où il existe un angle mort. Le présent dispositif 100a de commande de véhicule est caractérisé par: la détection d'une région 303 d'angle mort qui constitue un angle mort pour le véhicule hôte 200; la détermination des priorités relatives de la route 210 du véhicule hôte 200 et de la route 305 d'un corps 304 en mouvement qui peut potentiellement apparaître à partir de la région 303 d'angle mort; et la délivrance d'un signal de commande pour le véhicule hôte 200 d'après les priorités déterminées.
PCT/JP2015/084845 2014-12-25 2015-12-11 Dispositif de commande de véhicule WO2016104198A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014261537A JP2016122308A (ja) 2014-12-25 2014-12-25 車両制御装置
JP2014-261537 2014-12-25

Publications (1)

Publication Number Publication Date
WO2016104198A1 true WO2016104198A1 (fr) 2016-06-30

Family

ID=56150227

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/084845 WO2016104198A1 (fr) 2014-12-25 2015-12-11 Dispositif de commande de véhicule

Country Status (2)

Country Link
JP (1) JP2016122308A (fr)
WO (1) WO2016104198A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017138658A1 (fr) * 2016-02-10 2017-08-17 株式会社デンソー Dispositif d'aide à la conduite
WO2018158911A1 (fr) 2017-03-02 2018-09-07 日産自動車株式会社 Dispositif et procédé d'aide à la conduite
WO2018193535A1 (fr) 2017-04-19 2018-10-25 日産自動車株式会社 Procédé d'aide au déplacement et dispositif de commande de déplacement
WO2018216125A1 (fr) 2017-05-24 2018-11-29 日産自動車株式会社 Procédé d'aide au déplacement pour dispositif d'aide au déplacement, et dispositif d'aide au déplacement
WO2019003603A1 (fr) * 2017-06-29 2019-01-03 株式会社デンソー Dispositif et procédé de prédiction de collision
CN110062722A (zh) * 2016-12-14 2019-07-26 株式会社电装 车辆中的制动辅助装置以及制动辅助方法
WO2020230523A1 (fr) * 2019-05-16 2020-11-19 株式会社小糸製作所 Système de transport et infrastructure de transport
CN112714929A (zh) * 2018-09-14 2021-04-27 松下电器产业株式会社 步行者装置、车载装置、移动体引导系统以及移动体引导方法
US20210388578A1 (en) * 2019-03-14 2021-12-16 Hitachi Construction Machinery Co., Ltd. Construction machine
CN114126940A (zh) * 2019-09-18 2022-03-01 日立安斯泰莫株式会社 电子控制装置

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6680170B2 (ja) * 2016-09-30 2020-04-15 株式会社デンソー 運転支援装置及び運転支援方法
WO2018066711A1 (fr) * 2016-10-07 2018-04-12 アイシン・エィ・ダブリュ株式会社 Dispositif d'aide aux déplacements et programme informatique
JP6900775B2 (ja) * 2017-05-12 2021-07-07 株式会社デンソー 車両の自動運転制御システム
CN107963077B (zh) * 2017-10-26 2020-02-21 东软集团股份有限公司 一种车辆通过路口的控制方法、装置及系统
US20200398868A1 (en) * 2018-02-21 2020-12-24 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and program
JP7244215B2 (ja) * 2018-05-07 2023-03-22 株式会社デンソーテン 物体検知装置
JP2019197467A (ja) * 2018-05-11 2019-11-14 トヨタ自動車株式会社 車両制御装置
JP7182376B2 (ja) * 2018-05-14 2022-12-02 日産自動車株式会社 運転支援方法及び運転支援装置
JP7121361B2 (ja) 2018-06-29 2022-08-18 国立大学法人金沢大学 自律移動体
JP7163729B2 (ja) 2018-11-08 2022-11-01 トヨタ自動車株式会社 車両制御装置
US11505181B2 (en) * 2019-01-04 2022-11-22 Toyota Motor Engineering & Manufacturing North America, Inc. System, method, and computer-readable storage medium for vehicle collision avoidance on the highway
US10974732B2 (en) * 2019-01-04 2021-04-13 Toyota Motor Engineering & Manufacturing North America, Inc. System, method, and computer-readable storage medium for traffic intersection navigation
JP7180436B2 (ja) * 2019-02-15 2022-11-30 株式会社デンソー 行動制御方法、及び行動制御装置
CN109765902B (zh) 2019-02-22 2022-10-11 阿波罗智能技术(北京)有限公司 无人车驾驶参考线处理方法、装置及车辆
JP7239353B2 (ja) * 2019-03-12 2023-03-14 株式会社デンソー 車両における制動支援制御装置、制動支援制御システムおよび制動支援制御方法
DE102019108142A1 (de) * 2019-03-29 2020-10-01 Bayerische Motoren Werke Aktiengesellschaft Auswählen einer Handlungsoption für ein automatisiertes Kraftfahrzeug
JP7145815B2 (ja) 2019-05-27 2022-10-03 日立Astemo株式会社 電子制御装置
JP2021012467A (ja) * 2019-07-04 2021-02-04 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
JP2021105909A (ja) * 2019-12-27 2021-07-26 マツダ株式会社 車両制御装置
JP2021117039A (ja) * 2020-01-23 2021-08-10 アイシン・エィ・ダブリュ株式会社 運転支援装置及びコンピュータプログラム
JP7310667B2 (ja) * 2020-03-17 2023-07-19 いすゞ自動車株式会社 警告装置
WO2022004042A1 (fr) * 2020-06-29 2022-01-06 日立Astemo株式会社 Dispositif de commande de véhicule et système de commande de véhicule
KR102385431B1 (ko) * 2020-09-14 2022-04-13 주식회사 옐로나이프 어린이 보호구역 내 사고를 방지하기 위한 방법 및 장치
JPWO2022059352A1 (fr) * 2020-09-16 2022-03-24
US11358598B2 (en) 2020-10-01 2022-06-14 Argo AI, LLC Methods and systems for performing outlet inference by an autonomous vehicle to determine feasible paths through an intersection
US11618444B2 (en) 2020-10-01 2023-04-04 Argo AI, LLC Methods and systems for autonomous vehicle inference of routes for actors exhibiting unrecognized behavior
US11731661B2 (en) 2020-10-01 2023-08-22 Argo AI, LLC Systems and methods for imminent collision avoidance
US20220105959A1 (en) * 2020-10-01 2022-04-07 Argo AI, LLC Methods and systems for predicting actions of an object by an autonomous vehicle to determine feasible paths through a conflicted area
JP2022147924A (ja) 2021-03-24 2022-10-06 株式会社Subaru 運転支援装置
JPWO2023277160A1 (fr) * 2021-07-02 2023-01-05
CN115257728B (zh) * 2022-10-08 2022-12-23 杭州速玛科技有限公司 一种用于自动驾驶的盲区风险区检测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006154967A (ja) * 2004-11-25 2006-06-15 Nissan Motor Co Ltd リスク最小軌跡生成装置およびこれを用いた危険状況警報装置
JP2006260217A (ja) * 2005-03-17 2006-09-28 Advics:Kk 車両用走行支援装置
JP2008307999A (ja) * 2007-06-13 2008-12-25 Denso Corp 車両用衝突緩和装置
JP2011096009A (ja) * 2009-10-29 2011-05-12 Fuji Heavy Ind Ltd 交差点運転支援装置
JP2011194979A (ja) * 2010-03-18 2011-10-06 Toyota Motor Corp 運転支援装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003099898A (ja) * 2001-09-25 2003-04-04 Nissan Motor Co Ltd 運転者将来状況予測装置
JP4578795B2 (ja) * 2003-03-26 2010-11-10 富士通テン株式会社 車両制御装置、車両制御方法および車両制御プログラム
JP4327062B2 (ja) * 2004-10-25 2009-09-09 三菱電機株式会社 ナビゲーション装置
JP4297045B2 (ja) * 2004-12-14 2009-07-15 株式会社デンソー ヘッドアップディスプレイの表示制御装置およびプログラム
JP2006242643A (ja) * 2005-03-01 2006-09-14 Fujitsu Ten Ltd ナビゲーション装置
JP2008046766A (ja) * 2006-08-11 2008-02-28 Denso Corp 車両外部情報表示装置
JP2009086788A (ja) * 2007-09-28 2009-04-23 Hitachi Ltd 車両周辺監視装置
JP4814928B2 (ja) * 2008-10-27 2011-11-16 三菱電機株式会社 車両用衝突回避装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006154967A (ja) * 2004-11-25 2006-06-15 Nissan Motor Co Ltd リスク最小軌跡生成装置およびこれを用いた危険状況警報装置
JP2006260217A (ja) * 2005-03-17 2006-09-28 Advics:Kk 車両用走行支援装置
JP2008307999A (ja) * 2007-06-13 2008-12-25 Denso Corp 車両用衝突緩和装置
JP2011096009A (ja) * 2009-10-29 2011-05-12 Fuji Heavy Ind Ltd 交差点運転支援装置
JP2011194979A (ja) * 2010-03-18 2011-10-06 Toyota Motor Corp 運転支援装置

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017138658A1 (fr) * 2016-02-10 2017-08-17 株式会社デンソー Dispositif d'aide à la conduite
CN110062722A (zh) * 2016-12-14 2019-07-26 株式会社电装 车辆中的制动辅助装置以及制动辅助方法
US11396289B2 (en) 2016-12-14 2022-07-26 Denso Corporation Braking assistance device and braking assistance method for vehicle
US10766492B2 (en) 2017-03-02 2020-09-08 Nissan Motor Co., Ltd. Driving assistance method and driving assistance device
WO2018158911A1 (fr) 2017-03-02 2018-09-07 日産自動車株式会社 Dispositif et procédé d'aide à la conduite
KR20190113918A (ko) 2017-03-02 2019-10-08 닛산 지도우샤 가부시키가이샤 운전 지원 방법 및 운전 지원 장치
JPWO2018193535A1 (ja) * 2017-04-19 2020-05-14 日産自動車株式会社 走行支援方法及び走行支援装置
US10994730B2 (en) 2017-04-19 2021-05-04 Nissan Motor Co., Ltd. Traveling assistance method and traveling assistance device
KR20190135042A (ko) 2017-04-19 2019-12-05 닛산 지도우샤 가부시키가이샤 주행 지원 방법 및 주행 지원 장치
WO2018193535A1 (fr) 2017-04-19 2018-10-25 日産自動車株式会社 Procédé d'aide au déplacement et dispositif de commande de déplacement
RU2720226C1 (ru) * 2017-04-19 2020-04-28 Ниссан Мотор Ко., Лтд. Способ помощи при движении и устройство помощи при движении
CN110709911A (zh) * 2017-05-24 2020-01-17 日产自动车株式会社 行驶辅助装置的行驶辅助方法以及行驶辅助装置
WO2018216125A1 (fr) 2017-05-24 2018-11-29 日産自動車株式会社 Procédé d'aide au déplacement pour dispositif d'aide au déplacement, et dispositif d'aide au déplacement
CN110709911B (zh) * 2017-05-24 2022-01-11 日产自动车株式会社 行驶辅助装置的行驶辅助方法以及行驶辅助装置
US11069242B2 (en) 2017-05-24 2021-07-20 Nissan Motor Co., Ltd. Traveling assistance method of traveling assistance device and traveling assistance device
WO2019003603A1 (fr) * 2017-06-29 2019-01-03 株式会社デンソー Dispositif et procédé de prédiction de collision
JP2019012314A (ja) * 2017-06-29 2019-01-24 株式会社デンソー 衝突推定装置および衝突推定方法
CN112714929A (zh) * 2018-09-14 2021-04-27 松下电器产业株式会社 步行者装置、车载装置、移动体引导系统以及移动体引导方法
US20210388578A1 (en) * 2019-03-14 2021-12-16 Hitachi Construction Machinery Co., Ltd. Construction machine
JPWO2020230523A1 (fr) * 2019-05-16 2020-11-19
CN113826152A (zh) * 2019-05-16 2021-12-21 株式会社小糸制作所 交通用系统及交通用基础设施
WO2020230523A1 (fr) * 2019-05-16 2020-11-19 株式会社小糸製作所 Système de transport et infrastructure de transport
CN114126940A (zh) * 2019-09-18 2022-03-01 日立安斯泰莫株式会社 电子控制装置
US20220314968A1 (en) * 2019-09-18 2022-10-06 Hitachi Astemo, Ltd. Electronic control device

Also Published As

Publication number Publication date
JP2016122308A (ja) 2016-07-07

Similar Documents

Publication Publication Date Title
WO2016104198A1 (fr) Dispositif de commande de véhicule
US11163310B2 (en) Vehicle control device
CN108778882B (zh) 车辆控制装置、车辆控制方法及存储介质
JP6839770B2 (ja) 移動体制御システム、および、管制装置
CN109641591B (zh) 自动驾驶装置
JP6344695B2 (ja) 車両制御装置、車両制御方法、および車両制御プログラム
JP6397934B2 (ja) 走行制御装置
JP6120371B2 (ja) 自動駐車制御装置および駐車支援装置
EP3366540B1 (fr) Appareil de traitement d'informations et support d'enregistrement lisible par ordinateur non transitoire
JP6368574B2 (ja) 車両制御装置
CN110799403B (zh) 车辆控制装置
JP6951271B2 (ja) 車両制御装置
JP2019160032A (ja) 車両制御装置、車両制御方法、およびプログラム
JP6729326B2 (ja) 自動運転装置
JP7156252B2 (ja) 運転支援装置
CN109703563B (zh) 车辆、行驶控制装置和行驶控制方法
JP2019156270A (ja) 車両制御装置、車両制御方法、及びプログラム
CN112985435B (zh) 用于操作自主驾驶车辆的方法及系统
JP7220192B2 (ja) 車両制御装置、車両制御方法、およびプログラム
JP2021049873A (ja) 車両制御装置、車両制御方法、およびプログラム
JP2022123581A (ja) 管制装置
JP2021193007A (ja) 走行支援方法、及び、走行支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15872767

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15872767

Country of ref document: EP

Kind code of ref document: A1