WO2022130720A1 - 区画線認識装置 - Google Patents
区画線認識装置 Download PDFInfo
- Publication number
- WO2022130720A1 WO2022130720A1 PCT/JP2021/034493 JP2021034493W WO2022130720A1 WO 2022130720 A1 WO2022130720 A1 WO 2022130720A1 JP 2021034493 W JP2021034493 W JP 2021034493W WO 2022130720 A1 WO2022130720 A1 WO 2022130720A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- division line
- vehicle
- lane marking
- lane
- line
- Prior art date
Links
- 238000010276 construction Methods 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims description 35
- 238000000034 method Methods 0.000 description 47
- 230000010354 integration Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 18
- 230000036544 posture Effects 0.000 description 10
- 238000000926 separation method Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3658—Lane guidance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/181—Segmentation; Edge detection involving edge growing; involving edge linking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a lane marking recognition device.
- the lane marking sensor mounted on the vehicle detects the lane marking (lane boundary line) and recognizes the road boundary.
- lane marking lane boundary line
- the sensor may fail to detect the lane markings and the road boundaries may not be recognized correctly.
- Patent Document 1 is provided as a technique for dealing with a failure to detect a lane marking.
- the detection result of the camera mounted on the own vehicle is acquired, and at least one of the shape of the roadside object around the traveling direction and the movement history of another vehicle is used as peripheral information. Recognizes, estimates a reference line consisting of a sequence of points representing the road shape in the direction of travel of the own vehicle, and travels the own vehicle at a position separated by a predetermined distance from the reference line to both sides in the vehicle width direction of the own vehicle.
- a lane boundary which is the boundary of the lane
- a reference line estimation method an example is disclosed in which the recognized shape of a roadside object and the movement history of another vehicle are moved to the center in the vehicle width direction of the own vehicle to estimate the coordinate position of the point sequence. ..
- An object of the present invention has been made in view of the above problems, and it is possible to generate lane marking information including a lane marking of a portion that cannot be detected by the sensor in a form in which reliability is added to each portion of the lane marking.
- the purpose is to provide a lane marking device.
- the lane marking recognition device of the present invention that solves the above problems is mounted on the own vehicle and a lane marking information acquisition unit that acquires the lane marking information around the own vehicle detected by the lane marking detection sensor mounted on the own vehicle.
- a target information acquisition unit that acquires target information around the own vehicle detected by the target detection sensor, another vehicle state estimation unit that estimates the state of another vehicle based on the target information, and the lane marking line.
- a first division line generation unit that recognizes a division line from the division line information acquired by the information acquisition unit and generates the recognized division line as a first division line, and an extension of the first division line to form a division line.
- a third division line generation unit that estimates a division line based on the above and generates the estimated division line as a third division line, the first division line, the second division line, and the third division line. It is characterized by comprising an output division line construction unit for constructing an output division line output as a division line using at least one of them.
- the lane marking recognition device can output lane marking information including the lane marking of the portion that cannot be detected by the sensor in a form in which reliability is added to each portion of the lane marking. Therefore, when the sensor cannot detect the lane marking, the lane boundary can be set in consideration of the reliability, and the warning to the passenger and the vehicle control can be realized. That is, the functional range of driving support or automatic driving can be expanded.
- the hardware block diagram which shows one Embodiment of the lane marking recognition apparatus which concerns on this invention.
- the functional block diagram of the lane marking recognition apparatus shown in FIG. Derived pattern of the functional block diagram of the lane marking recognition device shown in FIG.
- the whole processing flow diagram of the lane marking recognition apparatus shown in FIG. FIG. 3 is a plan view of a vehicle equipped with the device shown in FIG. 1 when traveling in a lane.
- the figure which shows the example which the vehicle equipped with the apparatus shown in FIG. 1 generates an extension lane marking line.
- the figure which shows the example which the vehicle equipped with the apparatus shown in FIG. 1 generates a locus division line.
- the figure which shows the example which the vehicle equipped with the apparatus shown in FIG. 1 sets a lane marking and its reliability.
- the figure which shows the processing flow of the output division line construction part of the apparatus shown in FIG. The figure which shows the main processing of the output division line construction part of the apparatus shown in FIG.
- the image diagram of the utilization scene of Embodiment 1. The image diagram of the utilization scene of Embodiment 1.
- the second embodiment is an image diagram in the case where another vehicle is traveling in the front-rear direction of the own vehicle.
- An image diagram of the third embodiment when a plurality of other vehicles are traveling.
- FIG. 1 is a hardware configuration diagram showing an embodiment of a lane marking recognition device according to the present invention.
- the lane marking information integration device 100 of the present embodiment to which the lane marking recognition device according to the present invention is applied is mounted on the vehicle 10 and includes an Advanced Driver Assistance System (ADAS) and an automated driving system (Automated Driving:). It constitutes a part of AD).
- ADAS Advanced Driver Assistance System
- Automated Driving Automated Driving
- the lane marking information integration device 100 includes, for example, a storage device such as a central processing unit, a memory, and a hard disk, a computer program stored in the storage device, and an input / output device. Specifically, it is a computer system such as firmware or a microcontroller. Further, the lane marking information integration device 100 may be a part of an ADAS or an electronic control unit (ECU) for AD mounted on the vehicle 10.
- a storage device such as a central processing unit, a memory, and a hard disk
- a computer program stored in the storage device and an input / output device.
- an input / output device Specifically, it is a computer system such as firmware or a microcontroller.
- the lane marking information integration device 100 may be a part of an ADAS or an electronic control unit (ECU) for AD mounted on the vehicle 10.
- ECU electronice control unit
- the lane marking information integration device 100 enables information communication with the lane marking detection sensor 200, the target target detection sensor 300, and the positioning sensor 400 mounted on the vehicle 10 via CAN (Controller Area Network), in-vehicle Ethernet, or the like. It is connected.
- detection results I1, I2, and I3 are input from the lane marking sensor 200, the target detection sensor 300, and the positioning sensor 400, respectively, and the output result R of these sensor information is used as the lane marking information utilization device. Output to 500. Details of the functions included in the lane marking information integration device 100 will be described later.
- the lane marking information integration device 100 is configured to repeatedly operate at a predetermined cycle.
- the cycle of operation of the lane marking information integration device 100 is not particularly limited, but may be improved as a short cycle such as a 50 [ms] cycle to improve immediate responsiveness, or may be delayed as a 200 [ms] cycle. It may be set to reduce power consumption by limiting it to alarms that are not related to control.
- the cycle may be dynamically switched to change the balance between immediate response and power consumption as required by the situation. Further, instead of performing periodic processing, processing may be started based on another trigger such as input from a sensor to suppress unnecessary power consumption.
- the lane marking sensor 200 is a sensor mounted on the vehicle 10 and detecting the lane marking around the vehicle 10.
- the lane marking sensor 200 is, for example, a sensor capable of detecting a stereo camera, an all-around bird's-eye view camera system, a LIDAR (Light Detection and Ringing), a monocular camera, and other lane markings.
- the lane marking is a road marking that divides a lane on a road, and includes a lane boundary line displayed by a solid white or yellow line or a broken line.
- road marking paints, road studs, poles, stones and the like are generally used.
- the recognition of the lane marking by the lane marking sensor 200 will be described by taking a stereo camera as an example.
- the stereo camera which is the lane marking sensor 200, detects the lane marking from the image information. Further, the stereo camera generates a parallax image from the images of the two cameras, and measures the relative position, the relative speed, the line type of the lane marking, and the like with respect to each pixel of the lane marking image. It should be noted that the output does not necessarily include all the information, and here, the information on the relative position from the vehicle 10 to the lane marking is output to the lane marking information integration device 100 as the detection result I1.
- the target detection sensor 300 is a sensor mounted on the vehicle 10 and detecting a target around the vehicle 10.
- the target detection sensor 300 is a sensor capable of detecting a target, such as a radar, LIDAR, or sonar sensor.
- the target indicates another vehicle traveling around the own vehicle, a guardrail attached to the road, a curb, and the like.
- the target detection sensor 300 outputs the relative position and relative speed of the other vehicle 11 traveling around the own vehicle as the detection result I2 to the lane marking information integration device 100.
- the lane marking sensor 200 and the target detection sensor 300 may be replaced with only one LIDAR as a sensor capable of detecting both the lane marking and the target, and it is not always necessary to use a plurality of sensors.
- the positioning sensor 400 includes, for example, a speed sensor, an acceleration sensor, an angular speed sensor, a steering angle sensor, a gyro sensor, and a satellite positioning system such as a GNSS (Global Navigation Satellite System) mounted on the vehicle 10. Further, instead of the target detection sensor 300 mounted on the own vehicle, a vehicle-to-vehicle communication function for transmitting and receiving a position and a speed to and from another vehicle 11 may be installed to widely acquire the surrounding situation.
- the positioning sensor 400 outputs the detection result (positioning information) I3 including, for example, the speed, acceleration, angular velocity, steering angle, posture in the global coordinate system outside the own vehicle, etc. of the vehicle 10 to the lane marking information integration device 100. ..
- the detection result I3 output by the positioning sensor 400 does not necessarily include all the above-mentioned information, but includes, for example, at least the speed, acceleration, and angular velocity of the vehicle 10.
- the vehicle 10 is odometrically used using a speed sensor, an angular velocity sensor, a gyro sensor, or the like.
- the position and orientation of may be complemented. Further, the position and direction of the vehicle 10 may be accurately obtained in a short cycle, and the difference between the position and the direction between the previous cycle and the current cycle may be calculated.
- the lane departure information utilization device 500 includes an alarm device 510 or a vehicle control device 520, and the lane departure warning, lane keeping control, lane change, and lane change support are based on the output result R of the lane departure information integration device 100. To carry out.
- FIG. 2A is a functional block diagram of the lane marking information integration device 100 shown in FIG.
- the lane marking sensor 200, the target detection sensor 300, and the outputs I1, I2, and I3 of the positioning sensor 400 are combined with the lane marking information acquisition unit 101 and the target information acquisition unit 103 in the lane marking information integration device 100, respectively. And pass it to the positioning information acquisition unit 105.
- the lane marking information acquisition unit 101 acquires lane marking information around the own vehicle detected by the lane marking sensor 200.
- the lane marking information acquisition unit 101 converts the time synchronization of the lane marking information detected by the lane marking sensor 200 and the output format into a format that is easy to handle by the lane marking information integration device 100, and outputs the output to the lane marking recognition unit 102 in the subsequent stage. do.
- the detection result I1 of the lane marking sensor 200 may be a parameter of an approximate curve based on the shape of the lane marking, such as a coefficient of a quadratic curve based on the shape of the lane marking. In this case, the information capacity of the detection result I1 can be reduced as compared with the case where the detection result I1 is a recognition point sequence.
- the target information acquisition unit 103 acquires the target information around the own vehicle detected by the target detection sensor 300.
- the target information acquisition unit 103 converts the time synchronization of the target information detected by the target detection sensor 300 and the output format into a format that is easy to handle by the lane marking information integration device 100, and converts the output format into a format that can be easily handled by the lane marking information integration device 100. Output.
- the positioning information acquisition unit 105 converts the time synchronization of the input information and the format of the information to be handled, and the self-position attitude estimation unit 106 in the subsequent stage and the other vehicle state. Output to the estimation unit 104.
- the lane marking information for integrating the lane markings will be generated.
- the first lane marking line is a recognition lane marking line actually recognized by the vehicle 10
- the second lane marking line is an extension lane marking line extending the recognition lane marking line in front of the vehicle
- the third lane marking line is the own vehicle. It is a locus lane marking generated from the running locus of surrounding vehicles.
- the recognition division line, the extension division line, and the locus division line are identification information assigned to a certain division line, and the division line is divided into small sections by the output division line construction unit 111, respectively. Different information shall be assigned to. In particular, for the part where information is duplicated, a highly reliable lane marking is adopted.
- the division line recognition unit 102 sequentially processes the division line detection results output from the division line information acquisition unit 101, and recognizes the division line.
- a plurality of lane marking sensors mounted on the vehicle 10 may be mounted, and in that case, a plurality of lane marking detection results may be passed to the lane marking unit 102 for one lane marking.
- the division line information acquisition unit 101 and the division line recognition unit 102 in the previous stage may be treated as one unit to simplify the functional block.
- the other vehicle state estimation unit 104 sequentially processes the target detection results around the own vehicle output from the target information acquisition unit 103.
- a target refers to an object around the vehicle, such as a vehicle such as a car or motorcycle, a pedestrian, a guardrail or a curb that is a roadside object, but here, other vehicles existing around the vehicle are handled. It shall be.
- the other vehicle state estimation unit 104 estimates at least the position and speed of the other vehicle as the state of the other vehicle, and outputs the position and speed to the subsequent processing unit. Further, when the target detection sensor 300 cannot detect another vehicle, the position and speed information of the other vehicle output by the positioning sensor 400 may be used.
- the self-position / attitude estimation unit 106 receives the speed, acceleration, and angular velocity of the own vehicle output from the positioning information acquisition unit 105, and uses a Kalman filter or the like to perform postures such as the position and orientation of the own vehicle in the global coordinate system outside the own vehicle. Hereafter, the position and posture) are calculated. Then, it is output to the history storage unit 112 and the locus storage unit 107 in the subsequent stage.
- the locus accumulating unit 107 receives the relative position of the other vehicle with respect to the own vehicle, which is the output of the other vehicle state estimation unit 104, and the position / attitude of the own vehicle, which is the output of the self-position / posture estimation unit 106, in the global coordinate system outside the own vehicle. Generate the travel trajectory of another vehicle.
- extension lane marking generation unit 108 (second lane marking generation section), the in-lane position estimation section 109, the locus lane marking generation section (third lane marking generation section) 110, and the output lane marking section, which are the main parts of the present invention, are used.
- the construction unit 111 will be described with reference to FIGS. 3 to 8B.
- Fig. 3 shows the overall processing flow.
- sensor information such as the lane marking sensor 200 and the target detection sensor 300 is acquired, and in the process P2, the sensor information is detected by the sensor information recognition process in the lane marking information acquisition unit 101, the target information acquisition unit 103, and the like.
- the time synchronization and format conversion of the above are performed, and the information is converted into information that is easy to handle by the lane marking information integration device 100.
- a fusion process for integrating the same information into one may also be performed.
- the self-position / posture information (position / posture of the own vehicle) required for accumulating the recognition lane markings of the vehicle 10 and accumulating the traveling locus of the other vehicle 11 is estimated.
- the self-position / attitude information is estimated based on the sensor information recognized in the process P2.
- the lane marking is recognized from the information of the lane marking recognized in the process P2, and the recognized lane marking is generated as the recognition lane marking.
- the recognition lane marking line is generated by converting the lane marking line recognized in the process P2 into a plausible shape by using the least squares method or the like.
- the process in this process P4 corresponds to the recognition lane marking generation unit (first lane marking generation unit) that generates the recognition lane marking line. Further, in the process P5, the recognition division line recognized in the previous stage is extended and expanded to the non-detection range of the sensor to generate the extension division line. Here, the recognition division line is extended to estimate the division line, and the estimated division line is generated as the extension division line.
- the extension marking line is formed by stretching the recognition marking line based on the shape of the recognition marking line.
- the process in this process P5 corresponds to the extension section line generation section (second section line generation section) that generates the extension section line.
- the process P6 it is confirmed whether or not a vehicle exists in the own lane, the adjacent lane, or the like, and if so, the processes of the processes P7 to P11 are repeated for each vehicle. If it does not exist, the output lane marking line construction process of the process P12 is performed on the recognition lane marking line recognized by the process P4 or the extension lane marking line generated by the process P5, and the process ends.
- the traveling locus of another vehicle is generated from the position in the global coordinate system outside the own vehicle of the other vehicle acquired in advance.
- the lane marking line is estimated based on the positional relationship between the travel locus of another vehicle generated in the process P8 and the recognition lane marking line, and the estimated lane marking line is generated as the locus lane marking line.
- the distance between the travel locus of another vehicle and the recognition lane marking line required by the locus lane marking generation unit 110 is calculated.
- the distance between the travel locus of another vehicle and the recognition lane marking line is the distance in the road width direction between the travel locus of another vehicle and the recognition lane marking line.
- a locus section line is generated from the distance information obtained in the process P9.
- the processing in the processing P9 and the processing P10 comes to the locus division line generation unit (third division line generation unit) that generates the trajectory division line. If the locus lane markings have been generated for all the vehicles around the own vehicle at the time of the process P11, the process proceeds to the final process P12.
- an output division line output as a division line is constructed by using at least one of the recognition division line, the extension division line, and the locus division line.
- the recognition division lines, extension division lines, and locus division lines that have been generated so far are combined and evaluated in order of reliability, and one division line is divided into the three types of divisions on the left for each subsection.
- a line is adopted and used as the output of the lane marking information integration device 100.
- not only one type of lane marking line such as only the recognition lane marking line or only the locus lane marking line is output, but also a different lane marking line is adopted for each small section even if the reliability is low, and the lane marking information is integrated. Output from the device 100.
- the purpose of this is to output as much lane marking information as possible for one lane marking that should be recognized, regardless of the reliability, and how to use this information will be described later. This is because it is premised on the idea of leaving it to the control device (passing as much information as possible to the subsequent control device in order to expand the controllability options of the vehicle).
- the recognition division line, the extension division line, and the locus division line only one type having the highest reliability may be adopted and output as an output division line.
- FIG. 4 shows a basic image for explaining the present invention.
- a lane marking 20 is drawn on a road changing from a straight section to a curved section, and a lane marking information integration device 100, a lane marking sensor 200, a target detection sensor 300, a positioning sensor 400, and a lane marking are drawn.
- another vehicle 11 is traveling in front of the vehicle 10 equipped with the line information utilization device 500.
- four lane markings 21, 22, 23, and 24 are drawn as lane markings 20, and three lanes 25, 26, and 27 are partitioned.
- An example is shown in which the vehicle 10 and the other vehicle 11 are traveling in the central lane 26 among the three lanes 25, 26, and 27.
- the lane marking sensor 200 detects the lane marking, the lane marking information acquisition unit 101 and the lane marking recognition unit 102 recognize the vehicle relative position of the lane marking, and the recognition result is obtained by the extension lane marking generation unit 108.
- the extension lane marking generation unit 108 has a function of generating an extension lane marking by extending the lane marking line in front of the own vehicle by using the recognition lane marking output by the lane marking recognition unit 102. Specifically, first, the received recognition lane marking is projected from the own vehicle coordinate system to the global coordinate system outside the own vehicle, and then converted into an approximate curve such as a straight line or a circle by using the least squares method or the like. This complements the lane markings in case the lane markings themselves are faint, the lane markings are not detected outside the detection range of the lane marking sensor 200, or the lane markings cannot be detected due to backlight such as sunlight. .. Then, it is output to the output division line construction unit 111 in the subsequent stage.
- 30A is a detected lane marking line
- 30B is redrawn as a recognition lane marking line based on the detection result.
- the extension lane marking 40 shown by a broken line in the figure is a lane marking line just extended to the front of the own vehicle based on the shape of the recognition lane marking 30B.
- a known method such as geometric calculation is used. Therefore, in a section where the road shape changes from a straight section to a curved section, the lane markings in the vicinity of the vehicle match the actual lane markings on the road, but the lane markings in the distance are included. It is not always possible to express a shape that fits the entire actual road.
- the target detection sensor 300 and the positioning sensor 400 are used to calculate the relative position of the other vehicle and the position / posture of the own vehicle in the global coordinate system outside the own vehicle, and the traveling of the other vehicle 11 traveling in front of the own vehicle. It is a conceptual diagram which generates the position in the lane of another vehicle 11 and the locus division line 60 by recognizing the locus 50 and passing the recognition result to the locus storage unit 107.
- the position estimation unit 109 in the lane uses the information of the lane marking recognized by the lane marking unit 102 and the travel locus information of the other vehicle 11 accumulated by the locus accumulating unit 107 to determine the position of the other vehicle 11 in the lane. Perform the estimation process.
- the locus lane marking unit 110 generates the locus lane marking 60 based on the positional relationship between the traveling locus 50 of the other vehicle 11 and the recognition lane marking 30B of the own vehicle. Specifically, the in-lane position estimation unit 109 obtains the distance between the traveling locus 50 of the other vehicle 11 and the recognition division line 30B of the own vehicle for each fixed section (d1 to dn), takes an average value, and obtains this.
- the locus lane marking 60 is obtained by using the average value as an offset value from the travel locus 50 and shifting the travel locus.
- the locus lane marking 60 can be used as a substitute for the recognition lane marking 30B that the own vehicle could not detect.
- the distance in the position estimation in the lane is not limited to the average, and the distance in the section may be obtained by using learning. Further, for a section in which the lane width is not constant and changes, the offset value may be linearly increased or decreased by observing the change in distance.
- the section of the locus marking line 60 for example, from the place where the recognition marking line does not exist to the rear end of the other vehicle 11 is considered. In the example shown in FIG. 6, the locus lane marking 60 inside the curve is obtained based on the distance between the recognition lane marking 30B inside the curve and the traveling locus 50 of the other vehicle 11, but the recognition lane marking 30B outside the curve is obtained. It is also possible to obtain a locus lane marking outside the curve based on the distance between (see FIG. 5) and the travel locus 50 of the other vehicle 11.
- FIG. 7 is a processing conceptual diagram of the output division line construction unit 111.
- different reliability is set for each small section for one section line.
- it represents a state in which one lane marking line is composed of a plurality of lane marking information for each section 70A, 70B, 70C, 70D.
- a method of selecting the division line information to be output from each of a plurality of division line information will be described with reference to the processing flow diagram 8A of the output division line construction unit 111 and the main processing diagram 8B.
- the lane marking sensor 200 of the vehicle 10 cannot recognize the lane markings 22 and 23 ahead of the recognition lane marking 30B due to, for example, backlight or faintness.
- the lane marking to which the lane marking information is added is divided into a plurality of sections (P22). Then, the division line information in each section is referred to, and which division line information is to be output is determined by the combination of the recognition division line, the extension division line, and the locus division line, respectively (P23-P25). At this time, as the order of reliability of the lane marking information, it is assumed that the reliability decreases in the order of the recognition lane marking line, the locus lane marking line, and the extension lane marking line.
- the reliability of each section and the selection target of the adopted division line are determined based on the combination table of FIG. 8B. If the recognition lane markings exist in order of reliability of the lane markings, only the recognition lane markings are adopted and high reliability information is given. When both the extension lane marking line and the locus lane marking line exist and the distance between them is equal to or less than the threshold value, the locus lane marking line is adopted and medium reliability information is given. At this time, if the distance between the two is larger than the threshold value, the other vehicle 11 has traveled along the lane marking due to a lane change or the like, or the shape of the lane marking on the road has suddenly changed.
- the lines do not match, low reliability information is added to both lane marking information, and then a more reliable lane marking is adopted. Further, when only the locus lane marking exists, the locus lane marking is adopted and low reliability information is given, and when only the extension lane marking is present, the extension lane marking is adopted in the same manner and the low reliability information is given. Is given. Here, three types of reliability information are set, high, medium, and low, but the reliability may be set more strictly to improve the reliability of the entire lane marking, and the reliability is not limited to this.
- each section line 30B, 40, 60 is divided into small sections 70A to 70D at predetermined intervals.
- a group is formed between the same division lines for each division line (recognition division line, extension division line, locus division line) at a certain moment.
- the separation distances in the lane width direction of each lane marking are compared in a round-robin manner, and the lane markings that are closest to each other and whose separation distance is equal to or less than a threshold value (for example, a distance of half the lane width) are grouped together.
- a threshold value for example, a distance of half the lane width
- the extended division line is extended from the recognition division line, it can be regarded as the same group.
- the separation distance may be calculated as the average of the distance errors of the lane markings to be compared obtained in each fixed range.
- the recognition lane marking 30B on the right side of the own vehicle, the extension lane marking 40 on the right side of the own vehicle, and the locus lane marking 60 are in the same group, and the own vehicle is formed.
- the lane marking 30B on the left side and the lane marking 40 on the left side of the own vehicle form another same group.
- the recognition lane markings and the extension lane markings are searched for in a certain range from the start point of the lane markings in the extension direction, and the recognition lane markings and the extension lane markings are perpendicular to the extension direction. It is determined whether or not there is a locus lane marking that fits within the direction (direction of the lane marking width). Then, when the locus lane marking line exists, the separation distance between the recognition lane marking line and the extension lane marking line is calculated, and the section 70B is set for the portion below the threshold value (for example, about twice the lane marking line width). .. Then, the section 70C is set for the portion where the separation distance is larger than the threshold value.
- the threshold value for example, about twice the lane marking line width
- the section 70A is set for the portion where only the recognition division line 30B exists, and the section 70D is set for the portion where only the extension division line 40 exists. Since the threshold value used for these section divisions varies depending on the sensor used and the driving lane width, it may be changed according to the sensor characteristics and the actual driving environment.
- a method of dividing the lane markings into small sections a method of dividing the lane markings from the origin of the own vehicle coordinate system at regular intervals in the direction of travel of the own vehicle may be adopted, or each line may be divided at regular intervals so as to strictly follow the lane markings. Then, the dividing line included in the dividing area may be selected.
- the lane markings may be divided in a grid pattern based on the origin of the own vehicle coordinate system, and the lane marking information that fits in each cell may be combined. ..
- the threshold value of the distance is assumed to be based on the lane marking width, and double the lane marking width is set as the threshold value.
- the recognition section line 30B for high reliability information is set in the section 70A of FIG. 7
- the locus section line 60 for medium reliability information is set in the section 70B
- the low reliability information is also set in the section 70C.
- the locus section line 60 of the above is set
- the extension section line 40 of the low reliability information is set in the section 70D.
- the locus lane marking 60 is adopted as the lane marking of the section 70B, but in this section, an integrated lane marking that integrates the lane markings such as taking the average value of the coordinates of the locus lane marking 60 and the extension lane marking 40 is adopted. You may.
- FIGS. 9A and 9B show examples of scenes in which the effect of the present invention is remarkable.
- FIG. 9A when the vehicle 10 is traveling in the driving lane 26 toward a certain destination and the lane must be changed to the right lane 27, and the lane marking 22 ahead is due to some factor. , 23 cannot be detected. Then, since the other vehicle 11 is traveling ahead, the locus lane marking 60 can be generated, and together with the extension lane marking 40, the lane marking 23'on the right side of the vehicle 10 can be generated. This makes it possible to change lanes to the right lane 27.
- the vehicle 10 when the vehicle 10 is traveling in a curved section and the lane must be maintained, the other vehicle 11 travels ahead in a situation where the lane markings 22 and 23 in front cannot be detected. By doing so, the left locus lane marking 60 can be generated. Therefore, the vehicle 10 can sound the lane departure warning to the driver of the vehicle 10 and continue the lane keeping control, and can reduce the risk of the lane departure.
- FIG. 10 will be described as another utilization scene.
- FIG. 10 is a scene in which another vehicle 12 is traveling from the front to the rear in the oncoming lane next to the traveling lane in which the vehicle 10 is traveling.
- three lane markings 121, 122, and 123 are drawn as lane markings 20, and the traveling lane 124 between the lane marking 122, which is the center line, and the lane marking 121 on the left side thereof is driven by the own vehicle.
- An example is shown in which a certain vehicle 10 travels and another vehicle 12 which is an oncoming vehicle travels on an oncoming lane 125 between the lane marking 122 and the lane marking 123 on the right side thereof.
- the locus lane marking 60 having medium reliability information is generated for the section where the locus lane marking 60 obtained from the traveling locus 50 of the other vehicle 12 and the extension lane marking 40 obtained from the recognition lane marking line of the vehicle 10 overlap. Therefore, even in a situation where the vehicle 10 cannot detect the front lane markings 121 and 122, it is possible to execute the lane keeping control and prevent the vehicle 10 from jumping out of the traveling lane 124.
- a history storage unit 112 is newly provided between the division line recognition unit 102 and the extension division line generation unit 108.
- the history storage unit 112 stores past recognition lane markings detected by the lane marking unit 102 of the vehicle 10 and information on the position and posture of the own vehicle of the self-position / posture estimation unit 106 as a history, and the accumulated information.
- Is used to generate a fourth division line 41 which is a division line extending the recognition division line 30B to the rear of the own vehicle (hereinafter, the fourth division line generated by extending the recognition division line rearward is referred to as a history division line. Call).
- the locus lane marking unit 110 uses the recognition lane marking 30B and the traveling locus 50 of the other vehicle 12 to generate a locus lane marking 60 extending toward the rear of the vehicle 10.
- the output lane marking construction unit 111 uses the history lane marking 41 and the locus lane marking 60 to form a boundary between the own lane in which the own vehicle is traveling behind the vehicle 10 and the adjacent lane adjacent to the own lane. Build an output lane.
- the traveling lane of another vehicle traveling behind the vehicle 10 is set to the own lane by using the information of the output lane marking behind the own vehicle constructed by the output lane marking construction unit 111. And determine which is the next lane.
- the overall processing flow is located between the recognition division line generation of the processing P4 and the extension division line generation of P5 in FIG.
- FIG. 12 As an explanation using FIG. 12, consider a scene in which the vehicle 13A is traveling in front of the vehicle 10 and the vehicle 13B is approaching behind the vehicle 10.
- three lane markings 131, 132, and 133 are drawn as lane markings 130 for dividing the two lanes, and the vehicle 10 which is the own vehicle is in the lane 134 between the lane markings 131 and the lane markings 132.
- An example is shown in which the other vehicles 13A and 13B travel in the lane 135 between the lane marking 132 and the lane marking 133.
- the vehicle 10 detects a part of the front division line and sets it as the recognition division line 30B, but the history storage unit 112 can generate the history division line 41 to the rear of the vehicle 10. .. Then, based on the recognition lane marking 30B and the traveling locus 50 of the other vehicle 13A, the locus lane marking 60 extending toward the rear of the own vehicle can be obtained. Then, when the distance between both the history division line 41 and the locus division line 60 is equal to or less than the threshold value, it is possible to generate an output division line having medium reliability information.
- the lane marking information utilization device 500 based on the output lane marking having the medium reliability information, it is possible to determine whether the lane in which the other vehicle 13B behind is traveling is the own lane 134 or the adjacent lane 135. become able to. That is, it can be used for determining the danger when the vehicle 10 executes a lane change.
- the output division line having the medium reliability information the history division line generated from the history of the division line that can be actually detected is more reliable than the trajectory division line, so that the history A lane marking may be adopted, or an integrated lane marking that integrates both lane markings may be adopted.
- the third embodiment shown in FIG. 13 is the same as the block configuration of the first embodiment, but since a plurality of other vehicles are targeted, in the lane marking information integration device 100, the lane marking selection of the output lane marking construction unit 111 Added processing combination item.
- FIG. 13 is a scene in which a plurality of other vehicles 14A and 14B are traveling in front of the vehicle 10.
- the vehicle 10 is equipped with a lane marking sensor 200, a target detection sensor 300, and the like, and can detect surrounding lane markings and targets.
- the lane marking sensor 200 can detect not only the own lane 25 but also the lane markings 23 and 24 of the adjacent lanes 26 and 27.
- the locus lane markings 60 and 61 are generated based on the traveling locus 50A and the recognition lane marking 30B of the other vehicle 14A, and the locus lane marking 62 is generated based on the traveling locus 50B and the recognition lane marking 30B of the other vehicle 14B. ..
- the lane marking information integration device 100 can estimate the lane marking of the section 80 in which the locus lane markings 61 and 62 overlap from the traveling loci 50A and 50B of the other vehicles 14A and 14B.
- the locus section line generation unit 110 compares a plurality of track section lines with each other and integrates one or a plurality of track section lines of the plurality of track section lines when the adjacent locus sections have a distance equal to or less than the threshold value.
- the integrated lane marking line is adopted as the locus lane marking line. If the locus lane markings 61 and 62 of the section 80 have a distance equal to or less than the threshold value, since a plurality of lane markings are superimposed in the order of reliability, medium reliability information is given and one locus is given.
- Adopt a lane marking Alternatively, as in FIG. 7, an integrated lane marking line in which a plurality of lane marking lines are integrated may be adopted. As a result, it is possible to change lanes to the adjacent lane, and by capturing a plurality of lanes around the own vehicle, it is possible to improve the possibility of matching with the lane described on the map.
- the present invention is not limited to the above-described embodiments, and various designs are designed without departing from the spirit of the present invention described in the claims. You can make changes.
- the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
- it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
- 30B recognition lane marking (first lane marking), 40 extension lane marking (second lane marking), 41 history lane marking (fourth lane marking), 60 locus lane marking (third lane marking), 100 lane marking information integration device (Partition line recognition device), 102 lane marking recognition unit (first lane marking generation unit), 104 other vehicle state estimation unit, 106 self-position posture estimation unit, 108 extension lane marking generation unit (second lane marking generation unit), 110 Trajectory division line generation unit (third division line generation unit), 111 Output division line construction unit, 112 History storage unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、本発明に係る区画線認識装置の一実施形態を示すハードウェア構成図である。本発明に係る区画線認識装置が適用される本実施形態の区画線情報統合装置100は、車両10に搭載され、先進運転支援システム(Advanced Driver Assistance System:ADAS)や自動走行システム(Automated Driving:AD)の一部を構成する。
区画線情報統合装置100が備える機能の詳細は後述する。
図4に、本発明を説明するための基本的なイメージを示す。ここでは、一例として、直線区間からカーブ区間へと変化する道路に区画線20が引かれており、区画線情報統合装置100、区画線検知センサ200、物標検知センサ300、測位センサ400、区画線情報活用装置500を搭載した車両10の前方に他車両11が走行しているものとする。図4では、区画線20として4本の区画線21、22、23、24が引かれており、3つの車線25、26、27が区画されている。そして、車両10と他車両11が、3つの車線25、26、27のうち、中央の車線26を走行している例が示されている。
ここでは、各区画線30B、40、60をそれぞれ小区間70Aから70Dまで所定の間隔に分割する。まず、ある瞬間における各区画線(認識区画線・延伸区画線・軌跡区画線)について同じ区画線同士でグループを作る。例えば、各区画線の車線幅方向の離間距離を総当たりで比較し、最も近く、かつその離間距離が閾値(例えば車線幅の半分の距離)以下となる区画線同士を1つのグループにする。ここで、延伸区画線は、認識区画線から延伸したものなので同一のグループとみなせる。離間距離は、比較対象の区画線の距離誤差を一定範囲毎に求め、その平均として計算してよい。図7に示す例では、車両10を中心から見て、自車右側の認識区画線30Bと、自車右側の延伸区画線40と、軌跡区画線60の3本が同一のグループとなり、自車左側の区画線30Bと、自車左側の区画線40がもう一つの同一グループとなる。
図11に示す実施形態2の区画線情報統合装置100では、区画線認識部102と延伸区画線生成部108の間に、新たに履歴蓄積部112が設けられている。履歴蓄積部112は、これまで車両10の区画線認識部102が検知してきた過去の認識区画線と自己位置姿勢推定部106の自車の位置姿勢の情報を履歴として蓄積し、その蓄積した情報を用いて、認識区画線30Bを自車後方まで延伸した区画線である第4区画線41を生成する(以降、認識区画線を後方に延伸して生成した第4区画線を履歴区画線と呼ぶ)。
図13に示す実施形態3は、実施形態1のブロック構成と同じであるが、複数の他車両を対象としたことから、区画線情報統合装置100では、出力区画線構築部111の区画線選択処理の組み合わせ項目を追加した。
Claims (7)
- 自車に搭載された区画線検知センサにより検知された自車周辺の区画線情報を取得する区画線情報取得部と、
自車に搭載された物標検知センサにより検知された自車周辺の物標情報を取得する物標情報取得部と、
該物標情報に基づいて他車両の状態を推定する他車両状態推定部と、
前記区画線情報取得部によって取得された区画線情報から区画線を認識し、該認識した区画線を第1区画線として生成する第1区画線生成部と、
該第1区画線を延伸して区画線を推定し、該推定した区画線を第2区画線として生成する第2区画線生成部と、
前記他車両状態推定部によって推定された前記他車両の走行軌跡と前記第1区画線との位置関係を基に区画線を推定し、該推定した区画線を第3区画線として生成する第3区画線生成部と、
前記第1区画線と、前記第2区画線と、前記第3区画線の少なくともいずれか一つを用いて区画線として出力される出力区画線を構築する出力区画線構築部と、
を備えることを特徴とする区画線認識装置。 - 前記出力区画線構築部は、前記第1区画線と、前記第2区画線と、前記第3区画線を、それぞれ複数の区間に分割し、各区間内の区画線として、前記第1区画線と、前記第2区画線と、前記第3区画線のいずれかを信頼度に基づいて選択することを特徴とする請求項1に記載の区画線認識装置。
- 前記出力区画線構築部は、前記第1区画線の情報と、前記第2区画線の情報と、前記第3区画線の情報にそれぞれ信頼度の情報を付して、前記第1区画線と、前記第2区画線と、前記第3区画線のいずれかから、信頼度の高い区画線を前記各区間内の区画線として選択することを特徴とする請求項2に記載の区画線認識装置。
- 自車に搭載された測位センサにより検知された測位情報を取得する測位情報取得部と、 該測位情報に基づいて自車の位置姿勢を推定する自己位置姿勢推定部と、
前記自己位置姿勢推定部によって推定された自車の位置姿勢と、前記第1区画線生成部により生成した第1区画線の情報とを履歴として蓄積し、前記第1区画線を前記自車の後方に延伸した第4区画線を生成する履歴蓄積部と、を備え、
前記出力区画線構築部は、前記第3区画線と前記第4区画線を用いて、自車の後方に自車の走行している自車線と該自車線に隣接する隣車線との境界となる出力区画線を構築することを特徴とする請求項1に記載の区画線認識装置。 - 前記他車両状態推定部は、自車周囲の複数の他車両の状態を推定し、
前記第3区画線生成部は、前記他車両状態推定部によって推定された複数の他車両のそれぞれの走行軌跡と前記第1区画線との位置関係を基に複数の第3区画線を推定し、
前記出力区画線構築部は、前記第1区画線と、前記第2区画線と、前記複数の第3区画線の少なくともいずれか一つを用いて、区画線として出力される出力区画線を構築することを特徴とする請求項1に記載の区画線認識装置。 - 前記第3区画線生成部は、前記複数の第3区画線を互いに比較して隣接する第3区画どうしが閾値以下の距離をもつ場合、前記複数の第3区画線のいずれか一つまたは前記複数の第3区画線を統合した区画線を前記第3区画線として採用することを特徴とする請求項5に記載の区画線認識装置。
- 請求項4に記載の区画線認識装置により構築された自車の後方の出力区画線の情報を用いて、前記自車の後方を走行している他車両の走行車線が、前記自車線と前記隣車線のいずれであるかを判断することを特徴とする車両制御装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/254,637 US20240005673A1 (en) | 2020-12-16 | 2021-09-21 | Dividing line recognition device |
JP2022569716A JP7470214B2 (ja) | 2020-12-16 | 2021-09-21 | 区画線認識装置 |
DE112021005227.6T DE112021005227T5 (de) | 2020-12-16 | 2021-09-21 | Trennlinienerkennungsvorrichtung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020208174 | 2020-12-16 | ||
JP2020-208174 | 2020-12-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022130720A1 true WO2022130720A1 (ja) | 2022-06-23 |
Family
ID=82059695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/034493 WO2022130720A1 (ja) | 2020-12-16 | 2021-09-21 | 区画線認識装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240005673A1 (ja) |
JP (1) | JP7470214B2 (ja) |
DE (1) | DE112021005227T5 (ja) |
WO (1) | WO2022130720A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010221909A (ja) * | 2009-03-24 | 2010-10-07 | Hitachi Automotive Systems Ltd | 走行環境認識装置および車両制御装置 |
JP2015001773A (ja) * | 2013-06-13 | 2015-01-05 | ボッシュ株式会社 | 車線推定装置 |
JP2019051808A (ja) * | 2017-09-14 | 2019-04-04 | トヨタ自動車株式会社 | 運転支援装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7156924B2 (ja) | 2018-11-29 | 2022-10-19 | 株式会社Soken | 車線境界設定装置、車線境界設定方法 |
-
2021
- 2021-09-21 JP JP2022569716A patent/JP7470214B2/ja active Active
- 2021-09-21 WO PCT/JP2021/034493 patent/WO2022130720A1/ja active Application Filing
- 2021-09-21 DE DE112021005227.6T patent/DE112021005227T5/de active Pending
- 2021-09-21 US US18/254,637 patent/US20240005673A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010221909A (ja) * | 2009-03-24 | 2010-10-07 | Hitachi Automotive Systems Ltd | 走行環境認識装置および車両制御装置 |
JP2015001773A (ja) * | 2013-06-13 | 2015-01-05 | ボッシュ株式会社 | 車線推定装置 |
JP2019051808A (ja) * | 2017-09-14 | 2019-04-04 | トヨタ自動車株式会社 | 運転支援装置 |
Also Published As
Publication number | Publication date |
---|---|
DE112021005227T5 (de) | 2023-08-31 |
JPWO2022130720A1 (ja) | 2022-06-23 |
JP7470214B2 (ja) | 2024-04-17 |
US20240005673A1 (en) | 2024-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109484404B (zh) | 车辆控制装置、车辆控制方法及存储介质 | |
CN107451521B (zh) | 车辆车道图估算 | |
US10074279B1 (en) | Inference-aware motion planning | |
US10730503B2 (en) | Drive control system | |
CN106996793B (zh) | 地图更新判定系统 | |
EP3644294B1 (en) | Vehicle information storage method, vehicle travel control method, and vehicle information storage device | |
CN110546696B (zh) | 用于自主车辆的用于自动生成和更新数据集的方法 | |
CN101326511B (zh) | 用于检测或预测车辆超车的方法 | |
GB2614379A (en) | Systems and methods for vehicle navigation | |
US20150149076A1 (en) | Method for Determining a Course of a Traffic Lane for a Vehicle | |
US20190278285A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
US20210070289A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
JP2019128612A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP7005326B2 (ja) | 路側物認識装置 | |
US10501077B2 (en) | Inter-vehicle distance estimation method and inter-vehicle distance estimation device | |
WO2022130720A1 (ja) | 区画線認識装置 | |
JP2023054084A (ja) | 車両用ステレオカメラ装置 | |
Xu et al. | Context-aware tracking of moving objects for distance keeping | |
US20220383646A1 (en) | Mobile object control device, mobile object control method, and storage medium | |
JP4844812B2 (ja) | 車両間情報通信システム | |
US11989950B2 (en) | Information processing apparatus, vehicle system, information processing method, and storage medium | |
JP7216695B2 (ja) | 周囲車両監視装置及び周囲車両監視方法 | |
JP2022139009A (ja) | 運転支援装置、運転支援方法及びプログラム | |
JP7291015B2 (ja) | 周囲物体認識方法及び周囲物体認識装置 | |
JP2018128906A (ja) | 車両制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21906085 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022569716 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18254637 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21906085 Country of ref document: EP Kind code of ref document: A1 |