WO2022254788A1 - 後方白線推定装置、物標認識装置並びに方法 - Google Patents
後方白線推定装置、物標認識装置並びに方法 Download PDFInfo
- Publication number
- WO2022254788A1 WO2022254788A1 PCT/JP2022/004439 JP2022004439W WO2022254788A1 WO 2022254788 A1 WO2022254788 A1 WO 2022254788A1 JP 2022004439 W JP2022004439 W JP 2022004439W WO 2022254788 A1 WO2022254788 A1 WO 2022254788A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- white line
- vehicle
- azimuth
- unit
- estimating
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000001514 detection method Methods 0.000 claims abstract description 46
- 230000008859 change Effects 0.000 claims description 11
- 230000001186 cumulative effect Effects 0.000 claims description 8
- 238000009795 derivation Methods 0.000 description 12
- 238000004364 calculation method Methods 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000007423 decrease Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present invention relates to a rear white line estimation device, a target recognition device, and a method for grasping and recognizing the position of a target behind a vehicle in relation to the rear white line.
- the vehicle In the operation of the vehicle, it is necessary to ensure the safety behind the vehicle, and it is necessary to detect the vehicle as a target with a radar device and confirm its position.
- a radar device For example, when the own vehicle is traveling on a road having a plurality of lanes, it may be difficult for the radar device to determine which lane the following vehicle is traveling. For example, on a curved road that turns to the left, a following vehicle running in a lane on the right side of the vehicle may be positioned directly behind the vehicle. It becomes impossible to determine the lane in which the vehicle is traveling.
- Patent Document 1 is known as a driving support device that recognizes a following vehicle with a radar device and enables determination of the lane in which the following vehicle is traveling in order to enable lane determination.
- the driving support device described in Patent Document 1 is "a plurality of images indicating the relative positions of the vehicle and the lane detected from a plurality of images continuously captured by an imaging device during a predetermined data acquisition period from the present to the past. Based on a lane information group consisting of lane information and a travel trajectory of the own vehicle during the data acquisition period calculated based on the detection results of the vehicle speed sensor and the yaw rate sensor during the data acquisition period, A lane shape is calculated in a section from the position of the vehicle to the rear by a predetermined distance.
- Patent Document 1 the detection results of the vehicle speed sensor and the yaw rate sensor are used to calculate the lane shape in the section from the current position of the vehicle to the rear by a predetermined distance. It is also known that the same thing can be done by using GNSS.
- ADAS Advanced Drive Assistance System
- vehicles equipped with ADAS are generally equipped with a sensor that detects the front white line, but they do not have a sensor that detects the rear white line. It becomes necessary to do
- map-dependent GNSS cannot be used in tunnels, built-up areas, etc., and yaw rate integration has an accumulated error, which increases when changing lanes.
- white line estimation device a white line estimation device that detects the white line ahead of the vehicle, and a white line direction generation unit that calculates the direction of the vehicle relative to the white line at each predetermined time based on the detection result of the white line detection unit.
- a relative direction estimation unit for estimating the direction relative to the direction of the vehicle at a certain point based on the direction of the vehicle obtained by the direction generation unit for the white line.
- a white line detection unit for detecting a white line in front of the vehicle;
- a relative azimuth estimating unit that estimates the relative azimuth from the azimuth of the own vehicle at a certain point based on the azimuth of the own vehicle obtained by the white line azimuth generation unit, and an object that detects a target behind the own vehicle.
- a marker detection unit a movement distance estimation unit for estimating the movement distance of the host vehicle, a relative bearing at each predetermined time estimated by the relative orientation estimation unit, and a movement at each predetermined time estimated by the movement distance estimation unit an own vehicle trajectory estimating unit for estimating the trajectory of the own vehicle relative to the white line based on the distance; based on the position of the white line outside the detection range estimated by the white line position estimation unit, the position of the white line outside the detection range estimated by the white line position estimation unit, and the position of the target detected by the target detection unit. and a target position identification unit that identifies the positional relationship between the target and the white line outside the detection range.”
- a method for estimating a white line behind the vehicle using a computer which detects a white line in front of the vehicle, obtains the direction of the vehicle with respect to the white line at each predetermined time based on the detection result of the white line
- a backward white line estimation method characterized by estimating a relative direction from the direction of the own vehicle at a certain point based on the direction of the own vehicle.
- a method for estimating a white line behind the vehicle using a computer which detects a white line in front of the vehicle, obtains the direction of the vehicle with respect to the white line at predetermined time intervals based on the result of detection of the white line, and calculates the direction of the vehicle.
- the azimuth relative to the azimuth of the own vehicle at a certain point is estimated, the target behind the own vehicle is detected, the distance traveled by the own vehicle is estimated, and each estimated predetermined time based on the relative azimuth and the estimated moving distance at each predetermined time, the trajectory of movement of the own vehicle relative to the white line is estimated, and based on the detection result of the white line detected in the past and the trajectory of movement , the position of the white line outside the detection range of the current white line detection is estimated, and based on the estimated position of the white line outside the detection range and the position of the detected target, the white line and the object outside the detection range are detected.
- a target object recognition method characterized by identifying a positional relationship with a target”.
- FIG. 4 is a flowchart showing the processing contents of a trajectory point sequence derivation unit 30; FIG. The figure which shows the characteristic in each direction detection method.
- FIG. 4 is a flow chart showing the details of processing in an object position identification unit 40; The flowchart which shows the processing content of the backward white line estimation shown in Example 2.
- FIG. 4 is a flowchart showing the processing contents of a trajectory point sequence derivation unit 30; FIG. The figure which shows the characteristic in each direction detection method.
- FIG. 4 is a flow chart showing the details of processing in an object position identification unit 40; The flowchart which shows the processing content of the backward white line estimation shown in Example 2.
- FIG. 4 is a flowchart showing the processing contents of a trajectory point sequence derivation unit 30; FIG. The figure which shows the characteristic in each direction detection method.
- FIG. 4 is a flow chart showing the details of processing in an object position identification unit 40; The flowchart which shows the processing content of the backward white line estimation shown in Example 2.
- FIG. 1 is a diagram showing an example of the overall configuration of a target object recognition device according to the present invention.
- the target object recognition device 10 of FIG. 1 obtains outputs from a plurality of sensors S, and finally outputs the object position and its associated lane.
- These sensors S and their outputs are latitude, longitude and positioning status from GNSS (S1), vehicle speed from speed sensor S2, yaw rate from yaw rate sensor S3, front white line from front camera S4. , which is the object (target) position from the rear radar.
- These sensor inputs are time-series information sampled at regular intervals, and each data includes time information at the time of sampling. It should be noted that the amount of data may be saved by representing the time information by treating the transmission timing by the sensor as time. Also, the amount of data may be saved by performing sensor input only when an object to be detected by the sensor is detected.
- the target object recognition device 10 is a part of the function of a driving support device installed in a vehicle capable of automatic driving, and is a function of outputting lane change propriety information of the driving support device.
- the target object recognition device 10 also includes a control unit configured by a computer in which a CPU, a ROM, a RAM, an input/output device, etc. are connected to a bus. to control.
- the processing functions of the target object recognition device 10 in FIG. is composed of a trajectory point sequence derivation unit 30 that obtains a point sequence and an object position identification unit 40 that obtains the object position behind the vehicle and the lane to which it belongs.
- a trajectory point sequence derivation unit 30 that obtains a vehicle movement trajectory as a trajectory point sequence obtains distance information and azimuth information from the output of the distance/azimuth conversion unit 20 .
- the distance information can be obtained from the latitude, longitude and positioning status from the GNSS (S1) and the vehicle speed information from the speed sensor S2, and the direction information can be obtained from the latitude and longitude from the GNSS (S1). Also, it can be obtained from the positioning state, the yaw rate from the yaw rate sensor S3, and the point sequence group of the forward white line from the front camera S4.
- the latitude, longitude and positioning state from the GNSS (S1) are accumulated in the accumulation unit 21a, and the GNSS distance information 21b, the GNSS orientation accuracy estimation information 21c, and the GNSS orientation information 21d is extracted.
- the GNSS distance information 21b is provided to the movement distance estimation unit 31 of the trajectory point sequence derivation unit 30, and the GNSS orientation accuracy estimation information 21c and GNSS orientation information 21d are provided to the orientation selection unit 32 of the trajectory point sequence derivation unit 30.
- the cumulative distance information 22b obtained by integrating the vehicle speed from the speed sensor S2 is given to the movement distance estimation unit 31 of the locus point sequence derivation unit 30.
- the yaw rate from the yaw rate sensor S3 is integrated to obtain cumulative azimuth information 23d, and the azimuth accuracy is estimated to obtain azimuth accuracy estimation information 23c.
- the azimuth accuracy is high when the distance is short and the displacement is small, but the accuracy decreases when the distance is long and the displacement is large.
- These cumulative direction information 23 d and direction accuracy estimation information 23 c are given to the direction selection section 32 of the trajectory point sequence derivation section 30 .
- the distance/azimuth conversion unit 20 newly generates the direction-to-white-line direction generation information 24d from the point sequence group of the front white line from the front camera S4, and estimates the direction accuracy at this time to estimate the direction accuracy estimation information. 24c, and the white line history is held in the white line history holding unit 24e.
- the white line direction generation information 24 d and the direction accuracy estimation information 23 c are given to the direction selection section 32 of the trajectory point sequence derivation section 30 .
- the azimuth accuracy in this case is high when the change in the white line is small, but the accuracy decreases when the change in the white line is large.
- a method of calculating the direction-to-white line generation information 24d will be described in detail in a second embodiment.
- the trajectory point sequence deriving unit 30 estimates the moving distance from the GNSS distance information 21b included in the GNSS information in the moving distance estimating unit 31 and the cumulative distance information 22b obtained by integrating the vehicle speed.
- the idea of moving distance estimation is to use the difference in GNSS position if GNSS is in better conditions, and use GNSS when the positioning state deteriorates due to obstacles such as terrain, tunnels and buildings, radio wave conditions, etc. is inappropriate, it is preferable to estimate and output the cumulative distance information 22b obtained by time-integrating the vehicle speed as the distance.
- the trajectory point sequence derivation unit 30 preferably selects and outputs one of the GNSS, yaw, and forward white line point sequence groups selected by the azimuth selection unit 32 .
- the information (21c, 23c, 24c) of each azimuth accuracy obtained accompanying each point sequence group of GNSS, yaw, and forward white line serves as a criterion for selection.
- FIG. 2 shows a flow showing the processing contents of the trajectory point sequence derivation unit 30.
- FIG. This flow is repeatedly executed for each predetermined control cycle as shown in processing step St0.
- Azimuth accuracy information (21c, 23c, 24c) is input.
- the condition that the GNSS is good and the speed is high, or the condition that the GNSS is good and the change in the cumulative heading 23d is small is determined. 21d is selected as orientation information.
- processing step St4 If the condition is not satisfied (no) in the judgment of processing step St2, the judgment of processing step St4 or processing step St5 is performed.
- the processing step St4 when the accumulated direction 23d changes a lot and the direction information 24d based on the point sequence group of the forward white line can be used, the direction to the white line 24d is selected as the direction information in the processing step St6.
- the direction 23d based on the yaw rate is selected as the direction information in the processing step St7. do.
- processing step st8 If none of the determinations in processing steps St2, St4, and St5 match (no), in processing step st8, for the time being, it is preferable to maintain the selection result in the previous control cycle as a second best measure. .
- Various conditions for the above determination may be directly detected, or information (21c, 23c, 24c) of each bearing accuracy may be used.
- the result of selection of the azimuth information is passed to the vehicle trajectory estimating section 33, which is the next processing section, together with the distance information from the moving distance estimating section 31. After this selection, the process returns to the processing step St0, and waits until activation is started in the next control cycle.
- the selection shown in Fig. 2 reflects the characteristics of each azimuth detection method as shown in Fig. 3.
- the vertical axis indicates the direction estimation method using GNSS, yaw, and a point sequence group of forward white lines.
- the conditions for accuracy are listed.
- GNSS is unsuitable for use in tunnels and built-up areas, and the conditions under which highly accurate detection is possible are when driving at high speeds and when positioning is good.
- the yaw rate can be used at any time because there are no situations in which it cannot be used, but the high-precision conditions are when the distance is small and the displacement is small. Conversely, this means that when the distance or the displacement is large, the accuracy is lowered and it becomes unsuitable for use.
- the direction based on the point sequence group of the forward white line cannot be used in an environment where there is no white line, and can be highly accurate when the change in the white line is small.
- the own vehicle trajectory estimation unit 33 in the trajectory point sequence derivation unit 30 performs processing at, for example, a 50 ms interval control cycle (hereinafter sometimes referred to as a frame) using distance information and displacement information.
- a 50 ms interval control cycle hereinafter sometimes referred to as a frame
- three kinds of orientations detected from the point sequence group of the forward white line are stored.
- the current position and direction are accumulated from the direction and movement distance.
- the heading means accumulating the current position and heading using the difference from the heading ⁇ previous reference heading> stored in the same manner in the previous control cycle.
- the accumulated current position/direction is obtained as the information of the trajectory point sequence.
- the object position identification unit 40 stores the information of the trajectory point sequence regarding the current position and direction from the own vehicle trajectory estimation unit 33 and the white line history that holds the point sequence of the front white line captured by the front camera S4. information 24e and information on the position of the object grasped by the rear radar S5.
- the object position identification unit 40 obtains the target ( (Radar target) in the center coordinates of the vehicle, the difference between the current vehicle direction and the direction saved in the same way as the previous time (relative direction), and the current ground-fixed coordinates of the vehicle (vehicle trajectory). Enter each data for the position of .
- the vehicle center coordinates are a coordinate system based on the position and direction of the vehicle.
- the vehicle center coordinates are subject to change as the vehicle moves or changes direction.
- the ground-fixed coordinates are a coordinate system based on a specific position outside the vehicle, such as on a road.
- Ground-fixed coordinates are basically a coordinate system that is constant regardless of the movement of the vehicle. However, as in the case of using UTM, a plurality of ground-fixed coordinates may be switched according to the movement of the own vehicle to reduce errors caused by the fact that the earth's surface is not flat.
- the position and orientation that serve as the reference of the ground-fixed coordinates can be determined arbitrarily. For example, the coordinates of the center of the vehicle at a certain specific time may be used as the ground-fixed coordinates without changing the reference due to the movement of the vehicle. .
- Ground-fixed coordinates in this example do not require the vehicle's position on the earth like UTM, so it can be used in situations where GNSS is never available.
- the calculation load may be reduced by assuming that the road surface is flat and omitting the z-axis and representing the coordinates in two dimensions.
- the position of the radar target is projected onto the ground-fixed coordinates in processing step St12, and the white line is detected at the position closest to the ground-fixed coordinates of the radar target in processing step St13.
- a frame in the vehicle trajectory is extracted, and in processing step St14, the radar target and the white line are compared with the vehicle center coordinates in the frame to identify the lane.
- the relative azimuth from a specific point of the azimuth to the white line is used to reduce the azimuth error. Then, lane identification can be performed.
- the execution of the processing in this embodiment may be limited to cases where the object position and the lane to which it belongs are necessary, such as when there is a following vehicle with a risk of collision, so that necessary computer resources may be saved.
- the present invention is implemented when a sensor capable of detecting the rear white line such as a rear camera is installed, and by comparing the detection result of the sensor capable of detecting the rear white line with the rear white line estimated by this process, Accuracy may be increased or redundancy may be increased.
- FIG. 5 shows a specific processing flow example of backward white line estimation.
- the processing flow of FIG. 5 can be roughly divided into the processing of the vehicle direction ⁇ calculation unit St30, the movement distance and movement trajectory estimation unit St40, and the lane identification unit St50.
- the vehicle direction ⁇ calculation unit St30 corresponds to the processing of the direction to the white line generation 24d in FIG.
- the lane identifying section St50 corresponds to the processing of the object position identifying section 40 in FIG.
- a processing step St31 in each frame (control cycle: 50 ms cycle), a plurality of time-series white line point sequences detected by the sensor S4 (front camera) are detected. Get (array of x,y arrays).
- a parameter is obtained for each white line dot sequence.
- Atan2 is a standard C language function that receives the y-coordinate value and x-coordinate value of a point in that order as arguments, and returns the angle in radians that a straight line connecting the origin and the argument points forms with the x-axis. be.
- processing step St33 When it is determined in processing step St33 that the white line has not been detected, the process proceeds to processing step St35, and since the white line cannot be seen, it is estimated that the orientation of the previous frame is continued. It is also possible to use an integrated value based on the yaw rate when a curve or white line is not detected.
- step St32 an example of linear approximation is shown, but this may be a polynomial approximation as an approximation curve.
- the relative angle can be similarly obtained by differentiation.
- the moving distance and movement trajectory estimation unit St40 in the processing flow of FIG. Calculate and integrate.
- the moving distance is obtained as the product of the speed and the elapsed time from the previous frame
- the moving direction is obtained by adding the product of the yaw rate and the elapsed time from the previous frame to the heading ⁇ of the vehicle.
- This calculation method is simple and contains an error, but the error can be reduced by dividing the elapsed time from the previous frame.
- the white line position at the target position is obtained and the lane is identified. Specifically, for example, approximated curve parameters (quadratic approximation + value range) are calculated and recorded for each frame, and the position is set at which the white line near the target can be detected with the highest quality. For example, if the position 20m ahead is the easiest to grasp, then the position is 20m behind. Using this, the frame of the self position is obtained and compared with the white line position at that time to identify the lane.
- approximated curve parameters quadrattic approximation + value range
- the method of estimating the white line behind the second embodiment it is possible to provide a third method other than estimating the white line behind the GNSS and the yaw rate, and based on the advantages and disadvantages of each method, the best method is to combine them.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (13)
- 自車両前方の白線を検知する白線検知部と、前記白線検知部の検知結果に基づいて、所定の時刻毎に前記白線に対する自車両の方位を求める対白線方位生成部と、前記対白線方位生成部で求められた前記自車両の方位に基づいて、ある地点での前記自車両の方位からの相対的な方位を推定する相対方位推定部と、を有することを特徴とする後方白線推定装置。
- 請求項1に記載の後方白線推定装置であって、
前記白線の検知結果は一定周期での白線の点列であって、白線の点列を関数にて表記し、相違する時刻間での白線の点列位置から方位を求めることを特徴とする後方白線推定装置。 - 自車両前方の白線を検知する白線検知部と、前記白線検知部の検知結果に基づいて、所定の時刻毎に前記白線に対する自車両の方位を求める対白線方位生成部と、前記対白線方位生成部で求められた前記自車両の方位に基づいて、ある地点での前記自車両の方位からの相対的な方位を推定する相対方位推定部と、前記自車両の後方の物標を検知する物標検知部と、前記自車両の移動距離を推定する移動距離推定部と、前記相対方位推定部で推定した所定の時刻毎の前記相対的な方位と、前記移動距離推定部で推定した前記所定の時刻毎の前記移動距離と、に基づいて、前記白線に対する前記自車両の移動軌跡を推定する自車両軌跡推定部と、過去に検知した前記白線検知部の検知結果と、前記移動軌跡と、に基づいて、現在の前記白線検知部の検知範囲外にある白線の位置を推定する白線位置推定部と、前記白線位置推定部で推定した前記検知範囲外にある白線の位置と、前記物標検知部で検知した前記物標の位置と、に基づいて、前記検知範囲外にある白線と前記物標との位置関係を識別する物標位置識別部と、を有することを特徴とする物標認識装置。
- 請求項3に記載の物標認識装置であって、
前記白線の検知結果は周期的に出力される白線の点列であって、白線の点列を関数にて表記し、相違する時刻間での白線の点列位置から方位を求めることを特徴とする物標認識装置。 - 請求項3に記載の物標認識装置であって、
自車両軌跡推定部は、前記白線に対する前記自車両の移動軌跡を推定するために相対方位の変化を用いることを特徴とする物標認識装置。 - 請求項3に記載の物標認識装置であって、
前記相対方位推定部は、前記自車両の方位に基づいて、ある地点での前記自車両の方位からの相対的な方位を推定するにあたり、前記方位を前記対白線方位生成部で求めた方位と、GNSSから求めた方位と、ヨーレートから求めた方位のいずれかを選択する方位選択部を備えることを特徴とする物標認識装置。 - 請求項6に記載の物標認識装置であって、
前記方位選択部は、常時はGNSSから求めた方位を選択しており、GNSSの利用が不適当な場面で、前記対白線方位生成部で求めた方位、またはヨーレートから求めた方位のいずれかを選択することを特徴とする物標認識装置。 - 請求項6または請求項7に記載の物標認識装置であって、
前記方位選択部は、GNSSが良好で速度が速い場合、またはGNSSが良好で累積方位の変化が少ないという第1の条件を満たすときにGNSSから求めた方位を選択し、前記第1の条件を満たさない時に、累積方位の変化が多く、前方白線の点列群による方位が使える場合には、前方白線の点列群による方位を選択し、前記第1の条件を満たさない時に、累積方位の変化が少なく、前方白線の点列群による方位が使えない場合には、ヨーレートによる方位を選択することを特徴とする物標認識装置。 - 請求項3に記載の物標認識装置であって、
自車両の位置と方位を関連付けて保存することを特徴とする物標認識装置。 - 請求項3に記載の物標認識装置であって
前記白線位置推定部は、検知した過去の白線の位置を記録し、記録に近似パラメータを用いることを特徴とする物標認識装置。 - 請求項3に記載の物標認識装置であって
白線位置推定部は、物標に最も近い白線を捉えた時刻での移動軌跡上での自車位置と記録された白線を用いることを特徴とする物標認識装置。 - コンピュータを用いる後方白線推定方法であって、
自車両前方の白線を検知し、前記白線の検知結果に基づいて、所定の時刻毎に前記白線に対する自車両の方位を求め、前記自車両の方位に基づいて、ある地点での前記自車両の方位からの相対的な方位を推定することを特徴とする後方白線推定方法。 - コンピュータを用いる物標認識方法であって、
自車両前方の白線を検知し、前記白線の検知結果に基づいて、所定の時刻毎に前記白線に対する自車両の方位を求め、前記自車両の方位に基づいて、ある地点での前記自車両の方位からの相対的な方位を推定し、前記自車両の後方の物標を検知し、前記自車両の移動距離を推定し、前記推定した所定の時刻毎の前記相対的な方位と、推定した前記所定の時刻毎の前記移動距離と、に基づいて、前記白線に対する前記自車両の移動軌跡を推定し、過去に検知した前記白線の検知結果と、前記移動軌跡と、に基づいて、現在の白線検知の検知範囲外にある白線の位置を推定し、推定した前記検知範囲外にある白線の位置と、検知した前記物標の位置と、に基づいて、前記検知範囲外にある白線と前記物標との位置関係を識別することを特徴とする物標認識方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/566,585 US20240282123A1 (en) | 2021-06-04 | 2022-02-04 | Rearward white line inference device, target recognition device, and method |
JP2023525374A JPWO2022254788A1 (ja) | 2021-06-04 | 2022-02-04 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-094127 | 2021-06-04 | ||
JP2021094127 | 2021-06-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022254788A1 true WO2022254788A1 (ja) | 2022-12-08 |
Family
ID=84324098
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/004439 WO2022254788A1 (ja) | 2021-06-04 | 2022-02-04 | 後方白線推定装置、物標認識装置並びに方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240282123A1 (ja) |
JP (1) | JPWO2022254788A1 (ja) |
WO (1) | WO2022254788A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120062747A1 (en) * | 2010-07-20 | 2012-03-15 | Gm Global Technology Operations, Inc. | Lane fusion system using forward-view and rear-view cameras |
WO2015194371A1 (ja) * | 2014-06-19 | 2015-12-23 | 日立オートモティブシステムズ株式会社 | 物体認識装置及びそれを用いた車両走行制御装置 |
JP2019046150A (ja) * | 2017-09-01 | 2019-03-22 | 株式会社Subaru | 走行支援装置 |
JP2019194037A (ja) * | 2018-05-01 | 2019-11-07 | 三菱電機株式会社 | 区画線認識装置 |
-
2022
- 2022-02-04 US US18/566,585 patent/US20240282123A1/en active Pending
- 2022-02-04 JP JP2023525374A patent/JPWO2022254788A1/ja active Pending
- 2022-02-04 WO PCT/JP2022/004439 patent/WO2022254788A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120062747A1 (en) * | 2010-07-20 | 2012-03-15 | Gm Global Technology Operations, Inc. | Lane fusion system using forward-view and rear-view cameras |
WO2015194371A1 (ja) * | 2014-06-19 | 2015-12-23 | 日立オートモティブシステムズ株式会社 | 物体認識装置及びそれを用いた車両走行制御装置 |
JP2019046150A (ja) * | 2017-09-01 | 2019-03-22 | 株式会社Subaru | 走行支援装置 |
JP2019194037A (ja) * | 2018-05-01 | 2019-11-07 | 三菱電機株式会社 | 区画線認識装置 |
Also Published As
Publication number | Publication date |
---|---|
US20240282123A1 (en) | 2024-08-22 |
JPWO2022254788A1 (ja) | 2022-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4343536B2 (ja) | 車用の感知装置 | |
US11300415B2 (en) | Host vehicle position estimation device | |
JP5152244B2 (ja) | 追従対象車特定装置 | |
US8428843B2 (en) | Method to adaptively control vehicle operation using an autonomic vehicle control system | |
US8112223B2 (en) | Method for measuring lateral movements in a driver assistance system | |
EP2590152B1 (en) | Device for estimating vehicle travel path | |
US10493987B2 (en) | Target-lane relationship recognition apparatus | |
RU2682137C1 (ru) | Способ управления движением и устройство управления движением | |
JP7143722B2 (ja) | 自車位置推定装置 | |
US11608059B2 (en) | Method and apparatus for method for real time lateral control and steering actuation assessment | |
US11326889B2 (en) | Driver assistance system and control method for the same | |
JP6988873B2 (ja) | 位置推定装置および位置推定用コンピュータプログラム | |
JP2022023388A (ja) | 車両位置決定装置 | |
US20220204046A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
KR20210029323A (ko) | 정밀 지도를 이용한 센서 퓨전의 인지 성능 향상을 위한 장치 및 방법 | |
US20220155455A1 (en) | Method and system for ground surface projection for autonomous driving | |
KR101470231B1 (ko) | 차량의 주행을 제어하는 방법과 이를 수행하는 주행 제어 장치 및 차량 제어 시스템 | |
WO2022254788A1 (ja) | 後方白線推定装置、物標認識装置並びに方法 | |
US20220289185A1 (en) | Vehicle controller and method for controlling vehicle | |
KR20200107382A (ko) | 차량용 센서의 보정 정보 획득 장치 및 방법 | |
JP7400708B2 (ja) | センサ評価システム、センサ評価装置、車両 | |
WO2020230619A1 (ja) | 車両制御システム | |
CN112415516B (zh) | 一种车辆前方障碍区域感知方法及装置 | |
JP2022149051A (ja) | 地図生成装置、地図生成システム、地図生成方法、およびプログラム | |
CN111469841B (zh) | 一种弯道目标选择方法、车载设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22815547 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023525374 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18566585 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22815547 Country of ref document: EP Kind code of ref document: A1 |