WO2018168961A1 - Own-position estimating device - Google Patents

Own-position estimating device Download PDF

Info

Publication number
WO2018168961A1
WO2018168961A1 PCT/JP2018/010068 JP2018010068W WO2018168961A1 WO 2018168961 A1 WO2018168961 A1 WO 2018168961A1 JP 2018010068 W JP2018010068 W JP 2018010068W WO 2018168961 A1 WO2018168961 A1 WO 2018168961A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
information
vehicle
position estimation
self
Prior art date
Application number
PCT/JP2018/010068
Other languages
French (fr)
Japanese (ja)
Inventor
雄一 南口
健司 三宅
竜巳 杉山
和美 伊佐治
稔 岡田
謙太 高橋
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017248745A external-priority patent/JP6870604B2/en
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2018168961A1 publication Critical patent/WO2018168961A1/en
Priority to US16/568,606 priority Critical patent/US11639853B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • This disclosure relates to a self-position estimation apparatus.
  • a self-localization device described in Patent Document 1 below is known as an apparatus for estimating a self-position of a vehicle.
  • the self-localization device described in Patent Document 1 below uses existing road infrastructure such as white lines and road signs on the road for position calculation using GPS (Global Positioning System), inertial devices, and vehicle speed pulses. This will increase the orientation accuracy.
  • GPS Global Positioning System
  • inertial devices Inertial devices
  • vehicle speed pulses This will increase the orientation accuracy.
  • the azimuth angle of the white line reflected in the image captured using the camera is calculated, and the difference between the azimuth angle of the white line stored in the azimuth database by the Kalman filter and the azimuth angle of the white line calculated from the image is calculated. Based on this, error estimation is performed.
  • Patent Document 1 since an image captured using a camera is used, an error cannot be estimated correctly if the image cannot be clearly obtained, such as when the weather is bad. In particular, if position estimation at the lane level is required, the technique described in Patent Document 1 cannot cope with it. In advanced driving support and automatic driving, it is necessary to specify the lane and the driving position within the lane, so that more accurate self-position estimation is required.
  • This disclosure aims to provide a self-position estimation device capable of highly accurate position estimation at the lane level.
  • the present disclosure relates to a self-position estimation device, a map information acquisition unit (201) that acquires map information including lane information that identifies a lane in which a vehicle can travel, and a position in a lane in which the host vehicle is traveling.
  • a vehicle position estimation unit (204) that estimates the position of the vehicle corresponding to the lane information included in the map information based on the map information, the position information in the lane, and the absolute position information. .
  • the host vehicle position estimation unit determines whether there is a lane corresponding candidate for which of the lanes specified by the lane information. Based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
  • the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
  • FIG. 1 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 2 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 3 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 4 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 5 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 6 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 7 is a diagram for explaining self-position estimation according to the present embodiment.
  • FIG. 8 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 1 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 2 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 3 is a block configuration diagram illustrating a functional
  • FIG. 9 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 10 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 11 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 12 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 13 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 14 is a diagram for describing self-position estimation according to the present embodiment.
  • the self-position estimation apparatus 10 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 10 will be described.
  • the self-position estimation apparatus 10 includes a self-position measurement unit 101, a vehicle momentum measurement unit 102, a white line recognition unit 103, a surrounding environment measurement unit 104, a route information acquisition unit 105, a dead reckoning 106, and a position estimation unit 108.
  • the self-position measuring unit 101 is a part for measuring the position of the own vehicle by GNSS (Global Navigation Satellite System).
  • the self-position measuring unit 101 calculates a vehicle measurement position that is a navigation measurement position of the vehicle based on navigation signals received from a plurality of navigation satellites.
  • the self-position measuring unit 101 outputs the calculated own vehicle measurement position to the dead reckoning 106 and the position estimating unit 108.
  • the vehicle momentum measuring unit 102 is a part that receives signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor and measures the amount of movement of the vehicle.
  • the vehicle momentum measuring unit 102 outputs information on the momentum such as the vehicle speed, the azimuth angle, the yaw rate, and the acceleration to the dead reckoning 106 and the position estimating unit 108.
  • the white line recognition unit 103 is a part for recognizing a white line that divides a lane using image data captured by the camera.
  • the white line recognition unit 103 outputs information on the presence or absence of a white line and information on the type of white line to the position estimation unit 108.
  • the surrounding environment measurement unit 104 is a part that measures the weather and satellite arrangement information.
  • the surrounding environment measurement unit 104 outputs weather and satellite arrangement information to the position estimation unit 108.
  • the route information acquisition unit 105 is a part that acquires the destination of the vehicle and the route to the destination from the navigation system.
  • the route information acquisition unit 105 outputs information indicating the destination and the route to the travel lane estimation unit 110.
  • the dead reckoning 106 automatically determines the position of the host vehicle in a place where positioning is difficult with GNSS alone, based on the host vehicle measurement position output from the host position measurement unit 101 and the momentum information output from the vehicle momentum measurement unit 102. It is a part which calculates as a car gyro position.
  • the dead reckoning 106 outputs the calculated own vehicle gyro position of the own vehicle to the position estimation unit 108.
  • the map information acquisition unit 109 is a part that acquires map information including lane information on which the vehicle can travel.
  • the map information acquisition unit 109 reads the map information stored in the map information storage unit 120 and outputs the read map information to the position estimation unit 108 and the travel lane estimation unit 110.
  • the position estimation unit 108 is a part that estimates a corrected vehicle position that is a corrected position of the vehicle based on the map information and the vehicle measurement position and / or the vehicle gyro position.
  • the position estimation unit 108 estimates the corrected vehicle position by superimposing the accuracy of the map information and the vehicle measurement position and / or the vehicle gyro position.
  • a probability distribution representing the probability may be used, or a numerical value representing the certainty may be used.
  • a solid white line SLa is provided on the left side in the traveling direction of the lane L1.
  • a broken line white line BLa is provided between the lane L1 and the lane L2.
  • a broken line white line BLb is provided between the lane L2 and the lane L3.
  • a solid white line SLb is provided on the right side in the traveling direction of the lane L3.
  • the lane center line L1c is a line indicating the center of the lane L1.
  • the lane center line L2c is a line indicating the center of the lane L2.
  • the lane center line L3c is a line indicating the center of the lane L3.
  • the map probability distribution PDm of the lane center line L1c, the lane center line L2c, and the lane center line L3c is regarded as the likelihood of the map information.
  • the host vehicle is at the host vehicle measurement position Pa.
  • the host vehicle is traveling along the lane L1 from the host vehicle measurement position Pa.
  • the own vehicle gyro position may be used instead of the own vehicle measurement position.
  • the position estimation unit 108 estimates the corrected vehicle position Pb by superimposing the vehicle position probability distribution PDca and the map information probability distribution PDm at the vehicle measurement position Pa at the first estimation timing.
  • the corrected host vehicle position Pb is corrected to the lane center line L1c side by the distance d1 as compared with the case where the correction is not performed from the host vehicle measurement position Pa.
  • the position estimation unit 108 estimates the corrected vehicle position Pc by superimposing the vehicle position probability distribution PDcb at the corrected vehicle position Pb and the map information probability distribution PDm at the next estimation timing.
  • the corrected vehicle position Pc is corrected to the lane center line L1c side by the distance d2 as compared to the case where no correction is made from the corrected vehicle position Pb.
  • the map probability distribution is not limited to the probability distribution of the lane center line, but a probability distribution indicating the certainty of the map information is used. Further, a map probability distribution offset by the driver's bag or road shape may be used.
  • the road shape includes information such as the road width and the presence or absence of an adjacent lane.
  • the self-position estimation apparatus 20 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 20 will be described.
  • the self-position estimation apparatus 20 includes a map information acquisition unit 201, a lane position detection unit 202, an absolute position estimation unit 203, a vehicle position estimation unit 204, a comparison target detection unit 205, and a map information storage unit 211. It is equipped with.
  • the map information acquisition unit 201 is a part that acquires map information including lane information in which the vehicle can travel.
  • the map information acquisition unit 201 reads the map information stored in the map information storage unit 211 and outputs the read map information to the own vehicle position estimation unit 204.
  • the in-lane position detection unit 202 is a part that detects in-lane position information that specifies an in-lane position that is a position in the lane in which the host vehicle is traveling.
  • the in-lane position detection unit 202 detects in-lane position information based on the surrounding environment and lane conditions captured by the camera.
  • the in-lane position detection unit 202 outputs in-lane position information to the own vehicle position estimation unit 204.
  • the in-lane position detection unit 202 identifies the in-lane position from the lateral deviation and turning angle of the host vehicle, and generates in-lane position information.
  • the absolute position estimation unit 203 shown in FIG. 3 is a part that estimates absolute position information that specifies the absolute position of the host vehicle and its error.
  • the absolute position estimation unit 203 outputs the estimated absolute position information to the own vehicle position estimation unit 204.
  • the absolute position estimation unit 203 can estimate the absolute position information by various methods.
  • the absolute position estimation unit 203 can estimate absolute position information that specifies the absolute position of the host vehicle and its error by GNSS.
  • the absolute position estimation unit 203 calculates a host vehicle measurement position that is a navigation measurement position of the host vehicle based on navigation signals received from a plurality of navigation satellites, and estimates absolute position information based on the host vehicle measurement position. Can do.
  • the absolute position estimation unit 203 can also receive signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor, and measure the amount of movement of the host vehicle.
  • the absolute position estimation unit 203 calculates, based on the information indicating the own vehicle measurement position and the amount of movement of the own vehicle, the position of the own vehicle at a location where positioning is difficult by GNSS alone, as the own vehicle gyro position. To estimate absolute position information.
  • FIG. 4 shows an example of a mode estimated by the absolute position estimation unit 203.
  • the absolute position estimation unit 203 estimates “candidate 1” and “candidate 2” as absolute position information.
  • Candidate 1 estimates current position X i t based on the amount of movement of the vehicle from the previous position X i t-1 .
  • the current position X i t is position information including an estimation error.
  • Candidate 2 estimates current position X j t in consideration of the amount of movement of the host vehicle from previous position X j t ⁇ 1 .
  • the current position X j t is position information including an estimation error.
  • the own vehicle position estimation unit 204 shown in FIG. 3 is a part that estimates the position of the own vehicle corresponding to the lane information included in the map information based on the map information, the in-lane position information, and the absolute position information. is there.
  • the own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated. If there is no lane candidate, it may be determined that the sensor is abnormal.
  • “Gyro estimation candidate 1” and “Gyro estimation candidate 2” shown in FIG. 6 correspond to “candidate 1” and “candidate 2” shown in FIG. 4, and the estimation error of “candidate 1” in FIG. And the estimation error of “candidate 2” is superimposed on the map information.
  • the “camera observation lateral deviation” shown in FIG. 6 is obtained by superimposing the “lateral deviation” shown in FIG. 5 on the map information for each lane.
  • P t-1 is the covariance matrix at the previous time.
  • y t-1 is the horizontal position of the previous time.
  • ⁇ t-1 is the attitude angle at the previous time.
  • the lateral position y t and the posture angle ⁇ t are calculated by the following equation.
  • is system noise.
  • the estimated covariance matrix Pt by the gyro sensor is calculated by the following equation using the covariance matrix P t-1 at the previous time.
  • M is a covariance matrix of the system noise ⁇ , and is set based on the error specification of the gyro sensor.
  • T represents a transposed matrix.
  • the information shown in FIG. 6 is sensor-fused by a Kalman filter.
  • the Kalman gain K is calculated by the following equation.
  • Q is an observation error matrix, and is set from the error characteristics of the observed values at the positions in the lane.
  • H is a 2 ⁇ 2 unit matrix.
  • Z t is a vector in which the horizontal position and the posture angle detected by the gyro sensor are vertically arranged.
  • ⁇ Z is the difference between the observation value by the camera of the in-lane position and the attitude angle and the estimation result Z t by the gyro sensor.
  • the lane corresponding candidates are lane 1 and lane 2.
  • the comparison target detection unit 205 when there are a plurality of lane matching candidates, the comparison target detection unit 205 is provided to reject any lane matching candidate.
  • the comparison object detection unit 205 is a part that detects comparison object information that is different from information used for detection of in-lane position information and estimation of absolute position information.
  • comparison target information line type information for determining lanes, road shape information in map information, position information by GPS, and the like are used.
  • the own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
  • line type information is used as comparison target information.
  • the comparison target detection unit 205 detects a solid line on the left side and a broken line on the right side as line type information. Based on the line type recognition result, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and selects lane candidates. Execute. In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
  • GPS information is used as comparison target information when both lanes are lane candidates.
  • the comparison target detection unit 205 acquires position information including an error indicated by a broken-line circle in the figure as GPS information. Based on the GPS information, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and executes selection of lane candidates. . In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
  • map information is used as comparison target information when candidates exist in both lanes.
  • the comparison target detection unit 205 detects the road shape as the comparison target information as the map information.
  • the vehicle position estimation unit 204 rejects the lane on the left side of the traveling direction as a candidate because the candidate on the left side in the traveling direction does not overlap with the map information as time elapses, and executes selection of the lane candidate.
  • the candidate corresponding to the lane is the lane on the right side in the traveling direction, and the lane on the left side in the traveling direction after the lane change.
  • the own vehicle position estimation unit 204 can notify that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. As shown in FIG. 13, when there is one candidate, if the error variance is small, it is determined that the reliability is high. Even if there is only one candidate, if the error variance is large, it is determined that the reliability is low. When there are a plurality of candidates, it is determined that the reliability is low because the lane cannot be specified even when the variance of the error is small. If there are a plurality of candidates and the error variance is large, it is determined that the reliability is low.
  • FIG. 14A shows the position recognition result in the lane.
  • FIG. 14B shows the recognition result of the vehicle position.
  • FIG. 14C shows the number of lane candidates.
  • FIG. 14D shows the reliability.
  • the position in the lane is recognized.
  • the position of the host vehicle is recognized on the right lane.
  • the position in the lane is not recognized. For this reason, the error in recognizing the position of the host vehicle is large, and the reliability is low.
  • the in-lane position recognition is performed again, so that the error in the position recognition of the host vehicle is reduced and the reliability is also increased.
  • the position in the lane is not recognized, and the reliability is low again.
  • the self-position estimation device 20 is a map information acquisition unit 201 that acquires map information including lane information that identifies a lane in which the vehicle can travel, and a position in the lane in which the host vehicle is traveling.
  • In-lane position detection unit 202 that detects in-lane position information that specifies the in-lane position, an absolute position estimation unit 203 that estimates absolute position information that specifies the absolute position of the host vehicle and its error, map information, in-lane A host vehicle position estimation unit 204 that estimates the position of the host vehicle corresponding to the lane information included in the map information based on the position information and the absolute position information.
  • the own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
  • the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
  • the vehicle position estimation unit 204 determines a lane candidate based on the degree of overlap between the position in the lane including the error distribution and the absolute position including the error distribution. By superimposing the error distribution in both the in-lane position and the absolute position, the position of the host vehicle can be estimated by reflecting the error.
  • the host vehicle position estimation unit 204 continues the position estimation of the host vehicle corresponding to the plurality of lane candidates until the selection condition is satisfied, when a plurality of lane candidates are determined, and satisfies the selection condition. And selecting a plurality of lane candidates. When the selection condition is satisfied, a plurality of discriminated lane candidates can be selected and rejected, so that the position estimation load of the host vehicle corresponding to the plurality of lane candidates can be reduced.
  • the self-position estimation apparatus 20 further includes a comparison target detection unit 205 that detects comparison target information different from information used for detection of in-lane position information and absolute position information.
  • the own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
  • the lane candidate candidates are selected based on the comparison target information, the lane candidate candidates can be selected from a different point of view. By increasing the reliability of the comparison target information, unnecessary lane candidates can be rejected.
  • the comparison target detection unit 205 is line type information detected from the image captured by the imaging device provided in the host vehicle as the comparison target information, and classifies the lane in which the host vehicle is traveling.
  • the running line type information that identifies the line type of at least one lane boundary line is detected, and the vehicle position estimation unit 204 is traveling with the map line type information that identifies the line type of the lane boundary line included in the lane information. If the line type information does not match, it is determined that the selection condition is satisfied, and the lane candidate that does not match is rejected.
  • the map line type information specifies that there are broken lines on the right and left sides as viewed from the host vehicle, for example, the line type specified by the running line type information is a continuous line on the right side and a broken line on the left side Since the map line type information and the running line type information are inconsistent, it is determined that the selection condition is satisfied, and the corresponding lane candidate is rejected. Since it is considered that the reliability of the traveling line type information detected from the image captured by the imaging device is high, unnecessary lane candidates can be rejected.
  • the absolute position estimation unit 203 estimates absolute position information based on the detection result of the gyro sensor, and the comparison target detection unit 205 calculates the comparison target information based on navigation signals received from a plurality of navigation satellites.
  • the vehicle position estimation unit 204 detects and determines that the selection condition is satisfied when the lane candidate and the position of the host vehicle specified by the comparison target information do not overlap, and the lane candidate that does not overlap. Reject.
  • the comparison target information is generated based on navigation signals received from a plurality of navigation satellites, if the reception status of the navigation signals is good, the reliability can be improved more than the absolute position information based on the detection result of the gyro sensor. Yes, you can reject unnecessary lane candidates.
  • the own vehicle position estimation unit 204 determines that the selection condition is satisfied when there are a plurality of lane candidates and at least one of the plurality of lane candidates does not overlap with the lane information, and the overlap does not occur. Dismiss the corresponding lane candidate.
  • the overlapping lane candidates are selected and the lanes that do not overlap By rejecting the candidates, unnecessary lane candidates can be rejected.
  • the own vehicle position estimation unit 204 notifies that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. By notifying that the reliability of the estimation result is low, the user can recognize the decrease in the reliability.

Abstract

A host vehicle position estimating unit (204) of an own-position estimating device (20) assesses the presence or absence of a lane correspondence candidate indicating whether a within-lane position corresponds to any lane specified by lane information, on the basis of mutual relationships between the within-lane position, an absolute position, and errors therein, and estimates the position of a host vehicle corresponding to map information on the basis of the assessment result.

Description

自己位置推定装置Self-position estimation device 関連出願の相互参照Cross-reference of related applications
 本出願は、2017年12月26日に出願された日本国特許出願2017-248745号と、2017年12月26日に出願された日本国特許出願2017-248744号と、2017年3月16日に出願された日本国特許出願2017-051066号と、に基づくものであって、その優先権の利益を主張するものであり、その特許出願の全ての内容が、参照により本明細書に組み込まれる。 This application consists of Japanese Patent Application No. 2017-248745 filed on December 26, 2017, Japanese Patent Application No. 2017-248744 filed on December 26, 2017, and March 16, 2017. Patent application No. 2017-051066 filed in Japan and claims the benefit of its priority, the entire contents of which are incorporated herein by reference .
 本開示は、自己位置推定装置に関する。 This disclosure relates to a self-position estimation apparatus.
 車両の自己位置推定をするものとして下記特許文献1に記載の自己位置標定装置が知られている。下記特許文献1に記載の自己位置標定装置は、GPS(Global Positioning System)や慣性装置や車速パルスを用いた位置算出に対し、道路上の白線や道路標識といった既存の道路インフラを利用して、標定精度を上げるものである。 2. Description of the Related Art A self-localization device described in Patent Document 1 below is known as an apparatus for estimating a self-position of a vehicle. The self-localization device described in Patent Document 1 below uses existing road infrastructure such as white lines and road signs on the road for position calculation using GPS (Global Positioning System), inertial devices, and vehicle speed pulses. This will increase the orientation accuracy.
 具体的には、カメラを用いて撮像した画像に映った白線の方位角を算出し、カルマンフィルタが方位角データベースに記憶されている白線の方位角と画像から算出した白線の方位角との差分に基づいて誤差推定を行っている。 Specifically, the azimuth angle of the white line reflected in the image captured using the camera is calculated, and the difference between the azimuth angle of the white line stored in the azimuth database by the Kalman filter and the azimuth angle of the white line calculated from the image is calculated. Based on this, error estimation is performed.
特開2008-249639号公報JP 2008-249639 A
 特許文献1では、カメラを用いて撮像した画像を用いているので、天候が悪い場合のように画像が鮮明に取得できない場合には誤差を正しく推定することができない。特に車線レベルでの位置推定が必要となると、特許文献1に記載の技術では対応することができない。高度な運転支援や自動運転においては、レーン特定やレーン内走行位置の特定をすることが必要になるので、より高精度な自己位置推定が求められる。 In Patent Document 1, since an image captured using a camera is used, an error cannot be estimated correctly if the image cannot be clearly obtained, such as when the weather is bad. In particular, if position estimation at the lane level is required, the technique described in Patent Document 1 cannot cope with it. In advanced driving support and automatic driving, it is necessary to specify the lane and the driving position within the lane, so that more accurate self-position estimation is required.
 本開示は、車線レベルでの高精度な位置推定が可能な自己位置推定装置を提供することを目的とする。 This disclosure aims to provide a self-position estimation device capable of highly accurate position estimation at the lane level.
 本開示は、自己位置推定装置であって、車両が走行可能な車線を特定する車線情報を含む地図情報を取得する地図情報取得部(201)と、自車両が走行している車線内における位置である車線内位置を特定する車線内位置情報を検出する車線内位置検出部(202)と、自車両の絶対位置及びその誤差を特定する絶対位置情報を推定する絶対位置推定部(203)と、地図情報、車線内位置情報、及び絶対位置情報に基づいて、地図情報に含まれている車線情報に対応する自車両の位置を推定する自車位置推定部(204)と、を備えている。自車位置推定部は、車線内位置と絶対位置及びその誤差との相互関係に基づいて、車線内位置が車線情報によって特定される車線のいずれに該当するかの車線該当候補の有無を判別し、この判別結果に基づいて地図情報に対応する自車両の位置を推定する。 The present disclosure relates to a self-position estimation device, a map information acquisition unit (201) that acquires map information including lane information that identifies a lane in which a vehicle can travel, and a position in a lane in which the host vehicle is traveling. An in-lane position detecting unit (202) for detecting in-lane position information for specifying the in-lane position, and an absolute position estimating unit (203) for estimating absolute position information for specifying the absolute position of the host vehicle and its error, A vehicle position estimation unit (204) that estimates the position of the vehicle corresponding to the lane information included in the map information based on the map information, the position information in the lane, and the absolute position information. . Based on the correlation between the in-lane position, the absolute position, and its error, the host vehicle position estimation unit determines whether there is a lane corresponding candidate for which of the lanes specified by the lane information. Based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
 車線内位置が車線のいずれかに該当するかの車線該当候補の有無を判別するので、車線該当候補における車線内位置と絶対位置とを勘案して自車両の位置を推定することができる。 Since the presence / absence of a lane candidate corresponding to whether the position in the lane corresponds to one of the lanes is determined, the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
 尚、「発明の概要」及び「請求の範囲」に記載した括弧内の符号は、後述する「発明を実施するための形態」との対応関係を示すものであって、「発明の概要」及び「請求の範囲」が、後述する「発明を実施するための形態」に限定されることを示すものではない。 The reference numerals in parentheses described in the “Summary of the Invention” and “Claims” indicate the correspondence with the “Mode for Carrying Out the Invention” to be described later. It does not indicate that the “claims” are limited to the “modes for carrying out the invention” described below.
図1は、実施形態である自己位置推定装置の機能的な構成を示すブロック構成図である。FIG. 1 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment. 図2は、本実施形態の自己位置推定について説明するための図である。FIG. 2 is a diagram for describing self-position estimation according to the present embodiment. 図3は、実施形態である自己位置推定装置の機能的な構成を示すブロック構成図である。FIG. 3 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment. 図4は、本実施形態の自己位置推定について説明するための図である。FIG. 4 is a diagram for describing self-position estimation according to the present embodiment. 図5は、本実施形態の自己位置推定について説明するための図である。FIG. 5 is a diagram for describing self-position estimation according to the present embodiment. 図6は、本実施形態の自己位置推定について説明するための図である。FIG. 6 is a diagram for describing self-position estimation according to the present embodiment. 図7は、本実施形態の自己位置推定について説明するための図である。FIG. 7 is a diagram for explaining self-position estimation according to the present embodiment. 図8は、本実施形態の自己位置推定について説明するための図である。FIG. 8 is a diagram for describing self-position estimation according to the present embodiment. 図9は、本実施形態の自己位置推定について説明するための図である。FIG. 9 is a diagram for describing self-position estimation according to the present embodiment. 図10は、本実施形態の自己位置推定について説明するための図である。FIG. 10 is a diagram for describing self-position estimation according to the present embodiment. 図11は、本実施形態の自己位置推定について説明するための図である。FIG. 11 is a diagram for describing self-position estimation according to the present embodiment. 図12は、本実施形態の自己位置推定について説明するための図である。FIG. 12 is a diagram for describing self-position estimation according to the present embodiment. 図13は、本実施形態の自己位置推定について説明するための図である。FIG. 13 is a diagram for describing self-position estimation according to the present embodiment. 図14は、本実施形態の自己位置推定について説明するための図である。FIG. 14 is a diagram for describing self-position estimation according to the present embodiment.
 以下、添付図面を参照しながら本実施形態について説明する。説明の理解を容易にするため、各図面において同一の構成要素に対しては可能な限り同一の符号を付して、重複する説明は省略する。 Hereinafter, the present embodiment will be described with reference to the accompanying drawings. In order to facilitate the understanding of the description, the same constituent elements in the drawings will be denoted by the same reference numerals as much as possible, and redundant description will be omitted.
 図1を参照しながら、本実施形態に係る自己位置推定装置10について説明する。自己位置推定装置10は、ハードウェア的な構成要素として、CPUといった演算部、RAMやROMといった記憶部、各種センサとデータの授受を行うためのインターフェイス部を備えるコンピュータとして構成されている。続いて、自己位置推定装置10の機能的な構成要素について説明する。 The self-position estimation apparatus 10 according to the present embodiment will be described with reference to FIG. The self-position estimation apparatus 10 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 10 will be described.
 自己位置推定装置10は、自己位置計測部101と、車両運動量計測部102と、白線認識部103と、周辺環境計測部104と、経路情報取得部105と、デッドレコニング106と、位置推定部108と、地図情報取得部109と、走行車線推定部110と、地図情報格納部120と、を備えている。 The self-position estimation apparatus 10 includes a self-position measurement unit 101, a vehicle momentum measurement unit 102, a white line recognition unit 103, a surrounding environment measurement unit 104, a route information acquisition unit 105, a dead reckoning 106, and a position estimation unit 108. A map information acquisition unit 109, a travel lane estimation unit 110, and a map information storage unit 120.
 自己位置計測部101は、GNSS(Grobal Navigation Satellite System)によって自車の位置を計測する部分である。自己位置計測部101は、複数の航法衛星から受信する航法信号に基づいて自車両の航法計測位置である自車計測位置を算出する。自己位置計測部101は、算出した自車計測位置をデッドレコニング106及び位置推定部108に出力する。 The self-position measuring unit 101 is a part for measuring the position of the own vehicle by GNSS (Global Navigation Satellite System). The self-position measuring unit 101 calculates a vehicle measurement position that is a navigation measurement position of the vehicle based on navigation signals received from a plurality of navigation satellites. The self-position measuring unit 101 outputs the calculated own vehicle measurement position to the dead reckoning 106 and the position estimating unit 108.
 車両運動量計測部102は、加速度センサ、車速センサ、ジャイロセンサといったセンサ類から信号を受信し、自車の運動量を計測する部分である。車両運動量計測部102は、車速、方位角、ヨーレート、加速度といった運動量の情報をデッドレコニング106及び位置推定部108に出力する。 The vehicle momentum measuring unit 102 is a part that receives signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor and measures the amount of movement of the vehicle. The vehicle momentum measuring unit 102 outputs information on the momentum such as the vehicle speed, the azimuth angle, the yaw rate, and the acceleration to the dead reckoning 106 and the position estimating unit 108.
 白線認識部103は、カメラが撮像した画像データを用いて、車線を区切る白線を認識する部分である。白線認識部103は、白線の有無の情報や白線の種類の情報を位置推定部108に出力する。 The white line recognition unit 103 is a part for recognizing a white line that divides a lane using image data captured by the camera. The white line recognition unit 103 outputs information on the presence or absence of a white line and information on the type of white line to the position estimation unit 108.
 周辺環境計測部104は、天候や衛星配置の情報を計測する部分である。周辺環境計測部104は、天候や衛星配置の情報を位置推定部108に出力する。 The surrounding environment measurement unit 104 is a part that measures the weather and satellite arrangement information. The surrounding environment measurement unit 104 outputs weather and satellite arrangement information to the position estimation unit 108.
 経路情報取得部105は、ナビゲーションシステムから車両の目的地及びその目的地への経路を取得する部分である。経路情報取得部105は、目的地及び経路を示す情報を走行車線推定部110に出力する。 The route information acquisition unit 105 is a part that acquires the destination of the vehicle and the route to the destination from the navigation system. The route information acquisition unit 105 outputs information indicating the destination and the route to the travel lane estimation unit 110.
 デッドレコニング106は、自己位置計測部101から出力される自車計測位置及び車両運動量計測部102から出力される運動量の情報に基づいて、GNSSのみでは測位が難しい場所での自車両の位置を自車ジャイロ位置として算出する部分である。デッドレコニング106は、算出した自車両の自車ジャイロ位置を位置推定部108に出力する。 The dead reckoning 106 automatically determines the position of the host vehicle in a place where positioning is difficult with GNSS alone, based on the host vehicle measurement position output from the host position measurement unit 101 and the momentum information output from the vehicle momentum measurement unit 102. It is a part which calculates as a car gyro position. The dead reckoning 106 outputs the calculated own vehicle gyro position of the own vehicle to the position estimation unit 108.
 地図情報取得部109は、車両が走行可能な車線情報を含む地図情報を取得する部分である。地図情報取得部109は、地図情報格納部120に格納されている地図情報を読み取り、読み取った地図情報を位置推定部108及び走行車線推定部110に出力する。 The map information acquisition unit 109 is a part that acquires map information including lane information on which the vehicle can travel. The map information acquisition unit 109 reads the map information stored in the map information storage unit 120 and outputs the read map information to the position estimation unit 108 and the travel lane estimation unit 110.
 位置推定部108は、地図情報と自車計測位置及び/又は自車ジャイロ位置とに基づいて補正後の自車両の位置である補正後自車位置を推定する部分である。位置推定部108は、地図情報の確からしさと、自車計測位置及び/又は自車ジャイロ位置の確からしさとを重ね合わせて、補正後自車位置を推定する。地図情報の確からしさ及び自車絶対位置の確からしさは、確からしさを表す確率分布を用いてもよく、確からしさを表す数値を用いてもよい。 The position estimation unit 108 is a part that estimates a corrected vehicle position that is a corrected position of the vehicle based on the map information and the vehicle measurement position and / or the vehicle gyro position. The position estimation unit 108 estimates the corrected vehicle position by superimposing the accuracy of the map information and the vehicle measurement position and / or the vehicle gyro position. As the probability of the map information and the accuracy of the absolute position of the vehicle, a probability distribution representing the probability may be used, or a numerical value representing the certainty may be used.
 図2を参照しながら、位置推定部108の補正後自車位置の推定手法の一例について説明する。図2においては、3本の車線L1,L2,L3が設定されている。車線L1の進行方向左側には、実線白線SLaが設けられている。車線L1と車線L2との間には、破線白線BLaが設けられている。車線L2と車線L3との間には、破線白線BLbが設けられている。車線L3の進行方向右側には、実線白線SLbが設けられている。車線中心線L1cは、車線L1の中心を示す線である。車線中心線L2cは、車線L2の中心を示す線である。車線中心線L3cは、車線L3の中心を示す線である。 An example of a method for estimating the corrected vehicle position of the position estimation unit 108 will be described with reference to FIG. In FIG. 2, three lanes L1, L2, and L3 are set. A solid white line SLa is provided on the left side in the traveling direction of the lane L1. A broken line white line BLa is provided between the lane L1 and the lane L2. A broken line white line BLb is provided between the lane L2 and the lane L3. A solid white line SLb is provided on the right side in the traveling direction of the lane L3. The lane center line L1c is a line indicating the center of the lane L1. The lane center line L2c is a line indicating the center of the lane L2. The lane center line L3c is a line indicating the center of the lane L3.
 図2に示される例では、車線中心線L1c、車線中心線L2c、車線中心線L3cの地図確率分布PDmをもって、地図情報の確からしさとしている。最初に自車両は、自車計測位置Paにいるものとする。説明の便宜上、自車両は、自車計測位置Paから車線L1に沿って進行しているものとする。尚、自車計測位置に変えて自車ジャイロ位置を用いてもよい。 In the example shown in FIG. 2, the map probability distribution PDm of the lane center line L1c, the lane center line L2c, and the lane center line L3c is regarded as the likelihood of the map information. First, it is assumed that the host vehicle is at the host vehicle measurement position Pa. For convenience of explanation, it is assumed that the host vehicle is traveling along the lane L1 from the host vehicle measurement position Pa. The own vehicle gyro position may be used instead of the own vehicle measurement position.
 位置推定部108は、最初の推定タイミングにおいて、自車計測位置Paにおける自車位置確率分布PDcaと、地図情報確率分布PDmを重ね合わせて、補正後自車位置Pbを推定している。補正後自車位置Pbは、自車計測位置Paから補正を行わなかった場合に比較して、距離d1だけ車線中心線L1c側に補正されている。 The position estimation unit 108 estimates the corrected vehicle position Pb by superimposing the vehicle position probability distribution PDca and the map information probability distribution PDm at the vehicle measurement position Pa at the first estimation timing. The corrected host vehicle position Pb is corrected to the lane center line L1c side by the distance d1 as compared with the case where the correction is not performed from the host vehicle measurement position Pa.
 位置推定部108は、次の推定タイミングにおいて、補正後自車位置Pbにおける自車位置確率分布PDcbと、地図情報確率分布PDmを重ね合わせて、補正後自車位置Pcを推定している。補正後自車位置Pcは、補正後自車位置Pbから補正を行わなかった場合に比較して、距離d2だけ車線中心線L1c側に補正されている。 The position estimation unit 108 estimates the corrected vehicle position Pc by superimposing the vehicle position probability distribution PDcb at the corrected vehicle position Pb and the map information probability distribution PDm at the next estimation timing. The corrected vehicle position Pc is corrected to the lane center line L1c side by the distance d2 as compared to the case where no correction is made from the corrected vehicle position Pb.
 地図確率分布としては、車線中心線の確率分布に限らず、地図情報の確からしさを示す確率分布が用いられる。また、運転者の癖や道路形状によってオフセットさせた地図確率分布を用いてもよい。道路形状とは、道幅や隣接車線の有無といった情報が含まれる。 The map probability distribution is not limited to the probability distribution of the lane center line, but a probability distribution indicating the certainty of the map information is used. Further, a map probability distribution offset by the driver's bag or road shape may be used. The road shape includes information such as the road width and the presence or absence of an adjacent lane.
 続いて、図3を参照しながら、本実施形態に係る自己位置推定装置10の変形例である自己位置推定装置20について説明する。自己位置推定装置20は、ハードウェア的な構成要素として、CPUといった演算部、RAMやROMといった記憶部、各種センサとデータの授受を行うためのインターフェイス部を備えるコンピュータとして構成されている。続いて、自己位置推定装置20の機能的な構成要素について説明する。 Subsequently, a self-position estimation apparatus 20 that is a modification of the self-position estimation apparatus 10 according to the present embodiment will be described with reference to FIG. The self-position estimation apparatus 20 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 20 will be described.
 自己位置推定装置20は、地図情報取得部201と、車線内位置検出部202と、絶対位置推定部203と、自車位置推定部204と、比較対象検出部205と、地図情報格納部211と、を備えている。 The self-position estimation apparatus 20 includes a map information acquisition unit 201, a lane position detection unit 202, an absolute position estimation unit 203, a vehicle position estimation unit 204, a comparison target detection unit 205, and a map information storage unit 211. It is equipped with.
 地図情報取得部201は、車両が走行可能な車線情報を含む地図情報を取得する部分である。地図情報取得部201は、地図情報格納部211に格納されている地図情報を読み取り、読み取った地図情報を自車位置推定部204に出力する。 The map information acquisition unit 201 is a part that acquires map information including lane information in which the vehicle can travel. The map information acquisition unit 201 reads the map information stored in the map information storage unit 211 and outputs the read map information to the own vehicle position estimation unit 204.
 車線内位置検出部202は、自車両が走行している車線内における位置である車線内位置を特定する車線内位置情報を検出する部分である。車線内位置検出部202は、カメラが撮像した周囲の環境や車線の状況に基づいて、車線内位置情報を検出する。車線内位置検出部202は、車線内位置情報を自車位置推定部204に出力する。 The in-lane position detection unit 202 is a part that detects in-lane position information that specifies an in-lane position that is a position in the lane in which the host vehicle is traveling. The in-lane position detection unit 202 detects in-lane position information based on the surrounding environment and lane conditions captured by the camera. The in-lane position detection unit 202 outputs in-lane position information to the own vehicle position estimation unit 204.
 図5に示されるように、車線内位置検出部202は、自車両の横偏差と回頭角から車線内位置を特定し、車線内位置情報を生成する。 As shown in FIG. 5, the in-lane position detection unit 202 identifies the in-lane position from the lateral deviation and turning angle of the host vehicle, and generates in-lane position information.
 図3に示される絶対位置推定部203は、自車両の絶対位置及びその誤差を特定する絶対位置情報を推定する部分である。絶対位置推定部203は、推定した絶対位置情報を自車位置推定部204に出力する。絶対位置推定部203は、様々な方法によって絶対位置情報を推定することができる。 The absolute position estimation unit 203 shown in FIG. 3 is a part that estimates absolute position information that specifies the absolute position of the host vehicle and its error. The absolute position estimation unit 203 outputs the estimated absolute position information to the own vehicle position estimation unit 204. The absolute position estimation unit 203 can estimate the absolute position information by various methods.
 絶対位置推定部203は、GNSSによって、自車両の絶対位置及びその誤差を特定する絶対位置情報を推定することができる。絶対位置推定部203は、複数の航法衛星から受信する航法信号に基づいて自車両の航法計測位置である自車計測位置を算出し、この自車計測位置に基づいて絶対位置情報を推定することができる。 The absolute position estimation unit 203 can estimate absolute position information that specifies the absolute position of the host vehicle and its error by GNSS. The absolute position estimation unit 203 calculates a host vehicle measurement position that is a navigation measurement position of the host vehicle based on navigation signals received from a plurality of navigation satellites, and estimates absolute position information based on the host vehicle measurement position. Can do.
 絶対位置推定部203は、加速度センサ、車速センサ、ジャイロセンサといったセンサ類から信号を受信し、自車両の運動量を計測することもできる。絶対位置推定部203は、自車計測位置及び自車両の運動量を示す情報に基づいて、GNSSのみでは測位が難しい場所での自車両の位置を自車ジャイロ位置として算出し、この自車ジャイロ位置に基づいて絶対位置情報を推定する。 The absolute position estimation unit 203 can also receive signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor, and measure the amount of movement of the host vehicle. The absolute position estimation unit 203 calculates, based on the information indicating the own vehicle measurement position and the amount of movement of the own vehicle, the position of the own vehicle at a location where positioning is difficult by GNSS alone, as the own vehicle gyro position. To estimate absolute position information.
 図4に、絶対位置推定部203が推定する態様の一例を示す。図4に示されるように、絶対位置推定部203は、絶対位置情報として「候補1」と「候補2」を推定している。候補1は、前回位置Xi t-1から自車両の運動量に基づいて、現在位置Xi tを推定している。現在位置Xi tは、推定誤差を含んだ位置情報である。候補2は、前回位置Xj t-1から自車両の運動量を勘案して、現在位置Xj tを推定している。現在位置Xj tは、推定誤差を含んだ位置情報である。 FIG. 4 shows an example of a mode estimated by the absolute position estimation unit 203. As shown in FIG. 4, the absolute position estimation unit 203 estimates “candidate 1” and “candidate 2” as absolute position information. Candidate 1 estimates current position X i t based on the amount of movement of the vehicle from the previous position X i t-1 . The current position X i t is position information including an estimation error. Candidate 2 estimates current position X j t in consideration of the amount of movement of the host vehicle from previous position X j t−1 . The current position X j t is position information including an estimation error.
 図3に示される自車位置推定部204は、地図情報、車線内位置情報、及び絶対位置情報に基づいて、地図情報に含まれている車線情報に対応する自車両の位置を推定する部分である。自車位置推定部204は、車線内位置と絶対位置及びその誤差との相互関係に基づいて、車線内位置が車線情報によって特定される車線のいずれに該当するかの車線該当候補の有無を判別し、この判別結果に基づいて地図情報に対応する自車両の位置を推定する。車線該当候補が無い場合、センサ類の異常であると判断してもよい。 The own vehicle position estimation unit 204 shown in FIG. 3 is a part that estimates the position of the own vehicle corresponding to the lane information included in the map information based on the map information, the in-lane position information, and the absolute position information. is there. The own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated. If there is no lane candidate, it may be determined that the sensor is abnormal.
 図6及び図7を参照しながら、自車位置推定部204が自車両の位置を推定する手法の一例について説明する。図6に示される「ジャイロ推定候補1」「ジャイロ推定候補2」は、図4に示される「候補1」「候補2」に対応するものであって、図4の「候補1」の推定誤差及び「候補2」の推定誤差を地図情報に重ね合わせている。図6に示される「カメラ観測横偏差」は、図5に示される「横偏差」を車線毎に地図情報に対して重ね合わせたものである。 Referring to FIGS. 6 and 7, an example of a method in which the own vehicle position estimation unit 204 estimates the position of the own vehicle will be described. “Gyro estimation candidate 1” and “Gyro estimation candidate 2” shown in FIG. 6 correspond to “candidate 1” and “candidate 2” shown in FIG. 4, and the estimation error of “candidate 1” in FIG. And the estimation error of “candidate 2” is superimposed on the map information. The “camera observation lateral deviation” shown in FIG. 6 is obtained by superimposing the “lateral deviation” shown in FIG. 5 on the map information for each lane.
 図8を参照しながら、絶対位置情報としてのジャイロセンサによる推定位置及び誤差の算出について説明する。図8において、Pt-1は、前回時刻の共分散行列である。yt-1は、前回時刻の横位置である。φt-1は、前回時刻の姿勢角である。次式によって、横位置yt、姿勢角φtが算出される。εは、システムノイズである。
Figure JPOXMLDOC01-appb-M000001
The calculation of the estimated position and error by the gyro sensor as absolute position information will be described with reference to FIG. In FIG. 8, P t-1 is the covariance matrix at the previous time. y t-1 is the horizontal position of the previous time. φ t-1 is the attitude angle at the previous time. The lateral position y t and the posture angle φ t are calculated by the following equation. ε is system noise.
Figure JPOXMLDOC01-appb-M000001
 ジャイロセンサによる推定値の共分散行列Ptは、前回時刻の共分散行列Pt-1を用いて次式によって算出される。Mは、システムノイズεの共分散行列であり、ジャイロセンサの誤差特定等から設定される。Tは転置行列を表す。
Figure JPOXMLDOC01-appb-M000002
The estimated covariance matrix Pt by the gyro sensor is calculated by the following equation using the covariance matrix P t-1 at the previous time. M is a covariance matrix of the system noise ε, and is set based on the error specification of the gyro sensor. T represents a transposed matrix.
Figure JPOXMLDOC01-appb-M000002
 図6に示される情報をカルマンフィルタによってセンサフュージョンする。カルマンゲインKは次式によって算出される。Qは、観測誤差行列で、レーン内位置の観測値の誤差特性から設定される。Hは、2×2の単位行列である。
Figure JPOXMLDOC01-appb-M000003
The information shown in FIG. 6 is sensor-fused by a Kalman filter. The Kalman gain K is calculated by the following equation. Q is an observation error matrix, and is set from the error characteristics of the observed values at the positions in the lane. H is a 2 × 2 unit matrix.
Figure JPOXMLDOC01-appb-M000003
 カルマンゲインKを用いて、推定結果及び共分散行列が次式によって算出される。Ztは、ジャイロセンサによって検出される横位置及び姿勢角を縦に並べたベクトルである。ΔZは、レーン内位置、姿勢角のカメラによる観測値とジャイロセンサによる推定結果Ztとの差分である。
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000005
Using the Kalman gain K, an estimation result and a covariance matrix are calculated by the following equations. Z t is a vector in which the horizontal position and the posture angle detected by the gyro sensor are vertically arranged. ΔZ is the difference between the observation value by the camera of the in-lane position and the attitude angle and the estimation result Z t by the gyro sensor.
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000005
 このように算出されたジャイロセンサによる推定結果と、カメラによるレーン内位置観測結果とを対応付ける。対応付けの例を図9に示す。図9の(A)に示される例では、ジャイロセンサによる推定結果の誤差が小さいため、レーン2に自車両が存在することが推定され、レーン2においてカメラによるレーン内位置観測結果と対応付けをすることができる。この場合、車線該当候補としては、レーン2となる。 Associate the estimation result by the gyro sensor calculated in this way with the in-lane position observation result by the camera. An example of association is shown in FIG. In the example shown in FIG. 9A, since the error of the estimation result by the gyro sensor is small, it is estimated that the host vehicle is present in lane 2, and the lane 2 is associated with the in-lane position observation result by the camera. can do. In this case, the lane corresponding candidate is lane 2.
 一方、図9の(B)に示される例では、ジャイロセンサによる推定結果の誤差が大きいため、レーン1にもレーン2にも自車両が存在する可能性が捨てきれない。従って、レーン1に自車両が存在するという候補と、レーン2に自車両が存在するという候補とが存在することになる。この場合、車線該当候補としては、レーン1及びレーン2となる。 On the other hand, in the example shown in FIG. 9B, since the error of the estimation result by the gyro sensor is large, the possibility that the own vehicle exists in both lane 1 and lane 2 cannot be discarded. Therefore, there is a candidate that the own vehicle exists in lane 1 and a candidate that the own vehicle exists in lane 2. In this case, the lane corresponding candidates are lane 1 and lane 2.
 本実施形態では、複数の車線該当候補が存在する場合に、いずれかの車線該当候補を棄却するために比較対象検出部205を設けている。比較対象検出部205は、車線内位置情報の検出及び絶対位置情報の推定に用いられる情報とは異なる比較対象情報を検出する部分である。比較対象情報としては、車線を確定する線種情報、地図情報における道路形状情報、GPSによる位置情報等が用いられる。自車位置推定部204は、車線該当候補と比較対象情報とに基づいて選別条件を満たすか否かを判断し、車線該当候補の選別を実行する。 In the present embodiment, when there are a plurality of lane matching candidates, the comparison target detection unit 205 is provided to reject any lane matching candidate. The comparison object detection unit 205 is a part that detects comparison object information that is different from information used for detection of in-lane position information and estimation of absolute position information. As comparison target information, line type information for determining lanes, road shape information in map information, position information by GPS, and the like are used. The own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
 図10に示される例では、双方のレーンが車線該当候補である場合に、線種情報を比較対象情報として用いている。比較対象検出部205が、線種情報として、左側に実線、右側に破線を検出する。自車位置推定部204は、線種の認識結果に基づいて、進行方向左側のレーンに自車両が存在するものと推定し、進行方向右側のレーンを候補としては棄却し、車線該当候補の選別を実行する。この場合、車線該当候補としては、進行方向左側のレーンになる。 In the example shown in FIG. 10, when both lanes are lane candidates, line type information is used as comparison target information. The comparison target detection unit 205 detects a solid line on the left side and a broken line on the right side as line type information. Based on the line type recognition result, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and selects lane candidates. Execute. In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
 図11に示される例では、双方のレーンが車線該当候補である場合に、GPS情報を比較対象情報として用いている。比較対象検出部205が、GPS情報として、図中破線の円で示す誤差を含めた位置情報を取得する。自車位置推定部204は、GPS情報に基づいて、進行方向左側のレーンに自車両が存在するものと推定し、進行方向右側のレーンを候補としては棄却し、車線該当候補の選別を実行する。この場合、車線該当候補としては、進行方向左側のレーンになる。 In the example shown in FIG. 11, GPS information is used as comparison target information when both lanes are lane candidates. The comparison target detection unit 205 acquires position information including an error indicated by a broken-line circle in the figure as GPS information. Based on the GPS information, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and executes selection of lane candidates. . In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
 図12に示される例では、双方のレーンに候補が存在する場合に、地図情報を比較対象情報として用いている。比較対象検出部205が、地図情報として、道路形状を比較対象情報として検出する。自車位置推定部204は、進行方向左側の候補が時間の経過と共に地図情報と重なり合わなくなることから、進行方向左側のレーンを候補としては棄却し、車線該当候補の選別を実行する。この場合、車線該当候補としては、進行方向右側のレーンになり、レーンチェンジ後は進行方向左側のレーンになる。 In the example shown in FIG. 12, map information is used as comparison target information when candidates exist in both lanes. The comparison target detection unit 205 detects the road shape as the comparison target information as the map information. The vehicle position estimation unit 204 rejects the lane on the left side of the traveling direction as a candidate because the candidate on the left side in the traveling direction does not overlap with the map information as time elapses, and executes selection of the lane candidate. In this case, the candidate corresponding to the lane is the lane on the right side in the traveling direction, and the lane on the left side in the traveling direction after the lane change.
 自車位置推定部204は、車線該当候補が複数ある場合や絶対位置の誤差が所定以上である場合に、推定結果の信頼度が低いことを通知することができる。図13に示されるように、候補が1つの場合に、誤差の分散が小さければ信頼度が高いと判断する。候補が1つであっても誤差の分散が大きければ、信頼度が低いと判断する。候補が複数の場合は、誤差の分散が小さい場合であっても車線を特定できないので、信頼度が低いと判断する。候補が複数であって誤差の分散も大きい場合は、信頼度が低いと判断する。 The own vehicle position estimation unit 204 can notify that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. As shown in FIG. 13, when there is one candidate, if the error variance is small, it is determined that the reliability is high. Even if there is only one candidate, if the error variance is large, it is determined that the reliability is low. When there are a plurality of candidates, it is determined that the reliability is low because the lane cannot be specified even when the variance of the error is small. If there are a plurality of candidates and the error variance is large, it is determined that the reliability is low.
 図14を参照しながら、自己位置推定装置20の動作について説明する。図14の(A)は、レーン内の位置認識結果を示している。図14の(B)は、自車両位置の認識結果を示している。図14の(C)は、レーン候補数を示している。図14の(D)は、信頼度を示している。 The operation of the self-position estimation apparatus 20 will be described with reference to FIG. FIG. 14A shows the position recognition result in the lane. FIG. 14B shows the recognition result of the vehicle position. FIG. 14C shows the number of lane candidates. FIG. 14D shows the reliability.
 時刻t1においては、レーン内位置認識がなされている状態である。自車両の位置は右車線上に認識されている。時刻t2においては、レーン内位置認識がされていない状態になっている。そのため、自車両の位置認識の誤差が大きくなっており、信頼度が低くなっている。時刻t3においては、再びレーン内位置認識がなされているので、自車両の位置認識の誤差が小さくなり、信頼度も高まっている。時刻t4においては、レーン内位置認識がされていない状態になっており、再び信頼度が低くなっている。 At time t1, the position in the lane is recognized. The position of the host vehicle is recognized on the right lane. At time t2, the position in the lane is not recognized. For this reason, the error in recognizing the position of the host vehicle is large, and the reliability is low. At time t3, the in-lane position recognition is performed again, so that the error in the position recognition of the host vehicle is reduced and the reliability is also increased. At time t4, the position in the lane is not recognized, and the reliability is low again.
 時刻t5においては、ジャイロの推定誤差が大きくなっており、どちらのレーンに自車両が存在するかが特定できない状況となっている。時刻t6においては、双方のレーンを車線該当候補としている。時刻t6から時刻8では車線該当候補が2つのままで推定を継続している。 At time t5, the gyro estimation error is large, and it is not possible to determine in which lane the vehicle is present. At time t6, both lanes are lane candidates. From time t6 to time 8, estimation is continued with two lane candidates.
 時刻9においては、右側車線の候補が地図情報から逸脱しているので、右側車線の候補を棄却する。時刻t10以降は、左側車線から右側車線に車線変更しているので、右側車線を車線該当候補として推定を継続する。 At time 9, since the right lane candidate deviates from the map information, the right lane candidate is rejected. After time t10, since the lane is changed from the left lane to the right lane, estimation is continued with the right lane as the lane corresponding candidate.
 本実施形態に係る自己位置推定装置20は、車両が走行可能な車線を特定する車線情報を含む地図情報を取得する地図情報取得部201と、自車両が走行している車線内における位置である車線内位置を特定する車線内位置情報を検出する車線内位置検出部202と、自車両の絶対位置及びその誤差を特定する絶対位置情報を推定する絶対位置推定部203と、地図情報、車線内位置情報、及び絶対位置情報に基づいて、地図情報に含まれている車線情報に対応する自車両の位置を推定する自車位置推定部204と、を備えている。自車位置推定部204は、車線内位置と絶対位置及びその誤差との相互関係に基づいて、車線内位置が車線情報によって特定される車線のいずれに該当するかの車線該当候補の有無を判別し、この判別結果に基づいて地図情報に対応する自車両の位置を推定する。 The self-position estimation device 20 according to the present embodiment is a map information acquisition unit 201 that acquires map information including lane information that identifies a lane in which the vehicle can travel, and a position in the lane in which the host vehicle is traveling. In-lane position detection unit 202 that detects in-lane position information that specifies the in-lane position, an absolute position estimation unit 203 that estimates absolute position information that specifies the absolute position of the host vehicle and its error, map information, in-lane A host vehicle position estimation unit 204 that estimates the position of the host vehicle corresponding to the lane information included in the map information based on the position information and the absolute position information. The own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
 車線内位置が車線のいずれかに該当するかの車線該当候補の有無を判別するので、車線該当候補における車線内位置と絶対位置とを勘案して自車両の位置を推定することができる。 Since the presence / absence of a lane candidate corresponding to whether the position in the lane corresponds to one of the lanes is determined, the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
 本実施形態において、自車位置推定部204は、誤差分布を含めた車線内位置と誤差分布を含めた絶対位置との重なり度合いに基づいて、車線該当候補を判別する。車線内位置及び絶対位置の双方において誤差分布を勘案して重ね合わせることで、誤差を反映させて自車両の位置を推定することができる。 In the present embodiment, the vehicle position estimation unit 204 determines a lane candidate based on the degree of overlap between the position in the lane including the error distribution and the absolute position including the error distribution. By superimposing the error distribution in both the in-lane position and the absolute position, the position of the host vehicle can be estimated by reflecting the error.
 本実施形態において、自車位置推定部204は、車線該当候補が複数判別された場合に、選別条件を満たすまで複数の車線該当候補に対応する自車両の位置推定を継続し、選別条件を満たすと複数の車線該当候補の選別を実行する。選別条件を満たす場合、複数判別された車線該当候補を選別して棄却することができるので、複数の車線該当候補に対応した自車両の位置推定負荷を低減することができる。 In the present embodiment, the host vehicle position estimation unit 204 continues the position estimation of the host vehicle corresponding to the plurality of lane candidates until the selection condition is satisfied, when a plurality of lane candidates are determined, and satisfies the selection condition. And selecting a plurality of lane candidates. When the selection condition is satisfied, a plurality of discriminated lane candidates can be selected and rejected, so that the position estimation load of the host vehicle corresponding to the plurality of lane candidates can be reduced.
 本実施形態に係る自己位置推定装置20は、更に、車線内位置情報の検出及び絶対位置情報の推定に用いられる情報とは異なる比較対象情報を検出する比較対象検出部205を備えている。自車位置推定部204は、車線該当候補と比較対象情報とに基づいて選別条件を満たすか否かを判断し、車線該当候補の選別を実行する。 The self-position estimation apparatus 20 according to the present embodiment further includes a comparison target detection unit 205 that detects comparison target information different from information used for detection of in-lane position information and absolute position information. The own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
 比較対象情報に基づいた車線該当候補の選別を行うので、車線該当候補の判別とは異なる観点からの選別を行うことができる。比較対象情報の信頼度を高めることで、不要な車線該当候補を棄却することができる。 Since the lane candidate candidates are selected based on the comparison target information, the lane candidate candidates can be selected from a different point of view. By increasing the reliability of the comparison target information, unnecessary lane candidates can be rejected.
 本実施形態において、比較対象検出部205は、比較対象情報として、自車両に設けられた撮像装置が撮像した画像から検出される線種情報であって、自車両が走行中の車線を区分する少なくとも1つの車線境界線の線種を特定する走行中線種情報を検出し、自車位置推定部204は、車線情報に含まれる車線境界線の線種を特定する地図線種情報と走行中線種情報とが不一致となった場合に選別条件を満たすと判断し、不一致となっている車線該当候補を棄却する。 In the present embodiment, the comparison target detection unit 205 is line type information detected from the image captured by the imaging device provided in the host vehicle as the comparison target information, and classifies the lane in which the host vehicle is traveling. The running line type information that identifies the line type of at least one lane boundary line is detected, and the vehicle position estimation unit 204 is traveling with the map line type information that identifies the line type of the lane boundary line included in the lane information. If the line type information does not match, it is determined that the selection condition is satisfied, and the lane candidate that does not match is rejected.
 地図線種情報によって、例えば自車両から見て右側及び左側に破線があるものと特定される一方で、走行中線種情報によって特定される線種が右側は連続線で左側が破線である場合、地図線種情報と走行中線種情報とが不一致となっているので選別条件を満たすと判断し、対応する車線該当候補を棄却する。撮像装置が撮像した画像から検出される走行中線種情報の信頼度は高いものと考えられるので、不要な車線該当候補を棄却することができる。 When the map line type information specifies that there are broken lines on the right and left sides as viewed from the host vehicle, for example, the line type specified by the running line type information is a continuous line on the right side and a broken line on the left side Since the map line type information and the running line type information are inconsistent, it is determined that the selection condition is satisfied, and the corresponding lane candidate is rejected. Since it is considered that the reliability of the traveling line type information detected from the image captured by the imaging device is high, unnecessary lane candidates can be rejected.
 本実施形態において、絶対位置推定部203は、ジャイロセンサの検出結果に基づいて絶対位置情報を推定し、比較対象検出部205は、複数の航法衛星から受信する航法信号に基づいて比較対象情報を検出し、自車位置推定部204は、車線該当候補と比較対象情報によって特定される自車両の位置とが重なり合わなくなった場合に選別条件を満たすと判断し、重なり合わなくなっている車線該当候補を棄却する。 In the present embodiment, the absolute position estimation unit 203 estimates absolute position information based on the detection result of the gyro sensor, and the comparison target detection unit 205 calculates the comparison target information based on navigation signals received from a plurality of navigation satellites. The vehicle position estimation unit 204 detects and determines that the selection condition is satisfied when the lane candidate and the position of the host vehicle specified by the comparison target information do not overlap, and the lane candidate that does not overlap. Reject.
 比較対象情報を、複数の航法衛星から受信する航法信号に基づいて生成するので、航法信号の受信状況が良好であれば、ジャイロセンサの検出結果に基づく絶対位置情報よりも信頼度を高めることができ、不要な車線該当候補を棄却することができる。 Since the comparison target information is generated based on navigation signals received from a plurality of navigation satellites, if the reception status of the navigation signals is good, the reliability can be improved more than the absolute position information based on the detection result of the gyro sensor. Yes, you can reject unnecessary lane candidates.
 本実施形態において、自車位置推定部204は、車線該当候補が複数存在し、複数の車線該当候補の少なくとも1つが車線情報と重なり合わなくなった場合に選別条件を満たすと判断し、重なり合わなくなっている車線該当候補を棄却する。 In the present embodiment, the own vehicle position estimation unit 204 determines that the selection condition is satisfied when there are a plurality of lane candidates and at least one of the plurality of lane candidates does not overlap with the lane information, and the overlap does not occur. Dismiss the corresponding lane candidate.
 複数の車線該当候補が存在し、一部の車線該当候補が地図情報に含まれている車線情報と重なり合わなくなった場合には、重なり合っている車線該当候補を選別し、重なり合わなくなっている車線該当候補を棄却することで、不要な車線該当候補を棄却することができる。 If there are multiple lane candidates and some of the lane candidates do not overlap with the lane information included in the map information, the overlapping lane candidates are selected and the lanes that do not overlap By rejecting the candidates, unnecessary lane candidates can be rejected.
 本実施形態において、自車位置推定部204は、車線該当候補が複数ある場合や絶対位置の誤差が所定以上である場合に、推定結果の信頼度が低いことを通知する。推定結果の信頼度が低いことを通知することで、使用者が信頼度の低下を認識することができる。 In this embodiment, the own vehicle position estimation unit 204 notifies that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. By notifying that the reliability of the estimation result is low, the user can recognize the decrease in the reliability.
 以上、具体例を参照しつつ本実施形態について説明した。しかし、本開示はこれらの具体例に限定されるものではない。これら具体例に、当業者が適宜設計変更を加えたものも、本開示の特徴を備えている限り、本開示の範囲に包含される。前述した各具体例が備える各要素およびその配置、条件、形状などは、例示したものに限定されるわけではなく適宜変更することができる。前述した各具体例が備える各要素は、技術的な矛盾が生じない限り、適宜組み合わせを変えることができる。 The embodiment has been described above with reference to specific examples. However, the present disclosure is not limited to these specific examples. Those in which those skilled in the art appropriately modify the design of these specific examples are also included in the scope of the present disclosure as long as they have the features of the present disclosure. Each element included in each of the specific examples described above and their arrangement, conditions, shape, and the like are not limited to those illustrated, and can be changed as appropriate. Each element included in each of the specific examples described above can be appropriately combined as long as no technical contradiction occurs.

Claims (8)

  1.  自己位置推定装置であって、
     車両が走行可能な車線を特定する車線情報を含む地図情報を取得する地図情報取得部(201)と、
     自車両が走行している車線内における位置である車線内位置を特定する車線内位置情報を検出する車線内位置検出部(202)と、
     自車両の絶対位置及びその誤差を特定する絶対位置情報を推定する絶対位置推定部(203)と、
     前記地図情報、前記車線内位置情報、及び前記絶対位置情報に基づいて、前記地図情報に含まれている前記車線情報に対応する前記自車両の位置を推定する自車位置推定部(204)と、を備え、
     前記自車位置推定部は、前記車線内位置と前記絶対位置及びその誤差との相互関係に基づいて、前記車線内位置が前記車線情報によって特定される車線のいずれに該当するかの車線該当候補の有無を判別し、この判別結果に基づいて前記地図情報に対応する前記自車両の位置を推定する、自己位置推定装置。
    A self-position estimation device,
    A map information acquisition unit (201) for acquiring map information including lane information for specifying a lane in which the vehicle can travel;
    An in-lane position detection unit (202) for detecting in-lane position information for specifying an in-lane position that is a position in the lane in which the host vehicle is traveling;
    An absolute position estimator (203) for estimating absolute position information identifying the absolute position of the host vehicle and its error;
    An own vehicle position estimating unit (204) for estimating the position of the own vehicle corresponding to the lane information included in the map information based on the map information, the in-lane position information, and the absolute position information; With
    The own vehicle position estimation unit, based on a correlation between the position in the lane, the absolute position, and an error thereof, a lane corresponding candidate as to which of the lanes the lane position is specified by the lane information A self-position estimation device that determines the presence or absence of the vehicle and estimates the position of the host vehicle corresponding to the map information based on the determination result.
  2.  請求項1に記載の自己位置推定装置であって、
     前記自車位置推定部は、誤差分布を含めた前記車線内位置と誤差分布を含めた前記絶対位置との重なり度合いに基づいて、前記車線該当候補を判別する、自己位置推定装置。
    The self-position estimation apparatus according to claim 1,
    The self-vehicle position estimating unit determines the lane corresponding candidate based on a degree of overlap between the in-lane position including an error distribution and the absolute position including the error distribution.
  3.  請求項1又は2に記載の自己位置推定装置であって、
     前記自車位置推定部は、前記車線該当候補が複数判別された場合に、選別条件を満たすまで複数の前記車線該当候補に対応する前記自車両の位置推定を継続し、前記選別条件を満たすと複数の前記車線該当候補の選別を実行する、自己位置推定装置。
    The self-position estimation apparatus according to claim 1 or 2,
    The vehicle position estimation unit continues the position estimation of the vehicle corresponding to the plurality of lane candidates until a selection condition is satisfied when a plurality of lane candidates are determined, A self-position estimation device that performs a selection of a plurality of lane candidates.
  4.  請求項3に記載の自己位置推定装置であって、
     更に、前記車線内位置情報の検出及び前記絶対位置情報の推定に用いられる情報とは異なる比較対象情報を検出する比較対象検出部(205)を備え、
     前記自車位置推定部は、前記車線該当候補と前記比較対象情報とに基づいて前記選別条件を満たすか否かを判断し、前記車線該当候補の選別を実行する、自己位置推定装置。
    The self-position estimation apparatus according to claim 3,
    Furthermore, a comparison object detection unit (205) for detecting comparison object information different from information used for detection of the position information in the lane and estimation of the absolute position information,
    The own vehicle position estimation unit determines whether or not the selection condition is satisfied based on the lane corresponding candidate and the comparison target information, and executes the selection of the lane corresponding candidate.
  5.  請求項4に記載の自己位置推定装置であって、
     前記比較対象検出部は、前記比較対象情報として、自車両に設けられた撮像装置が撮像した画像から検出される線種情報であって、自車両が走行中の車線を区分する少なくとも1つの車線境界線の線種を特定する走行中線種情報を検出し、
     前記自車位置推定部は、前記車線情報に含まれる車線境界線の線種を特定する地図線種情報と前記走行中線種情報とが不一致となった場合に前記選別条件を満たすと判断し、不一致となっている前記車線該当候補を棄却する、自己位置推定装置。
    The self-position estimation apparatus according to claim 4,
    The comparison target detection unit is line type information detected from an image captured by an imaging device provided in the host vehicle as the comparison target information, and at least one lane that divides a lane in which the host vehicle is traveling Detects the running line type information that identifies the line type of the boundary line,
    The vehicle position estimation unit determines that the selection condition is satisfied when the map line type information for specifying the line type of the lane boundary included in the lane information and the running line type information do not match. The self-position estimation apparatus which rejects the said lane applicable candidate which is inconsistent.
  6.  請求項4に記載の自己位置推定装置であって、
     前記絶対位置推定部は、ジャイロセンサの検出結果に基づいて前記絶対位置情報を推定し、
     前記比較対象検出部は、複数の航法衛星から受信する航法信号に基づいて前記比較対象情報を検出し、
     前記自車位置推定部は、前記車線該当候補と前記比較対象情報によって特定される自車両の位置とが重なり合わなくなった場合に前記選別条件を満たすと判断し、重なり合わなくなっている前記車線該当候補を棄却する、自己位置推定装置。
    The self-position estimation apparatus according to claim 4,
    The absolute position estimation unit estimates the absolute position information based on a detection result of a gyro sensor,
    The comparison target detection unit detects the comparison target information based on navigation signals received from a plurality of navigation satellites,
    The own vehicle position estimation unit determines that the selection condition is satisfied when the lane corresponding candidate and the position of the own vehicle specified by the comparison target information are not overlapped, and the lane corresponding is not overlapped. Self-position estimation device that rejects candidates.
  7.  請求項3に記載の自己位置推定装置であって、
     前記自車位置推定部は、前記車線該当候補が複数存在し、複数の前記車線該当候補の少なくとも1つが前記地図情報に含まれている前記車線情報と重なり合わなくなった場合に前記選別条件を満たすと判断し、重なり合わなくなっている前記車線該当候補を棄却する、自己位置推定装置。
    The self-position estimation apparatus according to claim 3,
    The own vehicle position estimation unit satisfies the selection condition when there are a plurality of lane candidates and at least one of the lane candidates does not overlap with the lane information included in the map information. A self-position estimation device that judges that the lane candidate that is no longer overlapping is rejected.
  8.  請求項1から7のいずれかに記載の自己位置推定装置であって、
     前記自車位置推定部は、前記車線該当候補が複数ある場合や前記絶対位置の誤差が所定以上である場合に、推定結果の信頼度が低いことを通知する、自己位置推定装置。
    The self-position estimation apparatus according to any one of claims 1 to 7,
    The said own vehicle position estimation part is a self-position estimation apparatus which notifies that the reliability of an estimation result is low, when there exists two or more said lane applicable candidates, or when the error of the said absolute position is more than predetermined.
PCT/JP2018/010068 2017-03-16 2018-03-14 Own-position estimating device WO2018168961A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/568,606 US11639853B2 (en) 2017-03-16 2019-09-12 Self-localization estimation device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2017051066 2017-03-16
JP2017-051066 2017-03-16
JP2017-248745 2017-12-26
JP2017248745A JP6870604B2 (en) 2017-03-16 2017-12-26 Self-position estimator
JP2017-248744 2017-12-26
JP2017248744A JP6693496B2 (en) 2017-03-16 2017-12-26 Self-location estimation device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/568,606 Continuation US11639853B2 (en) 2017-03-16 2019-09-12 Self-localization estimation device

Publications (1)

Publication Number Publication Date
WO2018168961A1 true WO2018168961A1 (en) 2018-09-20

Family

ID=63522498

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2018/010068 WO2018168961A1 (en) 2017-03-16 2018-03-14 Own-position estimating device
PCT/JP2018/010057 WO2018168956A1 (en) 2017-03-16 2018-03-14 Own-position estimating device

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010057 WO2018168956A1 (en) 2017-03-16 2018-03-14 Own-position estimating device

Country Status (1)

Country Link
WO (2) WO2018168961A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021012086A (en) * 2019-07-05 2021-02-04 トヨタ自動車株式会社 Lane estimation apparatus
WO2022015715A1 (en) 2020-07-13 2022-01-20 The Trustees Of The University Of Pennsylvania Compositions useful for treatment of charcot-marie-tooth disease
WO2022221276A1 (en) 2021-04-12 2022-10-20 The Trustees Of The University Of Pennsylvania Compositions useful for treating spinal and bulbar muscular atrophy (sbma)
EP3882885A4 (en) * 2019-03-12 2022-11-23 Hitachi Astemo, Ltd. Vehicle control device
WO2023133574A1 (en) 2022-01-10 2023-07-13 The Trustees Of The University Of Pennsylvania Compositions and methods useful for treatment of c9orf72-mediated disorders

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08233584A (en) * 1995-02-27 1996-09-13 Nippon Telegr & Teleph Corp <Ntt> Device for detecting position of travelling object
JP2006112878A (en) * 2004-10-14 2006-04-27 Alpine Electronics Inc Navigation device
US20120271540A1 (en) * 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
WO2017029734A1 (en) * 2015-08-19 2017-02-23 三菱電機株式会社 Lane recognition device and lane recognition method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4832489B2 (en) * 2008-09-25 2011-12-07 クラリオン株式会社 Lane judgment device
JP4934167B2 (en) * 2009-06-18 2012-05-16 クラリオン株式会社 Position detection apparatus and position detection program
JP2012127845A (en) * 2010-12-16 2012-07-05 Yupiteru Corp On-vehicle electronic device and program
EP3106836B1 (en) * 2015-06-16 2018-06-06 Volvo Car Corporation A unit and method for adjusting a road boundary

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08233584A (en) * 1995-02-27 1996-09-13 Nippon Telegr & Teleph Corp <Ntt> Device for detecting position of travelling object
JP2006112878A (en) * 2004-10-14 2006-04-27 Alpine Electronics Inc Navigation device
US20120271540A1 (en) * 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
WO2017029734A1 (en) * 2015-08-19 2017-02-23 三菱電機株式会社 Lane recognition device and lane recognition method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3882885A4 (en) * 2019-03-12 2022-11-23 Hitachi Astemo, Ltd. Vehicle control device
US11796324B2 (en) 2019-03-12 2023-10-24 Hitachi Astemo, Ltd. Vehicle control device
JP2021012086A (en) * 2019-07-05 2021-02-04 トヨタ自動車株式会社 Lane estimation apparatus
JP7120170B2 (en) 2019-07-05 2022-08-17 トヨタ自動車株式会社 Lane estimation device
WO2022015715A1 (en) 2020-07-13 2022-01-20 The Trustees Of The University Of Pennsylvania Compositions useful for treatment of charcot-marie-tooth disease
WO2022221276A1 (en) 2021-04-12 2022-10-20 The Trustees Of The University Of Pennsylvania Compositions useful for treating spinal and bulbar muscular atrophy (sbma)
WO2023133574A1 (en) 2022-01-10 2023-07-13 The Trustees Of The University Of Pennsylvania Compositions and methods useful for treatment of c9orf72-mediated disorders

Also Published As

Publication number Publication date
WO2018168956A1 (en) 2018-09-20

Similar Documents

Publication Publication Date Title
JP6870604B2 (en) Self-position estimator
WO2018168961A1 (en) Own-position estimating device
Atia et al. A low-cost lane-determination system using GNSS/IMU fusion and HMM-based multistage map matching
CN105937912B (en) The map data processing device of vehicle
EP2664894B1 (en) Navigation apparatus
US11287524B2 (en) System and method for fusing surrounding V2V signal and sensing signal of ego vehicle
US8775063B2 (en) System and method of lane path estimation using sensor fusion
US8452535B2 (en) Systems and methods for precise sub-lane vehicle positioning
EP2054699B1 (en) Rerouting in vehicle navigation systems
US11599121B2 (en) Method for localizing a more highly automated vehicle (HAF), in particular a highly automated vehicle, and a vehicle system
JP2019532292A (en) Autonomous vehicle with vehicle location
WO2007000912A1 (en) Vehicle and lane recognizing device
EP3644016A1 (en) Localization using dynamic landmarks
JP2016080460A (en) Moving body
US20220113139A1 (en) Object recognition device, object recognition method and program
US20210278217A1 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
KR102134841B1 (en) System and method for researching localization of autonomous vehicle
CN114670840A (en) Dead angle estimation device, vehicle travel system, and dead angle estimation method
JP6507841B2 (en) Preceding vehicle estimation device and program
JP2016218015A (en) On-vehicle sensor correction device, self-position estimation device, and program
KR20210073281A (en) Method and apparatus for estimating motion information
EP4001844A1 (en) Method and apparatus with localization
US11287281B2 (en) Analysis of localization errors in a mobile object
JP2018169319A (en) Vehicle travel lane estimation device
Tsogas et al. Using digital maps to enhance lane keeping support systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18768509

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18768509

Country of ref document: EP

Kind code of ref document: A1