WO2018168961A1 - Dispositif d'estimation de position propre - Google Patents

Dispositif d'estimation de position propre Download PDF

Info

Publication number
WO2018168961A1
WO2018168961A1 PCT/JP2018/010068 JP2018010068W WO2018168961A1 WO 2018168961 A1 WO2018168961 A1 WO 2018168961A1 JP 2018010068 W JP2018010068 W JP 2018010068W WO 2018168961 A1 WO2018168961 A1 WO 2018168961A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
information
vehicle
position estimation
self
Prior art date
Application number
PCT/JP2018/010068
Other languages
English (en)
Japanese (ja)
Inventor
雄一 南口
健司 三宅
竜巳 杉山
和美 伊佐治
稔 岡田
謙太 高橋
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017248744A external-priority patent/JP6693496B2/ja
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2018168961A1 publication Critical patent/WO2018168961A1/fr
Priority to US16/568,606 priority Critical patent/US11639853B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • This disclosure relates to a self-position estimation apparatus.
  • a self-localization device described in Patent Document 1 below is known as an apparatus for estimating a self-position of a vehicle.
  • the self-localization device described in Patent Document 1 below uses existing road infrastructure such as white lines and road signs on the road for position calculation using GPS (Global Positioning System), inertial devices, and vehicle speed pulses. This will increase the orientation accuracy.
  • GPS Global Positioning System
  • inertial devices Inertial devices
  • vehicle speed pulses This will increase the orientation accuracy.
  • the azimuth angle of the white line reflected in the image captured using the camera is calculated, and the difference between the azimuth angle of the white line stored in the azimuth database by the Kalman filter and the azimuth angle of the white line calculated from the image is calculated. Based on this, error estimation is performed.
  • Patent Document 1 since an image captured using a camera is used, an error cannot be estimated correctly if the image cannot be clearly obtained, such as when the weather is bad. In particular, if position estimation at the lane level is required, the technique described in Patent Document 1 cannot cope with it. In advanced driving support and automatic driving, it is necessary to specify the lane and the driving position within the lane, so that more accurate self-position estimation is required.
  • This disclosure aims to provide a self-position estimation device capable of highly accurate position estimation at the lane level.
  • the present disclosure relates to a self-position estimation device, a map information acquisition unit (201) that acquires map information including lane information that identifies a lane in which a vehicle can travel, and a position in a lane in which the host vehicle is traveling.
  • a vehicle position estimation unit (204) that estimates the position of the vehicle corresponding to the lane information included in the map information based on the map information, the position information in the lane, and the absolute position information. .
  • the host vehicle position estimation unit determines whether there is a lane corresponding candidate for which of the lanes specified by the lane information. Based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
  • the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
  • FIG. 1 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 2 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 3 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 4 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 5 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 6 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 7 is a diagram for explaining self-position estimation according to the present embodiment.
  • FIG. 8 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 1 is a block configuration diagram illustrating a functional configuration of the self-position estimation apparatus according to the embodiment.
  • FIG. 2 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 3 is a block configuration diagram illustrating a functional
  • FIG. 9 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 10 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 11 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 12 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 13 is a diagram for describing self-position estimation according to the present embodiment.
  • FIG. 14 is a diagram for describing self-position estimation according to the present embodiment.
  • the self-position estimation apparatus 10 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 10 will be described.
  • the self-position estimation apparatus 10 includes a self-position measurement unit 101, a vehicle momentum measurement unit 102, a white line recognition unit 103, a surrounding environment measurement unit 104, a route information acquisition unit 105, a dead reckoning 106, and a position estimation unit 108.
  • the self-position measuring unit 101 is a part for measuring the position of the own vehicle by GNSS (Global Navigation Satellite System).
  • the self-position measuring unit 101 calculates a vehicle measurement position that is a navigation measurement position of the vehicle based on navigation signals received from a plurality of navigation satellites.
  • the self-position measuring unit 101 outputs the calculated own vehicle measurement position to the dead reckoning 106 and the position estimating unit 108.
  • the vehicle momentum measuring unit 102 is a part that receives signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor and measures the amount of movement of the vehicle.
  • the vehicle momentum measuring unit 102 outputs information on the momentum such as the vehicle speed, the azimuth angle, the yaw rate, and the acceleration to the dead reckoning 106 and the position estimating unit 108.
  • the white line recognition unit 103 is a part for recognizing a white line that divides a lane using image data captured by the camera.
  • the white line recognition unit 103 outputs information on the presence or absence of a white line and information on the type of white line to the position estimation unit 108.
  • the surrounding environment measurement unit 104 is a part that measures the weather and satellite arrangement information.
  • the surrounding environment measurement unit 104 outputs weather and satellite arrangement information to the position estimation unit 108.
  • the route information acquisition unit 105 is a part that acquires the destination of the vehicle and the route to the destination from the navigation system.
  • the route information acquisition unit 105 outputs information indicating the destination and the route to the travel lane estimation unit 110.
  • the dead reckoning 106 automatically determines the position of the host vehicle in a place where positioning is difficult with GNSS alone, based on the host vehicle measurement position output from the host position measurement unit 101 and the momentum information output from the vehicle momentum measurement unit 102. It is a part which calculates as a car gyro position.
  • the dead reckoning 106 outputs the calculated own vehicle gyro position of the own vehicle to the position estimation unit 108.
  • the map information acquisition unit 109 is a part that acquires map information including lane information on which the vehicle can travel.
  • the map information acquisition unit 109 reads the map information stored in the map information storage unit 120 and outputs the read map information to the position estimation unit 108 and the travel lane estimation unit 110.
  • the position estimation unit 108 is a part that estimates a corrected vehicle position that is a corrected position of the vehicle based on the map information and the vehicle measurement position and / or the vehicle gyro position.
  • the position estimation unit 108 estimates the corrected vehicle position by superimposing the accuracy of the map information and the vehicle measurement position and / or the vehicle gyro position.
  • a probability distribution representing the probability may be used, or a numerical value representing the certainty may be used.
  • a solid white line SLa is provided on the left side in the traveling direction of the lane L1.
  • a broken line white line BLa is provided between the lane L1 and the lane L2.
  • a broken line white line BLb is provided between the lane L2 and the lane L3.
  • a solid white line SLb is provided on the right side in the traveling direction of the lane L3.
  • the lane center line L1c is a line indicating the center of the lane L1.
  • the lane center line L2c is a line indicating the center of the lane L2.
  • the lane center line L3c is a line indicating the center of the lane L3.
  • the map probability distribution PDm of the lane center line L1c, the lane center line L2c, and the lane center line L3c is regarded as the likelihood of the map information.
  • the host vehicle is at the host vehicle measurement position Pa.
  • the host vehicle is traveling along the lane L1 from the host vehicle measurement position Pa.
  • the own vehicle gyro position may be used instead of the own vehicle measurement position.
  • the position estimation unit 108 estimates the corrected vehicle position Pb by superimposing the vehicle position probability distribution PDca and the map information probability distribution PDm at the vehicle measurement position Pa at the first estimation timing.
  • the corrected host vehicle position Pb is corrected to the lane center line L1c side by the distance d1 as compared with the case where the correction is not performed from the host vehicle measurement position Pa.
  • the position estimation unit 108 estimates the corrected vehicle position Pc by superimposing the vehicle position probability distribution PDcb at the corrected vehicle position Pb and the map information probability distribution PDm at the next estimation timing.
  • the corrected vehicle position Pc is corrected to the lane center line L1c side by the distance d2 as compared to the case where no correction is made from the corrected vehicle position Pb.
  • the map probability distribution is not limited to the probability distribution of the lane center line, but a probability distribution indicating the certainty of the map information is used. Further, a map probability distribution offset by the driver's bag or road shape may be used.
  • the road shape includes information such as the road width and the presence or absence of an adjacent lane.
  • the self-position estimation apparatus 20 is configured as a computer including a calculation unit such as a CPU, a storage unit such as a RAM and a ROM, and an interface unit for exchanging data with various sensors as hardware components. Subsequently, functional components of the self-position estimation apparatus 20 will be described.
  • the self-position estimation apparatus 20 includes a map information acquisition unit 201, a lane position detection unit 202, an absolute position estimation unit 203, a vehicle position estimation unit 204, a comparison target detection unit 205, and a map information storage unit 211. It is equipped with.
  • the map information acquisition unit 201 is a part that acquires map information including lane information in which the vehicle can travel.
  • the map information acquisition unit 201 reads the map information stored in the map information storage unit 211 and outputs the read map information to the own vehicle position estimation unit 204.
  • the in-lane position detection unit 202 is a part that detects in-lane position information that specifies an in-lane position that is a position in the lane in which the host vehicle is traveling.
  • the in-lane position detection unit 202 detects in-lane position information based on the surrounding environment and lane conditions captured by the camera.
  • the in-lane position detection unit 202 outputs in-lane position information to the own vehicle position estimation unit 204.
  • the in-lane position detection unit 202 identifies the in-lane position from the lateral deviation and turning angle of the host vehicle, and generates in-lane position information.
  • the absolute position estimation unit 203 shown in FIG. 3 is a part that estimates absolute position information that specifies the absolute position of the host vehicle and its error.
  • the absolute position estimation unit 203 outputs the estimated absolute position information to the own vehicle position estimation unit 204.
  • the absolute position estimation unit 203 can estimate the absolute position information by various methods.
  • the absolute position estimation unit 203 can estimate absolute position information that specifies the absolute position of the host vehicle and its error by GNSS.
  • the absolute position estimation unit 203 calculates a host vehicle measurement position that is a navigation measurement position of the host vehicle based on navigation signals received from a plurality of navigation satellites, and estimates absolute position information based on the host vehicle measurement position. Can do.
  • the absolute position estimation unit 203 can also receive signals from sensors such as an acceleration sensor, a vehicle speed sensor, and a gyro sensor, and measure the amount of movement of the host vehicle.
  • the absolute position estimation unit 203 calculates, based on the information indicating the own vehicle measurement position and the amount of movement of the own vehicle, the position of the own vehicle at a location where positioning is difficult by GNSS alone, as the own vehicle gyro position. To estimate absolute position information.
  • FIG. 4 shows an example of a mode estimated by the absolute position estimation unit 203.
  • the absolute position estimation unit 203 estimates “candidate 1” and “candidate 2” as absolute position information.
  • Candidate 1 estimates current position X i t based on the amount of movement of the vehicle from the previous position X i t-1 .
  • the current position X i t is position information including an estimation error.
  • Candidate 2 estimates current position X j t in consideration of the amount of movement of the host vehicle from previous position X j t ⁇ 1 .
  • the current position X j t is position information including an estimation error.
  • the own vehicle position estimation unit 204 shown in FIG. 3 is a part that estimates the position of the own vehicle corresponding to the lane information included in the map information based on the map information, the in-lane position information, and the absolute position information. is there.
  • the own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated. If there is no lane candidate, it may be determined that the sensor is abnormal.
  • “Gyro estimation candidate 1” and “Gyro estimation candidate 2” shown in FIG. 6 correspond to “candidate 1” and “candidate 2” shown in FIG. 4, and the estimation error of “candidate 1” in FIG. And the estimation error of “candidate 2” is superimposed on the map information.
  • the “camera observation lateral deviation” shown in FIG. 6 is obtained by superimposing the “lateral deviation” shown in FIG. 5 on the map information for each lane.
  • P t-1 is the covariance matrix at the previous time.
  • y t-1 is the horizontal position of the previous time.
  • ⁇ t-1 is the attitude angle at the previous time.
  • the lateral position y t and the posture angle ⁇ t are calculated by the following equation.
  • is system noise.
  • the estimated covariance matrix Pt by the gyro sensor is calculated by the following equation using the covariance matrix P t-1 at the previous time.
  • M is a covariance matrix of the system noise ⁇ , and is set based on the error specification of the gyro sensor.
  • T represents a transposed matrix.
  • the information shown in FIG. 6 is sensor-fused by a Kalman filter.
  • the Kalman gain K is calculated by the following equation.
  • Q is an observation error matrix, and is set from the error characteristics of the observed values at the positions in the lane.
  • H is a 2 ⁇ 2 unit matrix.
  • Z t is a vector in which the horizontal position and the posture angle detected by the gyro sensor are vertically arranged.
  • ⁇ Z is the difference between the observation value by the camera of the in-lane position and the attitude angle and the estimation result Z t by the gyro sensor.
  • the lane corresponding candidates are lane 1 and lane 2.
  • the comparison target detection unit 205 when there are a plurality of lane matching candidates, the comparison target detection unit 205 is provided to reject any lane matching candidate.
  • the comparison object detection unit 205 is a part that detects comparison object information that is different from information used for detection of in-lane position information and estimation of absolute position information.
  • comparison target information line type information for determining lanes, road shape information in map information, position information by GPS, and the like are used.
  • the own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
  • line type information is used as comparison target information.
  • the comparison target detection unit 205 detects a solid line on the left side and a broken line on the right side as line type information. Based on the line type recognition result, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and selects lane candidates. Execute. In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
  • GPS information is used as comparison target information when both lanes are lane candidates.
  • the comparison target detection unit 205 acquires position information including an error indicated by a broken-line circle in the figure as GPS information. Based on the GPS information, the host vehicle position estimation unit 204 estimates that the host vehicle is present in the lane on the left side in the traveling direction, rejects the lane on the right side in the traveling direction as a candidate, and executes selection of lane candidates. . In this case, the lane corresponding candidate is the lane on the left side in the traveling direction.
  • map information is used as comparison target information when candidates exist in both lanes.
  • the comparison target detection unit 205 detects the road shape as the comparison target information as the map information.
  • the vehicle position estimation unit 204 rejects the lane on the left side of the traveling direction as a candidate because the candidate on the left side in the traveling direction does not overlap with the map information as time elapses, and executes selection of the lane candidate.
  • the candidate corresponding to the lane is the lane on the right side in the traveling direction, and the lane on the left side in the traveling direction after the lane change.
  • the own vehicle position estimation unit 204 can notify that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. As shown in FIG. 13, when there is one candidate, if the error variance is small, it is determined that the reliability is high. Even if there is only one candidate, if the error variance is large, it is determined that the reliability is low. When there are a plurality of candidates, it is determined that the reliability is low because the lane cannot be specified even when the variance of the error is small. If there are a plurality of candidates and the error variance is large, it is determined that the reliability is low.
  • FIG. 14A shows the position recognition result in the lane.
  • FIG. 14B shows the recognition result of the vehicle position.
  • FIG. 14C shows the number of lane candidates.
  • FIG. 14D shows the reliability.
  • the position in the lane is recognized.
  • the position of the host vehicle is recognized on the right lane.
  • the position in the lane is not recognized. For this reason, the error in recognizing the position of the host vehicle is large, and the reliability is low.
  • the in-lane position recognition is performed again, so that the error in the position recognition of the host vehicle is reduced and the reliability is also increased.
  • the position in the lane is not recognized, and the reliability is low again.
  • the self-position estimation device 20 is a map information acquisition unit 201 that acquires map information including lane information that identifies a lane in which the vehicle can travel, and a position in the lane in which the host vehicle is traveling.
  • In-lane position detection unit 202 that detects in-lane position information that specifies the in-lane position, an absolute position estimation unit 203 that estimates absolute position information that specifies the absolute position of the host vehicle and its error, map information, in-lane A host vehicle position estimation unit 204 that estimates the position of the host vehicle corresponding to the lane information included in the map information based on the position information and the absolute position information.
  • the own vehicle position estimation unit 204 determines whether or not there is a lane candidate corresponding to which of the lanes in which the lane position is specified by the lane information based on the correlation between the lane position, the absolute position, and its error. Then, based on the determination result, the position of the host vehicle corresponding to the map information is estimated.
  • the position of the host vehicle can be estimated in consideration of the position in the lane and the absolute position in the lane candidate.
  • the vehicle position estimation unit 204 determines a lane candidate based on the degree of overlap between the position in the lane including the error distribution and the absolute position including the error distribution. By superimposing the error distribution in both the in-lane position and the absolute position, the position of the host vehicle can be estimated by reflecting the error.
  • the host vehicle position estimation unit 204 continues the position estimation of the host vehicle corresponding to the plurality of lane candidates until the selection condition is satisfied, when a plurality of lane candidates are determined, and satisfies the selection condition. And selecting a plurality of lane candidates. When the selection condition is satisfied, a plurality of discriminated lane candidates can be selected and rejected, so that the position estimation load of the host vehicle corresponding to the plurality of lane candidates can be reduced.
  • the self-position estimation apparatus 20 further includes a comparison target detection unit 205 that detects comparison target information different from information used for detection of in-lane position information and absolute position information.
  • the own vehicle position estimating unit 204 determines whether or not the selection condition is satisfied based on the lane candidate and the comparison target information, and executes the selection of the lane candidate.
  • the lane candidate candidates are selected based on the comparison target information, the lane candidate candidates can be selected from a different point of view. By increasing the reliability of the comparison target information, unnecessary lane candidates can be rejected.
  • the comparison target detection unit 205 is line type information detected from the image captured by the imaging device provided in the host vehicle as the comparison target information, and classifies the lane in which the host vehicle is traveling.
  • the running line type information that identifies the line type of at least one lane boundary line is detected, and the vehicle position estimation unit 204 is traveling with the map line type information that identifies the line type of the lane boundary line included in the lane information. If the line type information does not match, it is determined that the selection condition is satisfied, and the lane candidate that does not match is rejected.
  • the map line type information specifies that there are broken lines on the right and left sides as viewed from the host vehicle, for example, the line type specified by the running line type information is a continuous line on the right side and a broken line on the left side Since the map line type information and the running line type information are inconsistent, it is determined that the selection condition is satisfied, and the corresponding lane candidate is rejected. Since it is considered that the reliability of the traveling line type information detected from the image captured by the imaging device is high, unnecessary lane candidates can be rejected.
  • the absolute position estimation unit 203 estimates absolute position information based on the detection result of the gyro sensor, and the comparison target detection unit 205 calculates the comparison target information based on navigation signals received from a plurality of navigation satellites.
  • the vehicle position estimation unit 204 detects and determines that the selection condition is satisfied when the lane candidate and the position of the host vehicle specified by the comparison target information do not overlap, and the lane candidate that does not overlap. Reject.
  • the comparison target information is generated based on navigation signals received from a plurality of navigation satellites, if the reception status of the navigation signals is good, the reliability can be improved more than the absolute position information based on the detection result of the gyro sensor. Yes, you can reject unnecessary lane candidates.
  • the own vehicle position estimation unit 204 determines that the selection condition is satisfied when there are a plurality of lane candidates and at least one of the plurality of lane candidates does not overlap with the lane information, and the overlap does not occur. Dismiss the corresponding lane candidate.
  • the overlapping lane candidates are selected and the lanes that do not overlap By rejecting the candidates, unnecessary lane candidates can be rejected.
  • the own vehicle position estimation unit 204 notifies that the reliability of the estimation result is low when there are a plurality of lane candidates or when the absolute position error is greater than or equal to a predetermined value. By notifying that the reliability of the estimation result is low, the user can recognize the decrease in the reliability.

Abstract

L'invention concerne une unité d'estimation de position de véhicule hôte (204) d'un dispositif d'estimation de position propre (20), laquelle évalue la présence ou l'absence d'un candidat de correspondance de voie indiquant si une position dans la voie correspond à une voie spécifiée par des informations de voie, en fonction de relations mutuelles entre la position dans la voie, une position absolue et des erreurs dans celles-ci, et estime la position d'un véhicule hôte correspondant à des informations de carte d'après le résultat d'évaluation.
PCT/JP2018/010068 2017-03-16 2018-03-14 Dispositif d'estimation de position propre WO2018168961A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/568,606 US11639853B2 (en) 2017-03-16 2019-09-12 Self-localization estimation device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2017051066 2017-03-16
JP2017-051066 2017-03-16
JP2017-248744 2017-12-26
JP2017248744A JP6693496B2 (ja) 2017-03-16 2017-12-26 自己位置推定装置
JP2017248745A JP6870604B2 (ja) 2017-03-16 2017-12-26 自己位置推定装置
JP2017-248745 2017-12-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/568,606 Continuation US11639853B2 (en) 2017-03-16 2019-09-12 Self-localization estimation device

Publications (1)

Publication Number Publication Date
WO2018168961A1 true WO2018168961A1 (fr) 2018-09-20

Family

ID=63522498

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2018/010068 WO2018168961A1 (fr) 2017-03-16 2018-03-14 Dispositif d'estimation de position propre
PCT/JP2018/010057 WO2018168956A1 (fr) 2017-03-16 2018-03-14 Dispositif d'estimation de position propre

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010057 WO2018168956A1 (fr) 2017-03-16 2018-03-14 Dispositif d'estimation de position propre

Country Status (1)

Country Link
WO (2) WO2018168961A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021012086A (ja) * 2019-07-05 2021-02-04 トヨタ自動車株式会社 車線推定装置
WO2022015715A1 (fr) 2020-07-13 2022-01-20 The Trustees Of The University Of Pennsylvania Compositions utiles pour le traitement de la maladie de charcot-marie-tooth
WO2022221276A1 (fr) 2021-04-12 2022-10-20 The Trustees Of The University Of Pennsylvania Compositions utiles pour le traitement de l'amyotrophie spinale et bulbaire (sbma)
EP3882885A4 (fr) * 2019-03-12 2022-11-23 Hitachi Astemo, Ltd. Dispositif de commande de véhicule
WO2023133574A1 (fr) 2022-01-10 2023-07-13 The Trustees Of The University Of Pennsylvania Compositions et méthodes utiles pour le traitement de troubles médiés par c9orf72

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7223629B2 (ja) * 2019-05-13 2023-02-16 日立Astemo株式会社 車載システム、外界認識センサ、電子制御装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08233584A (ja) * 1995-02-27 1996-09-13 Nippon Telegr & Teleph Corp <Ntt> 移動体位置検出装置
JP2006112878A (ja) * 2004-10-14 2006-04-27 Alpine Electronics Inc ナビゲーション装置
US20120271540A1 (en) * 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
WO2017029734A1 (fr) * 2015-08-19 2017-02-23 三菱電機株式会社 Dispositif de reconnaissance de voie de circulation et procédé de reconnaissance de voie de circulation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4832489B2 (ja) * 2008-09-25 2011-12-07 クラリオン株式会社 車線判定装置
JP4934167B2 (ja) * 2009-06-18 2012-05-16 クラリオン株式会社 位置検出装置および位置検出プログラム
JP2012127845A (ja) * 2010-12-16 2012-07-05 Yupiteru Corp 車載用電子機器及びプログラム
EP3106836B1 (fr) * 2015-06-16 2018-06-06 Volvo Car Corporation Unité et procédé pour régler une limite de route

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08233584A (ja) * 1995-02-27 1996-09-13 Nippon Telegr & Teleph Corp <Ntt> 移動体位置検出装置
JP2006112878A (ja) * 2004-10-14 2006-04-27 Alpine Electronics Inc ナビゲーション装置
US20120271540A1 (en) * 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
WO2017029734A1 (fr) * 2015-08-19 2017-02-23 三菱電機株式会社 Dispositif de reconnaissance de voie de circulation et procédé de reconnaissance de voie de circulation

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3882885A4 (fr) * 2019-03-12 2022-11-23 Hitachi Astemo, Ltd. Dispositif de commande de véhicule
US11796324B2 (en) 2019-03-12 2023-10-24 Hitachi Astemo, Ltd. Vehicle control device
JP2021012086A (ja) * 2019-07-05 2021-02-04 トヨタ自動車株式会社 車線推定装置
JP7120170B2 (ja) 2019-07-05 2022-08-17 トヨタ自動車株式会社 車線推定装置
WO2022015715A1 (fr) 2020-07-13 2022-01-20 The Trustees Of The University Of Pennsylvania Compositions utiles pour le traitement de la maladie de charcot-marie-tooth
WO2022221276A1 (fr) 2021-04-12 2022-10-20 The Trustees Of The University Of Pennsylvania Compositions utiles pour le traitement de l'amyotrophie spinale et bulbaire (sbma)
WO2023133574A1 (fr) 2022-01-10 2023-07-13 The Trustees Of The University Of Pennsylvania Compositions et méthodes utiles pour le traitement de troubles médiés par c9orf72

Also Published As

Publication number Publication date
WO2018168956A1 (fr) 2018-09-20

Similar Documents

Publication Publication Date Title
JP6870604B2 (ja) 自己位置推定装置
WO2018168961A1 (fr) Dispositif d&#39;estimation de position propre
Atia et al. A low-cost lane-determination system using GNSS/IMU fusion and HMM-based multistage map matching
CN105937912B (zh) 车辆的地图数据处理装置
EP2664894B1 (fr) Dispositif de navigation
US11287524B2 (en) System and method for fusing surrounding V2V signal and sensing signal of ego vehicle
US8775063B2 (en) System and method of lane path estimation using sensor fusion
US8452535B2 (en) Systems and methods for precise sub-lane vehicle positioning
EP2054699B1 (fr) Changement d&#39;itinéraire dans des systèmes de navigation de véhicule
US11599121B2 (en) Method for localizing a more highly automated vehicle (HAF), in particular a highly automated vehicle, and a vehicle system
JP2019532292A (ja) 車両位置特定の自律走行車両
WO2007000912A1 (fr) Dispositif de reconnaissance de véhicule et de voie
EP3644016A1 (fr) Localisation à l&#39;aide de repères dynamiques
JP2016080460A (ja) 移動体
US20220113139A1 (en) Object recognition device, object recognition method and program
US20210278217A1 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
KR102134841B1 (ko) 자율주행차의 위치추정시스템 및 그의 위치추정방법
CN114670840A (zh) 死角推测装置、车辆行驶系统、死角推测方法
JP6507841B2 (ja) 先行車両推定装置及びプログラム
JP2016218015A (ja) 車載センサ補正装置、自己位置推定装置、プログラム
KR20210073281A (ko) 운동 정보 추정 방법 및 장치
EP4001844A1 (fr) Procédé et appareil de localisation
CN113196109A (zh) 用于确定完整性范围的方法
US11287281B2 (en) Analysis of localization errors in a mobile object
JP2018169319A (ja) 車両の走行車線推定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18768509

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18768509

Country of ref document: EP

Kind code of ref document: A1