WO2020021596A1 - Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule - Google Patents

Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule Download PDF

Info

Publication number
WO2020021596A1
WO2020021596A1 PCT/JP2018/027495 JP2018027495W WO2020021596A1 WO 2020021596 A1 WO2020021596 A1 WO 2020021596A1 JP 2018027495 W JP2018027495 W JP 2018027495W WO 2020021596 A1 WO2020021596 A1 WO 2020021596A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
road feature
vehicle position
estimated
Prior art date
Application number
PCT/JP2018/027495
Other languages
English (en)
Japanese (ja)
Inventor
雄治 五十嵐
優子 大田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/027495 priority Critical patent/WO2020021596A1/fr
Publication of WO2020021596A1 publication Critical patent/WO2020021596A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle position estimating device for estimating a current position of a vehicle.
  • a road sign is detected from image data obtained by photographing the periphery of a vehicle, and a road sign is detected based on a relative position of the vehicle with respect to the detected road sign and a position of the road sign included in the map information.
  • a technique for estimating the position of a vehicle on a road has been disclosed.
  • An imaging sensor such as a monocular camera has an advantage that it can recognize the type of an imaging target such as a road sign and is relatively inexpensive.
  • the position detection using the image sensor has a detection error in the traveling direction of the vehicle of about several meters, and it is difficult to detect the position with high accuracy. Therefore, the position detection by the image sensor is not suitable for applications that need to detect the position and direction of the vehicle with high accuracy, such as automatic driving of vehicles and preventive safety technology.
  • optical ranging sensors such as LiDAR (Light Detection and Ranging) and stereo cameras have high position detection accuracy.
  • LiDAR Light Detection and Ranging
  • stereo cameras have high position detection accuracy.
  • point cloud information including information on a distance to a detection target and information on reflection intensity or luminance / color.
  • Central Processing Unit Central Processing Unit
  • the processing time required for position detection is also required in units of several seconds, it is not suitable for applications requiring immediate responsiveness, such as automatic driving of vehicles and preventive safety technology.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a vehicle position estimating apparatus having both high-accuracy position detection and immediate responsiveness.
  • the vehicle position estimating device is a first vehicle position estimating device that indicates a position and an azimuth of the own vehicle on a map based on absolute positioning information of the own vehicle obtained by satellite positioning and vehicle sensor information obtained from a vehicle sensor of the own vehicle.
  • An orientation detection unit Than it is.
  • the range in which the detection processing is performed on the optical ranging information includes the position of the road feature estimated from the first estimated own-vehicle position information. , The time required for the detection process can be reduced. As a result, a vehicle position estimating device having both high-accuracy position detection and immediate responsiveness can be realized.
  • FIG. 1 is a block diagram illustrating a configuration of a vehicle position estimation system according to an embodiment of the present invention. It is a figure showing an example of a detectable range of an optical distance measuring sensor device. It is a figure which shows the example of derivation of the detectable area of a road feature. It is a figure which shows the example of the target range of the detection processing of a road feature. 4 is a flowchart illustrating an operation of the vehicle position estimation device according to the embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device.
  • FIG. 1 is a block diagram showing a configuration of a vehicle position estimation system according to an embodiment of the present invention.
  • the vehicle position estimation system includes a satellite positioning device 1, a vehicle sensor information output device 2, an imaging sensor device 3, an optical ranging sensor device 4, a high-accuracy map database 5, and a vehicle position estimation device 6. ing.
  • a vehicle equipped with the vehicle position estimation system is referred to as “own vehicle”.
  • the satellite positioning device 1 calculates the absolute position (latitude, longitude, altitude) and absolute azimuth of the own vehicle by satellite positioning based on a positioning signal transmitted by a Global Positioning System (GNSS) satellite such as a GPS (Global Positioning System) satellite. Then, the calculated information on the absolute position and the absolute direction of the own vehicle is output as “absolute positioning information”.
  • GNSS Global Positioning System
  • GPS Global Positioning System
  • the vehicle sensor information output device 2 outputs “vehicle sensor information” obtained from vehicle sensors such as a vehicle speed sensor, a gyro, a steering angle sensor, and an air pressure sensor mounted on the own vehicle. It is assumed that the vehicle sensor information includes at least one of travel speed, rotation angle, steering angle, and air pressure.
  • the imaging sensor device 3 includes an imaging device (camera sensor) such as a monocular camera, for example, and detects a feature around the road where the own vehicle is located (hereinafter, referred to as a “road feature”) from an image of the vicinity of the own vehicle. Detect position, shape, type, etc. Road features include not only three-dimensional features such as traffic road signs and traffic lights, but also planar features such as lane markings and stop lines drawn on the road surface. In addition, the position of the road feature detected by the imaging sensor device 3 is a relative position from the own vehicle. The imaging sensor device 3 outputs information on the detected position, shape, type, and the like of the road feature and information on its detection accuracy as “imaging sensor information”.
  • imaging sensor information information on the detected position, shape, type, and the like of the road feature and information on its detection accuracy as “imaging sensor information”.
  • the optical ranging sensor device 4 includes an optical ranging sensor such as a LiDAR or a stereo camera, and includes “optical ranging information including information on a distance to an obstacle existing around the own vehicle and reflection intensity or luminance / color. Is output.
  • an optical ranging sensor such as a LiDAR or a stereo camera
  • optical ranging information including information on a distance to an obstacle existing around the own vehicle and reflection intensity or luminance / color. Is output.
  • an object that reflects light is detected as an obstacle.
  • the stereo camera an object whose distance can be detected from a parallax image (images captured from a plurality of different viewpoints) is detected as an obstacle.
  • the detectable range of the optical distance measuring sensor device 4 is preset in the vehicle position estimating device 6 as a parameter. Since the range changes according to the performance of the optical distance measuring sensor device 4, it is preferable that the range is determined based on an experiment or the like.
  • FIG. 2 shows an example of the detectable range of the optical distance measuring sensor device 4.
  • the X axis is set in the traveling direction (head direction) of the host vehicle
  • the Y axis is set in the lateral direction of the host vehicle
  • the Z axis is set in the height direction with the host vehicle position as the origin.
  • the own vehicle position as the origin corresponds to the own vehicle position estimated by the vehicle position estimating device 6, and is, for example, the center of gravity position, the center position, or the head position of the own vehicle.
  • the detectable range of the optical ranging sensor device 4 is defined as a space defined by a detectable distance L1, a detectable angle ⁇ 1 on the XY plane, and a detectable angle ⁇ 2 on the ZY plane, as shown in FIG. it can.
  • the detectable distance is considered to be the same on the XY plane and the ZX plane, but different detectable distances may be set on the XY plane and the ZX plane.
  • the high-precision map database 5 is a database in which data of high-precision maps is stored.
  • the high-precision map includes information indicating the lane shape of the road, such as the position, shape, and type of lane markings of the road, the position of the stop line, the position, shape, type of traffic road signs and traffic signals installed around the road, Detailed information on road features, such as direction information, is included.
  • the high-accuracy map database 5 may not be mounted on the own vehicle, and may be, for example, a server that distributes high-accuracy map data to the vehicle position estimating device 6 by communication.
  • the high-accuracy map database 5 is depicted as a different block from the vehicle position estimating device 6, but the high-accurate map database 5 may be provided inside the vehicle position estimating device 6. Further, the high-accuracy map database 5 and the vehicle position estimating device 6 may be provided in a navigation system of the own vehicle.
  • the vehicle position estimating device 6 includes a vehicle position estimating unit 61 and a road feature position / azimuth detecting unit 62, as shown in FIG.
  • the vehicle position estimating unit 61 includes the absolute positioning information (the absolute position and the absolute azimuth of the own vehicle) output from the satellite positioning device 1 and the vehicle sensor information (the traveling speed and the rotation angle of the own vehicle) output from the vehicle sensor information output device 2. , Steering angle, and air pressure) and the position and orientation of the vehicle on the high-accuracy map based on the high-accuracy map data stored in the high-accuracy map database 5. Is output as “first estimated vehicle position information”. Note that the absolute positioning information and the vehicle sensor information are input to the vehicle position estimating unit 61 at regular intervals, and the vehicle position estimating unit 61 outputs the first estimated own vehicle position information at regular intervals.
  • the road feature position / azimuth detecting unit 62 outputs the first estimated own vehicle position information (the position and the azimuth of the own vehicle on the high-accuracy map estimated by the vehicle position estimating unit 61) output by the vehicle position estimating unit 61, Based on the high-accuracy map stored in the high-accuracy map database 5 and at least one of the image sensor information (the position, shape, type, and the like of the road feature detected by the image sensor 3) output by the image sensor 3 Then, a road feature to be detected is selected from the optical ranging information output by the optical ranging sensor device 4.
  • the road feature position / azimuth detecting unit 62 detects the relative position and relative orientation of the selected road feature from the own vehicle from the optical ranging information output by the optical ranging sensor device 4. Further, the road feature position / azimuth detecting unit 62 detects the position of the vehicle on the high-precision map based on the relative position and relative orientation of the road feature detected from the imaging sensor information and the data of the high-precision map. And the direction is estimated. The information on the position and orientation of the vehicle on the high-accuracy map estimated by the road feature position / azimuth detecting unit 62 is output from the road feature position / azimuth detecting unit 62 as “second estimated own vehicle position information”. You.
  • the road feature position / azimuth detection unit 62 can detect the relative position and relative orientation from the own vehicle at higher speed and with higher accuracy as the road feature to be detected, that is, the optical distance measurement information. Select the one with higher density and accuracy. Specifically, a road feature having a short distance from the own vehicle (a short detection distance) and a large size (a large detectable area) when viewed from the own vehicle is selected as a detection target. For example, by performing a road feature detection process based on the first estimated own vehicle position information and the high-precision map or the imaging sensor information, a 10-cm-diameter cylindrical pole and a 30-cm-radius disk-shaped road sign are automatically detected. If detected at the same distance from the car, the road feature position / azimuth detecting unit 62 selects a road sign of both as a detection target.
  • the position / direction detection unit 62 may calculate the size of the road feature based on the positional relationship between the vehicle and the road feature. For example, as shown in FIG. 3, when the angle (detection angle) between the surface of a rectangular road feature having an area S and the detection direction of the optical distance measuring sensor device 4 is ⁇ a, the road feature can be detected.
  • the area is S ⁇ sin ⁇ a.
  • the road feature position / azimuth detecting unit 62 When detecting the relative position and relative orientation of the road feature selected as the detection target, the road feature position / azimuth detecting unit 62 outputs the optical ranging information (point group information) output by the optical ranging sensor device 4. Among them, the detection processing is performed on the optical ranging information within the range in which the road feature is estimated to be located. The position of the road feature is known at the stage of selecting the road feature to be detected. By limiting the range of the optical ranging information to be subjected to the road feature detection processing in this way, the calculation load required for the detection processing can be reduced, and the position and orientation of the road feature can be detected at high speed.
  • the range of the optical distance measurement information to be subjected to the road feature detection processing includes an area having a width corresponding to a predetermined margin at the outer edge of the range corresponding to the estimated position and shape of the road feature.
  • the range is added.
  • the range of the optical ranging information to be subjected to the detection processing is a radius R including the shape of the road sign. 4 is defined as a range obtained by adding a region having a margin m to the outer edge of the circle, that is, a range of a circle having a radius R + m shown in FIG.
  • the allowance m is preferably a value equal to or larger than the estimation error e of the first estimated own vehicle position information.
  • the reason is that the range where the road feature is estimated to be located is obtained as the position (relative position) of the road feature based on the own vehicle position on the high-accuracy map indicated by the first estimated own vehicle position information. For this reason, the same error as the error generated in the first estimated own-vehicle position information also occurs in the position of the road feature position.
  • the information of the estimation error e may be included in the first estimated own-vehicle position information, or may be a value previously obtained from an experiment or the like.
  • an optimal detection model is applied according to the shape information (circle, rectangle, cylinder, etc.) included in the road feature information of the high-precision map.
  • shape information (circle, rectangle, cylinder, etc.) included in the road feature information of the high-precision map.
  • the shape information of a road feature indicates a cylinder
  • the cylinder may be detected using a Cylinder model of PCL (Point @ Cloud @ Library).
  • PCL Point @ Cloud @ Library
  • the vehicle position estimating device 6 of the present embodiment has both high-accuracy position detection and immediate responsiveness. Also, by limiting the range of the optical ranging information to be subjected to the road feature detection processing, the calculation load for the detection processing is reduced. Therefore, the effect that the position of the own vehicle can be estimated with high accuracy can be obtained.
  • the detection processing may take a certain amount of time (200 to 300 milliseconds may be required). . Therefore, especially when the own vehicle is running, a difference may occur between the position and the azimuth of the own vehicle indicated by the second estimated own vehicle position information and the current position and the azimuth of the own vehicle.
  • the vehicle position estimating unit 61 obtains an error in the first estimated own vehicle position information using the second estimated own vehicle position information, and based on the error, determines the current (latest) By correcting the estimated own vehicle position information of No. 1, a more accurate position and orientation of the own vehicle are obtained. Specifically, the vehicle position estimating unit 61 acquires from the road feature position / azimuth detecting unit 62 the second estimated own vehicle position information and the information on the acquisition time of the optical ranging information used for the calculation, The difference between the first estimated own vehicle position information at the same time and the second estimated own vehicle position information acquired from the road feature position / azimuth detecting unit 62 is obtained, and the obtained difference is used as the first estimated own vehicle position information. It is regarded as an error of the vehicle position information. Then, the vehicle position estimating unit 61 calculates the more accurate position and orientation of the own vehicle by adding the error to the latest first estimated own vehicle position information.
  • the vehicle position estimating unit 61 uses the first estimated own-vehicle position information from the present time to a certain time (200 to 300 ms or more). Must be held together with the calculation time.
  • FIG. 5 is a flowchart showing the operation of the vehicle position estimation device 6.
  • a process for the vehicle position estimating device 6 to estimate the own vehicle position (own vehicle position estimating process) will be described. Note that the flow of FIG. 5 is repeatedly executed after the activation of the vehicle position estimating device 6.
  • the vehicle position estimating unit 61 determines the position and orientation of the vehicle on the high-precision map based on the absolute positioning information output by the satellite positioning device 1 and the vehicle sensor information output by the vehicle sensor information output device 2. First estimated vehicle position information including the information is calculated (step S100).
  • the road feature position / azimuth detecting unit 62 is in a detectable range of the optical distance measuring sensor device 4 among the road features existing around the own vehicle position indicated by the first estimated own vehicle position information.
  • Information on road features is acquired from the high-accuracy map database 5 (step S101).
  • the road feature position / azimuth detecting unit 62 calculates the first estimated own vehicle position information calculated in step S100, the road feature information acquired in step S101, and the imaging sensor information output by the imaging sensor device 3.
  • a road feature to be detected is selected based on one or both of the above (step S102).
  • the road feature position / azimuth detecting unit 62 performs a detection process on the optical ranging information (point group information) output from the optical ranging sensor device 4, so that the road feature selected in step S ⁇ b> 102 can be used.
  • the relative position and relative azimuth from the car are detected (step S103). This detection process is performed on the optical ranging information output from the optical ranging sensor device 4 within the range where the road feature is estimated to be located.
  • the road feature position / azimuth detection unit 62 performs a high-precision map based on the information on the road feature obtained in step S101 and the relative position and relative orientation of the road feature detected from the own vehicle in step S103.
  • the second estimated own vehicle position information including the information on the position and the direction of the upper own vehicle is calculated (step S104).
  • the vehicle position estimating unit 61 calculates the second estimated own vehicle position information calculated in step S104 and the first estimated own vehicle position information corresponding to the acquisition time of the optical ranging information used for the calculation. Is obtained as an error of the first estimated own vehicle position information, and the error is added to the latest first estimated own vehicle position information, thereby correcting the latest first estimated own vehicle position information. (Step S105). Based on the corrected first estimated own-vehicle position information, the current position and orientation of the own-vehicle can be obtained.
  • FIGS. 6 and 7 are diagrams illustrating examples of the hardware configuration of the vehicle position estimating device 6, respectively.
  • Each function of the components of the vehicle position estimation device 6 shown in FIG. 1 is realized by, for example, a processing circuit 70 shown in FIG. That is, the vehicle position estimating device 6 uses the first absolute position information of the own vehicle obtained by the satellite positioning and the vehicle sensor information obtained from the own vehicle sensor to indicate the position and the direction of the own vehicle on the map. Estimated own vehicle position information is calculated, and based on the first estimated own vehicle position information and one or both of the road feature information included in the map data and the image sensor information obtained from the own vehicle image sensor.
  • a processing circuit 70 is provided for calculating second estimated vehicle position information indicating the position and direction of the vehicle on the map based on the relative position and direction of the road feature detected from the information.
  • the processing circuit 70 may be dedicated hardware, or a processor that executes a program stored in a memory (a central processing unit (CPU: Central Processing Unit), a processing device, an arithmetic device, a microprocessor, a microcomputer, It may be configured using a DSP (also called Digital Signal Processor).
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the processing circuit 70 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these.
  • Each function of the components of the vehicle position estimating device 6 may be realized by an individual processing circuit, or the functions may be realized by one processing circuit.
  • FIG. 7 shows an example of a hardware configuration of the vehicle position estimating device 6 when the processing circuit 70 is configured using the processor 71 that executes a program.
  • the functions of the components of the vehicle position estimation device 6 are realized by software or the like (software, firmware, or a combination of software and firmware).
  • Software and the like are described as programs and stored in the memory 72.
  • the processor 71 implements the function of each unit by reading and executing the program stored in the memory 72. That is, when the vehicle position estimating device 6 is executed by the processor 71, the vehicle position estimating device 6 uses the vehicle positioning information on the map based on the absolute positioning information of the vehicle obtained by satellite positioning and the vehicle sensor information obtained from the vehicle sensor of the vehicle.
  • a second estimated own-vehicle position indicating a position and an azimuth of the own vehicle on a map based on a process of detecting from the obtained optical ranging information and a relative position and a relative azimuth of the road feature detected from the optical ranging information.
  • Calculate information Comprising a process of the memory 72 for storing a program that will but executed consequently. In other words, it can be said that this program causes a computer to execute the procedure and method of operation of the components of the vehicle position estimation device 6.
  • the memory 72 is, for example, a non-volatile or non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electrically Erasable Programmable Read Only Memory). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. You may.
  • a non-volatile or non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electrically Erasable Programmable Read Only Memory). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive
  • the present invention is not limited to this, and some components of the vehicle position estimating device 6 may be realized by dedicated hardware, and another part of the components may be realized by software or the like.
  • the function is realized by a processing circuit 70 as dedicated hardware, and for other components, the processing circuit 70 as a processor 71 executes a program stored in a memory 72. The function can be realized by reading and executing.
  • the vehicle position estimating apparatus 6 can realize the above-described functions by hardware, software, or the like, or a combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un dispositif d'estimation de position de véhicule (6) dans lequel une unité d'estimation de position de véhicule (61) calcule des premières informations de position de véhicule estimée indiquant la position et l'orientation d'un véhicule sur une carte en fonction d'informations de positionnement absolues du véhicule obtenues par un positionnement satellite et d'informations de capteur de véhicule obtenues à partir d'un capteur de véhicule du véhicule. Une unité de détection de position/orientation de caractéristique de route (62) sélectionne une caractéristique de route à détecter en fonction des premières informations de position de véhicule estimée et au moins l'une parmi des informations de caractéristiques de route incluses dans des données cartographiques et des informations de capteur d'imagerie obtenues à partir d'un capteur d'imagerie du véhicule. En outre, l'unité de détection de position/orientation de caractéristique de route (62) détecte la position relative et l'orientation relative de la caractéristique de route sélectionnée par rapport au véhicule à partir d'informations de mesure de distance optique obtenues à partir d'un capteur de mesure de distance optique du véhicule et calcule des secondes informations de position de véhicule estimée indiquant la position et l'orientation du véhicule sur la carte en fonction de la position relative détectée et de l'orientation relative de la caractéristique de route.
PCT/JP2018/027495 2018-07-23 2018-07-23 Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule WO2020021596A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027495 WO2020021596A1 (fr) 2018-07-23 2018-07-23 Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027495 WO2020021596A1 (fr) 2018-07-23 2018-07-23 Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule

Publications (1)

Publication Number Publication Date
WO2020021596A1 true WO2020021596A1 (fr) 2020-01-30

Family

ID=69181450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/027495 WO2020021596A1 (fr) 2018-07-23 2018-07-23 Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule

Country Status (1)

Country Link
WO (1) WO2020021596A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115132061A (zh) * 2021-03-25 2022-09-30 本田技研工业株式会社 地图生成装置、地图生成系统、地图生成方法及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008129867A (ja) * 2006-11-21 2008-06-05 Toyota Motor Corp 運転支援装置
JP2016176769A (ja) * 2015-03-19 2016-10-06 クラリオン株式会社 情報処理装置、及び、車両位置検出方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008129867A (ja) * 2006-11-21 2008-06-05 Toyota Motor Corp 運転支援装置
JP2016176769A (ja) * 2015-03-19 2016-10-06 クラリオン株式会社 情報処理装置、及び、車両位置検出方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115132061A (zh) * 2021-03-25 2022-09-30 本田技研工业株式会社 地图生成装置、地图生成系统、地图生成方法及存储介质

Similar Documents

Publication Publication Date Title
US10788830B2 (en) Systems and methods for determining a vehicle position
US10384679B2 (en) Travel control method and travel control apparatus
WO2016203515A1 (fr) Dispositif de détermination de voie de circulation et procédé de détermination de voie de circulation
US20180154901A1 (en) Method and system for localizing a vehicle
CN111507130B (zh) 车道级定位方法及系统、计算机设备、车辆、存储介质
CN111025308B (zh) 车辆定位方法、装置、系统和存储介质
JP7113134B2 (ja) 車両制御装置
WO2018212292A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support d'informations
US11983890B2 (en) Method and apparatus with motion information estimation
JP2018189463A (ja) 車両位置推定装置及びプログラム
EP3994043A1 (fr) Décalage latéral obtenu pour des caractéristiques adas ou ad
CN114670840A (zh) 死角推测装置、车辆行驶系统、死角推测方法
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
JP2023527898A (ja) センサデータを処理する方法及び装置
WO2020021596A1 (fr) Dispositif d'estimation de position de véhicule et procédé d'estimation de position de véhicule
WO2021033312A1 (fr) Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations
JP7209912B2 (ja) 運転支援制御装置および運転支援制御方法
CN114755663A (zh) 车辆传感器的外参校准方法、装置及计算机可读存储介质
US20230020069A1 (en) A camera system for a mobile device, a method for localizing a camera and a method for localizing multiple cameras
JP2018185156A (ja) 物標位置推定方法及び物標位置推定装置
WO2018212290A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support de stockage
JP6929493B2 (ja) 地図データ処理システムおよび地図データ処理方法
RU2777308C1 (ru) Способ оценки собственного местоположения и устройство для оценки собственного местоположения
EP3835724B1 (fr) Procédé d'estimation d'auto-localisation et dispositif d'estimation d'auto-localisation
US12018946B2 (en) Apparatus, method, and computer program for identifying road being traveled

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18927800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP