WO2017009923A1 - 自己位置推定装置及び自己位置推定方法 - Google Patents
自己位置推定装置及び自己位置推定方法 Download PDFInfo
- Publication number
- WO2017009923A1 WO2017009923A1 PCT/JP2015/070008 JP2015070008W WO2017009923A1 WO 2017009923 A1 WO2017009923 A1 WO 2017009923A1 JP 2015070008 W JP2015070008 W JP 2015070008W WO 2017009923 A1 WO2017009923 A1 WO 2017009923A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target position
- target
- self
- straight line
- vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 14
- 238000001514 detection method Methods 0.000 claims description 34
- 238000000605 extraction Methods 0.000 claims description 16
- 238000009825 accumulation Methods 0.000 claims description 10
- 239000000284 extract Substances 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 3
- 239000013598 vector Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/04—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a self-position estimation device and a self-position estimation method for estimating the self-position of a vehicle.
- Patent Literature 1 is based on the premise that the information obtained by the sensor, which is collated with map information for estimating the self position, is information within a predetermined distance from the latest self position. .
- the straight path continues for a predetermined distance or more and the target information is described by a straight line parallel to the path, the degree of freedom remains in the direction of the straight line, so the accuracy of self-position estimation decreases. There is a fear.
- an object of the present invention is to provide a self-position estimation apparatus and a self-position estimation method that can improve the accuracy of self-position estimation.
- the self-position estimation device selects target position data to be used for self-position estimation from straight lines obtained from surrounding targets based on angles formed by mutually intersecting straight lines, and selects the target position data and the map information.
- the self position of the vehicle is estimated by collating with the position of the target.
- a self-position estimation apparatus and a self-position estimation method can be provided.
- FIG. 1 is a block diagram illustrating an example of a configuration of a self-position estimation apparatus according to an embodiment of the present invention.
- FIG. 2 is an example illustrating a laser range finder and a camera mounted on a vehicle.
- FIG. 3 is a flowchart for explaining the processing flow of the self-position estimation apparatus according to the embodiment of the present invention.
- FIG. 4 is an example illustrating an environment in which a vehicle on which a self-position estimation apparatus according to an embodiment of the present invention is mounted travels.
- FIG. 5 is a diagram illustrating a region specified by a region specifying unit included in the self-position estimation apparatus according to the embodiment of the present invention.
- FIG. 1 is a block diagram illustrating an example of a configuration of a self-position estimation apparatus according to an embodiment of the present invention.
- FIG. 2 is an example illustrating a laser range finder and a camera mounted on a vehicle.
- FIG. 3 is a flowchart for explaining the processing flow of the self-position estimation apparatus according
- FIG. 6 is a diagram for explaining processing by the target position detection unit provided in the self-position estimation apparatus according to the embodiment of the present invention.
- FIG. 7 is a diagram for explaining processing by the movement amount detection unit provided in the self-position estimation apparatus according to the embodiment of the present invention.
- FIG. 8 is a diagram for explaining processing by the straight line extraction unit provided in the self-position estimation apparatus according to the embodiment of the present invention.
- FIG. 9 is an example illustrating straight line information extracted by the straight line extraction unit while the vehicle travels.
- FIG. 10 is a table showing the straight line information extracted by the straight line extraction unit and the acquisition time.
- FIG. 11 is a table illustrating a state in which priorities are set based on angles formed by straight lines intersecting each other for combinations of straight line information.
- FIG. 12 is a table for explaining a state in which priorities are set according to the angle formed by the straight lines intersecting each other and the acquisition time for each straight line information.
- FIG. 1 is a diagram for explaining the configuration of the self-position estimation apparatus according to the present embodiment.
- the self-position estimation apparatus according to the present embodiment includes a surrounding sensor group 1, a processing device 3, a storage device 4, and a vehicle sensor group 5.
- the self-position estimation apparatus according to the present embodiment is mounted on a vehicle V (see FIG. 2) and estimates the self-position of the vehicle V.
- the position in the east-west direction (X-axis direction) (X coordinate [m])
- the position in the north-south direction (Y-axis direction) (Y coordinate [m])
- posture As angle information, a position and posture angle of a total of three degrees of freedom on a two-dimensional plane of the azimuth angle ⁇ (yaw angle [rad]) of the vehicle are estimated.
- the ambient sensor group 1 includes, for example, a plurality of laser range finders (LRF) 101 and 102 that detect distances to an object by reflection of irradiated laser light, and a plurality of cameras 201 that capture digital images that can be processed. 202. As described above, the ambient sensor group 1 includes a plurality of sensors that respectively detect the targets existing around the vehicle V.
- LRF laser range finders
- FIG. 2 is an example illustrating a state in which the surrounding sensor group 1 is mounted on the vehicle V.
- the LRFs 101 and 102 can be mounted in the vicinity of the front fenders on both the left and right sides of the vehicle V, for example.
- the LRFs 101 and 102 scan, for example, at a predetermined scanning angle ⁇ (for example, 90 °) so that the locus of the laser beam to be irradiated forms a plane perpendicular to the road surface as a rotation axis along the front-rear direction D of the vehicle V.
- ⁇ for example, 90 °
- the LRFs 101 and 102 can detect a target such as a curb that exists in the left-right direction of the vehicle V.
- the LRFs 101 and 102 sequentially output the detected target shape to the processing device 3 as a detection result.
- the cameras 201 and 202 can be mounted on door mirrors on both the left and right sides of the vehicle V, for example.
- the cameras 201 and 202 capture an image using a solid-state image sensor such as a CCD or a CMOS, for example.
- the cameras 201 and 202 photograph a road surface on the side of the vehicle V.
- the cameras 201 and 202 sequentially output the captured images to the processing device 3.
- the storage device 4 is a map information storage unit that stores map information 41 including positions of targets existing around the road.
- the storage device 4 can be composed of a semiconductor memory, a magnetic disk, or the like.
- the target recorded in the map information 41 is, for example, a road marking indicating a stop line, a pedestrian crossing, a pedestrian crossing notice, a lane marking, a structure such as a curb, and various other sensors that can be detected by the surrounding sensor group 1. Includes facilities.
- the map information 41 is described only by position information on a two-dimensional plane even if the target is actually a target such as a curb. In the map information 41, position information such as curbstones and white lines is defined by a collection of straight line information having two-dimensional position information of both end points.
- the map information 41 is described as straight line information on a two-dimensional plane approximated by a broken line when the shape of the real environment is a curve.
- the vehicle sensor group 5 includes a GPS receiver 51, an accelerator sensor 52, a steering sensor 53, a brake sensor 54, a vehicle speed sensor 55, an acceleration sensor 56, a wheel speed sensor 57, and other sensors 58 such as a yaw rate sensor.
- Each sensor 51 to 58 is connected to the processing device 3 and sequentially outputs various detection results to the processing device 3.
- the processing device 3 can calculate the approximate position of the vehicle V in the map information 41 using each detection result of the vehicle sensor group 5, and can calculate an odometry that indicates the amount of movement of the vehicle V per unit time. .
- the processing device 3 includes a target position detection unit 31, a movement amount detection unit 32, a target position accumulation unit 33, a straight line extraction unit 34, a target position selection unit 35, and a self-position estimation unit 36. .
- the processing device 3 can be configured by, for example, a microcontroller that is an integrated circuit including a central processing unit (CPU), a memory, an input / output I / F, and the like.
- the CPU executes a computer program preinstalled in the microcontroller, thereby realizing a plurality of information processing units (31 to 36) constituting the processing device 3.
- Each part which comprises the processing apparatus 3 may be comprised from integral hardware, and may be comprised from separate hardware.
- the microcontroller may also be used as an electronic control unit (ECU) used for other control related to the vehicle V such as automatic driving control.
- ECU electronice control unit
- the target position detection unit 31 detects the relative position of the target existing around the vehicle V with respect to the vehicle V based on the detection result of at least one of the LRFs 101 and 102 and the cameras 201 and 202.
- the position detected by the target position detection unit 31 is a position in the vehicle coordinate system.
- the center of the rear wheel axle of the vehicle V may be the origin
- the forward direction may be the positive direction of the x axis
- the left direction may be the positive direction of the y axis
- the upward direction may be the positive direction of the z axis.
- the conversion formula from the coordinate system of the LRF 101, 102 and the cameras 201, 202 to the vehicle coordinate system is set in the target position detection unit 31 in advance. The same applies to road surface parameters in the vehicle coordinate system.
- the movement amount detection unit 32 detects odometry, which is the movement amount of the vehicle V per unit time, based on detection result information of at least one of the sensors included in the vehicle sensor group 5.
- the movement amount of the vehicle V is detected as a movement amount in the odometry coordinate system.
- the target position accumulating unit 33 uses the target position detected by the movement amount detection unit 32 based on the movement amount detected by the movement amount detection unit 32, based on the movement amount detected by the movement amount detection unit 32. Accumulate as data.
- the straight line extraction unit 34 extracts straight line information from the target position data accumulated by the target position accumulation unit 33.
- the target position selection unit 35 selects target position data from the straight lines indicated by the straight line information extracted by the straight line extraction unit 34 based on the angle of 90 ° or less among the angles formed by the straight lines intersecting each other. To do.
- the self-position estimating unit 36 estimates the self-position of the vehicle V by collating the target position data selected by the target position selecting unit 35 with the position of the target in the map information 41.
- the target position selection unit 35 sets the priority order for each straight line information so that the higher the priority order is, the larger the angle of 90 ° or less of the angles formed by the intersecting straight lines, and the set priority is set.
- the target position data corresponding to the straight line information having the higher rank is selected in order.
- the target position selection unit 35 has at least a set of 90 ° or less of the angles formed by the straight lines intersecting each other from the target position data stored in the target position storage unit 33.
- Target position data corresponding to the straight line information is selected.
- the target position selection unit 35 sets a priority for each line information so that the priority becomes higher as the acquisition time of the corresponding target position data is newer, and the object corresponding to the set straight line information having a higher priority. Select from the standard position data in order.
- the priority set for each straight line information is sequentially changed based on the acquisition time of the target position data and the angle formed by the straight line that is an extension line of the straight line information.
- the target position accumulating unit 33 determines target position data to be accumulated based on the priority set in each straight line information by the target position selecting unit 35. That is, the target position accumulation unit 33 preferentially accumulates target position data corresponding to straight line information having a high priority, and preferentially deletes target position data corresponding to straight line information having a low priority.
- step S ⁇ b> 10 the target position detection unit 31 detects the position of a target existing around the vehicle V based on the detection result by the surrounding sensor group 1.
- the target position detection unit 31 acquires the detection results of the LRFs 101 and 102 and the cameras 201 and 202, and detects the positions of road markings such as lane markings and stop lines, and structures such as curbs and buildings in the vehicle coordinate system.
- FIG. 4 is an example illustrating an environment in which the vehicle V travels when performing self-position estimation.
- the laser light emitted from the LRF 101 is applied to the road surface including the curb 61 like a line 64.
- the target position detection unit 31 extracts a place where the shape change is large from the direction and distance of the irradiated laser light as the position of the curb 61 and detects the position (x, y, z) in the vehicle coordinate system. Since it can be assumed that there is always a road surface in the vertically downward direction of the LRFs 101 and 102, the curb can be detected by extracting a point having a large change when the road surface and the height are compared.
- the target position detection unit 31 detects white lines 62 and 63 existing on both sides of the vehicle V from the luminance information of the images captured by the cameras 201 and 202.
- the target position detection unit 31 detects, from the grayscale images captured by the cameras 201 and 202, a pattern whose luminance changes in order of dark, bright, and dark, so that the white lines 62 and 63 are centered on the center of the bright part.
- the position (x, y, z) of the white lines 62, 63 in the vehicle coordinate system can be detected from the positional relationship between the cameras 201, 202 and the road surface.
- the position (x, y, z) in the vehicle coordinate system detected in step S10 is treated as two-dimensional data after the height information (Z-axis component) is excluded.
- 5 (a) to 5 (d) show the position 71 of the curb 61 and the white line 62 in the vehicle coordinate system detected by the target position detection unit 31 during the time t1 to t4 in the example shown in FIG. , 63 are positions 72 and 73. t1 is the past time and t4 is the newest time.
- step S20 the movement amount detection unit 32 calculates the position of the vehicle V in the odometry coordinate system by integrating the movement amount of the vehicle V calculated based on the detection result by the vehicle sensor group 5.
- the position of the vehicle V at the time when power is supplied to the self-position estimation apparatus or when the process is reset may be set as the origin, and the azimuth angle of the vehicle V may be set to 0 °.
- the integration of the movement amount of the vehicle V is performed in the odometry coordinate system.
- FIG. 6 is a diagram showing a result of integrating the movement amount of the vehicle V calculated based on the detection result by the vehicle sensor group 5 in the examples shown in FIGS. 5 (a) to 5 (d).
- the movement amount detection unit 32 calculates the position (Xo, Yo) of the vehicle V in the odometry coordinate system.
- FIG. 7 is a diagram showing the target position data converted into the odometry coordinate system in the examples shown in FIGS. 5 and 6.
- the target position accumulation unit 33 converts the position of the target detected in step S10 into an odometry coordinate system based on the movement amount detected in step S20, and accumulates it as target position data.
- the straight line extraction unit 34 extracts straight line information from the target position data accumulated by the target position accumulation unit 33.
- the straight line extraction unit 34 extracts a straight line from the target position data acquired between times t0 and t0 + ⁇ t, where ⁇ t is the unit time.
- the number of target position data that can be acquired during the unit time ⁇ t is determined by the sampling periods of the LRFs 101 and 102 and the cameras 201 and 202, respectively.
- the straight line extraction unit 34 converts the target position data to the right side. And the left side and the left side, the straight line parameters are estimated.
- FIG. 8 is a diagram showing the straight line information N1, N2, and N3 extracted by the straight line extracting unit 34 in the example shown in FIG.
- the straight line can be estimated by obtaining the optimum parameters a, b, and c.
- the straight line extraction unit 34 obtains the sum of the distances between the estimated straight lines and the points used for the estimation, and does not extract the straight line information if the sum exceeds a predetermined threshold.
- the straight line extraction unit 34 obtains each point on the straight line that minimizes the distance to the point used for estimation, selects the two points having the maximum length as end points, and selects the selected two points as straight lines. Extract as end points of information. 71a and 71d are extracted from the end points of the straight line information N1, 72a and 72d are extracted from the end points of the straight line information N2, and 73a and 73d are extracted from the end points of the straight line information N3.
- the target position accumulation unit 33 accumulates the point group (target position data) used for the estimation of the straight line, the straight line information, and the acquisition time t0 + ⁇ t in association with each other.
- the target position selection unit 35 sets the priority order for each piece of straight line information so that the higher the priority order is, the higher the arg is.
- the target position selection unit 35 may set the priority order for each line information so that the priority order becomes higher as the acquisition time of the target position data is newer when the priority order by the angle is the same.
- FIG. 9 is a diagram illustrating a case where five straight line information a to e are extracted by the straight line extraction unit 34 while the vehicle V is traveling.
- the straight line information a, b, c, d, and e are acquired at times t1, t2, t3, t4, and t5, respectively.
- t1 is the past time
- t5 is the newest time.
- the target position selection unit 35 first obtains an angle of 90 ° or less from the angle formed by the combination of the straight line information a to e and the extension line of the straight line information a to e, as shown in FIG.
- Priorities are set so that the larger the angle, the higher the angle. Accordingly, the priority order for each combination of the straight line information a to e is determined as shown in FIG.
- the target position selection unit 35 further increases each straight line so that the newer the acquisition time of the straight line information or the target position data, the higher the acquisition time.
- Prioritize information For example, for the straight line information a, b, and d set with the highest priority in FIG. 11, the acquisition times are t1, t2, and t4 in FIG. a. Therefore, the priority order for each straight line information is d, b, a from the highest, as shown in FIG. Further, in FIG. 11, the priority order for each straight line information is determined in the order of the straight line information e and c, which is set to have the next highest priority after the straight line information a, b and d.
- step S50 the target position accumulating unit 33 stores the straight line information and the target position data in association with each other according to the priority set in step S40.
- the target position accumulating unit 33 may delete the straight line information in descending order of priority when the storage capacity is limited.
- step S60 the self-position estimation unit 36 estimates the self-position of the vehicle V by comparing the target position data stored in the target position storage unit 33 with the position of the target in the map information 41. That is, the self-position estimating unit 36 is a position and posture with a total of three degrees of freedom composed of the position of the vehicle V in the east-west direction (X coordinate), the position in the north-south direction (Y coordinate), and the azimuth angle (yaw angle ⁇ ). Estimate the angle.
- the self-position estimation unit 36 matches the end points at both ends of the target positions included in the map information 41, for example, to the lane markings as evaluation points.
- the target position data is not affected by the odometry error as it is closer to the vehicle V (ambient sensor group 1), the self-position estimation unit 36 compensates for the vicinity of the vehicle V by linear interpolation and calculates the number of evaluation points. It is possible to increase the number of evaluation points in the distance of the vehicle V.
- the target position data corresponding to the set of straight line information having the largest angle of 90 ° or less among the angles formed by the intersecting straight lines is used.
- Self-position estimation can be performed. Thereby, the self-position estimation apparatus according to the present embodiment can select data having the highest reliability from the target position data necessary for uniquely estimating the self-position.
- each line information is prioritized so that a group having a larger angle of 90 ° or less among angles formed by intersecting straight lines has a higher priority.
- priorities are set for the target position data in the order useful for self-position estimation, and the accuracy of self-position estimation of the vehicle can be improved.
- the priority order is set for each line information so that the priority order becomes higher as the acquisition time of the corresponding target position data is newer.
- the self-position estimation apparatus according to the present embodiment can select the latest target position data that greatly contributes to self-position estimation, and can improve the accuracy of self-position estimation of the vehicle.
- the target position accumulation unit 33 determines the target position data to be accumulated based on the set priority order.
- the self-position estimation apparatus according to the present embodiment can delete the data from lower priority data when it is necessary to delete the data for reasons such as storage capacity limitations. It is possible to continue to hold the target position data necessary for estimation.
- a self-position estimation apparatus and a self-device that can improve the accuracy of self-position estimation by selecting data used for self-position estimation based on an angle formed by straight lines obtained from surrounding targets.
- a position estimation method can be provided.
- V vehicle 4 storage device (map information storage unit) 31 target position detection unit 32 movement amount detection unit 33 target position storage unit 34 straight line extraction unit 35 target position selection unit 36 self-position estimation unit 41 map information
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Aviation & Aerospace Engineering (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、本実施の形態に係る自己位置推定装置の構成を説明する図である。本実施の形態に係る自己位置推定装置は、周囲センサ群1と、処理装置3と、記憶装置4と、車両センサ群5とを備える。本実施の形態に係る自己位置推定装置は、車両V(図2参照)に搭載され、車両Vの自己位置を推定する。
図3のフローチャートを参照して、本実施の形態に係る自己位置推定装置を用いた自己位置推定方法の一例を説明する。
aij=cos-1((Vi・Vj)/(|Vi||Vj|))
(aijが90度を超えるとき) argij=180-aij
(aijが90度以下のとき) argij=aij
4 記憶装置(地図情報格納部)
31 物標位置検出部
32 移動量検出部
33 物標位置蓄積部
34 直線抽出部
35 物標位置選択部
36 自己位置推定部
41 地図情報
Claims (6)
- 車両の周囲に存在する物標の位置を検出する物標位置検出部と、
前記車両の移動量を検出する移動量検出部と、
前記物標位置検出部により検出された物標の位置を、前記移動量検出部により検出された移動量に基づいて、物標位置データとして蓄積する物標位置蓄積部と、
前記物標位置蓄積部により蓄積された前記物標位置データから直線を抽出する直線抽出部と、
前記直線抽出部により抽出された直線から、互いに交差する直線がなす角の大きさに基づいて、前記物標位置データを選択する物標位置選択部と、
物標の位置を含む地図情報を格納する地図情報格納部と、
前記物標位置選択部により選択された物標位置データと、前記地図情報における物標の位置とを照合することにより、前記車両の自己位置を推定する自己位置推定部と
を備えることを特徴とする自己位置推定装置。 - 前記物標位置選択部は、少なくとも、互いに交差する直線がなす角のうち90°以下の角度の大きさが最も大きい組の直線情報に対応する前記物標位置データを選択することを特徴とする請求項1に記載の自己位置推定装置。
- 前記物標位置選択部は、互いに交差する直線がなす角のうち90°以下の角度の大きさが大きい組ほど優先順位が高くなるように各直線情報に優先順位を設定し、設定された優先順位が高い直線に対応する物標位置データから順に選択することを特徴とする請求項1又は2に記載の自己位置推定装置。
- 前記物標位置選択部は、対応する物標位置データの取得時間が新しいほど優先順位が高くなるように各直線情報に優先順位を設定し、設定された優先順位が高い直線情報に対応する物標位置データから順に選択することを特徴とする請求項3に記載の自己位置推定装置。
- 前記物標位置蓄積部は、前記優先順位に基づいて、蓄積する前記物標位置データを決定することを特徴とする請求項3又は4に記載の自己位置推定装置。
- 車両の周囲に存在する物標の位置を検出することと、
前記車両の移動量を検出することと、
検出された物標の位置を、検出された移動量に基づいて変換し、物標位置データとして蓄積することと、
蓄積された前記物標位置データから直線を抽出することと、
抽出された直線から、互いに交差する直線がなす角の大きさに基づいて、前記物標位置データを選択することと、
選択された物標位置データと、地図情報における物標の位置とを照合することにより、前記車両の自己位置を推定することと
を含むことを特徴とする自己位置推定方法。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/070008 WO2017009923A1 (ja) | 2015-07-13 | 2015-07-13 | 自己位置推定装置及び自己位置推定方法 |
RU2018105102A RU2669652C1 (ru) | 2015-07-13 | 2015-07-13 | Устройство оценки собственной позиции и способ оценки собственной позиции |
US15/743,853 US10145693B2 (en) | 2015-07-13 | 2015-07-13 | Own-position estimation device and own-position estimation method |
MX2018000438A MX366083B (es) | 2015-07-13 | 2015-07-13 | Dispositivo de estimacion de la posicion propia y metodo de estimacion de la posicion propia. |
CN201580081657.8A CN107850446B (zh) | 2015-07-13 | 2015-07-13 | 自身位置推定装置及自身位置推定方法 |
BR112018000704-8A BR112018000704B1 (pt) | 2015-07-13 | 2015-07-13 | Dispositivo de estimativa da própria posição e método de estimativa da própria posição |
EP15898239.7A EP3324152B1 (en) | 2015-07-13 | 2015-07-13 | Own-position estimating device and own-position estimating method |
KR1020187000810A KR101887335B1 (ko) | 2015-07-13 | 2015-07-13 | 자기 위치 추정 장치 및 자기 위치 추정 방법 |
JP2017528030A JP6477882B2 (ja) | 2015-07-13 | 2015-07-13 | 自己位置推定装置及び自己位置推定方法 |
CA2992006A CA2992006C (en) | 2015-07-13 | 2015-07-13 | Own-position estimation device and own-position estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/070008 WO2017009923A1 (ja) | 2015-07-13 | 2015-07-13 | 自己位置推定装置及び自己位置推定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017009923A1 true WO2017009923A1 (ja) | 2017-01-19 |
Family
ID=57757193
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/070008 WO2017009923A1 (ja) | 2015-07-13 | 2015-07-13 | 自己位置推定装置及び自己位置推定方法 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10145693B2 (ja) |
EP (1) | EP3324152B1 (ja) |
JP (1) | JP6477882B2 (ja) |
KR (1) | KR101887335B1 (ja) |
CN (1) | CN107850446B (ja) |
BR (1) | BR112018000704B1 (ja) |
CA (1) | CA2992006C (ja) |
MX (1) | MX366083B (ja) |
RU (1) | RU2669652C1 (ja) |
WO (1) | WO2017009923A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019109331A (ja) * | 2017-12-18 | 2019-07-04 | パイオニア株式会社 | 地図データ構造 |
JP2019109332A (ja) * | 2017-12-18 | 2019-07-04 | パイオニア株式会社 | 地図データ構造 |
JP2020507072A (ja) * | 2017-01-27 | 2020-03-05 | カールタ インコーポレイテッド | 実時間オンライン自己運動推定を備えたレーザスキャナ |
US11398075B2 (en) | 2018-02-23 | 2022-07-26 | Kaarta, Inc. | Methods and systems for processing and colorizing point clouds and meshes |
US11506500B2 (en) | 2016-03-11 | 2022-11-22 | Kaarta, Inc. | Aligning measured signal data with SLAM localization data and uses thereof |
US11567201B2 (en) | 2016-03-11 | 2023-01-31 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
US11573325B2 (en) | 2016-03-11 | 2023-02-07 | Kaarta, Inc. | Systems and methods for improvements in scanning and mapping |
US11585662B2 (en) | 2016-03-11 | 2023-02-21 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
US11815601B2 (en) | 2017-11-17 | 2023-11-14 | Carnegie Mellon University | Methods and systems for geo-referencing mapping systems |
US11830136B2 (en) | 2018-07-05 | 2023-11-28 | Carnegie Mellon University | Methods and systems for auto-leveling of point clouds and 3D models |
US12014533B2 (en) | 2018-04-03 | 2024-06-18 | Carnegie Mellon University | Methods and systems for real or near real-time point cloud map data confidence evaluation |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
MX364577B (es) * | 2015-08-28 | 2019-05-02 | Nissan Motor | Dispositivo de estimacion de posicion de vehiculo, metodo de estimacion de posicion de vehiculo. |
KR101847836B1 (ko) * | 2015-12-24 | 2018-04-11 | 현대자동차주식회사 | 도로 경계 검출 시스템 및 방법과 이를 이용한 차량 |
US10625746B2 (en) * | 2016-07-26 | 2020-04-21 | Nissan Motor Co., Ltd. | Self-position estimation method and self-position estimation device |
US10317901B2 (en) | 2016-09-08 | 2019-06-11 | Mentor Graphics Development (Deutschland) Gmbh | Low-level sensor fusion |
US10678240B2 (en) | 2016-09-08 | 2020-06-09 | Mentor Graphics Corporation | Sensor modification based on an annotated environmental model |
US10585409B2 (en) * | 2016-09-08 | 2020-03-10 | Mentor Graphics Corporation | Vehicle localization with map-matched sensor measurements |
US11067996B2 (en) | 2016-09-08 | 2021-07-20 | Siemens Industry Software Inc. | Event-driven region of interest management |
US10650270B2 (en) * | 2017-04-21 | 2020-05-12 | X Development Llc | Methods and systems for simultaneous localization and calibration |
US20180314253A1 (en) | 2017-05-01 | 2018-11-01 | Mentor Graphics Development (Deutschland) Gmbh | Embedded automotive perception with machine learning classification of sensor data |
US11145146B2 (en) | 2018-01-31 | 2021-10-12 | Mentor Graphics (Deutschland) Gmbh | Self-diagnosis of faults in an autonomous driving system |
US10553044B2 (en) | 2018-01-31 | 2020-02-04 | Mentor Graphics Development (Deutschland) Gmbh | Self-diagnosis of faults with a secondary system in an autonomous driving system |
WO2020008221A1 (ja) * | 2018-07-04 | 2020-01-09 | 日産自動車株式会社 | 走行支援方法及び走行支援装置 |
WO2021205193A1 (ja) * | 2020-04-08 | 2021-10-14 | 日産自動車株式会社 | 地図情報補正方法、運転支援方法及び地図情報補正装置 |
US11872965B2 (en) * | 2020-05-11 | 2024-01-16 | Hunter Engineering Company | System and method for gyroscopic placement of vehicle ADAS targets |
JP7241839B1 (ja) | 2021-10-06 | 2023-03-17 | 三菱電機株式会社 | 自己位置推定装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007303841A (ja) * | 2006-05-08 | 2007-11-22 | Toyota Central Res & Dev Lab Inc | 車両位置推定装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2013068482A (ja) * | 2011-09-21 | 2013-04-18 | Nec Casio Mobile Communications Ltd | 方位補正システム、端末装置、サーバ装置、方位補正方法及びプログラム |
JP2013156034A (ja) * | 2012-01-26 | 2013-08-15 | Toyota Motor Corp | 車両走行道路特定装置および車両走行道路特定方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7287884B2 (en) * | 2002-02-07 | 2007-10-30 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation supporting device and vehicle operation supporting system |
JP4437556B2 (ja) * | 2007-03-30 | 2010-03-24 | アイシン・エィ・ダブリュ株式会社 | 地物情報収集装置及び地物情報収集方法 |
JP2008250906A (ja) | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
CN101968940B (zh) * | 2009-07-28 | 2012-08-22 | 钰程科技股份有限公司 | 具有定位与照相能力的手持装置及其地理定位方法 |
JP5761162B2 (ja) * | 2012-11-30 | 2015-08-12 | トヨタ自動車株式会社 | 車両位置推定装置 |
KR102027771B1 (ko) * | 2013-01-31 | 2019-10-04 | 한국전자통신연구원 | 차량 속도 적응형 장애물 검출 장치 및 방법 |
JP6233706B2 (ja) * | 2013-04-02 | 2017-11-22 | パナソニックIpマネジメント株式会社 | 自律移動装置及び自律移動装置の自己位置推定方法 |
CN103398717B (zh) * | 2013-08-22 | 2016-04-20 | 成都理想境界科技有限公司 | 全景地图数据库采集系统及基于视觉的定位、导航方法 |
JP2016176769A (ja) * | 2015-03-19 | 2016-10-06 | クラリオン株式会社 | 情報処理装置、及び、車両位置検出方法 |
-
2015
- 2015-07-13 EP EP15898239.7A patent/EP3324152B1/en active Active
- 2015-07-13 CA CA2992006A patent/CA2992006C/en active Active
- 2015-07-13 WO PCT/JP2015/070008 patent/WO2017009923A1/ja active Application Filing
- 2015-07-13 RU RU2018105102A patent/RU2669652C1/ru active
- 2015-07-13 CN CN201580081657.8A patent/CN107850446B/zh active Active
- 2015-07-13 MX MX2018000438A patent/MX366083B/es active IP Right Grant
- 2015-07-13 KR KR1020187000810A patent/KR101887335B1/ko active IP Right Grant
- 2015-07-13 US US15/743,853 patent/US10145693B2/en active Active
- 2015-07-13 BR BR112018000704-8A patent/BR112018000704B1/pt active IP Right Grant
- 2015-07-13 JP JP2017528030A patent/JP6477882B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007303841A (ja) * | 2006-05-08 | 2007-11-22 | Toyota Central Res & Dev Lab Inc | 車両位置推定装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2013068482A (ja) * | 2011-09-21 | 2013-04-18 | Nec Casio Mobile Communications Ltd | 方位補正システム、端末装置、サーバ装置、方位補正方法及びプログラム |
JP2013156034A (ja) * | 2012-01-26 | 2013-08-15 | Toyota Motor Corp | 車両走行道路特定装置および車両走行道路特定方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3324152A4 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11573325B2 (en) | 2016-03-11 | 2023-02-07 | Kaarta, Inc. | Systems and methods for improvements in scanning and mapping |
US11585662B2 (en) | 2016-03-11 | 2023-02-21 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
US11506500B2 (en) | 2016-03-11 | 2022-11-22 | Kaarta, Inc. | Aligning measured signal data with SLAM localization data and uses thereof |
US11567201B2 (en) | 2016-03-11 | 2023-01-31 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
JP2020507072A (ja) * | 2017-01-27 | 2020-03-05 | カールタ インコーポレイテッド | 実時間オンライン自己運動推定を備えたレーザスキャナ |
JP7141403B2 (ja) | 2017-01-27 | 2022-09-22 | カールタ インコーポレイテッド | 実時間オンライン自己運動推定を備えたレーザスキャナ |
US11815601B2 (en) | 2017-11-17 | 2023-11-14 | Carnegie Mellon University | Methods and systems for geo-referencing mapping systems |
JP7251918B2 (ja) | 2017-12-18 | 2023-04-04 | ジオテクノロジーズ株式会社 | 車両位置推定装置及び車両位置推定システム |
JP2019109331A (ja) * | 2017-12-18 | 2019-07-04 | パイオニア株式会社 | 地図データ構造 |
JP2019109332A (ja) * | 2017-12-18 | 2019-07-04 | パイオニア株式会社 | 地図データ構造 |
US11398075B2 (en) | 2018-02-23 | 2022-07-26 | Kaarta, Inc. | Methods and systems for processing and colorizing point clouds and meshes |
US12014533B2 (en) | 2018-04-03 | 2024-06-18 | Carnegie Mellon University | Methods and systems for real or near real-time point cloud map data confidence evaluation |
US11830136B2 (en) | 2018-07-05 | 2023-11-28 | Carnegie Mellon University | Methods and systems for auto-leveling of point clouds and 3D models |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017009923A1 (ja) | 2018-06-14 |
US20180202815A1 (en) | 2018-07-19 |
CN107850446B (zh) | 2019-01-01 |
CN107850446A (zh) | 2018-03-27 |
CA2992006A1 (en) | 2017-01-19 |
BR112018000704A2 (ja) | 2018-09-18 |
KR20180016567A (ko) | 2018-02-14 |
US10145693B2 (en) | 2018-12-04 |
EP3324152A4 (en) | 2018-08-08 |
EP3324152B1 (en) | 2019-04-03 |
EP3324152A1 (en) | 2018-05-23 |
BR112018000704B1 (pt) | 2022-11-29 |
CA2992006C (en) | 2018-08-21 |
RU2669652C1 (ru) | 2018-10-12 |
KR101887335B1 (ko) | 2018-08-09 |
MX2018000438A (es) | 2018-05-17 |
MX366083B (es) | 2019-06-27 |
JP6477882B2 (ja) | 2019-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6477882B2 (ja) | 自己位置推定装置及び自己位置推定方法 | |
CA2987373C (en) | Position estimation device and position estimation method | |
US9740942B2 (en) | Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method | |
JP6222353B2 (ja) | 物標検出装置及び物標検出方法 | |
JP5966747B2 (ja) | 車両走行制御装置及びその方法 | |
JP6881464B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP6567659B2 (ja) | レーン検出装置およびレーン検出方法 | |
JP7077910B2 (ja) | 区画線検出装置及び区画線検出方法 | |
JP2007300181A (ja) | 周辺認識装置、周辺認識方法、プログラム | |
JP5834933B2 (ja) | 車両位置算出装置 | |
JP6139465B2 (ja) | 物体検出装置、運転支援装置、物体検出方法、および物体検出プログラム | |
JP6834401B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP6838365B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP7261588B2 (ja) | 信号機認識方法及び信号機認識装置 | |
JP2018073275A (ja) | 画像認識装置 | |
CN109309785B (zh) | 拍摄控制装置以及拍摄控制方法 | |
JP7169075B2 (ja) | 撮像制御装置および撮像制御方法 | |
JP5891802B2 (ja) | 車両位置算出装置 | |
JP6564682B2 (ja) | 対象物検出装置、対象物検出方法、及び、対象物検出プログラム | |
JP2021163220A (ja) | 情報処理装置 | |
JP2021163263A (ja) | 情報処理装置 | |
JP2004220281A (ja) | 障害物検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15898239 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017528030 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2992006 Country of ref document: CA Ref document number: 20187000810 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15743853 Country of ref document: US Ref document number: MX/A/2018/000438 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018105102 Country of ref document: RU Ref document number: 2015898239 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018000704 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112018000704 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180112 |