EP3521962B1 - Self-position estimation method and self-position estimation device - Google Patents

Self-position estimation method and self-position estimation device Download PDF

Info

Publication number
EP3521962B1
EP3521962B1 EP16917637.7A EP16917637A EP3521962B1 EP 3521962 B1 EP3521962 B1 EP 3521962B1 EP 16917637 A EP16917637 A EP 16917637A EP 3521962 B1 EP3521962 B1 EP 3521962B1
Authority
EP
European Patent Office
Prior art keywords
target
position data
self
target position
moving object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP16917637.7A
Other languages
German (de)
French (fr)
Other versions
EP3521962A4 (en
EP3521962A1 (en
Inventor
Yasuhito Sano
Chikao Tsuchiya
Takuya Nanri
Hiroyuki Takano
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of EP3521962A4 publication Critical patent/EP3521962A4/en
Publication of EP3521962A1 publication Critical patent/EP3521962A1/en
Application granted granted Critical
Publication of EP3521962B1 publication Critical patent/EP3521962B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present invention relates to a self-position estimation method and a self-position estimation device.
  • Patent Literature 1 There has been known a technology of estimating a self-position of an autonomous mobile robot (refer to Patent Literature 1).
  • a result (surrounding environment information) of having detected a movable region of a mobile robot by means of a sensor is restricted in a region which is predetermined on the basis of the mobile robot, and this restricted surrounding environment information is compared with an environmental map previously stored in the mobile robot, thereby estimating a self-position thereof.
  • Patent Literature 2 and Patent Literature 3 Further examples describing methods for estimating a self-position of a vehicle can be found in Patent Literature 2 and Patent Literature 3.
  • Patent Literature 4 describes a method for enhancing pre-stored map data using data obtained from a moving vehicle.
  • the present invention has been made in light of the above-mentioned problem, and the object of the present invention is to provide a self-position estimation method and a self-position estimation device for improving an estimation accuracy of the self-position by eliminating the target position data estimated to have many errors of a relative position.
  • the estimation accuracy of the self-position can be improved.
  • the self-position estimation device includes a surrounding sensor group 1, a processing unit 3, a storage unit 4, and a vehicle sensor group 5.
  • the self-position estimation device is mounted in a vehicle V (refer to Fig. 2 ), and is configured to estimate a self-position of the vehicle V.
  • attitude angle i.e., a position in the east-west direction (X axial direction) (X coordinate [m]) and a position in the north-south direction (Y axial direction) (Y coordinate [m]) as the self-position of the vehicle V to be estimated, and an azimuth angle ⁇ of the vehicle (yaw angle [rad]) as the attitude angle data of the vehicle V to be estimated) on the two-dimensional plane.
  • the surrounding sensor group 1 includes a plurality of Laser Range Finders (LRFs) 101 and 102 and a plurality of cameras 201 and 202, for example.
  • the Laser Range Finders (LRFs) 101 and 102 are respectively configured to detect a distance and azimuth to a target by receiving light reflected from the target to which laser light is emitted.
  • the cameras 201 and 202 are configured to capture surroundings of the vehicle V, and obtain the digital image capable of image processing.
  • the surrounding sensor group 1 is composed of a plurality of sensors respectively configured to detect targets present in surroundings of the vehicle V.
  • the surrounding sensor group 1 may include sonar and/or radar.
  • the targets which are present in the surroundings of the vehicle V include: targets indicating traveling lane boundaries present on a traveling lane in the surroundings of the vehicle V, e.g. white lines, curbs, median strips, the guardrails, and reflectors; road surface markings, e.g. stop lines, pedestrian crossings, and speed limit markings; and road structures, e.g. road signs, traffic lights, and the utility-line pole.
  • targets indicating traveling lane boundaries present on a traveling lane in the surroundings of the vehicle V e.g. white lines, curbs, median strips, the guardrails, and reflectors
  • road surface markings e.g. stop lines, pedestrian crossings, and speed limit markings
  • road structures e.g. road signs, traffic lights, and the utility-line pole.
  • Fig. 2 shows an example illustrating a state where the surrounding sensor group 1 is mounted in the vehicle V.
  • the LRFs 101 and 102 can be respectively mounted near front fenders of both sides of the vehicle V, for example.
  • the LRFs 101 and 102 are configured to scan laser light at a predetermined scan angle (e.g., 90 degrees) so that a track of the laser light to be emitted may, for example, form a vertical plane with respect to a road surface as a rotation axis along a front-back direction D of the vehicle V. Consequently, the LRFs 101 and 102 can detect targets, such as curbs or the like, which are present in a right-left direction of the vehicle V.
  • the LRFs 101 and 102 are configured to sequentially output a shape of the detected target to the processing unit 3 as a detection result.
  • the cameras 201 and 202 can be respectively mounted in door mirrors of both sides of the vehicle V, for example.
  • the cameras 201 and 202 are configured to capture an image by means of solid state imaging elements, e.g. a Charge-Coupled Device (CCD) and a Complementary Metal-Oxide Semiconductor (CMOS), for example.
  • CCD Charge-Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the cameras 201 and 202 are configured to capture a road surface of a lateral direction of the vehicle V, for example.
  • the cameras 201 and 202 are configured to sequentially output the captured image to the processing unit 3.
  • the storage unit 4 is a map information storage unit configured to store map information 41 including position information on targets present on a road or around the road.
  • the storage unit 4 can be composed by including a semiconductor memory, a magnetic disk, or the like.
  • the targets (landmark) recorded in the map information 41 includes, for example, various facilities which can be detected by the surrounding sensor group 1 in addition to: the road markings indicating stop lines, pedestrian crossings, advance notices of pedestrian crossing, section lines, and the like; structures, e.g. curbs, and the like, etc.
  • the map information 41 is described with only position information on a two-dimensional plane.
  • the position information e.g. curbs and white lines, is defined by the aggregate of linear information having two-dimensional position information on both end points.
  • the map information 41 is described as linear information on a two-dimensional plane approximated with a polygonal line, when a shape of real environment is a curve.
  • the vehicle sensor group 5 includes a GPS receiver 51, an accelerator sensor 52, a steering sensor 53, a brake sensor 54, a vehicle speed sensor 55, an acceleration sensor 56, a wheel speed sensor 57, and other sensors, such as a yaw rate sensor.
  • Each sensor 51 to 57 is connected to the processing unit 3 and is configured to sequentially output various detection results to the processing unit 3.
  • the processing unit 3 can calculate a position of the vehicle V in the map information 41 or can calculate the odometry indicating a moved amount of the vehicle V in a unit time, by using each detection result of the vehicle sensor group 5. For example, as a measuring method of the moved amount of the vehicle V, there can be considered various schemes, e.g.
  • an odometry measurement method at the rotational frequency of a tire an inertia measurement method using a gyroscope or an acceleration sensor, a method by receiving electric waves from satellites, e.g. a Global Navigation Satellite System (GNSS), and Simultaneous Localization and Mapping (SLAM) for estimating an moved amount from change of a measurement value of external sensors; but it may be used any method.
  • GNSS Global Navigation Satellite System
  • SLAM Simultaneous Localization and Mapping
  • the processing unit 3 includes: a target position detection unit 31, a moved amount estimation unit 32, a target position storing unit 33, a straight line extracting unit 34, a target position selection unit 35, a self-position estimation unit 36, and a target attribute estimation unit 37.
  • the processing unit 3 can be composed by including a microcontroller which is an integrated circuit provided with a Central Processing Unit (CPU), a memory, an input / output interface (I/F), and the like, for example.
  • CPU Central Processing Unit
  • I/F input / output interface
  • a plurality of information processing units (31 to 37) constituting the processing unit 3 are realized by the CPU executing a computer program preinstalled in the microcontroller.
  • Each unit constituting the processing unit 3 may be composed by including integrated hardware or may be composed by including discrete hardware.
  • the microcontroller may also be used as an Electronic Control Unit (ECU) used for other control in regard of the vehicle V, e.g. automatic driving control, for example.
  • ECU Electronic Control Unit
  • a "self-position estimation circuit” is provided therein by including the moved amount estimation unit 32, the target position storing unit 33, the straight line extracting unit 34, the target position selection unit 35, the self-position estimation unit 36, and the target attribute estimation unit 37.
  • the target position detection unit 31 detects a relative position between a target present in the surroundings of the vehicle V and the vehicle V on the basis of a detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202.
  • the relative position detected by the target position detection unit 31 is a position in a vehicle coordinate system.
  • the vehicle coordinate system may adopt the center of a rear wheel axle of the vehicle V as an origin point, a forward direction of the vehicle V as a positive direction of the x-axis, a leftward direction of the vehicle V as a positive direction of the y-axis, and an upward direction as a positive direction of the z-axis.
  • a conversion formula from the coordinate system (sensor coordinate system) of the LRFs 101 and 102 and the cameras 201 and 202 to the vehicle coordinate system is previously set in the target position detection unit 31.
  • a "target detection sensor” is provided therein by including the vehicle sensor group 5 and the target position detection unit 31.
  • the moved amount estimation unit 32 detects an odometry which is a moved amount of the vehicle V in a unit time on the basis of detection result information of at least any one of the sensors includes in the vehicle sensor group 5.
  • the moved amount of the vehicle V is detected as a moved amount in the odometry coordinate system.
  • the target position storing unit 33 stores a position where the relative position of the target detected by the target position detection unit 31 is moved by the moved amount of the vehicle V detected by the moved amount estimation unit 32, as target position data, in a primary storage unit or the storage unit 4 in the processing unit 3.
  • the straight line extracting unit 34 extracts linear information from the target position data stored in the target position storing unit 33.
  • the target attribute estimation unit 37 estimates an attribute of the target on the basis of a detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202.
  • the target position selection unit 35 selects target position data on the basis of reliability of the relative position with respect to the vehicle of target position data.
  • the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the linear information extracted by the straight line extracting unit 34 and the attribute of the target estimated by the target attribute estimation unit 37.
  • the self-position estimation unit 36 estimates a self-position which is a current position of the vehicle V by comparing the selected target position data with the map information including the position information on the target present on the road or around the road.
  • Step S01 the self-position estimation device measures surroundings of the vehicle V using the surrounding sensor group 1.
  • the surrounding sensor group 1 respectively detects targets present in the surroundings of the vehicle V.
  • the target position detection unit 31 estimates a position of the target with respect to the LRFs 101 and 102 and the cameras 201 and 202 (i.e., a relative position of the target in the sensor coordinate system) on the basis of the detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202. For example, in a case of the cameras 201 and 202, a relationship between the position in an image and the actual distance may be previously measured. Alternatively, it is possible to utilize a motion stereo system. The estimation method is not limited to this method, and other known methods can also be utilized. If another sensor (e.g., sonar, LRF, or radar) capable of obtaining distance information is used, a value to be obtained may be directly utilized.
  • another sensor e.g., sonar, LRF, or radar
  • Fig. 4 is an example illustrating an environment in which the vehicle V travels when the self-position estimation is executed.
  • a road surface including curb 61 is irradiated with laser light emitted from the LRF 101, as shown by the line 64.
  • the target position detection unit 31 extracts a place where change of a shape is large, as a position of the curb 61, on the basis of a direction and a distance of the emitted laser light, and thereby detects a position in the sensor coordinate system. Since it can be assumed that there is always a road surface in a vertical downward direction of the LRFs 101 and 102, the curb 61 can be detected by extracting a point where there is a large change when the road surface is compared with the height thereof.
  • the target position detection unit 31 detects white lines 62 and 63 which are present at both sides of the vehicle V, respectively on the basis of brightness information of the images captured by the cameras 201 and 202.
  • the target position detection unit 31 detects the pattern from which luminance is changed in the order of a dark portion, a bright portion, and a bright portion, on the basis of gray scale image captured by the camera (201, 202), and thereby can detect the center of the bright portion as the white line (62, 63).
  • the positions of the white lines 62 and 63 in the sensor coordinate system can be respectively detected on the basis of a positional relationship between the cameras 201, 202 and the road surface.
  • the position in the sensor coordinate system detected in Step S05 is hereinafter handled as two-dimensional data from which the height information is excluded.
  • the target position detection unit 31 converts the relative position of the target in the sensor coordinate system into a relative position of the target in the vehicle coordinate system using the conversion formula previously set therein.
  • Figs. 5(a) to 5(d) are diagrams respectively showing positions 71 of curb 61 and target position data 72 and 73 of white lines 62 and 63 in a vehicle coordinate system detected by the target position detector 31 during time t1 to time t4, in the example shown in Fig. 4 .
  • Time t1 is the oldest time
  • time t4 is the newest time.
  • the moved amount estimation unit 32 integrates the moved amount of the vehicle V calculated on the basis of the detection result from the vehicle sensor group 5, and thereby calculates a position of the vehicle V in the odometry coordinate system.
  • the azimuth angle of the vehicle V may be set to 0 degree, with reference to a position of the vehicle V, as the origin point, at the time when power is supplied to the self-position estimation device or the processing is reset.
  • the integration of the moved amount of the vehicle V is executed in the odometry coordinate system.
  • Fig. 6 is a diagram showing a result of integrating a moved amount (M1, M2, M3, and M4) of the vehicle V calculated on the basis of a detection result by a vehicle sensor group 5, in the example shown in Figs. 5(a) to 5(d) .
  • the moved amount includes a change in position and attitude ( ⁇ : yaw angle) on the two-dimensional coordinate system.
  • the moved amount estimation unit 32 calculates a position (Xo, Yo) of the vehicle V in the odometry coordinate system.
  • the target position storing unit 33 stores a position where the relative position of the target in the vehicle coordinate system detected by the target position detection unit 31 is moved by the moved amount of the vehicle V detected by the moved amount estimation unit 32, as target position data.
  • Fig. 7 is a diagram showing target position data (71a to 71d, 72a to 72d, and 73a to 73d) converted into an odometry coordinate system, in the example shown in Figs. 5 and 6 .
  • the target position storing unit 33 converts the position of the target in the sensor coordinate system measured in the past (t1, t2, t3, ...) into a position of the target in the odometry coordinate system on the basis of the moved amount (M1, M2, M3, M4) of the vehicle V, and stores the converted position data as target position data therein.
  • the target position selection unit 35 extracts target position data (71a to 71d, 72a to 72d, and 73a to 73d) indicating traveling lane boundary on the basis of a plurality of stored target position data, and calculates reliability of the relative position with respect to the vehicle V with respect to the extracted target position data (71a to 71d, 72a to 72d, and 73a to 73d).
  • the straight line extracting unit 34 extracts linear information (N1, N2, and N3) on the basis of the target position data (71a to 71d, 72a to 72d, and 73a to 73d) stored in the target position storing unit 33.
  • a linear approximation is applied to the detection result of the white line (target position data).
  • the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V in accordance with a difference between the distance from the vehicle V to the target obtained on the basis of the relative position of the target and the assumed distance from the vehicle V to the target. The target position selection unit 35 determines that the reliability is higher as the aforementioned difference is smaller.
  • the straight line extracting unit 34 approximates straight lines (N2 and N3) with respect to the target position data (72 and 73) indicating the traveling lane boundary shown in Fig. 9 .
  • the target position selection unit 35 measures the respective distances in a vehicle width direction from the vehicle V to the respective straight lines (N2 and N3).
  • the traveling lane width is 4 meters and the vehicle V is traveling in the center of the traveling lane
  • the respective assumed distances from the center of the vehicle V to the respective white lines are 2 meters.
  • the absolute value of the difference between the distance from the vehicle V to the straight line (N2, N3) and the assumed distance (2 meters) including a deviation of the position of the vehicle V with respect to the traveling lane, a detection error, and the like is 1 meter or more, it is determined that there is a high possibility that the distance from the vehicle V to the straight line (N2, N3) is inaccurate.
  • the absolute value of the difference between the distance WL and the estimated distance (2 meters) is the less than 1 meter, but the absolute value of the difference between the distance LR and the estimated distance (2 meters) is equal to or more than 1 meter.
  • the target position selection unit 35 lowly evaluates the reliability of the target position data 73 indicating the right-hand side traveling lane boundary, and highly evaluates the reliability of the target position data 72 indicating the left-hand side traveling lane boundary.
  • the target position selection unit 35 eliminates the target position data 73 of which the reliability is lowly evaluated, and adopts only the target position data 72 of which the reliability is highly evaluated.
  • the above-mentioned determination method of the reliability based on the difference with the assumed distance is applicable not only to the traveling lane boundaries, e.g. white lines and curbs, but is applicable also to other targets.
  • road structures e.g. road signs, traffic lights, and utility poles, are present at side strips.
  • the assumed distance can be set on the basis of the traveling lane width, and therefore the difference between the relative distance to the road structure detected from the vehicle V and the assumed distance can be calculated.
  • the target position selection unit 35 further determines reliability based on the attribute of the target with respect to the target position data 72 selected on the basis of the above-mentioned difference with the assumed distance. More specifically, the target position selection unit 35 determines reliability of the target position data 72 on the basis of the attribute of the target estimated by the target attribute estimation unit 37, and further narrows down the target position data to be used for the self-position estimation.
  • the detection accuracy (i.e., reliability) of the relative position is relatively high.
  • the map information it is possible to previously specify whether the detected white line is a solid line or a dashed line.
  • the white line positioned at one side of the vehicle V is a solid line and the white line positioned in the other side is a dashed line, it is determined that the reliability of the target position data indicating the white line positioned on the one side is relatively high even if the detection errors of both sides or the respective distances from the vehicle V are approximately the same. Consequently, the target position data indicating the white line positioned in the one side is selected.
  • the type of the white line is merely an example of the attribute of the targets, and therefore other attributes of the targets can also be applied thereto.
  • a color of the section line it is easier to detect white lines than the yellow lines, and therefore the reliability of the white lines is highly determined.
  • the reliabilities of the different target may be determined from each other. For example, comparing stop lines and the pedestrian crossings with each other, since the number of characteristic parts of the pedestrian crossings is larger than that of the stop lines, the reliability of the pedestrian crossings is highly determined.
  • the target position selection unit 35 further determines the reliability of the target position data on the basis of a time period when the target position data can be continuously detected, with regard to the target position data selected on the basis of the above-mentioned attribute of the target.
  • any targets present in general environments can not always be continuously detected with constant reliability due to aging degradation, occlusion, and other effects.
  • a detection of only a certain direction is always uncertain. Therefore, the information on the white line or the traveling lane boundary is evaluated together with the detection time period thereof. Then, only when it is continuously detected for a certain time period (e.g., 10 seconds) or more, it is determined that the reliability thereof is high and this target position data should be selected.
  • the target position selection unit 35 further determines the reliability of the target position data on the basis of distribution of errors when the target position data of the traveling lane boundary is linearly approximated, with respect to the target position data selected on the basis of the above-mentioned continuous detection time period. In other words, the target position selection unit 35 determines the reliability of the target position data on the basis of the linear information (approximation straight line) extracted by the straight line extracting unit 34, and further narrows down the target position data to be used for the self-position estimation.
  • the target position selection unit 35 determines whether or not a plurality of parallel traveling lane boundaries are detected as target position data which indicates traveling lane boundaries (e.g., white lines) for specifying a traveling lane on which the vehicle V is traveling. Moreover, when a plurality of the parallel traveling lane boundaries are detected, the reliability of the target position data in a range which can be approximated with the straight line in the detection results of the white lines (target position data) is highly evaluated, and selects the highly-evaluated target position data as target position data to be used for the self-position estimation. For example, as shown in Fig.
  • the target position data (72j, 72k, 73j, and 73k), indicating a plurality of the parallel traveling lane boundaries, which specify a traveling lane on which the vehicle V is traveling.
  • the straight line extracting unit 34 applies a linear approximation to the target position data indicating the traveling lane boundary.
  • the target position selection unit 35 selects the target position data (72j and 73j) included in the range LA which can be approximated with the straight lines, among the target position data (72j, 72k, 73j, and 73k). At this time, the target position selection unit 35 expands the range LA which can be approximated with the straight lines with respect to the vehicle V.
  • a section in which the number of target position data having a minimum distance of the target position data being within a range from -15 cm to +15 cm with respect to the approximate line is equal to or more than 80% is set as the range LA that can be approximated with the straight lines.
  • the target position data (72k and 73k) being not included in the range LA which can be approximated with the straight lines is eliminated.
  • the straight line extracting unit 34 executes a curve approximation instead of the linear approximation (straight line approximation).
  • the target position selection unit 35 highly evaluates the reliability of the target position data (72m and 73m) included in the range LB which can be approximated with the curved lines (N2 and N3), and selects the highly-evaluated target position data as target position data to be used for the self-position estimation.
  • the target position data (72n and 73n) being not included in the range LB which can be approximated with the curved lines is eliminated.
  • the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V in the order of (1) the difference between the distance from the vehicle V to the target and the assumed distance, (2) The attribute of the target, (3) the continuous detection time period, and (4) the distribution of errors when the target position data indicating the traveling lane boundary is linearly approximated.
  • the present invention is not limited to such an example, but the sequence of the determination of reliability can be arbitrarily replaced. Alternatively, only a part of the determination processing among the determination processing (1) to (4) can also be executed. Furthermore, a comprehensive evaluation may be executed by quantifying each reliability determination. For example, in each reliability determination, evaluation points may be given in multiple stages to be added thereto, and thereby a total evaluation point may be calculated. Consequently, the reliability of the detected target can be quantified to be determined.
  • the self-position estimation unit 36 compares the target position data selected by the target position selection unit 35 with the position of the target in the map information 41. In other words, the position of the target in the map information 41 and the target position data determined so as to have high reliability by the target position selection unit 35 are matched with each other.
  • the self-position estimation unit 36 estimates a self-position of the vehicle V by executing the above-mentioned comparison (map matching) of the position of the target. More specifically, the self-position estimation unit 36 estimates a position and an attitude angle of total three degrees of freedom composed of a position in the east-west direction of the vehicle V (X coordinate), a position in the north-south direction thereof (Y coordinate), and an azimuth angle (yaw angle ⁇ ). A known self-position estimation method may be used as the method of estimating the position on the map. Proceeding to Step S17, the self-position estimation unit 36 outputs the estimated self-position of the vehicle V.
  • an Iterative Closest Point (ICP) algorithm can be used for the comparison in Step S13.
  • the self-position estimation unit 36 matches endpoints of both ends thereof as an evaluation point, among the positions of the target included in the map information 41.
  • the target position data is more unaffected by an error of odometry as it is closer to the vehicle V (surrounding sensor group 1)
  • the self-position estimating unit 36 can increase the number of evaluation points for the target in the vicinity of the vehicle V by linearly complementing the target, and can decrease the number of the evaluation points for the target far from the vehicle V.
  • the target position data are selected on the basis of the reliability of the relative position of target position data with respect to the vehicle V, the target position data estimated so as to have many errors of the relative position can be eliminated, and thereby the estimation accuracy of the self-position is improved.
  • the target position selecting section 35 determines that the reliability of the relative position of the target position data with respect to the vehicle is higher as the difference between the distance from the vehicle V to the target and the assumed distance is smaller. As a result, since the target position data estimated so as to have a large error in the relative position can be appropriately eliminated, the estimation accuracy of the self-position can be improved.
  • the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the attribute of the target. For example, comparing the solid line and dashed line of white lines with each other, the target position selection unit 35 determines that the solid line capable of steadily obtaining the target position data is more reliable than the dashed line. Accordingly, since the target position data estimated so as to have many errors in the relative position can be appropriately determined, the estimation accuracy of the self-position can be improved.
  • the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the time period when the target position data can be continuously detected. As a result, it is possible to stably and accurately estimate the self-position.
  • the target position selection unit 35 selects target position data having high reliability of the relative position with respect to the vehicle V, when target position data indicating a plurality of parallel traveling lane boundaries, which specify a traveling lane on which the vehicle V travels, is detected. Consequently, the traveling lane boundary having high accuracy of position can be selected, and the accuracy of the self-position estimation becomes higher.
  • the target position selecting section 35 determines that the reliability of the relative position of the target position data indicating the traveling lane boundary with respect to the vehicle V is higher, as an error from the approximate line when approximating the traveling lane boundary is smaller. Consequently, the traveling lane boundary having high accuracy of the detected position can be selected, and the estimation accuracy of the self-position estimation further becomes higher.
  • the moving object is not limited to the vehicle V as moving objects which move on land, but includes vessels, aircraft, spacecraft, and other moving objects.
  • processing circuits may be implemented in one or more processing circuits.
  • a processing circuit includes a programmed processing device such as a processing device including an electric circuit.
  • the processing device includes an Application Specific Integrated Circuit (ASIC) and/or a device such as a conventional circuit component, configured to execute the functions described in the respective embodiments.
  • ASIC Application Specific Integrated Circuit

Description

  • The present invention relates to a self-position estimation method and a self-position estimation device.
  • There has been known a technology of estimating a self-position of an autonomous mobile robot (refer to Patent Literature 1). In Patent Literature 1, a result (surrounding environment information) of having detected a movable region of a mobile robot by means of a sensor is restricted in a region which is predetermined on the basis of the mobile robot, and this restricted surrounding environment information is compared with an environmental map previously stored in the mobile robot, thereby estimating a self-position thereof. Further examples describing methods for estimating a self-position of a vehicle can be found in Patent Literature 2 and Patent Literature 3. Moreover, Patent Literature 4 describes a method for enhancing pre-stored map data using data obtained from a moving vehicle.
  • By the way, in order to estimate a self-position of a vehicle, there are cases of using white lines positioned on both sides of the vehicle in a vehicle width direction. Generally, when both white lines are simultaneously detected, an error may be included in the detected position of the white lines. In particular, the position of the white lines in the vehicle width direction with respect to the vehicle is steadily offset due to a calibration error or the like. For this reason, an estimation result of the self-position becomes unstable, or an estimation accuracy of the self-position is reduced.
  • The present invention has been made in light of the above-mentioned problem, and the object of the present invention is to provide a self-position estimation method and a self-position estimation device for improving an estimation accuracy of the self-position by eliminating the target position data estimated to have many errors of a relative position.
  • This is achieved by the features of the independent claims.
  • According to the self-position estimation method according to one aspect of the present invention, since the target position data estimated to have many errors of the relative position can be eliminated, the estimation accuracy of the self-position can be improved.
    • [Fig. 1] Fig. 1 is a block diagram showing an example of a configuration of a self-position estimation device according to an embodiment.
    • [Fig. 2] Fig. 2 is a perspective diagram showing a state where a surrounding sensor group 1 is mounted in a vehicle V.
    • [Fig. 3] Fig. 3 is a flow chart showing an example of a self-position estimation method using the self-position estimation device shown in Fig. 1.
    • [Fig. 4] Fig. 4 is a perspective diagram showing an environment in which the vehicle V travels when the self-position estimation is executed.
    • [Fig. 5] Figs. 5(a) to 5(d) are diagrams respectively showing positions 71 of curbs 61 and target position data 72 and 73 of white lines 62 and 63 in a vehicle coordinate system detected by the target position detector 31 during time t1 to time t4, in the example shown in Fig. 4.
    • [Fig. 6] Fig. 6 is a diagram showing a result of integrating a moved amount of the vehicle V calculated on the basis of a detection result by a vehicle sensor group 5, in the example shown in Figs. 5(a) to 5(d).
    • [Fig. 7] Fig. 7 is a diagram showing target position data converted into an odometry coordinate system, in the example shown in Figs. 5 and 6.
    • [Fig. 8] Fig. 8 is a conceptual diagram showing linear information (N1, N2, and N3) extracted from the target position data (71a to 71d, 72a to 72d, and 73a to 73d).
    • [Fig. 9] Fig. 9 is the diagram showing straight lines (N2 and N3) approximated to the target position data (72 and 73) indicating the traveling lane boundaries.
    • [Fig. 10] Fig. 10 is a diagram showing an aspect that traveling lane boundaries (72j, 72k, 73j, and 73k) which can be linearly approximated are detected, indicating traveling lane boundaries specifying a traveling lane on which the vehicle V is traveling.
    • [Fig. 11] Fig. 11 is a diagram showing an aspect that traveling lane boundaries (72m, 72n, 73m, and 73n) which can be curvilinearly approximated are detected, indicating traveling lane boundaries specifying a traveling lane on which the vehicle V is traveling.
  • An embodiment will now be explained with reference to the drawings. In the description of the drawings, the identical or similar reference numeral is attached to the identical or similar part, and an explanation thereof is omitted.
  • With reference to Fig. 1, a configuration of a self-position estimation device according to the present embodiment will now be explained. The self-position estimation device according to the present embodiment includes a surrounding sensor group 1, a processing unit 3, a storage unit 4, and a vehicle sensor group 5. The self-position estimation device according to the present embodiment is mounted in a vehicle V (refer to Fig. 2), and is configured to estimate a self-position of the vehicle V.
  • In the present embodiment, it is configured to estimate three degrees of freedom in total including positions and attitude angle (i.e., a position in the east-west direction (X axial direction) (X coordinate [m]) and a position in the north-south direction (Y axial direction) (Y coordinate [m]) as the self-position of the vehicle V to be estimated, and an azimuth angle θ of the vehicle (yaw angle [rad]) as the attitude angle data of the vehicle V to be estimated) on the two-dimensional plane.
  • The surrounding sensor group 1 includes a plurality of Laser Range Finders (LRFs) 101 and 102 and a plurality of cameras 201 and 202, for example. The Laser Range Finders (LRFs) 101 and 102 are respectively configured to detect a distance and azimuth to a target by receiving light reflected from the target to which laser light is emitted. The cameras 201 and 202 are configured to capture surroundings of the vehicle V, and obtain the digital image capable of image processing. Thus, the surrounding sensor group 1 is composed of a plurality of sensors respectively configured to detect targets present in surroundings of the vehicle V. In addition to the plurality of sensors, the surrounding sensor group 1 may include sonar and/or radar. The targets which are present in the surroundings of the vehicle V include: targets indicating traveling lane boundaries present on a traveling lane in the surroundings of the vehicle V, e.g. white lines, curbs, median strips, the guardrails, and reflectors; road surface markings, e.g. stop lines, pedestrian crossings, and speed limit markings; and road structures, e.g. road signs, traffic lights, and the utility-line pole.
  • Fig. 2 shows an example illustrating a state where the surrounding sensor group 1 is mounted in the vehicle V. The LRFs 101 and 102 can be respectively mounted near front fenders of both sides of the vehicle V, for example. The LRFs 101 and 102 are configured to scan laser light at a predetermined scan angle (e.g., 90 degrees) so that a track of the laser light to be emitted may, for example, form a vertical plane with respect to a road surface as a rotation axis along a front-back direction D of the vehicle V. Consequently, the LRFs 101 and 102 can detect targets, such as curbs or the like, which are present in a right-left direction of the vehicle V. The LRFs 101 and 102 are configured to sequentially output a shape of the detected target to the processing unit 3 as a detection result.
  • The cameras 201 and 202 can be respectively mounted in door mirrors of both sides of the vehicle V, for example. The cameras 201 and 202 are configured to capture an image by means of solid state imaging elements, e.g. a Charge-Coupled Device (CCD) and a Complementary Metal-Oxide Semiconductor (CMOS), for example. The cameras 201 and 202 are configured to capture a road surface of a lateral direction of the vehicle V, for example. The cameras 201 and 202 are configured to sequentially output the captured image to the processing unit 3.
  • Returning to Fig. 1, the storage unit 4 is a map information storage unit configured to store map information 41 including position information on targets present on a road or around the road. The storage unit 4 can be composed by including a semiconductor memory, a magnetic disk, or the like. The targets (landmark) recorded in the map information 41 includes, for example, various facilities which can be detected by the surrounding sensor group 1 in addition to: the road markings indicating stop lines, pedestrian crossings, advance notices of pedestrian crossing, section lines, and the like; structures, e.g. curbs, and the like, etc. Also regarding a target actually having a three-dimensional structure such as curbs, the map information 41 is described with only position information on a two-dimensional plane. In the map information 41, the position information, e.g. curbs and white lines, is defined by the aggregate of linear information having two-dimensional position information on both end points. The map information 41 is described as linear information on a two-dimensional plane approximated with a polygonal line, when a shape of real environment is a curve.
  • The vehicle sensor group 5 includes a GPS receiver 51, an accelerator sensor 52, a steering sensor 53, a brake sensor 54, a vehicle speed sensor 55, an acceleration sensor 56, a wheel speed sensor 57, and other sensors, such as a yaw rate sensor. Each sensor 51 to 57 is connected to the processing unit 3 and is configured to sequentially output various detection results to the processing unit 3. The processing unit 3 can calculate a position of the vehicle V in the map information 41 or can calculate the odometry indicating a moved amount of the vehicle V in a unit time, by using each detection result of the vehicle sensor group 5. For example, as a measuring method of the moved amount of the vehicle V, there can be considered various schemes, e.g. an odometry measurement method at the rotational frequency of a tire, an inertia measurement method using a gyroscope or an acceleration sensor, a method by receiving electric waves from satellites, e.g. a Global Navigation Satellite System (GNSS), and Simultaneous Localization and Mapping (SLAM) for estimating an moved amount from change of a measurement value of external sensors; but it may be used any method.
  • The processing unit 3 includes: a target position detection unit 31, a moved amount estimation unit 32, a target position storing unit 33, a straight line extracting unit 34, a target position selection unit 35, a self-position estimation unit 36, and a target attribute estimation unit 37. The processing unit 3 can be composed by including a microcontroller which is an integrated circuit provided with a Central Processing Unit (CPU), a memory, an input / output interface (I/F), and the like, for example. In this case, a plurality of information processing units (31 to 37) constituting the processing unit 3 are realized by the CPU executing a computer program preinstalled in the microcontroller. Each unit constituting the processing unit 3 may be composed by including integrated hardware or may be composed by including discrete hardware. The microcontroller may also be used as an Electronic Control Unit (ECU) used for other control in regard of the vehicle V, e.g. automatic driving control, for example. A "self-position estimation circuit" is provided therein by including the moved amount estimation unit 32, the target position storing unit 33, the straight line extracting unit 34, the target position selection unit 35, the self-position estimation unit 36, and the target attribute estimation unit 37.
  • The target position detection unit 31 detects a relative position between a target present in the surroundings of the vehicle V and the vehicle V on the basis of a detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202. The relative position detected by the target position detection unit 31 is a position in a vehicle coordinate system. The vehicle coordinate system may adopt the center of a rear wheel axle of the vehicle V as an origin point, a forward direction of the vehicle V as a positive direction of the x-axis, a leftward direction of the vehicle V as a positive direction of the y-axis, and an upward direction as a positive direction of the z-axis. Moreover, a conversion formula from the coordinate system (sensor coordinate system) of the LRFs 101 and 102 and the cameras 201 and 202 to the vehicle coordinate system is previously set in the target position detection unit 31. A "target detection sensor" is provided therein by including the vehicle sensor group 5 and the target position detection unit 31.
  • The moved amount estimation unit 32 detects an odometry which is a moved amount of the vehicle V in a unit time on the basis of detection result information of at least any one of the sensors includes in the vehicle sensor group 5. The moved amount of the vehicle V is detected as a moved amount in the odometry coordinate system. The target position storing unit 33 stores a position where the relative position of the target detected by the target position detection unit 31 is moved by the moved amount of the vehicle V detected by the moved amount estimation unit 32, as target position data, in a primary storage unit or the storage unit 4 in the processing unit 3.
  • The straight line extracting unit 34 extracts linear information from the target position data stored in the target position storing unit 33. The target attribute estimation unit 37 estimates an attribute of the target on the basis of a detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202. The target position selection unit 35 selects target position data on the basis of reliability of the relative position with respect to the vehicle of target position data. The target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the linear information extracted by the straight line extracting unit 34 and the attribute of the target estimated by the target attribute estimation unit 37. The self-position estimation unit 36 estimates a self-position which is a current position of the vehicle V by comparing the selected target position data with the map information including the position information on the target present on the road or around the road.
  • With reference to Fig. 3, an example of a self-position estimation method using the self-position estimation device shown in Fig. 1 will now be explained. First, in Step S01, the self-position estimation device measures surroundings of the vehicle V using the surrounding sensor group 1.
  • Proceeding to Step S03, the surrounding sensor group 1 respectively detects targets present in the surroundings of the vehicle V. Proceeding to Step S05, the target position detection unit 31 estimates a position of the target with respect to the LRFs 101 and 102 and the cameras 201 and 202 (i.e., a relative position of the target in the sensor coordinate system) on the basis of the detection result of at least any one of the LRFs 101 and 102 and the cameras 201 and 202. For example, in a case of the cameras 201 and 202, a relationship between the position in an image and the actual distance may be previously measured. Alternatively, it is possible to utilize a motion stereo system. The estimation method is not limited to this method, and other known methods can also be utilized. If another sensor (e.g., sonar, LRF, or radar) capable of obtaining distance information is used, a value to be obtained may be directly utilized.
  • Fig. 4 is an example illustrating an environment in which the vehicle V travels when the self-position estimation is executed. In the example shown in Fig. 4, a road surface including curb 61 is irradiated with laser light emitted from the LRF 101, as shown by the line 64. The target position detection unit 31 extracts a place where change of a shape is large, as a position of the curb 61, on the basis of a direction and a distance of the emitted laser light, and thereby detects a position in the sensor coordinate system. Since it can be assumed that there is always a road surface in a vertical downward direction of the LRFs 101 and 102, the curb 61 can be detected by extracting a point where there is a large change when the road surface is compared with the height thereof.
  • Moreover, the target position detection unit 31 detects white lines 62 and 63 which are present at both sides of the vehicle V, respectively on the basis of brightness information of the images captured by the cameras 201 and 202. For example, the target position detection unit 31 detects the pattern from which luminance is changed in the order of a dark portion, a bright portion, and a bright portion, on the basis of gray scale image captured by the camera (201, 202), and thereby can detect the center of the bright portion as the white line (62, 63). The positions of the white lines 62 and 63 in the sensor coordinate system can be respectively detected on the basis of a positional relationship between the cameras 201, 202 and the road surface. The position in the sensor coordinate system detected in Step S05 is hereinafter handled as two-dimensional data from which the height information is excluded.
  • Proceeding to Step S07, the target position detection unit 31 converts the relative position of the target in the sensor coordinate system into a relative position of the target in the vehicle coordinate system using the conversion formula previously set therein.
  • Figs. 5(a) to 5(d) are diagrams respectively showing positions 71 of curb 61 and target position data 72 and 73 of white lines 62 and 63 in a vehicle coordinate system detected by the target position detector 31 during time t1 to time t4, in the example shown in Fig. 4. Time t1 is the oldest time, and time t4 is the newest time.
  • Next, in Step S07, the moved amount estimation unit 32 integrates the moved amount of the vehicle V calculated on the basis of the detection result from the vehicle sensor group 5, and thereby calculates a position of the vehicle V in the odometry coordinate system. For the odometry coordinate system, the azimuth angle of the vehicle V may be set to 0 degree, with reference to a position of the vehicle V, as the origin point, at the time when power is supplied to the self-position estimation device or the processing is reset. The integration of the moved amount of the vehicle V is executed in the odometry coordinate system.
  • Fig. 6 is a diagram showing a result of integrating a moved amount (M1, M2, M3, and M4) of the vehicle V calculated on the basis of a detection result by a vehicle sensor group 5, in the example shown in Figs. 5(a) to 5(d). The moved amount includes a change in position and attitude (θ: yaw angle) on the two-dimensional coordinate system. In this manner, the moved amount estimation unit 32 calculates a position (Xo, Yo) of the vehicle V in the odometry coordinate system.
  • In Step S07, the target position storing unit 33 stores a position where the relative position of the target in the vehicle coordinate system detected by the target position detection unit 31 is moved by the moved amount of the vehicle V detected by the moved amount estimation unit 32, as target position data.
  • Fig. 7 is a diagram showing target position data (71a to 71d, 72a to 72d, and 73a to 73d) converted into an odometry coordinate system, in the example shown in Figs. 5 and 6. Thus, the target position storing unit 33 converts the position of the target in the sensor coordinate system measured in the past (t1, t2, t3, ...) into a position of the target in the odometry coordinate system on the basis of the moved amount (M1, M2, M3, M4) of the vehicle V, and stores the converted position data as target position data therein.
  • Proceeding to Step S09, the target position selection unit 35 extracts target position data (71a to 71d, 72a to 72d, and 73a to 73d) indicating traveling lane boundary on the basis of a plurality of stored target position data, and calculates reliability of the relative position with respect to the vehicle V with respect to the extracted target position data (71a to 71d, 72a to 72d, and 73a to 73d).
  • First, as shown in Fig. 8, the straight line extracting unit 34 extracts linear information (N1, N2, and N3) on the basis of the target position data (71a to 71d, 72a to 72d, and 73a to 73d) stored in the target position storing unit 33. A linear approximation is applied to the detection result of the white line (target position data). Moreover, the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V in accordance with a difference between the distance from the vehicle V to the target obtained on the basis of the relative position of the target and the assumed distance from the vehicle V to the target. The target position selection unit 35 determines that the reliability is higher as the aforementioned difference is smaller.
  • For example, the straight line extracting unit 34 approximates straight lines (N2 and N3) with respect to the target position data (72 and 73) indicating the traveling lane boundary shown in Fig. 9. The target position selection unit 35 measures the respective distances in a vehicle width direction from the vehicle V to the respective straight lines (N2 and N3). When the traveling lane width is 4 meters and the vehicle V is traveling in the center of the traveling lane, the respective assumed distances from the center of the vehicle V to the respective white lines are 2 meters. When the absolute value of the difference between the distance from the vehicle V to the straight line (N2, N3) and the assumed distance (2 meters) including a deviation of the position of the vehicle V with respect to the traveling lane, a detection error, and the like is 1 meter or more, it is determined that there is a high possibility that the distance from the vehicle V to the straight line (N2, N3) is inaccurate. In the example shown in Fig. 9, the absolute value of the difference between the distance WL and the estimated distance (2 meters) is the less than 1 meter, but the absolute value of the difference between the distance LR and the estimated distance (2 meters) is equal to or more than 1 meter. Accordingly, the target position selection unit 35 lowly evaluates the reliability of the target position data 73 indicating the right-hand side traveling lane boundary, and highly evaluates the reliability of the target position data 72 indicating the left-hand side traveling lane boundary. The target position selection unit 35 eliminates the target position data 73 of which the reliability is lowly evaluated, and adopts only the target position data 72 of which the reliability is highly evaluated.
  • The above-mentioned determination method of the reliability based on the difference with the assumed distance is applicable not only to the traveling lane boundaries, e.g. white lines and curbs, but is applicable also to other targets. For example, road structures, e.g. road signs, traffic lights, and utility poles, are present at side strips. Accordingly, the assumed distance can be set on the basis of the traveling lane width, and therefore the difference between the relative distance to the road structure detected from the vehicle V and the assumed distance can be calculated.
  • The target position selection unit 35 further determines reliability based on the attribute of the target with respect to the target position data 72 selected on the basis of the above-mentioned difference with the assumed distance. More specifically, the target position selection unit 35 determines reliability of the target position data 72 on the basis of the attribute of the target estimated by the target attribute estimation unit 37, and further narrows down the target position data to be used for the self-position estimation.
  • For example, since a detectable region of the solid line is larger than a detectable region of the dashed line even if both lines are the same white line, it can be determined that the detection accuracy (i.e., reliability) of the relative position is relatively high. By referring to the map information, it is possible to previously specify whether the detected white line is a solid line or a dashed line. When it turned out that the white line positioned at one side of the vehicle V is a solid line and the white line positioned in the other side is a dashed line, it is determined that the reliability of the target position data indicating the white line positioned on the one side is relatively high even if the detection errors of both sides or the respective distances from the vehicle V are approximately the same. Consequently, the target position data indicating the white line positioned in the one side is selected.
  • The type of the white line is merely an example of the attribute of the targets, and therefore other attributes of the targets can also be applied thereto. For example, regarding a color of the section line, it is easier to detect white lines than the yellow lines, and therefore the reliability of the white lines is highly determined. Moreover, the reliabilities of the different target may be determined from each other. For example, comparing stop lines and the pedestrian crossings with each other, since the number of characteristic parts of the pedestrian crossings is larger than that of the stop lines, the reliability of the pedestrian crossings is highly determined.
  • The target position selection unit 35 further determines the reliability of the target position data on the basis of a time period when the target position data can be continuously detected, with regard to the target position data selected on the basis of the above-mentioned attribute of the target.
  • Not only white lines but any targets present in general environments can not always be continuously detected with constant reliability due to aging degradation, occlusion, and other effects. Moreover, in a case of adopting a sensor fusion system for covering different directions by means of a plurality of sensors, it is also considered that a detection of only a certain direction is always uncertain. Therefore, the information on the white line or the traveling lane boundary is evaluated together with the detection time period thereof. Then, only when it is continuously detected for a certain time period (e.g., 10 seconds) or more, it is determined that the reliability thereof is high and this target position data should be selected.
  • The target position selection unit 35 further determines the reliability of the target position data on the basis of distribution of errors when the target position data of the traveling lane boundary is linearly approximated, with respect to the target position data selected on the basis of the above-mentioned continuous detection time period. In other words, the target position selection unit 35 determines the reliability of the target position data on the basis of the linear information (approximation straight line) extracted by the straight line extracting unit 34, and further narrows down the target position data to be used for the self-position estimation.
  • The target position selection unit 35 determines whether or not a plurality of parallel traveling lane boundaries are detected as target position data which indicates traveling lane boundaries (e.g., white lines) for specifying a traveling lane on which the vehicle V is traveling. Moreover, when a plurality of the parallel traveling lane boundaries are detected, the reliability of the target position data in a range which can be approximated with the straight line in the detection results of the white lines (target position data) is highly evaluated, and selects the highly-evaluated target position data as target position data to be used for the self-position estimation. For example, as shown in Fig. 10, there are detected the target position data (72j, 72k, 73j, and 73k), indicating a plurality of the parallel traveling lane boundaries, which specify a traveling lane on which the vehicle V is traveling. The straight line extracting unit 34 applies a linear approximation to the target position data indicating the traveling lane boundary. The target position selection unit 35 selects the target position data (72j and 73j) included in the range LA which can be approximated with the straight lines, among the target position data (72j, 72k, 73j, and 73k). At this time, the target position selection unit 35 expands the range LA which can be approximated with the straight lines with respect to the vehicle V. For example, a section in which the number of target position data having a minimum distance of the target position data being within a range from -15 cm to +15 cm with respect to the approximate line is equal to or more than 80% is set as the range LA that can be approximated with the straight lines. On the other hand, the target position data (72k and 73k) being not included in the range LA which can be approximated with the straight lines is eliminated.
  • In addition, as shown in Fig. 11, when the traveling lane on which the vehicle V is traveling is a curve section, the approximate line is not always a straight line. In this case, the straight line extracting unit 34 executes a curve approximation instead of the linear approximation (straight line approximation). The target position selection unit 35 highly evaluates the reliability of the target position data (72m and 73m) included in the range LB which can be approximated with the curved lines (N2 and N3), and selects the highly-evaluated target position data as target position data to be used for the self-position estimation. On the other hand, the target position data (72n and 73n) being not included in the range LB which can be approximated with the curved lines is eliminated.
  • In the present embodiment, there has been shown the example in which the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V in the order of (1) the difference between the distance from the vehicle V to the target and the assumed distance, (2) The attribute of the target, (3) the continuous detection time period, and (4) the distribution of errors when the target position data indicating the traveling lane boundary is linearly approximated. The present invention is not limited to such an example, but the sequence of the determination of reliability can be arbitrarily replaced. Alternatively, only a part of the determination processing among the determination processing (1) to (4) can also be executed. Furthermore, a comprehensive evaluation may be executed by quantifying each reliability determination. For example, in each reliability determination, evaluation points may be given in multiple stages to be added thereto, and thereby a total evaluation point may be calculated. Consequently, the reliability of the detected target can be quantified to be determined.
  • Next, proceeding to Step S13, the self-position estimation unit 36 compares the target position data selected by the target position selection unit 35 with the position of the target in the map information 41. In other words, the position of the target in the map information 41 and the target position data determined so as to have high reliability by the target position selection unit 35 are matched with each other.
  • Proceeding to Step S15, the self-position estimation unit 36 estimates a self-position of the vehicle V by executing the above-mentioned comparison (map matching) of the position of the target. More specifically, the self-position estimation unit 36 estimates a position and an attitude angle of total three degrees of freedom composed of a position in the east-west direction of the vehicle V (X coordinate), a position in the north-south direction thereof (Y coordinate), and an azimuth angle (yaw angle θ). A known self-position estimation method may be used as the method of estimating the position on the map. Proceeding to Step S17, the self-position estimation unit 36 outputs the estimated self-position of the vehicle V.
  • In addition, an Iterative Closest Point (ICP) algorithm can be used for the comparison in Step S13. At this time, with respect to section lines, for example, the self-position estimation unit 36 matches endpoints of both ends thereof as an evaluation point, among the positions of the target included in the map information 41. Moreover, since the target position data is more unaffected by an error of odometry as it is closer to the vehicle V (surrounding sensor group 1), the self-position estimating unit 36 can increase the number of evaluation points for the target in the vicinity of the vehicle V by linearly complementing the target, and can decrease the number of the evaluation points for the target far from the vehicle V.
  • As mentioned above, according to the embodiments, the following operation/working-effects can be obtained.
  • Since the target position data are selected on the basis of the reliability of the relative position of target position data with respect to the vehicle V, the target position data estimated so as to have many errors of the relative position can be eliminated, and thereby the estimation accuracy of the self-position is improved.
  • The target position selecting section 35 determines that the reliability of the relative position of the target position data with respect to the vehicle is higher as the difference between the distance from the vehicle V to the target and the assumed distance is smaller. As a result, since the target position data estimated so as to have a large error in the relative position can be appropriately eliminated, the estimation accuracy of the self-position can be improved.
  • The target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the attribute of the target. For example, comparing the solid line and dashed line of white lines with each other, the target position selection unit 35 determines that the solid line capable of steadily obtaining the target position data is more reliable than the dashed line. Accordingly, since the target position data estimated so as to have many errors in the relative position can be appropriately determined, the estimation accuracy of the self-position can be improved.
  • The target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V on the basis of the time period when the target position data can be continuously detected. As a result, it is possible to stably and accurately estimate the self-position.
  • The target position selection unit 35 selects target position data having high reliability of the relative position with respect to the vehicle V, when target position data indicating a plurality of parallel traveling lane boundaries, which specify a traveling lane on which the vehicle V travels, is detected. Consequently, the traveling lane boundary having high accuracy of position can be selected, and the accuracy of the self-position estimation becomes higher.
  • The target position selecting section 35 determines that the reliability of the relative position of the target position data indicating the traveling lane boundary with respect to the vehicle V is higher, as an error from the approximate line when approximating the traveling lane boundary is smaller. Consequently, the traveling lane boundary having high accuracy of the detected position can be selected, and the estimation accuracy of the self-position estimation further becomes higher.
  • The embodiments of the present invention have been described above, as a disclosure including associated description and drawings to be construed as illustrative, not restrictive. This disclosure makes clear a variety of alternative embodiments, working examples, and operational techniques for those skilled in the art.
  • The moving object is not limited to the vehicle V as moving objects which move on land, but includes vessels, aircraft, spacecraft, and other moving objects.
  • The functions described in the respective embodiments may be implemented in one or more processing circuits. Such a processing circuit includes a programmed processing device such as a processing device including an electric circuit. Moreover, the processing device includes an Application Specific Integrated Circuit (ASIC) and/or a device such as a conventional circuit component, configured to execute the functions described in the respective embodiments.
  • REFERENCE SIGNS LIST
  • 1
    Surrounding sensor group (Target detection sensor)
    31
    Target position detection unit (Target detection sensor)
    32
    Moved amount estimation unit (Self-position estimation circuit)
    33
    Target position storing unit (Self-position estimation circuit)
    34
    Straight line extracting unit (Self-position estimation circuit)
    35
    Target position selection unit (Self-position estimation circuit)
    36
    Self-position estimation unit (Self-position estimation circuit)
    37
    Target attribute estimation unit (Self-position estimation circuit)
    41
    Map information
    61
    Curb (Target)
    62, 63
    White line (Target)
    72j, 72k, 72m, 72n
    Target position data
    73j, 73k, 73n, 73m
    Target position data
    M1 to M4
    Moved amount of moving object
    N1, N2, N3
    Line approximating a plurality of target position data
    V
    Vehicle (Moving object)

Claims (7)

  1. A self-position estimation method using a target detection sensor (1, 31) and a self-position estimation circuit (32, 33, 34, 35, 36, 37), wherein
    the target detection sensor (1, 31) is mounted in a moving object (V), the target detection sensor (1, 31) is configured to detect relative positions between targets (61, 62, 63) present in surroundings of the moving object (V) and the moving object (V);
    the self-position estimation circuit (32, 33, 34, 35, 36, 37) is configured (i) to convert the relative positions into positions of the targets in an odometry coordinate system on the basis of the moved amount and (ii) to store the converted position data as target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m); and
    the self-position estimation method comprises:
    extracting target position data (71a to 71d, 72a to 72d, and 73a to 73d) indicating a traveling lane boundary on the basis of a plurality of the stored target position data;
    selecting target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) to be compared with map information (41) from the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) on the basis of reliability of the relative position of the extracted target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) with respect to the moving object (V); and
    comparing the selected target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) with map information (41) including position information on the target (61, 62, 63) present on a road or around the road, thereby estimating a self-position which is a current position of the moving object (V).
  2. The self-position estimation method according to claim 1, wherein the reliability is determined such that the reliability is higher as a difference between a distance from the moving object (V) to the target (61, 62, 63) obtained from the relative position and an assumed distance from the moving object (V) to the target (61, 62, 63) is smaller.
  3. The self-position estimation method according to claim 1 or 2, wherein the reliability is determined on the basis of an attribute of the target (61, 62, 63).
  4. The self-position estimation method according to any one of claims 1 to 3, wherein the reliability is determined on the basis of a time period when the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) can be continuously detected.
  5. The self-position estimation method according to any one of claims 1 to 4, wherein the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) having high reliability of the relative position with respect to the moving object (V) is selected, when target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) indicating a plurality of parallel traveling lane boundaries, which specify a traveling lane on which moving object (V) travels, is detected.
  6. The self-position estimation method according to claim 5, wherein the reliability is determined such that the reliability of the relative position of the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) indicating the traveling lane boundary with respect to the moving object (V) is higher, as an error between a line approximating a plurality of the target position data (N1, N2, N3) indicating the traveling lane boundary and the plurality of the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) is smaller.
  7. A self-position estimation device comprising:
    a target detection sensor (1, 31) mounted in a moving object (V), the target detection sensor (1, 31) configured to detect relative positions between targets (61, 62, 63) present in surroundings of the moving object (V) and the moving object (V);
    the self-position estimation device further comprising
    a self-position estimation circuit (32, 33, 34, 35, 36, 37) configured
    (i) to convert the relative positions into positions of the targets in an odometry coordinate system on the basis of the moved amount;
    (ii) to store the converted position data as target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m);
    (iii) to extract target position data (71a to 71d, 72a to 72d, and 73a to 73d) indicating a traveling lane boundary on the basis of a plurality of the stored target position data;
    (iv) to select target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) to be compared with map information (41) from the target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) on the basis of reliability of the relative position of the extracted target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) with respect to the moving object (V); and
    (v) to compare the selected target position data (72j, 72k, 72m, 72n, 73j, 73k, 73n, 73m) with map information (41) including the position information on the target (61, 62, 63) present on a road or around the road, thereby estimating a self-position which is a current position of the moving object (V).
EP16917637.7A 2016-09-27 2016-09-27 Self-position estimation method and self-position estimation device Active EP3521962B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/078428 WO2018061084A1 (en) 2016-09-27 2016-09-27 Self-position estimation method and self-position estimation device

Publications (3)

Publication Number Publication Date
EP3521962A4 EP3521962A4 (en) 2019-08-07
EP3521962A1 EP3521962A1 (en) 2019-08-07
EP3521962B1 true EP3521962B1 (en) 2021-07-21

Family

ID=61759367

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16917637.7A Active EP3521962B1 (en) 2016-09-27 2016-09-27 Self-position estimation method and self-position estimation device

Country Status (10)

Country Link
US (1) US11321572B2 (en)
EP (1) EP3521962B1 (en)
JP (1) JP6881464B2 (en)
KR (1) KR20190045220A (en)
CN (1) CN109791408B (en)
BR (1) BR112019006057B1 (en)
CA (1) CA3038643A1 (en)
MX (1) MX2019002985A (en)
RU (1) RU2720140C1 (en)
WO (1) WO2018061084A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6421782B2 (en) * 2016-04-22 2018-11-14 トヨタ自動車株式会社 Peripheral information collection system
US11157752B2 (en) * 2017-03-29 2021-10-26 Pioneer Corporation Degraded feature identification apparatus, degraded feature identification system, degraded feature identification method, degraded feature identification program, and computer-readable recording medium recording degraded feature identification program
US10750953B1 (en) 2018-05-11 2020-08-25 Arnold Chase Automatic fever detection system and method
US11294380B2 (en) 2018-05-11 2022-04-05 Arnold Chase Passive infra-red guidance system
US10467903B1 (en) 2018-05-11 2019-11-05 Arnold Chase Passive infra-red pedestrian detection and avoidance system
US11062608B2 (en) 2018-05-11 2021-07-13 Arnold Chase Passive infra-red pedestrian and animal detection and avoidance system
US11847838B2 (en) * 2018-09-25 2023-12-19 Hitachi Astemo, Ltd. Recognition device
JP7332403B2 (en) * 2019-09-11 2023-08-23 株式会社東芝 Position estimation device, mobile control system, position estimation method and program
JP7409330B2 (en) 2021-01-28 2024-01-09 トヨタ自動車株式会社 Self-position estimation accuracy verification method, self-position estimation system
JP2023039626A (en) * 2021-09-09 2023-03-22 日立Astemo株式会社 On-vehicle processing device, vehicle control device and self-position estimation method

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3029360B2 (en) 1993-05-21 2000-04-04 三菱電機株式会社 Automotive white line detector
US6581005B2 (en) * 2000-11-30 2003-06-17 Nissan Motor Co., Ltd. Vehicle position calculation apparatus and method
KR100495635B1 (en) * 2002-09-02 2005-06-16 엘지전자 주식회사 Method for correcting position error in navigation system
JP4392389B2 (en) * 2005-06-27 2009-12-24 本田技研工業株式会社 Vehicle and lane recognition device
US7912633B1 (en) * 2005-12-01 2011-03-22 Adept Mobilerobots Llc Mobile autonomous updating of GIS maps
JP4978099B2 (en) 2006-08-03 2012-07-18 トヨタ自動車株式会社 Self-position estimation device
JP2008250906A (en) 2007-03-30 2008-10-16 Sogo Keibi Hosho Co Ltd Mobile robot, and self-location correction method and program
JP4985166B2 (en) * 2007-07-12 2012-07-25 トヨタ自動車株式会社 Self-position estimation device
JP4254889B2 (en) * 2007-09-06 2009-04-15 トヨタ自動車株式会社 Vehicle position calculation device
JP4950858B2 (en) * 2007-11-29 2012-06-13 アイシン・エィ・ダブリュ株式会社 Image recognition apparatus and image recognition program
US9026315B2 (en) * 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
JP5142047B2 (en) * 2009-02-26 2013-02-13 アイシン・エィ・ダブリュ株式会社 Navigation device and navigation program
US8306269B2 (en) * 2009-03-12 2012-11-06 Honda Motor Co., Ltd. Lane recognition device
JP5206740B2 (en) * 2010-06-23 2013-06-12 株式会社デンソー Road shape detection device
KR101472615B1 (en) * 2010-12-21 2014-12-16 삼성전기주식회사 System and method for warning lane departure
JP2012194860A (en) * 2011-03-17 2012-10-11 Murata Mach Ltd Traveling vehicle
WO2014038041A1 (en) * 2012-09-06 2014-03-13 株式会社東芝 Position detection device, position detection method and position detection program
JP6325806B2 (en) 2013-12-06 2018-05-16 日立オートモティブシステムズ株式会社 Vehicle position estimation system
EP2918974B1 (en) * 2014-03-11 2019-01-16 Volvo Car Corporation Method and system for determining a position of a vehicle
JP6185418B2 (en) 2014-03-27 2017-08-23 トヨタ自動車株式会社 Runway boundary line detector
KR102255432B1 (en) * 2014-06-17 2021-05-24 팅크웨어(주) Electronic apparatus and control method thereof
JP6397663B2 (en) * 2014-06-18 2018-09-26 シャープ株式会社 Self-propelled electronic device
KR20160002178A (en) * 2014-06-30 2016-01-07 현대자동차주식회사 Apparatus and method for self-localization of vehicle
JP6189815B2 (en) 2014-10-29 2017-08-30 株式会社Soken Traveling line recognition system
WO2016093028A1 (en) 2014-12-08 2016-06-16 日立オートモティブシステムズ株式会社 Host vehicle position estimation device
EP3032221B1 (en) * 2014-12-09 2022-03-30 Volvo Car Corporation Method and system for improving accuracy of digital map data utilized by a vehicle
CN111351494A (en) * 2015-02-10 2020-06-30 御眼视觉技术有限公司 Navigation system and computer readable medium
CN105718860B (en) * 2016-01-15 2019-09-10 武汉光庭科技有限公司 Localization method and system based on driving safety map and binocular Traffic Sign Recognition
CN105929364B (en) * 2016-04-22 2018-11-27 长沙深之瞳信息科技有限公司 Utilize the relative position measurement method and measuring device of radio-positioning
US10346797B2 (en) * 2016-09-26 2019-07-09 Cybernet Systems, Inc. Path and load localization and operations supporting automated warehousing using robotic forklifts or other material handling vehicles

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
BR112019006057B1 (en) 2022-12-06
US11321572B2 (en) 2022-05-03
MX2019002985A (en) 2019-07-04
KR20190045220A (en) 2019-05-02
WO2018061084A1 (en) 2018-04-05
CN109791408B (en) 2022-04-26
US20200019792A1 (en) 2020-01-16
BR112019006057A2 (en) 2019-06-18
RU2720140C1 (en) 2020-04-24
CN109791408A (en) 2019-05-21
EP3521962A4 (en) 2019-08-07
EP3521962A1 (en) 2019-08-07
JP6881464B2 (en) 2021-06-02
JPWO2018061084A1 (en) 2019-07-04
CA3038643A1 (en) 2018-04-05

Similar Documents

Publication Publication Date Title
EP3521962B1 (en) Self-position estimation method and self-position estimation device
JP7301909B2 (en) Determination of yaw error from map data, lasers, and cameras
JP6533619B2 (en) Sensor calibration system
EP3306429B1 (en) Position estimation device and position estimation method
Tao et al. Mapping and localization using GPS, lane markings and proprioceptive sensors
US9151626B1 (en) Vehicle position estimation system
US9863775B2 (en) Vehicle localization system
CN111856491B (en) Method and apparatus for determining geographic position and orientation of a vehicle
KR102627453B1 (en) Method and device to estimate position
KR20200028648A (en) Method for adjusting an alignment model for sensors and an electronic device performing the method
GB2370706A (en) Determining the position of a vehicle
US20200133301A1 (en) Method and System for Determining an Accurate Position of An Autonomous Vehicle
JP6834401B2 (en) Self-position estimation method and self-position estimation device
JP6989284B2 (en) Vehicle position estimation device and program
JP6838365B2 (en) Self-position estimation method and self-position estimation device
US10249056B2 (en) Vehicle position estimation system
CN114730014A (en) Object recognition device and object recognition method
WO2022196709A1 (en) Autonomous vehicle
JP6972528B2 (en) Self-position estimation method, mobile vehicle travel control method, self-position estimation device, and mobile vehicle travel control device
JP2023053891A (en) Own position estimation apparatus and own position estimation method
JP2022112754A (en) Self-traveling vehicle

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190412

A4 Supplementary search report drawn up and despatched

Effective date: 20190703

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200324

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210310

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016061142

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1413157

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210721

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1413157

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210721

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211122

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211021

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211021

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211022

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602016061142

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210930

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

26N No opposition filed

Effective date: 20220422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210927

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210927

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210721

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20160927

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230823

Year of fee payment: 8

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602016061142

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G05D0001020000

Ipc: G05D0001430000

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230822

Year of fee payment: 8

Ref country code: DE

Payment date: 20230822

Year of fee payment: 8