WO2017037752A1 - 車両位置推定装置、車両位置推定方法 - Google Patents
車両位置推定装置、車両位置推定方法 Download PDFInfo
- Publication number
- WO2017037752A1 WO2017037752A1 PCT/JP2015/004382 JP2015004382W WO2017037752A1 WO 2017037752 A1 WO2017037752 A1 WO 2017037752A1 JP 2015004382 W JP2015004382 W JP 2015004382W WO 2017037752 A1 WO2017037752 A1 WO 2017037752A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- target
- target position
- point
- turning point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 18
- 238000001514 detection method Methods 0.000 claims description 60
- 238000009825 accumulation Methods 0.000 claims description 12
- 230000007423 decrease Effects 0.000 claims description 3
- 230000003247 decreasing effect Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 7
- 239000004575 stone Substances 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000001186 cumulative effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000000717 retained effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000010432 diamond Substances 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/14—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by recording the course traversed by the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Definitions
- the present invention relates to a vehicle position estimation device and a vehicle position estimation method.
- Patent Document 1 The prior art described in Patent Document 1 is a method in which a mobile robot that moves autonomously estimates its own position according to the amount of movement, and compares a path detected by laser scanning with previously acquired map information. Thus, the estimated self-position is corrected.
- the detected passage and map information are treated as two-dimensional data in plan view, and only the data within a predetermined range from the current position is used for collation.
- the white line extending in a straight line is a reference in the vehicle width direction in the traveling direction, but is not a reference in the traveling direction. It cannot be verified.
- a vehicle position estimation device detects the position of a target existing around the vehicle, detects the amount of movement of the vehicle, and sets the position of the target based on the amount of movement. Accumulate as.
- map information including the position of the target is acquired, and the position of the vehicle is estimated by collating the target position in the map information with the target position data.
- the turning point of the vehicle is detected from the moving amount of the vehicle. Then, at least target position data in a range that goes back by a predetermined first set distance from the current position and target position data in a range that goes back by a predetermined second set distance from the turning point are held.
- the target position data in the range that is traced back from the current position by the first set distance and the target position data in the range that is traced back from the turn point by the second set distance are retained,
- the target position can be estimated by collating the target position data with the map information as a base point.
- the data amount of the target position data can be adjusted appropriately.
- FIG. 1 is a configuration diagram of a vehicle position estimation device.
- the vehicle position estimation device 11 estimates the self position of the vehicle, and includes a radar device 12, a camera 13, a map database 14, a sensor group 15, and a controller 16.
- FIG. 2 is a diagram illustrating the arrangement of the radar apparatus and the camera.
- FIG. 3 is a diagram illustrating a scanning range of the radar apparatus and an imaging range of the camera.
- the radar device 12 is composed of, for example, an LRF (Laser Range Finder), measures the distance and azimuth of an object existing on the side of the vehicle 21, and outputs the measured data to the controller 16.
- the radar device 12 is provided at a total of two locations on the left side surface and the right side surface of the vehicle 21.
- the radar device provided on the left side surface of the vehicle 21 is referred to as a left radar device 12L
- the radar device provided on the right side surface of the vehicle 21 is referred to as a right radar device 12R.
- the left radar device 12L scans from the bottom to the left
- the right radar device 12R scans from the bottom to the right.
- each has a rotation axis in the longitudinal direction of the vehicle body and scans in a direction perpendicular to the axis.
- the curbstone 23 is provided on the shoulder of the road along the traveling lane as a boundary line between the roadway and the sidewalk.
- the camera 13 is a wide-angle camera using, for example, a CCD (Charge Coupled Device) image sensor, images the side of the vehicle 21, and outputs the captured data to the controller 16.
- the cameras 13 are provided at two locations in the vehicle 21, that is, a left door mirror and a right door mirror.
- the camera provided on the left door mirror is referred to as the left side camera 13L
- the camera provided on the right door mirror is referred to as the right side camera 13R.
- the left side camera 13L images the left road surface 22 in the vehicle 21, and the right side camera 13R images the right road surface 22 in the vehicle 21. Thereby, the traffic division line 24 on the side of the vehicle body is detected.
- the traffic division line 24 is a lane marking such as a white line painted on the road surface 22 in order to mark a travel lane (vehicle traffic zone) that the vehicle 21 should pass through, and is marked along the travel lane.
- the traffic dividing line 24 is drawn three-dimensionally for convenience. However, since the thickness can actually be regarded as zero, it is assumed to be the same plane as the road surface 22.
- the map database 14 acquires road map information.
- the road map information includes position information of the curb 23 and the traffic dividing line 24.
- the curb 23 is an object having a height, but is acquired as two-dimensional data in plan view.
- the curbstone 23 and the traffic dividing line 24 are converted into data as a set of straight lines, each straight line is acquired as position information of both end points, and the curved arc curve is treated as a straight line approximated by a broken line.
- the map database 14 may be a storage medium that stores road map information in a car navigation system, or from the outside through a communication system such as wireless communication (road-to-vehicle communication or vehicle-to-vehicle communication is possible). Map information may be acquired.
- the map database 14 may periodically acquire the latest map information and update the held map information. Moreover, the map database 14 may accumulate
- the sensor group 15 includes, for example, a GPS receiver, an accelerator sensor, a steering angle sensor, a brake sensor, a vehicle speed sensor, an acceleration sensor, a wheel speed sensor, a yaw rate sensor, and the like, and outputs each detected data to the controller 16.
- the GPS receiver acquires current position information of the vehicle 21.
- the accelerator sensor detects the amount of operation of the accelerator pedal.
- the steering angle sensor detects the operation amount of the steering wheel.
- the brake sensor detects the operation amount of the brake pedal and the pressure in the brake booster.
- the vehicle speed sensor detects the vehicle speed.
- the acceleration sensor detects acceleration / deceleration and lateral acceleration in the vehicle longitudinal direction.
- the wheel speed sensor detects the wheel speed of each wheel.
- the yaw rate sensor detects the yaw rate of the vehicle.
- the controller 16 is composed of, for example, an ECU (Electronic Control Unit), and includes a CPU, a ROM, a RAM, and the like, and a program that executes various arithmetic processes such as a vehicle position estimation process described later is recorded in the ROM. Note that a dedicated controller 16 may be provided for vehicle position estimation processing, or may be shared with other control controllers.
- the controller 16 includes a target position detection unit 31, a movement amount detection unit 32, a target position storage unit 33, a turning point detection unit 34, and a self-position estimation unit 35 as functional block configurations.
- the target position detection unit 31 detects the positions of targets such as the curb stone 23 and the traffic lane markings 24 existing around the vehicle as a relative position with respect to the vehicle in a vehicle coordinate system based on the vehicle.
- FIG. 4 is a diagram illustrating a vehicle coordinate system.
- the vehicle coordinate system is a two-dimensional coordinate in a plan view.
- the center of the rear wheel axle of the vehicle 21 is the origin O
- the front-rear direction is the X VHC axis
- the left-right direction is the Y VHC axis.
- Expressions for converting the coordinate system of the radar device 12 and the coordinate system of the camera 13 into the vehicle coordinate system are obtained in advance. Further, the parameters of the road surface 22 in the vehicle coordinate system are also known.
- the radar device 12 performs laser scanning on the road surface 22 toward the outside in the vehicle width direction, and detects a position where a large change (step) in the height occurs as an end point on the roadway side in the width direction of the curb stone 23. That is, the position of the curb 23 is detected from the three-dimensional data and projected onto the two-dimensional vehicle coordinate system.
- the detection point of the curb stone 23 is Pc and is displayed with a black diamond.
- the road surface 22 is picked up by the camera 13, and in the grayscale image, a pattern in which the luminance changes from the dark part to the bright part and from the bright part to the dark part along the left-right direction of the vehicle body is extracted. Is detected. For example, the center point in the width direction of the traffic dividing line 24 is detected.
- the image data captured by the camera 13 is overhead-converted, and the traffic marking line 24 is detected from the overhead image and projected onto the vehicle coordinate system.
- the detection point of the traffic dividing line 24 is set to Pw, and is displayed with a black circle.
- the movement amount detection unit 32 detects odometry, which is the movement amount of the vehicle 21 in a unit time, from various information detected by the sensor group 15. By integrating this, the traveling locus of the vehicle can be calculated as an odometry coordinate system.
- FIG. 5 is a diagram showing an odometry coordinate system. In the odometry coordinate system, for example, the vehicle position at the time when the system is turned on or off is set as the coordinate origin, and the vehicle body posture (azimuth angle) at that time is set to 0 degrees.
- a travel locus is detected by accumulating three parameters of a vehicle coordinate position [X ODM , Y ODM ] and a vehicle body posture [ ⁇ ODM ] for each calculation cycle.
- a vehicle coordinate position [X ODM , Y ODM ]
- a vehicle body posture [ ⁇ ODM ]
- FIG. 5 the coordinate position of the vehicle and the vehicle body posture at time points t1 to t4 are depicted.
- the accumulated target position data may be coordinate-converted every time with the current vehicle position as the origin. That is, the target position data may be accumulated in the same coordinate system.
- the target position storage unit 33 stores the travel locus based on the movement amount detected by the movement amount detection unit 32 and the position of the target detected by the target position detection unit 31 in association with each other in the odometry coordinate system.
- FIG. 6 is a diagram illustrating a target position in the vehicle coordinate system. Here, the positions of the targets detected by the target position detection unit 31 in the vehicle coordinate system from time t1 to t4 are shown. As targets, the detection point Pc of the curb 23 that exists on the left side of the vehicle 21, the detection point Pw of the traffic division line 24 that exists on the left side of the vehicle 21, and the traffic division line that exists on the right side of the vehicle 21 24 detection points Pw are detected. Due to the displacement and posture change of the vehicle 21, the position of each target in the vehicle coordinate system changes every moment.
- FIG. 7 is a diagram in which a travel locus based on the movement amount of the vehicle is associated with a target position. That is, the position of the target at each time point is projected onto the odometry coordinate system corresponding to the coordinate position of the vehicle and the vehicle body posture at the time points t1 to t4. That is, at each time point, the detection point Pc of the curb 23 existing on the left side of the vehicle 21, the detection point Pw of the traffic line 24 existing on the left side of the vehicle 21, and the traffic classification existing on the right side of the vehicle 21. The detection point Pw of the line 24 is projected.
- each target is on the left side or the right side of the vehicle 21 is determined based on whether the Y VHC coordinate of the vehicle coordinate system is positive or negative. Thus, after dividing the point group on the right or left side of the vehicle 21, the parameters a, b, and c are obtained.
- the straight line L23 is extracted from the detection point Pc of the curb 23 existing on the left side of the vehicle 21 at time points t1 to t4. Further, from time t1 to t4, a straight line L24 is extracted from the detection point Pw of the traffic dividing line 24 existing on the left side of the vehicle 21. Further, from time t1 to time t4, a straight line L24 is extracted from the detection point Pw of the traffic dividing line 24 existing on the right side of the vehicle 21.
- the turning point detection unit 34 refers to the travel locus, and detects a point where the turning angle ⁇ t of the vehicle first becomes equal to or larger than a predetermined set angle ⁇ 1 retroactively from the current position Pn as the turning point Pt1.
- the turning angle ⁇ t of the vehicle is the amount of change in posture until the current vehicle body posture is reached in the odometry coordinate system, and is thus the angle difference of the vehicle body with respect to the current vehicle body orientation.
- the initial value of the setting angle ⁇ 1 is, for example, 60 degrees.
- the set angle ⁇ 1 is made variable according to the length of the straight line extracted by the target position accumulating unit 33.
- the set angle ⁇ 1 is reduced as the traveling lane is a straight line from the current position and the straight line distance L is longer.
- the straight line distance L is obtained, for example, by referring to a set of straight lines extracted by the target position accumulating unit 33 and determining how far the current straight line can be regarded as the same straight line.
- the straight line distance L is long, there is a target that is the reference in the vehicle width direction in the traveling direction, but the target that is the reference in the traveling direction is far from the current position, and the accumulation error of odometry increases. The self-position estimation accuracy will deteriorate.
- FIG. 8 is a map used for setting the setting angle ⁇ 1 according to the straight line distance L.
- the horizontal axis is a linear distance L
- the vertical axis is a set angle ⁇ 1.
- L1 larger than 0 and L2 larger than L1 are determined in advance.
- For setting the angle ⁇ 1 is predetermined and larger theta MAX than larger theta MIN and, theta MIN than 0.
- ⁇ MAX is, for example, 60 degrees
- ⁇ MIN is, for example, 30 degrees.
- the linear distance L is in the range of L1 to L2 is more linear distance L is long, small in the range of theta MIN from the setting angle ⁇ 1 is theta MAX.
- the set angle ⁇ 1 maintains ⁇ MIN .
- FIG. 9 is a diagram illustrating the setting of the turning point Pt1.
- Points P1 and P2 are located in a region retroactive from the current position Pn.
- the turning angle ⁇ t at the point P1 is set to 35 degrees, and the turning angle ⁇ t at the point P2 is set to 65 degrees. Therefore, when the set angle ⁇ 1 is 60 degrees, it is the point P2 that first goes back from the current position Pn and the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1, and this point P2 is detected as the turning point Pt1.
- the set angle ⁇ 1 When the set angle ⁇ 1 is 30 degrees, it is the point P1 that first goes back from the current position Pn and the turning angle ⁇ t becomes equal to or larger than the set angle ⁇ 1, and this point P1 is detected as the turning point Pt1.
- the point P1 may be set as the turning point Pt1
- the point P2 may be set as the turning point Pt2, and both may be set as the turning points. That is, it is only necessary to be able to hold the target position data of the set distance range around the current position of the current vehicle and before the turning point.
- the point where the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1 may be due to meandering such as when avoiding an obstacle, and the turning point may be obtained using the average turning angle ⁇ t AVE. .
- a point where the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1 is selected as a turning point candidate Pp, and an average turning angle ⁇ t AVE in a predetermined setting section centered on the turning point candidate Pp is obtained.
- the set section is a section obtained by a predetermined ⁇ before and after the turning point candidate Pp, that is, a section from the point [Pp ⁇ ] to the point [Pp + ⁇ ].
- the predetermined ⁇ is, for example, 10 m.
- FIG. 10 is a diagram for explaining meandering determination.
- (A) in the figure shows the case where the turning point candidate Pp is selected by turning the vehicle
- (b) in the figure shows the case where the turning point candidate Pp is selected by meandering of the vehicle.
- the target position accumulating unit 33 includes a range from a current position Pn to a point [Pn ⁇ D1] retroactive by a predetermined distance D1, and a point [Pt1 ⁇ D2] from a turning point Pt1 to a predetermined preset distance D2. Holds range target position data.
- the other target position data that is, the range from the point [Pn-D1] to the turning point Pt1, and the target position data before the point [Pt1-D2] are deleted or thinned out. It is not necessary to delete all of the data according to the amount of data that can be accumulated in the target position accumulation unit 33.
- the accumulated data amount may be adjusted by accumulating the target position data thinned out at a predetermined distance interval.
- the set distance D1 is 20 m, for example.
- the initial value of the set distance D2 is 20 m, for example.
- FIG. 11 is a diagram showing a section that holds target position data and a section that deletes or thins out target position data.
- the set distance D2 is a range that is traced back from the turning point Pt1 by the set distance D2.
- the target number N is easily secured by increasing the set distance D2 as the target number N is smaller.
- FIG. 12 is a map used for setting the set distance D2 according to the target number N.
- the horizontal axis is the target number N
- the vertical axis is the set distance D2.
- N1 larger than 0 and N2 larger than N1 are predetermined.
- For setting the distance D2 is predetermined large theta MAX than larger D MIN and, D MIN than 0.
- D MIN is, for example, 20 m
- D MAX is, for example, 40 m.
- the target number N is the number of detection points, but it may be converted into a cumulative length of a straight line.
- the set distance D2 maintains DMIN .
- an object when the target number N is in the range of N2 of N1 is, the smaller the target object number N, the set distance D2 increases in the range of D MAX from D MIN.
- the target position accumulating unit 33 sequentially (automatically) sequentially deletes the target position before the point [Pn ⁇ D3] that is back from the current position Pn by a predetermined set distance D3.
- FIG. 13 is a diagram illustrating the set distance D3.
- the travel locus detected in the odometry coordinate system has a larger cumulative error as the travel distance becomes longer, which affects the self-position estimation. Therefore, the set distance D3 is set as a distance where the cumulative error may increase. For example, it is 100 m. Therefore, even if the turning point Pt1 is before the point [Pn-D3], the target position data before the point [Pn-D3] is deleted or thinned out.
- the self-position estimating unit 35 estimates the self-position of the vehicle 21 in the map coordinate system by collating the target position data accumulated in the target position accumulating unit 33 and the map information stored in the map database 14.
- the map coordinate system is a two-dimensional coordinate in plan view, and the east-west direction is an X MAP axis and the north-south direction is a Y MAP axis.
- the vehicle body posture (azimuth angle) is expressed as a counterclockwise angle with the east direction being 0 degrees.
- three parameters of the coordinate position [X MAP , Y MAP ] of the vehicle and the vehicle body posture [ ⁇ MAP ] are estimated.
- mapping matching for example, an ICP (Iterative Closest Point) algorithm is used.
- ICP Intelligent Closest Point
- step S101 corresponds to the processing in the target position detection unit 31, and the positions of targets such as the curb stone 23 and the traffic lane marking 24 existing around the vehicle are expressed in a vehicle coordinate system based on the vehicle. Detected as a relative position with respect to. That is, the detection point Pc of the curb 23 detected by the radar device 12 and the detection point Pw of the traffic dividing line 24 detected by the camera 13 are detected in the vehicle coordinate system.
- the subsequent step S102 corresponds to the processing in the movement amount detection unit 32, and detects odometry that is the movement amount of the vehicle 21 per unit time from various information detected by the sensor group 15. By integrating this, the travel locus of the vehicle can be calculated as an odometry coordinate system. That is, in the odometry coordinate system, three parameters of the vehicle coordinate position [X ODM , Y ODM ] and the vehicle body posture [ ⁇ ODM ] are accumulated for each calculation cycle.
- the subsequent step S103 corresponds to the processing in the target position accumulation unit 33, and associates the travel locus based on the movement amount detected by the movement amount detection unit 32 and the position of the target detected by the target position detection unit 31. Accumulate in the odometry coordinate system.
- the object position data detected at each time is moved by the amount of movement of the vehicle during the elapsed time up to the present, and the curb 23 and the traffic line 24 are respectively corresponding to the vehicle coordinate position and the vehicle body posture at each time. Are projected onto the odometry coordinate system and accumulated. However, the target position data before the point [Pn ⁇ D3] that is back from the current position Pn by a predetermined set distance D3 is sequentially and uniformly deleted.
- the subsequent step S104 corresponds to the processing in the turning point detection unit 34, refers to the travel locus, and first detects the point where the turning angle ⁇ t of the vehicle first becomes equal to or larger than the set angle ⁇ 1 as the turning point Pt1 retroactively from the current position Pn. To do.
- the set angle ⁇ 1 is reduced as the traveling lane is a straight line from the current position Pn and the straight line distance L is longer. Further, after making meandering determination, a final turning point Pt1 is determined.
- a point where the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1 is selected as a turning point candidate Pp, and the average turning in the set section from the point [Pp + ⁇ ] to the point [Pp ⁇ ] centered on the turning point candidate Pp.
- An angle ⁇ t AVE is obtained.
- the average turning angle ⁇ t AVE is equal to or larger than the set angle ⁇ 2, it is determined that the vehicle 21 is turning, and the turning point candidate Pp is detected as the final turning point Pt1.
- the subsequent step S105 corresponds to the processing in the target position accumulating unit 33, ranging from the current position Pn to the predetermined distance D1, and from the turning point Pt1 to the point [Pt1-D2] going back by the set distance D2.
- the target position data of the range is retained, and other target position data is deleted or thinned out.
- the set distance D2 is a range that is traced back by the set distance D2 from the turning point Pt1, and the smaller the target number N such as the curb stone 23 or the traffic division line 24 that the target position detection unit 31 can detect, the set distance Increase D2.
- the subsequent step S106 corresponds to the processing in the self-position estimating unit 35, and collates the target position data accumulated in the target position accumulating unit 33 and the map information stored in the map database 14, whereby the map coordinate system.
- the self-position of the vehicle 21 at is estimated. That is, three parameters of the vehicle coordinate position [X MAP , Y MAP ] and the vehicle body posture [ ⁇ MAP ] are estimated in the map coordinate system. The above is the vehicle position estimation process.
- map information is created only with targets that can be described as two-dimensional data in plan view, such as curbstones 23 and traffic dividing lines 24, which are relatively easy to detect compared to other targets.
- the method of estimating self-position using it is illustrated. Note that in order to further increase the self-position estimation accuracy, map information having three-dimensional (vertical, horizontal, height) data of the structure may be used. Even in this case, the present embodiment can be applied.
- the target position data extending in a straight line is not serve as a reference for the direction of travel, it cannot be accurately collated with map information in the direction of travel. That is, if only the straight target position data extending in a straight line is lost, the target position data serving as a reference in the traveling direction is lost, and the self position cannot be obtained uniquely. Therefore, in order to uniquely determine the self position, at least one combination of two intersecting straight lines is required.
- target position data for uniquely determining its own position cannot be obtained only by sensing from the current position. Therefore, using the movement amount information of the vehicle 21, a certain amount of past target position data is accumulated in the odometry coordinate system, and an odometry coordinate system in which the target position data is projected, and a map coordinate system in which the target position is stored in advance. The self position can be estimated.
- the accumulation error increases as the travel distance increases.
- the old target position data is sequentially deleted.
- only target position data within a predetermined range from the current position is held, for example, only a straight target position data remains on a straight road extending in a straight line.
- FIG. 15 is a diagram illustrating holding only target position data within a predetermined range.
- a case where only target position data of a target existing within a predetermined range from the current position is held, and a section holding the target position data is indicated by a thick dotted line.
- (A) in the figure shows a point in time just past the curve, and target position data before entering the curve is also held. That is, since the combination of two intersecting straight lines can be detected, the self position can be uniquely obtained.
- the target position data before entering the curve is deleted as the vehicle further moves forward. That is, since two intersecting straight lines cannot be detected and only one straight line can be detected, the self-position cannot be uniquely determined.
- FIG. 16 is a diagram illustrating the concept of the embodiment.
- the section holding the target position data is indicated by a thick dotted line.
- (A) in the figure shows a point in time just past the curve, and target position data before entering the curve is also held. That is, since the combination of two intersecting straight lines can be detected, the self position can be uniquely obtained.
- (b) in the figure shows that the target position data within a predetermined distance retroactive from the current position of the vehicle and the target position data before entering the curve are held even though the vehicle is moving further forward. Yes. That is, since the combination of two intersecting straight lines can be detected, the self position can be uniquely obtained.
- the positions of targets such as the curb 23 and the traffic marking line 24 existing around the vehicle are detected as relative positions with respect to the vehicle in the vehicle coordinate system based on the vehicle (step S101), and detected by the sensor group 15.
- the odometry which is the amount of movement of the vehicle 21 per unit time, is detected from the various information thus obtained and integrated to calculate the vehicle travel locus as an odometry coordinate system (step S102). Further, the travel locus based on the detected movement amount and the detected position of the target are associated with each other and accumulated in the odometry coordinate system (step S103).
- a point where the turning angle ⁇ t of the vehicle is equal to or larger than the set angle ⁇ 1 is detected as a turning point Pt1 from the travel locus (step S104). Then, the target position data in the range from the current position Pn to the predetermined distance D1 and the range from the turning point Pt1 to the point [Pt1-D2] back by the set distance D2 is held, and the other target positions Data is deleted or thinned out (step S105). Then, the target position of the vehicle 21 in the map coordinate system is estimated by collating the target position data stored in the target position storage unit 33 and the map information stored in the map database 14 (step S106).
- the target position data from the current position Pn to the predetermined distance D1 and the point [Pt1-D2] that is back from the turning point Pt1 by the set distance D2 is held, the target is set based on the turning point Pt1.
- the self position can be estimated uniquely.
- the target position data before the point [D1-Pt1] and the point [Pt1-D2] are deleted or thinned out, the increase in the data amount of the target position data is suppressed and adjusted appropriately. be able to.
- the odometry coordinate system it is possible to suppress the accumulation error of odometry from increasing and affecting the self-position estimation accuracy.
- the set angle ⁇ 1 is reduced as the traveling lane is a straight line from the current position Pn and the linear distance L is longer. That is, the longer the straight line distance L is, the smaller the set angle ⁇ 1 is, so that it becomes easier to detect the turning point Pt1 at a point farther back from the current position, thereby maintaining a target as a reference in the traveling direction. Easy to do. If the turning point Pt1 can be detected at a point in front, the amount of data for storing the target position data can be reduced. In addition, accumulation errors in the odometry coordinate system can be reduced, and it is possible to suppress a decrease in self-position estimation accuracy.
- the set distance D2 is increased as the target number N such as the curb stone 23 and the traffic lane marking 24 detected by the target position detection unit 31 is decreased within the range that is set back from the turning point Pt1 by the set distance D2.
- the set distance D2 by extending the set distance D2 according to the target number N, it is possible to secure a necessary and sufficient target number N while suppressing an increase in the data amount of the target position data more than necessary. Therefore, the self-position estimation accuracy can be ensured.
- a final turning point Pt1 is determined. Specifically, a point where the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1 is selected as a turning point candidate Pp, and a set section centered on the turning point candidate Pp, that is, the point [Pp + ⁇ ] to the point [Pp ⁇ ]. The average turning angle ⁇ t AVE in the section up to is obtained.
- the average turning angle ⁇ t AVE is equal to or larger than the set angle ⁇ 2, it is determined that the vehicle 21 is turning, and the turning point candidate Pp is detected as the final turning point Pt1.
- the average turning angle ⁇ t AVE is less than the set angle ⁇ 2, it is determined that the vehicle 21 is meandering, the turning point candidate Pp is excluded from the candidates, and the next turning point candidate Pp is further retroactively determined. Explore. Thereby, misjudgment due to meandering traveling can be reduced, and the turning point Pt1 can be easily and accurately determined. Therefore, it is possible to suppress the influence of the self-position estimation accuracy.
- the turning angle ⁇ t is detected from the change in the traveling direction of the vehicle. Specifically, the angle difference of the vehicle body when the current vehicle body direction is used as a reference is detected. Thereby, the influence of the accumulation error in the odometry coordinate system and the influence of the posture change such as when avoiding an obstacle can be reduced, and the turning angle ⁇ t can be accurately detected. Further, at the stage where the target position is associated with the traveling locus and accumulated (step S103), the target position data before the point [Pn ⁇ D3] that is a predetermined distance D3 from the current position Pn is obtained. Delete sequentially. Thereby, it can suppress that the data amount of target position data increases.
- the target position data before the point [Pt1-D2] is deleted or thinned out, but the present invention is not limited to this.
- a point where the turning angle ⁇ t of the vehicle is equal to or larger than the set angle ⁇ 1 within a range that is set back from the turning point Pt1 by the set distance D2 is detected as the turning point Pt2.
- the target position data of the range from the present position Pn to the set distance D1 and the range from the turning point Pt1 to the turning point Pt2 are held.
- the target position data before the other points [D1-Pt1] and the turning point Pt2 are deleted or thinned out.
- FIG. 17 is a diagram illustrating a section that holds target position data and a section that deletes or thins out the target position based on the turning point Pt2.
- ⁇ Application Example 2 In the first embodiment, only one turning point Pt1 is detected. However, the present invention is not limited to this, and a plurality of turning points may be detected. For example, in the range from the current position Pn to the point [Pn ⁇ D3] that is back by the set distance D3, all points where the turning angle ⁇ t of the vehicle is equal to or greater than the set angle ⁇ 1 are detected, and the points closest to the current position Pn are sequentially , Turn points Pt1, Pt2, Pt3,. And you may hold
- FIG. 18 is a diagram illustrating a state in which a plurality of turning points are detected.
- turning points Pt1, Pt2, Pt3, and Pt4 are detected.
- the range from the current position Pn to the point [Pn ⁇ D1] that is back by the set distance D1 the range from the turning point Pt1 to the point [Pt1-D2] that is back by the set distance D2
- the set distance D2 from the turning point Pt2 the set distance from the turning point Pt2.
- other target position data that is, a range from the point [Pn-D1] to the turning point Pt1, a range from the point [Pt1-D2] to the turning point Pt2, and a point [Pt2-D2] to the turning point Pt3.
- the range from the point [Pt3-D2] to the turning point Pt4, and the target position data before the point [Pt4-D2] are deleted or thinned out.
- the set angle ⁇ 1 is set according to the data amount of the target position data detected by the target position detection unit 31 within a range that is back by the set distance D1 from the current position Pn. It may be variable. For example, the set angle ⁇ 1 is decreased as the target number N such as the curb 23 or the traffic line 24 that the target position detection unit 31 can detect in the range that is set back from the current position Pn by the set distance D1.
- the smaller the target number N is, the smaller the set angle ⁇ 1 is, so that even a more gentle turn can be easily detected as the turning point Pt1, and the turning point Pt1 is set at more points.
- the target position data There are many places to hold the target position data, and the number of targets necessary for estimating the self position can be secured. If more turning points can be detected, the target position data necessary for estimating the self-position can be retained, while the amount of data for storing other target position data can be reduced, so the accumulation error in the odometry coordinate system can be reduced. It is possible to reduce the self-position estimation accuracy.
- FIG. 19 is a map used for setting the setting angle ⁇ 1 according to the target number N.
- the horizontal axis is the target number N
- the vertical axis is the set angle ⁇ 1.
- N3 larger than 0 and N4 larger than N3 are predetermined.
- For setting the angle ⁇ 1 is predetermined and larger theta MAX than larger theta MIN and, theta MIN than 0.
- ⁇ MAX is, for example, 60 degrees
- ⁇ MIN is, for example, 30 degrees.
- the target number N is the number of detection points, but it may be converted into a cumulative length of a straight line.
- the set angle ⁇ 1 maintains ⁇ MAX .
- an object when the target number N is in the range of from N4 N3 is, the smaller the target object number N, reduced by the range of theta MIN from the setting angle ⁇ 1 is theta MAX.
- the target position detection unit 31 and the process of step S101 correspond to a “target position detection unit”.
- the movement amount detection unit 32 and the process of step S102 correspond to a “movement amount detection unit”.
- the target position storage unit 33 and the processes of steps S103 and S105 correspond to the “target position storage unit”.
- the map database 14 corresponds to a “map information acquisition unit”.
- the turning point detection unit 34 and the process of step S104 correspond to a “turning point detection unit”.
- the self-position estimation unit 35 and the process of step S106 correspond to a “self-position estimation unit”.
- the set distance D1 corresponds to the “first set distance”.
- the setting angle ⁇ 1 corresponds to the “first setting angle”.
- the set distance D2 corresponds to the “second set distance”.
- the setting angle ⁇ 2 corresponds to the “second setting angle”.
- the set distance D3 corresponds to the “third set distance”.
- the vehicle position estimation apparatus detects the position of a target existing around the vehicle, detects the amount of movement of the vehicle, and determines the position of the target based on the detected amount of movement. Accumulated as object position data.
- map information including the position of the target is stored in advance in the map database 14, and the self position of the vehicle is estimated by comparing the target position data with the map information. Further, the turning point Pt1 of the vehicle is detected from the moving amount of the vehicle.
- target position data in the range from the current position Pn to the set distance D1 and the range from the turning point Pt1 to the point [Pt1-D2] that is back by the set distance D2 is held.
- the target position data from the current position Pn to the set distance D1 and the point [Pt1-D2] that is back by the set distance D2 from the turning point Pt1 is held, the target is set based on the turning point Pt1.
- the self position can be estimated uniquely.
- the target position data before the point [D1-Pt1] and the point [Pt1-D2] are deleted or thinned out, the increase in the data amount of the target position data is suppressed and adjusted appropriately. be able to.
- the vehicle position estimation apparatus detects a point where the turning angle ⁇ t is equal to or larger than the set angle ⁇ 1 as a turning point Pt, and within a range that is traced back from the current position Pn by the set distance D1.
- the vehicle position estimation apparatus detects a point where the turning angle ⁇ t is equal to or greater than the set angle ⁇ 1 as a turning point Pt, and the traveling lane is a straight line retroactively from the current position Pn.
- the vehicle position estimation apparatus increases the set distance D2 as the target number N detected by the target position detection unit 31 is less in the range that is traced back from the turning point Pt1 by the set distance D2. To do. In this way, the smaller the target number N is, the smaller the set angle ⁇ 1 is, so that it becomes easier to detect the turning point Pt1 at a point farther back from the current position Pn, and therefore data for storing target position data. The amount can be reduced.
- the vehicle position estimation apparatus turns in a set section [(Pp- ⁇ ) to (Pp + ⁇ )] within a set section where the average turning angle ⁇ t AVE is equal to or larger than the set angle ⁇ 2. It detects as point Pt1. As described above, when the average turning angle ⁇ t AVE is equal to or larger than the set angle ⁇ 2, the turning point candidate Pp is detected as the final turning point Pt1, thereby reducing misjudgment due to meandering and making the turning point easy. In addition, it can be determined accurately.
- the vehicle position estimation apparatus detects a turning point Pt1 according to a change in the traveling direction of the vehicle. In this way, by calculating the turning angle ⁇ t according to the change in the traveling direction of the vehicle and detecting the turning point Pt1, the effect of accumulation error in the odometry coordinate system, or the posture as when avoiding obstacles The influence of the change can be reduced and the turning point Pt1 can be accurately detected.
- the vehicle position estimation apparatus deletes or thins out target position data before a point [Pn ⁇ D3] that is back by a set distance D3 from the current position Pn. In this way, it is possible to suppress an increase in the data amount of the target position data by deleting or thinning out the target position data before the point [Pn-D3].
- the vehicle position estimation method detects the position of a target existing around the vehicle, detects the amount of movement of the vehicle, and determines the position of the target based on the detected amount of movement. Accumulated as object position data. Further, the turning point Pt1 of the vehicle is detected from the moving amount of the vehicle. Then, target position data in the range from the current position Pn to the set distance D1 and the range from the turning point Pt1 to the point [Pt1-D2] that is back by the set distance D2 is held. Further, map information including the position of the target is acquired from the map database 14, and the self-position of the vehicle is estimated by collating the accumulated target position data and the position of the target in the map information.
- the target position data from the current position Pn to the set distance D1 and the point [Pt1-D2] that is back by the set distance D2 from the turning point Pt1 is held, the target is set based on the turning point Pt1.
- the self position can be estimated uniquely.
- the target position data before the point [D1-Pt1] and the point [Pt1-D2] are deleted or thinned out, the increase in the data amount of the target position data is suppressed and adjusted appropriately. be able to.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Electromagnetism (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本発明の課題は、自己位置の推定を維持しつつ、物標位置データのデータ量を適切に調整できるようにすることである。
《構成》
図1は、車両位置推定装置の構成図である。
車両位置推定装置11は、車両の自己位置を推定するものであり、レーダ装置12と、カメラ13と、地図データベース14と、センサ群15と、コントローラ16と、を備える。
図2は、レーダ装置とカメラの配置を示す図である。
図3は、レーダ装置の走査範囲、及びカメラの撮像範囲を示す図である。
センサ群15は、例えばGPS受信機、アクセルセンサ、操舵角センサ、ブレーキセンサ、車速センサ、加速度センサ、車輪速センサ、ヨーレートセンサ等を含み、検出した各データをコントローラ16へと出力する。GPS受信機は、車両21の現在位置情報を取得する。アクセルセンサは、アクセルペダルの操作量を検出する。操舵角センサは、ステアリングホイールの操作量を検出する。ブレーキセンサは、ブレーキペダルの操作量やブレーキブースタ内の圧力を検出する。車速センサは、車速を検出する。加速度センサは、車両前後方向の加減速度や横加速度を検出する。車輪速センサは、各車輪の車輪速度を検出する。ヨーレートセンサは、車両のヨーレートを検出する。
コントローラ16は、機能ブロックの構成として、物標位置検出部31と、移動量検出部32と、物標位置蓄積部33と、旋回地点検出部34と、自己位置推定部35と、を備える。
図4は、車両座標系を示す図である。
車両座標系は、平面視の二次元座標であり、車両21の例えば後輪車軸の中心を原点Oとし、前後方向をXVHC軸、左右方向をYVHC軸とする。レーダ装置12の座標系、及びカメラ13の座標系を、夫々、車両座標系に変換する式は、予め求められている。また、車両座標系における路面22のパラメータも既知とする。
カメラ13により、路面22を撮像し、そのグレースケール画像において、車体左右方向に沿って暗部から明部へ、且つ明部から暗部へと輝度が変化するパターンを抽出することで、通行区分線24を検出する。例えば、通行区分線24における幅方向の中心点を検出する。すなわち、カメラ13によって撮像した画像データを俯瞰変換し、俯瞰画像から通行区分線24を検出し、車両座標系に投影する。ここでは、通行区分線24の検出点をPwとし、黒丸で表示している。
図5は、オドメトリ座標系を示す図である。
オドメトリ座標系は、例えばシステムの電源を投入した、又は切断した時点での車両位置を座標原点とし、その時点の車体姿勢(方位角)を0度とする。このオドメトリ座標系で、演算周期毎に車両の座標位置[XODM,YODM]、及び車体姿勢[θODM]の三つのパラメータを蓄積することで、走行軌跡を検出する。図5では、時点t1~t4における車両の座標位置、及び車体姿勢を描いている。なお、現在の車両位置を原点として、蓄積されている物標位置データを、毎回、座標変換してもよい。つまり、同一座標系で物標位置データが蓄積されればよい。
図6は、車両座標系の物標位置を示す図である。
ここでは、時点t1~t4において、物標位置検出部31で検出した物標の、車両座標系における位置を示している。物標としては、車両21の左方に存在する縁石23の検出点Pcと、車両21の左方に存在する通行区分線24の検出点Pwと、車両21の右方に存在する通行区分線24の検出点Pwと、を検出している。車両21の変位、及び姿勢変化によって、車両座標系における各物標の位置は刻々と変化する。
すなわち、時点t1~t4における車両の座標位置、及び車体姿勢に対応して、各時点における物標の位置を、オドメトリ座標系に投影している。すなわち、各時点において、車両21の左方に存在する縁石23の検出点Pcと、車両21の左方に存在する通行区分線24の検出点Pwと、車両21の右方に存在する通行区分線24の検出点Pwと、を投影している。
単位時間Δtを0.2秒とし、レーダ装置12の動作を25Hzとし、カメラ13の動作を30Hzとすると、単位時間Δtの間に、縁石23は5点分のデータ、通行区分線24は6点分のデータが取得できる。また、各物標が車両21の左側にあるか右側にあるかは、車両座標系のYVHC座標が正であるか負であるかで判定するものとする。こうして、車両21の右側か左側かで点群を分けた後で、パラメータa、b、cを求める。
ここでは、時点t1~t4において、車両21の左方に存在する縁石23の検出点Pcから直線L23を抽出している。また、時点t1~t4において、車両21の左方に存在する通行区分線24の検出点Pwから直線L24を抽出している。また、時点t1~t4において、車両21の右方に存在する通行区分線24の検出点Pwから直線L24を抽出している。
車両の旋回角度θtとは、オドメトリ座標系において、現在の車体姿勢に至るまでの姿勢変化量であり、したがって現在の車体の向きを基準としたときの車体の角度差である。設定角度θ1の初期値は、例えば60度である。但し、物標位置蓄積部33で抽出した直線の長さに応じて、設定角度θ1を可変にする。
このマップは、横軸を直線距離Lとし、縦軸を設定角度θ1としている。直線距離Lについては、0よりも大きなL1と、L1よりも大きなL2とを予め定めている。設定角度θ1については、0よりも大きなθMINと、θMINよりも大きなθMAXとを予め定めている。θMAXは例えば60度で、θMINは例えば30度である。そして、直線距離LがL1からL2の範囲にあるときには、直線距離Lが長いほど、設定角度θ1がθMAXからθMINの範囲で小さくなる。また、直線距離LがL2以上であるときには、設定角度θ1がθMINを維持する。
現在位置Pnから遡った領域に地点P1及びP2があり、地点P1での旋回角度θtを35度とし、地点P2での旋回角度θtを65度とする。したがって、設定角度θ1が60度である場合には、現在位置Pnから遡り最初に旋回角度θtが設定角度θ1以上となるのは、地点P2であり、この地点P2を旋回地点Pt1として検出する。また、設定角度θ1が30度である場合には、現在位置Pnから遡り最初に旋回角度θtが設定角度θ1以上となるのは、地点P1であり、この地点P1を旋回地点Pt1として検出する。なお、地点P1を旋回地点Pt1とし、地点P2を旋回地点Pt2として両方を旋回地点として設定するようにしてもよい。つまり、現在の車両の自己位置の周囲と、旋回地点の前の設定距離範囲の物標位置データを保持するようにできればよい。
先ず、旋回角度θtが設定角度θ1以上となる地点を旋回地点候補Ppとして選出し、この旋回地点候補Ppを中心とする予め定めた設定区間における平均旋回角度θtAVEを求める。設定区間は、旋回地点候補Ppの前後に予め定めたαだけとった区間、つまり地点[Pp-α]から地点[Pp+α]までの区間である。予め定めたαは、例えば10mである。
図10は、蛇行判定について説明した図である。
図中の(a)は、車両の旋回によって旋回地点候補Ppが選出される場合を示し、図中の(b)は、車両の蛇行によって旋回地点候補Ppが選出される場合を示している。上記の蛇行判定を行なうことにより、(a)の場合には旋回地点候補Ppが最終的な旋回地点Pt1として検出され、(b)の場合には旋回地点候補Ppが候補から除外される。
図11は、物標位置データを保持する区間、及び物標位置データを削除もしくは間引く区間を示した図である。
設定距離D2は、旋回地点Pt1から設定距離D2だけ遡った範囲で、物標位置検出部31が検出できた縁石23や通行区分線24等の物標数Nが少ないほど、設定距離D2を長くする。このように、物標数Nが少ないほど設定距離D2を長くすることで、物標数Nを確保しやすくする。
このマップは、横軸を物標数Nとし、縦軸を設定距離D2としている。物標数Nについては、0よりも大きなN1と、N1よりも大きなN2とを予め定めている。設定距離D2については、0よりも大きなDMINと、DMINよりも大きなθMAXを予め定めている。DMINは例えば20mで、DMAXは例えば40mである。物標数Nは検出点の数量であるが、直線の累積長さに換算してもよい。そして、物標数NがN2以上であるときには、設定距離D2がDMINを維持する。また、物標数NがN2からN1の範囲にあるときには、物標数Nが少ないほど、設定距離D2がDMINからDMAXの範囲で大きくなる。
図13は、設定距離D3を示す図である。
オドメトリ座標系で検出する走行軌跡は、走行距離が長くなるほど、累積誤差が大きくなり、自己位置推定に影響を与えるため、設定距離D3は、累積誤差が大きくなる可能性のある距離として設定され、例えば100mである。よって、旋回地点Pt1が地点[Pn-D3]よりも前にあった場合であっても、地点[Pn-D3]よりも前の物標位置データを削除もしくは間引きするようにする。
地図座標系は、平面視の二次元座標であり、東西方向をXMAP軸、南北方向をYMAP軸とする。車体姿勢(方位角)は、東方向を0度とし、反時計回りの角度として表す。この地図座標系で、車両の座標位置[XMAP,YMAP]、及び車体姿勢[θMAP]の三つのパラメータを推定する。この照合(マップマッチング)には、例えばICP(Iterative Closest Point)アルゴリズムを用いる。このとき、直線同士を照合する場合には、両側の端点を評価点として照合させ、両側の端点の間隔が広いときには、補間することができる。
図14は、車両位置推定処理を示すフローチャートである。
先ずステップS101は、物標位置検出部31での処理に対応し、車両の周囲に存在する縁石23や通行区分線24等の物標の位置を、車両を基準とした車両座標系で、車両に対する相対位置として検出する。すなわち、レーダ装置12で検出した縁石23の検出点Pcや、カメラ13で検出した通行区分線24の検出点Pwを、車両座標系で検出する。
続くステップS103は、物標位置蓄積部33での処理に対応し、移動量検出部32で検出した移動量に基づく走行軌跡、及び物標位置検出部31で検出した物標の位置を対応付けてオドメトリ座標系で蓄積する。すなわち、各時刻で検出した物体位置データを現在までの経過時間の車両の移動量だけ移動させ、各時刻の車両の座標位置、及び車体姿勢に対応して、夫々、縁石23や通行区分線24等の物標位置データを、オドメトリ座標系に投影して蓄積する。但し、現在位置Pnから予め定めた設定距離D3だけ遡った地点[Pn-D3]よりも前の物標位置データを画一的に逐次削除する。
また、蛇行判定を行なったうえで、最終的な旋回地点Pt1を決定する。すなわち、旋回角度θtが設定角度θ1以上となる地点を、旋回地点候補Ppとして選出し、この旋回地点候補Ppを中心とする地点[Pp+α]から地点[Pp-α]までの設定区間における平均旋回角度θtAVEを求める。そして、平均旋回角度θtAVEが設定角度θ2以上であるときには、車両21が旋回していると判断して、その旋回地点候補Ppを最終的な旋回地点Pt1として検出する。一方、平均旋回角度θtAVEが設定角度θ2未満であるときには、車両21が蛇行していると判断して、その旋回地点候補Ppを候補から除外すると共に、さらに遡って次の旋回地点候補Ppを探索する。
続くステップS106は、自己位置推定部35での処理に対応し、物標位置蓄積部33に蓄積した物標位置データ、及び地図データベース14に格納された地図情報を照合することにより、地図座標系における車両21の自己位置を推定する。すなわち、地図座標系で、車両の座標位置[XMAP,YMAP]、及び車体姿勢[θMAP]の三つのパラメータを推定する。
上記が車両位置推定処理である。
先ず、第1実施形態の技術的思想について説明する。
レーダ装置12で検出した縁石23やカメラ13で検出した通行区分線24等の物標の位置と、地図情報として予めデータ化された各物標の位置と、を照合することにより、車両21の自己位置を推定する。本実施形態では、縁石23や通行区分線24等、検出が他の物標に比べ比較的容易で、且つ平面視の二次元データとして記述可能な物標のみで地図情報を作成し、それを用いて自己位置の推定を行う方法を例示している。なお、より自己位置の推定精度を高くする場合には、構造物の三次元(縦、横、高さ)データを有する地図情報用いてもよい。この場合でも、本実施形態を適用することができる。
ここでは、現在位置から所定範囲内に存在する物標の物標位置データだけを保持する場合を示しており、物標位置データを保持している区間を太い点線で示している。図中の(a)は、カーブを過ぎて間もない時点を示しており、カーブに進入する前の物標位置データも保持されている。すなわち、交差する二本の直線の組み合わせを検出できているので、自己位置を一意に求めることができる。一方、図中の(b)は、車両がさらに前進したことにより、カーブに進入する前の物標位置データが削除されている。すなわち、交差する二本の直線が検出できず、一本の直線しか検出できないため、自己位置を一意に求めることができなくなる。
図16は、実施形態の概念を示す図である。
ここでは、物標位置データを保持している区間を太い点線で示している。図中の(a)は、カーブを過ぎて間もない時点を示しており、カーブに進入する前の物標位置データも保持されている。すなわち、交差する二本の直線の組み合わせを検出できているので、自己位置を一意に求めることができる。また、図中の(b)は、車両がさらに前進しているものの、車両の現在位置から遡った所定距離内の物標位置データと、カーブに進入する前の物標位置データが保持されている。すなわち、交差する二本の直線の組み合わせを検出できているので、自己位置を一意に求めることができる。
先ず、車両の周囲に存在する縁石23や通行区分線24等の物標の位置を、車両を基準とした車両座標系で、車両に対する相対位置として検出し(ステップS101)、センサ群15で検出した各種情報をから単位時間における車両21の移動量であるオドメトリを検出して、これ積分し、車両の走行軌跡をオドメトリ座標系として算出する(ステップS102)。また、検出した移動量に基づく走行軌跡、及び検出した物標の位置を対応付けてオドメトリ座標系で蓄積する(ステップS103)。
また、走行軌跡に物標の位置を対応付けて蓄積する段階で(ステップS103)、現在位置Pnから予め定めた設定距離D3だけ遡った地点[Pn-D3]よりも前の物標位置データを逐次削除する。これにより、物標位置データのデータ量が増大することを抑制できる。
第1実施形態では、地点[Pt1-D2]より前の物標位置データを削除もしくは間引きしているが、これに限定されるものではない。例えば、旋回地点Pt1から設定距離D2だけ遡った範囲で、車両の旋回角度θtが設定角度θ1以上となる地点を、旋回地点Pt2として検出する。そして、現在位置Pnから設定距離D1までの範囲、及び旋回地点Pt1から旋回地点Pt2までの範囲の物標位置データを保持する。一方、それ以外の地点[D1-Pt1]および旋回地点Pt2より前の物標位置データを削除もしくは間引きする。なお、物標位置データと地図情報とを照合するうえ必要となる基点は、少なくとも一つあればよい。したがって、物標位置データとして保持する物標位置データの中に、旋回地点Pt1が含まれていればよいので、旋回地点Pt2は削除もしくは間引きしてもよい。
図17は、旋回地点Pt2に基づいて、物標位置データを保持する区間、及び物標位置を削除もしくは間引く区間を示した図である。
第1実施形態では、一つの旋回地点Pt1だけを検出しているが、これに限定されるものではなく、複数の旋回地点を検出するようにしてもよい。例えば、現在位置Pnから設定距離D3だけ遡った地点[Pn-D3]までの範囲で、車両の旋回角度θtが設定角度θ1以上となる全ての地点を検出し、現在位置Pnに近いものから順に、旋回地点Pt1、Pt2、Pt3、……とする。そして、各旋回地点から設定距離D2だけ遡った地点までの物標位置データを保持してもよい。このように、複数の旋回地点を検出し、各旋回地点から設定距離D2だけ遡った範囲の物標位置データを保持することで、各旋回地点を基点にして物標位置データと地図情報とを照合できるので、自己位置の推定精度を向上させることができる。
ここでは、旋回地点Pt1、Pt2、Pt3、Pt4を検出している。この場合、現在位置Pnから設定距離D1だけ遡った地点[Pn-D1]までの範囲、旋回地点Pt1から設定距離D2だけ遡った地点[Pt1-D2]までの範囲、旋回地点Pt2から設定距離D2だけ遡った地点[Pt2-D2]までの範囲、旋回地点Pt3から設定距離D2だけ遡った地点[Pt3-D2]までの範囲、及び旋回地点Pt4から設定距離D2だけ遡った地点[Pt4-D2]までの範囲の全ての物標位置データを保持する。一方、それ以外の物標位置データ、つまり地点[Pn-D1]から旋回地点Pt1までの範囲、地点[Pt1-D2]から旋回地点Pt2までの範囲、地点[Pt2-D2]から旋回地点Pt3までの範囲、地点[Pt3-D2]から旋回地点Pt4までの範囲、及び地点[Pt4-D2]より以前の物標位置データを、削除もしくは間引きする。
このマップは、横軸を物標数Nとし、縦軸を設定角度θ1としている。物標数Nについては、0よりも大きなN3と、N3よりも大きなN4とを予め定めている。設定角度θ1については、0よりも大きなθMINと、θMINよりも大きなθMAXとを予め定めている。θMAXは例えば60度で、θMINは例えば30度である。物標数Nは検出点の数量であるが、直線の累積長さに換算してもよい。そして、物標数NがN4以上であるときには、設定角度θ1がθMAXを維持する。また、物標数NがN4からN3の範囲にあるときには、物標数Nが少ないほど、設定角度θ1がθMAXからθMINの範囲で小さくなる。
第1実施形態では、物標位置検出部31、及びステップS101の処理が「物標位置検出部」に対応する。移動量検出部32、及びステップS102の処理が「移動量検出部」に対応する。物標位置蓄積部33、及びステップS103、S105の処理が「物標位置蓄積部」に対応する。地図データベース14が「地図情報取得部」に対応する。旋回地点検出部34、及びステップS104の処理が「旋回地点検出部」に対応する。自己位置推定部35、及びステップS106の処理が「自己位置推定部」に対応する。設定距離D1が「第一の設定距離」に対応する。設定角度θ1が「第一の設定角度」に対応する。設定距離D2が「第二の設定距離」に対応する。設定角度θ2が「第二の設定角度」に対応する。設定距離D3が「第三の設定距離」に対応する。
次に、第1実施形態における主要部の効果を記す。
(1)第1実施形態に係る車両位置推定装置は、車両の周囲に存在する物標の位置を検出し、車両の移動量を検出し、物標の位置を、検出した移動量に基づいて物体位置データとして蓄積する。また、地図データベース14には、物標の位置を含む地図情報を予め格納しておき、物標位置データと地図情報とを照合することにより、車両の自己位置を推定する。また、車両の移動量から車両の旋回地点Pt1を検出する。そして、現在位置Pnから設定距離D1までの範囲、及び旋回地点Pt1から設定距離D2だけ遡った地点[Pt1-D2]までの範囲の物標位置データを保持する。
このように、現在位置Pnから設定距離D1まで、及び旋回地点Pt1から設定距離D2だけ遡った地点[Pt1-D2]までの物標位置データを保持するので、旋回地点Pt1を基点にして物標位置データと地図情報とを照合し、自己位置を一意に推定することができる。また、それ以外地点[D1-Pt1]および地点[Pt1-D2]より前の物標位置データを削除もしくは間引きするので、物標位置データのデータ量が増大することを抑制し、適切に調整することができる。
このように、物標数Nが少ないほど設定角度θ1を小さくすることで、より多くの地点で旋回地点Pt1が設定され、物標位置データを保持する箇所が多くでき、保持する物標数を多くできる。自己位置の推定精度を確保できる物標位置データを増やしながら、それ以外の物標位置データを蓄積するデータ量を減らすことができる。
このように、直線距離Lが長いほど、設定角度θ1を小さくすることで、現在位置Pnから遡って、より手前の地点で旋回地点Pt1を検出しやすくなることで、進行方向の基準となる物標を保持しやすくできるので、物標位置データを蓄積するデータ量を減らすことができる。また、オドメトリ座標系での蓄積誤差を軽減し、自己位置の推定精度が低下することを抑制できる。
このように、物標数Nが少ないほど設定角度θ1を小さくすることで、現在位置Pnから遡って、より手前の地点で旋回地点Pt1を検出しやすくなるので、物標位置データを蓄積するデータ量を減らすことができる。
このように、平均旋回角度θtAVEが設定角度θ2以上となるときに、旋回地点候補Ppを最終的な旋回地点Pt1として検出することで、蛇行走行による誤判断を低減し、旋回地点を、容易に、且つ正確に判定することができる。
このように、車両の進行方向の変化に応じて旋回角度θtを算出し、旋回地点Pt1を検出することで、オドメトリ座標系における蓄積誤差の影響や、また障害物を回避するときのような姿勢変化の影響を軽減し、旋回地点Pt1を正確に検出することができる。
このように、地点[Pn-D3]よりも前の物標位置データを削除もしくは間引きすることで、物標位置データのデータ量が増大することを抑制できる。
このように、現在位置Pnから設定距離D1まで、及び旋回地点Pt1から設定距離D2だけ遡った地点[Pt1-D2]までの物標位置データを保持するので、旋回地点Pt1を基点にして物標位置データと地図情報とを照合し、自己位置を一意に推定することができる。また、それ以外地点[D1-Pt1]および地点[Pt1-D2]より前の物標位置データを削除もしくは間引きするので、物標位置データのデータ量が増大することを抑制し、適切に調整することができる。
12 レーダ装置
13 カメラ
14 地図データベース
15 センサ群
16 コントローラ
21 車両
22 路面
23 縁石
24 通行区分線
31 物標位置検出部
32 移動量検出部
33 物標位置蓄積部
34 旋回地点検出部
35 自己位置推定部
Claims (8)
- 車両の周囲に存在する物標の位置を検出する物標位置検出部と、
前記車両の移動量を検出する移動量検出部と、
前記物標位置検出部で検出した物標の位置を、前記移動量検出部により検出された移動量に基づいて、物標位置データとして蓄積する物標位置蓄積部と、
前記物標の位置を含む地図情報を取得する地図情報取得部と、
前記物標位置蓄積部に蓄積した物標位置データ、及び前記地図情報取得部で取得した地図情報における物標の位置を照合することにより、前記車両の自己位置を推定する自己位置推定部と、
前記車両の移動量から前記車両の旋回地点を検出する旋回地点検出部と、を備え、
前記物標位置蓄積部は、
少なくとも現在位置から予め定めた第一の設定距離だけ遡った範囲の前記物標位置データ、及び前記旋回地点から予め定めた第二の設定距離だけ遡った範囲の前記物標位置データを保持することを特徴とする車両位置推定装置。 - 前記旋回地点検出部は、
車両の旋回角度が予め定めた第一の設定角度以上となった地点を、前記旋回地点として検出し、
現在位置から前記第一の設定距離だけ遡った範囲で、前記物標位置検出部が検出した前記物標が少ないほど、前記第一の設定角度を小さくすることを特徴とする請求項1に記載の車両位置推定装置。 - 前記旋回地点検出部は、
車両の旋回角度が予め定めた第一の設定角度以上となった地点を、前記旋回地点として検出し、
現在位置から遡って走行車線が直線であり、且つ直線距離が長いほど、前記第一の設定角度を小さくすることを特徴とする請求項1又は2に記載の車両位置推定装置。 - 前記物標位置蓄積部は、
前記旋回地点から前記第二の設定距離だけ遡った範囲で、前記物標位置検出部が検出した前記物標が少ないほど、前記第二の設定距離を長くすることを特徴とする請求項1~3の何れか一項に記載の車両位置推定装置。 - 前記旋回地点検出部は、
車両の旋回角度が第一の設定角度以上となる地点を含む予め定めた設定区間のうち平均旋回角度が予め定めた第二の設定角度以上となった設定区間内の地点を、前記旋回地点として検出することを特徴とする請求項1~4の何れか一項に記載の車両位置推定装置。 - 前記旋回地点検出部は、
車両の進行方向の変化に基づいて旋回地点を検出することを特徴とする請求項1~5の何れか一項に記載の車両位置推定装置。 - 前記物標位置蓄積部は、
現在位置から予め定めた第三の設定距離だけ遡った地点よりも前の前記物標位置データを削除することを特徴とする請求項1~6の何れか一項に記載の車両位置推定装置。 - 物標位置検出部が、車両の周囲に存在する物標の位置を検出し、
移動量検出部が、前記車両の移動量を検出し、
物標位置蓄積部が、前記物標位置検出部で検出した物標の位置を、前記移動量検出部で検出した移動量に基づいて、物標位置データとして蓄積し、
旋回地点検出部が、前記車両の移動量から前記車両の旋回地点を検出し、
前記物標位置蓄積部が、少なくとも現在位置から予め定めた第一の設定距離だけ遡った範囲の前記物標位置データ、及び前記旋回地点から予め定めた第二の設定距離だけ遡った範囲の前記物標位置データを保持し、
地図情報取得部が、前記物標の位置を含む地図情報を取得し、
自己位置推定部が、前記物標位置蓄積部に蓄積した物標位置データ、及び前記地図情報取得部で取得した地図情報における物標の位置を照合することにより、車両の自己位置を推定することを特徴とする車両位置推定方法。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/004382 WO2017037752A1 (ja) | 2015-08-28 | 2015-08-28 | 車両位置推定装置、車両位置推定方法 |
JP2017537032A JP6418332B2 (ja) | 2015-08-28 | 2015-08-28 | 車両位置推定装置、車両位置推定方法 |
RU2018110826A RU2687103C1 (ru) | 2015-08-28 | 2015-08-28 | Устройство оценки положения транспортного средства и способ оценки положения транспортного средства |
CN201580082771.2A CN107949768B (zh) | 2015-08-28 | 2015-08-28 | 车辆位置推定装置、车辆位置推定方法 |
BR112018003728-1A BR112018003728B1 (pt) | 2015-08-28 | 2015-08-28 | Dispositivo de estimativa de posição de veículo e método de estimativa de posição de veículo |
KR1020187008177A KR101926322B1 (ko) | 2015-08-28 | 2015-08-28 | 차량 위치 추정 장치, 차량 위치 추정 방법 |
CA2997171A CA2997171C (en) | 2015-08-28 | 2015-08-28 | Vehicle position estimation device, vehicle position estimation method |
US15/755,794 US10267640B2 (en) | 2015-08-28 | 2015-08-28 | Vehicle position estimation device, vehicle position estimation method |
MX2018002266A MX364578B (es) | 2015-08-28 | 2015-08-28 | Dispositivo de estimacion de posicion de vehiculo, metodo de estimacion de posicion de vehiculo. |
EP15902878.6A EP3343173B1 (en) | 2015-08-28 | 2015-08-28 | Vehicle position estimation device, vehicle position estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/004382 WO2017037752A1 (ja) | 2015-08-28 | 2015-08-28 | 車両位置推定装置、車両位置推定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017037752A1 true WO2017037752A1 (ja) | 2017-03-09 |
Family
ID=58187176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/004382 WO2017037752A1 (ja) | 2015-08-28 | 2015-08-28 | 車両位置推定装置、車両位置推定方法 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10267640B2 (ja) |
EP (1) | EP3343173B1 (ja) |
JP (1) | JP6418332B2 (ja) |
KR (1) | KR101926322B1 (ja) |
CN (1) | CN107949768B (ja) |
BR (1) | BR112018003728B1 (ja) |
CA (1) | CA2997171C (ja) |
MX (1) | MX364578B (ja) |
RU (1) | RU2687103C1 (ja) |
WO (1) | WO2017037752A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019075132A (ja) * | 2018-11-29 | 2019-05-16 | 株式会社ゼンリン | 走行支援装置、プログラム |
US11214250B2 (en) | 2017-04-27 | 2022-01-04 | Zenrin Co., Ltd. | Travel support device and non-transitory computer-readable medium |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3343174B1 (en) * | 2015-08-28 | 2020-04-15 | Nissan Motor Co., Ltd. | Vehicle position estimation device, vehicle position estimation method |
US9766344B2 (en) * | 2015-12-22 | 2017-09-19 | Honda Motor Co., Ltd. | Multipath error correction |
MX2019000759A (es) * | 2016-07-26 | 2019-06-20 | Nissan Motor | Metodo de estimacion de la posicion propia y dispositivo de estimacion de la posicion propia. |
JP6740470B2 (ja) * | 2017-05-19 | 2020-08-12 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
CN111337010B (zh) * | 2018-12-18 | 2022-05-03 | 北京地平线机器人技术研发有限公司 | 可移动设备的定位方法、定位装置及电子设备 |
DE102018133441A1 (de) * | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Verfahren und System zum Bestimmen von Landmarken in einer Umgebung eines Fahrzeugs |
DE102019201222A1 (de) * | 2019-01-31 | 2020-08-06 | Robert Bosch Gmbh | Verfahren zur Bestimmung einer Position eines Fahrzeugs in einer digitalen Karte |
JP7167751B2 (ja) * | 2019-02-08 | 2022-11-09 | 株式会社アイシン | 物体検出装置 |
JP2020154384A (ja) * | 2019-03-18 | 2020-09-24 | いすゞ自動車株式会社 | 衝突確率算出装置、衝突確率算出システムおよび衝突確率算出方法 |
US11378652B2 (en) * | 2019-09-03 | 2022-07-05 | GM Global Technology Operations LLC | Enhancement of vehicle radar system robustness based on elevation information |
US11263347B2 (en) * | 2019-12-03 | 2022-03-01 | Truata Limited | System and method for improving security of personally identifiable information |
JP7421923B2 (ja) * | 2019-12-23 | 2024-01-25 | フォルシアクラリオン・エレクトロニクス株式会社 | 位置推定装置、及び位置推定方法 |
CN114396957B (zh) * | 2022-02-28 | 2023-10-13 | 重庆长安汽车股份有限公司 | 基于视觉与地图车道线匹配的定位位姿校准方法及汽车 |
CN114563006B (zh) * | 2022-03-17 | 2024-03-19 | 长沙慧联智能科技有限公司 | 基于参考线匹配的车辆全局定位方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08334363A (ja) * | 1995-06-09 | 1996-12-17 | Zanavy Informatics:Kk | 現在位置算出装置 |
JP2008241446A (ja) * | 2007-03-27 | 2008-10-09 | Clarion Co Ltd | ナビゲーション装置及びその制御方法 |
JP2011191239A (ja) * | 2010-03-16 | 2011-09-29 | Mazda Motor Corp | 移動体位置検出装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2013068482A (ja) * | 2011-09-21 | 2013-04-18 | Nec Casio Mobile Communications Ltd | 方位補正システム、端末装置、サーバ装置、方位補正方法及びプログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2018085C1 (ru) * | 1992-01-03 | 1994-08-15 | Нижегородский архитектурно-строительный институт | Устройство для определения координат подвижного объекта |
US5550538A (en) * | 1993-07-14 | 1996-08-27 | Zexel Corporation | Navigation system |
JP3578511B2 (ja) * | 1995-04-21 | 2004-10-20 | 株式会社ザナヴィ・インフォマティクス | 現在位置算出装置 |
US5941934A (en) | 1995-06-09 | 1999-08-24 | Xanavi Informatics Corporation | Current position calculating device |
JP3658519B2 (ja) * | 1999-06-28 | 2005-06-08 | 株式会社日立製作所 | 自動車の制御システムおよび自動車の制御装置 |
JP2008250906A (ja) | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
TW200900655A (en) * | 2007-06-21 | 2009-01-01 | Mitac Int Corp | Navigation device and method calibrated by map position-matching |
JP5746695B2 (ja) * | 2010-06-29 | 2015-07-08 | 本田技研工業株式会社 | 車両の進行路推定装置 |
US9424468B2 (en) * | 2010-09-08 | 2016-08-23 | Toyota Jidosha Kabushiki Kaisha | Moving object prediction device, hypothetical movable object prediction device, program, moving object prediction method and hypothetical movable object prediction method |
JP6233706B2 (ja) * | 2013-04-02 | 2017-11-22 | パナソニックIpマネジメント株式会社 | 自律移動装置及び自律移動装置の自己位置推定方法 |
-
2015
- 2015-08-28 MX MX2018002266A patent/MX364578B/es active IP Right Grant
- 2015-08-28 RU RU2018110826A patent/RU2687103C1/ru active
- 2015-08-28 JP JP2017537032A patent/JP6418332B2/ja active Active
- 2015-08-28 WO PCT/JP2015/004382 patent/WO2017037752A1/ja active Application Filing
- 2015-08-28 US US15/755,794 patent/US10267640B2/en active Active
- 2015-08-28 CN CN201580082771.2A patent/CN107949768B/zh active Active
- 2015-08-28 CA CA2997171A patent/CA2997171C/en active Active
- 2015-08-28 EP EP15902878.6A patent/EP3343173B1/en active Active
- 2015-08-28 KR KR1020187008177A patent/KR101926322B1/ko active IP Right Grant
- 2015-08-28 BR BR112018003728-1A patent/BR112018003728B1/pt active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08334363A (ja) * | 1995-06-09 | 1996-12-17 | Zanavy Informatics:Kk | 現在位置算出装置 |
JP2008241446A (ja) * | 2007-03-27 | 2008-10-09 | Clarion Co Ltd | ナビゲーション装置及びその制御方法 |
JP2011191239A (ja) * | 2010-03-16 | 2011-09-29 | Mazda Motor Corp | 移動体位置検出装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2013068482A (ja) * | 2011-09-21 | 2013-04-18 | Nec Casio Mobile Communications Ltd | 方位補正システム、端末装置、サーバ装置、方位補正方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3343173A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11214250B2 (en) | 2017-04-27 | 2022-01-04 | Zenrin Co., Ltd. | Travel support device and non-transitory computer-readable medium |
JP2019075132A (ja) * | 2018-11-29 | 2019-05-16 | 株式会社ゼンリン | 走行支援装置、プログラム |
Also Published As
Publication number | Publication date |
---|---|
MX364578B (es) | 2019-05-02 |
CN107949768A (zh) | 2018-04-20 |
CA2997171C (en) | 2019-10-22 |
EP3343173A1 (en) | 2018-07-04 |
KR20180044354A (ko) | 2018-05-02 |
BR112018003728B1 (pt) | 2022-08-30 |
JP6418332B2 (ja) | 2018-11-07 |
MX2018002266A (es) | 2018-03-23 |
CA2997171A1 (en) | 2017-03-09 |
KR101926322B1 (ko) | 2018-12-06 |
EP3343173B1 (en) | 2020-07-22 |
BR112018003728A2 (ja) | 2018-09-25 |
US20180328742A1 (en) | 2018-11-15 |
JPWO2017037752A1 (ja) | 2018-02-22 |
EP3343173A4 (en) | 2018-11-07 |
CN107949768B (zh) | 2018-10-12 |
RU2687103C1 (ru) | 2019-05-07 |
US10267640B2 (en) | 2019-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6418332B2 (ja) | 車両位置推定装置、車両位置推定方法 | |
JP6414337B2 (ja) | 車両位置推定装置、車両位置推定方法 | |
US10713504B2 (en) | Estimating friction based on image data | |
US10260889B2 (en) | Position estimation device and position estimation method | |
US11526173B2 (en) | Traveling trajectory correction method, traveling control method, and traveling trajectory correction device | |
US11243080B2 (en) | Self-position estimation method and self-position estimation device | |
JP6020729B2 (ja) | 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 | |
US10970870B2 (en) | Object detection apparatus | |
US20220291015A1 (en) | Map generation apparatus and vehicle position recognition apparatus | |
US11867526B2 (en) | Map generation apparatus | |
US12123739B2 (en) | Map generation apparatus | |
US20220254056A1 (en) | Distance calculation apparatus and vehicle position estimation apparatus | |
US20220291013A1 (en) | Map generation apparatus and position recognition apparatus | |
CN115107778A (zh) | 地图生成装置 | |
JP2020190415A (ja) | 車両状態推定システム、車両状態推定方法、及び車両状態推定プログラム | |
JP2020190413A (ja) | 車両進行状態推定システム、車両進行状態推定方法、及び車両進行状態推定プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15902878 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017537032 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/002266 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15755794 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2997171 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187008177 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018110826 Country of ref document: RU Ref document number: 2015902878 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018003728 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112018003728 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180226 |