WO2016189732A1 - 自己位置推定装置及び自己位置推定方法 - Google Patents
自己位置推定装置及び自己位置推定方法 Download PDFInfo
- Publication number
- WO2016189732A1 WO2016189732A1 PCT/JP2015/065415 JP2015065415W WO2016189732A1 WO 2016189732 A1 WO2016189732 A1 WO 2016189732A1 JP 2015065415 W JP2015065415 W JP 2015065415W WO 2016189732 A1 WO2016189732 A1 WO 2016189732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- self
- vehicle
- movement amount
- moving body
- target
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 30
- 238000001514 detection method Methods 0.000 claims description 107
- 230000008859 change Effects 0.000 claims description 73
- 238000000605 extraction Methods 0.000 description 84
- 238000009825 accumulation Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 16
- 238000012986 modification Methods 0.000 description 13
- 230000004048 modification Effects 0.000 description 13
- 230000001133 acceleration Effects 0.000 description 10
- 239000000284 extract Substances 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000009467 reduction Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 240000004050 Pentaglottis sempervirens Species 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000010426 asphalt Substances 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3644—Landmark guidance, e.g. using POIs or conspicuous other objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present invention relates to a self-position estimation apparatus and method for estimating the self-position of a moving object.
- Patent Document 1 is disclosed.
- the movement amount calculated by a wheel speed pulse, a gyroscope, etc. and the past history of the sensing result are accumulated and connected, and the result is adjusted so that the map matches and moves Estimates body position and posture angle.
- the influence of the moving speed of the moving body is eliminated by using a history of sensing results at a constant distance, so that the self-position can be estimated stably.
- An object is to provide an apparatus and a method thereof.
- a self-position estimation apparatus and method detect a target position of a target existing around a moving body and detect a moving amount of the moving body. . Then, the detected target position is moved by the movement amount and accumulated as target position data, and map information including the target position of the target existing on the map is acquired. Thereafter, the target position of the moving object is estimated by collating the target position data within a predetermined range set based on the movement history up to the current position of the moving object and the target position included in the map information.
- FIG. 1 is a block diagram illustrating a configuration of a self-position estimation system including a self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 2 is a diagram showing the mounting positions of the laser range finder and the camera on the vehicle.
- FIG. 3 is a flowchart showing a processing procedure of self-position estimation processing by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 4 is a diagram for explaining a coordinate system employed in the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 5 is a diagram for explaining a detection method by the laser range finder of the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 1 is a block diagram illustrating a configuration of a self-position estimation system including a self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 2 is a diagram showing the mounting positions of the laser range finder and the camera on the vehicle.
- FIG. 3 is a flowchart showing
- FIG. 6 is a diagram for explaining a white line detection method by the camera of the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 7 is a diagram showing a target detection result by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 8 is a diagram for explaining a method of estimating a movement amount detection error by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 9 is a diagram for explaining an extraction range setting method by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 10 is a diagram for explaining the link node information acquired by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 11 is a diagram for explaining an extraction range setting method by the self-position estimation apparatus according to the fifth modification of the first embodiment of the present invention.
- FIG. 12 is a diagram for explaining a conventional self-position estimation technique.
- FIG. 13 is a diagram for explaining a conventional self-position estimation technique.
- FIG. 14 is a diagram for explaining a conventional self-position estimation technique.
- FIG. 15 is a diagram for explaining an extraction range setting method by the self-position estimation apparatus according to the first embodiment of the present invention.
- FIG. 16 is a flowchart showing a processing procedure of self-position estimation processing by the self-position estimation apparatus according to the second embodiment of the present invention.
- FIG. 1 is a block diagram illustrating a configuration of a self-position estimation system including a self-position estimation apparatus according to the present embodiment.
- the self-position estimation system according to this embodiment includes an ECU 1, a camera 2, a three-dimensional map database 3, a vehicle sensor group 4, and a laser range finder 5.
- the ECU 1 is an electronic control unit configured by a ROM, a RAM, an arithmetic circuit, and the like, and a program for realizing the self-position estimation apparatus 10 according to the present embodiment is stored in the ROM.
- the ECU 1 may also be used as an ECU used for other controls.
- the camera 2 (2a, 2b) uses a solid-state image sensor such as a CCD.
- a CCD solid-state image sensor
- the camera 2 (2a, 2b) is installed on the left and right door mirrors of the vehicle and can image the road surface below the vehicle. Is directed to.
- the captured image is transmitted to the ECU 1.
- the 3D map database 3 is a storage unit that stores map information including target positions of targets existing on the map, and stores, for example, 3D position information of the surrounding environment including road display.
- the landmarks recorded in the map information are those registered in the map.
- road markings such as lane markings, stop lines, pedestrian crossings, road markings, structures on the road surface such as curbs. Things and buildings are included.
- Each piece of map information such as a white line is defined as an aggregate of edges. In the case where the edge is a long straight line, for example, since it is divided every 1 m, there is no extremely long edge. In the case of a straight line, each edge has three-dimensional position information indicating both end points of the straight line.
- each edge has three-dimensional position information indicating both end points and a center point of the curve. Further, since the 3D map database 3 stores node / link information of a general navigation system, the ECU 1 can perform route guidance to a destination and a process of recording a past travel route.
- the vehicle sensor group 4 includes a GPS receiver 41, an accelerator sensor 42, a steering sensor 43, a brake sensor 44, a vehicle speed sensor 45, an acceleration sensor 46, a wheel speed sensor 47, and a yaw rate sensor 48. Yes.
- the vehicle sensor group 4 is connected to the ECU 1 and supplies various detection values detected by the sensors 41 to 48 to the ECU 1.
- the ECU 1 uses the output value of the vehicle sensor group 4 to calculate the approximate position of the vehicle or to calculate odometry indicating the amount of movement of the vehicle per unit time.
- the laser range finder 5 (5a, 5b) is attached so as to be able to scan the left and right direction of the vehicle body as shown in FIG.
- the self-position estimation device 10 functions as a control unit that executes self-position estimation processing, and by matching the target position of a target existing around the vehicle with map information stored in the three-dimensional map database 3. Estimate the position and attitude angle of the vehicle.
- the self-position estimation apparatus 10 executes a self-position estimation program stored in the ROM, thereby performing a target position detection unit 12, a movement amount detection unit 14, a target position accumulation unit 16, a map information acquisition unit 18, and It operates as the self-position estimating unit 20.
- the target position detection unit 12 detects the target position of the target existing around the vehicle from the scan result of the laser range finder 5. Further, the target position detection unit 12 may detect the target position from the image of the camera 2 or may detect the target position using both the camera 2 and the laser range finder 5.
- the movement amount detection unit 14 detects odometry, which is the movement amount of the vehicle, based on information from the vehicle sensor group 4 mounted on the vehicle.
- the target position accumulating unit 16 moves the target position detected by the target position detecting unit 12 in the previously executed control cycles by the amount of movement detected by the movement amount detecting unit 14. Accumulated as reference position data.
- the map information acquisition unit 18 acquires map information from the three-dimensional map database 3, and the acquired map information includes the target position of the target existing on the map.
- the self-position estimation unit 20 estimates a detection error of the movement amount detected by the movement amount detection unit 14 based on a past movement amount change in the travel history of the vehicle (corresponding to the movement history of the moving body). To do. Then, the self-position estimation unit 20 extracts a predetermined range (when the target position data is extracted from the target position storage unit 16 based on the detection error of the movement amount estimated from the past movement amount change of the vehicle travel history). Target position data included in a predetermined range (extraction range A) set based on the vehicle travel history among the target position data stored in the target position storage unit 16 by setting an extraction range A) to be described later. And the extracted target position data and the target position included in the map information are collated to estimate the self-position of the vehicle.
- extraction range A extraction range A
- this embodiment demonstrates the case where it applies to a vehicle, it is applicable also to moving bodies, such as an aircraft and a ship.
- moving bodies such as an aircraft and a ship.
- the self-position of the moving object can be estimated by matching with the terrain or a building instead of the information on the road marking as the surrounding environment.
- FIG. 4 Two coordinate systems shown in FIG. 4 are used when estimating the position of the vehicle. That is, an absolute coordinate system centered on the origin of the map information and a relative space coordinate system centered on the rear wheel axle center of the vehicle.
- the origin of map information is the origin O
- the east-west direction is the X axis
- the north-south direction is the Y axis
- the vertically upward direction is the Z axis.
- the azimuth angle (yaw angle) ⁇ [rad] that the vehicle is facing is represented by a counterclockwise angle when the east direction (X-axis direction) is zero.
- the center of the rear wheel axle of the vehicle is the origin o
- the longitudinal direction of the vehicle is the x axis
- the vehicle width direction is the y axis
- the vertical direction is the z axis.
- the estimated self-position of the vehicle is the position in the east-west direction (X-axis direction) (X coordinate [m]), the position in the north-south direction (Y-axis direction) (Y coordinate [m]), and posture.
- the position and posture angle of a total of three degrees of freedom of the azimuth angle ⁇ (yaw angle [rad]) of the vehicle are estimated.
- the self-position estimation process described below is continuously performed at intervals of about 100 msec, for example.
- the target position detection unit 12 detects the target position of the target existing around the vehicle. Specifically, the target position detection unit 12 acquires the scan result of the laser range finder 5 and extracts the target positions of road markings such as lane markings and stop lines, and structures such as curbs and buildings.
- the target position extracted here is relative plane position coordinates (xj (t), yj (t)) [m] with respect to the vehicle in the relative space coordinate system. Note that j in xj and yj is equal to the number of road markings and curbs extracted. T represents the current time (cycle).
- the method for obtaining the relative plane position coordinates of the road marking is that when the intensity ⁇ i of the reflected laser of the irradiated laser is equal to or greater than the threshold value ⁇ th, it is determined that the laser is irradiated on the road marking, and the laser is irradiated
- the relative plane position coordinates are obtained from the distance ⁇ i and the angle of the vehicle.
- the threshold value ⁇ th may be obtained in advance by experiments or the like and stored in the ECU 1. In this embodiment, 120 is set as the threshold value ⁇ th.
- relative plane position coordinates (xj (t), yj (t )) Is calculated.
- the laser is irradiated to the road surface including the curb 51 as indicated by a line 53.
- a greatly changing portion 55 is detected as shown in FIG.
- step S ⁇ b> 10 the target position detection unit 12 acquires an image of the camera 2, and calculates relative plane position coordinates (xj (t), yj (t)) of road markings such as lane markings from the image of the camera 2. It may be extracted.
- a method for extracting the relative plane position coordinates of the road marking from the image of the camera 2 will be described with reference to FIG.
- FIG. 6A shows an image 60a of the camera 2a that captures the lower left direction of the vehicle and an image 60b of the camera 2b that captures the lower right direction of the vehicle. In these images 60a and 60b, the target position detection unit 12 detects white lines in the regions 61a and 61b.
- binarization processing is performed on the regions 61a and 61b to extract a range 63 having a high luminance value, and a portion 65 having a high luminance value is detected as shown in FIG. Then, as shown in FIG. 6C, the center of gravity position 67 of the portion 65 having a high luminance value is calculated, and the center of gravity position is calculated from the internal parameters (camera model) of the camera 2 and the external parameters (vehicle mounting position of the camera 2). 67 relative plane position coordinates (xj (t), yj (t)) are obtained.
- the scan result of the laser range finder 5 is not accumulated immediately after the vehicle engine is started or immediately after the driving power is turned on when the vehicle is an electric vehicle. Therefore, in this embodiment, when the vehicle engine is turned off, or when the vehicle is an electric vehicle, the driving power is turned off, the scan within the extraction range A [m] described later from the current position. The result is extracted and recorded in the memory or storage medium of the ECU 1. Then, when the vehicle engine is started or when the vehicle is an electric vehicle, the recorded scan result is read at the moment when the driving power source is turned on.
- the relative plane position coordinates (xj (t), yj (t)) of the lane markings, curbstones, buildings, etc. extracted in step S10 are obtained by executing the processing described later, thereby executing the relative plane position at the current time t.
- the coordinates continue to be recorded until the vehicle engine is turned off. If the vehicle is an electric vehicle, the recording continues until the driving power is turned off.
- the movement amount detection unit 14 detects odometry, which is the movement amount of the vehicle from one cycle before to the current time t, based on the sensor information obtained from the vehicle sensor group 4.
- This odometry is the amount of movement of the vehicle per unit time.
- the yaw angle change amount ⁇ (t) [rad] is obtained by multiplying the yaw rate ⁇ [rad / s] acquired from the yaw rate sensor 48 by the calculation cycle 100 msec of the ECU 1 for the yaw angle movement amount. .
- the translational movement amount is obtained by multiplying the vehicle speed V [m / s] acquired from the vehicle speed sensor 45 by the calculation cycle 100 msec of the ECU 1 to obtain the translational movement amount ⁇ L (t) [m]. Furthermore, when calculating the odometry, the tire parameter of each wheel of the vehicle may be measured and a two-wheel model or the like may be used to estimate and calculate the side slip angle of the vehicle body or the side slip of the vehicle body.
- the target position accumulation unit 16 moves the target position detected by the target position detection unit 12 by the movement amount detected by the movement amount detection unit 14, and accumulates it as target position data. More specifically, the relative plane position coordinates (xj (t), yj (t)) of the targets such as lane markings and curbs acquired in step S10 in each previous cycle are obtained from the odometry obtained in step S20. Just move. In other words, using the odometry information, the center position of the rear wheel axle of the current vehicle is used as the origin, using the odometry information, and the target position of the marking line, curbstone, etc. acquired in the past by either or both of the laser range finder 5 and the camera 2. To the relative space coordinate system.
- FIG. 7B shows the position of the target position data accumulated when the vehicle travels along the route indicated by the arrow 71 on the road seen from the sky shown in FIG. That is, FIG. 7B is an example in which the position information of the target detected by the laser range finder 5 in step S10 is moved and accumulated by the amount of movement detected in step S20.
- the position of the curb where the detection result of the laser changes greatly is displayed together with the point where the reflection intensity of the road surface is high, for example, the position of the lane marking, the stop line, and the pedestrian crossing.
- step S40 the self-position estimation unit 20 estimates the detection error of the vehicle movement amount based on the past movement amount change in the travel history of the vehicle, and accumulates the target position based on the estimated detection error of the movement amount.
- a predetermined range (extraction range A) for extracting target position data from the unit 16 is set.
- the detection error of the movement amount is estimated from the past movement amount change of the travel history of the vehicle.
- the past movement amount change of the travel history the current position of the vehicle and a predetermined time before The detection error of the movement amount is estimated from the past movement amount change at the position or the position going back a predetermined distance.
- the self-position estimation unit 20 includes the vehicle self-position 80 (X (t), Y (t), ⁇ (t)) calculated in step S50 one cycle before from the travel history, A change in the amount of movement of the vehicle's own position 82 (X (t ⁇ T), Y (t ⁇ T), ⁇ (t ⁇ T)) just before time T [s] from the present is compared.
- the self-position here means the position in the east-west direction of the absolute coordinate system (X coordinate [m]), the position in the north-south direction (Y coordinate [m]), and the east direction as 0 as posture angle information.
- Counterclockwise azimuth angle ⁇ (yaw angle [rad]).
- the vehicle's own position 82 before time T [s] from the travel history has shifted by an absolute value ⁇ y [m] in the vehicle width direction as viewed from the vehicle direction of the current self position 80. There is a change in the amount of movement.
- the absolute value ⁇ y [m] of the shift amount which is the past change in the vehicle width direction, is equal to or greater than the threshold value yth [m]
- the self-position estimation unit 20 estimates that the detection error of the movement amount is large. it can.
- the self-position estimation unit 20 estimates that the detection error of the movement amount is small. it can.
- the fact that the vehicle has moved in the vehicle width direction can be estimated as a large turn change or lane change, right / left turn, or curved road travel where odometry errors are likely to accumulate. It can be estimated that the detection error of the movement amount is large.
- the self-position estimation unit 20 extracts the extraction range A when extracting the target position data in step S50 described later. [M] is increased and set to 200 m, for example.
- the self-position estimation unit 20 reduces the extraction range A [m] and sets it to 100 m, for example. Further, as shown in FIG. 9, the extraction range A may be changed from the travel history so that the extraction range A is continuously reduced as the deviation amount ⁇ y, which is a past movement amount change, increases.
- the extraction range A is set to be smaller as the movement amount detection error estimated from the past movement amount change in the travel history increases. In this way, when it is estimated that the detection error of the movement amount is large from the past movement amount change of the travel history, the extraction range A is reduced to reduce the accumulation of odometry errors.
- the threshold value yth may be set to 50 m, for example.
- the reduction range of the extraction range A is obtained by verifying the matching state in step S50 in advance through experiments and simulations and setting an optimum value, and may be set to other values besides reducing from 200 m to 100 m. Good.
- the extraction range A is set to 200 m, and the number of times of passing through the intersection in the extraction range A is counted from the link node information registered in the 3D map database 3 as a travel history.
- the extraction range A is set to 200 m.
- the extraction range A may be set to 100 m on the assumption that the detection error of the movement amount becomes large.
- the map information on the travel route of the vehicle is referred to, the past travel amount change is estimated, and the travel amount detection error May be estimated.
- the travel route includes a large change in the amount of movement such as an intersection with a right or left turn, stop, or start, or a steep section, and the point where odometry errors are likely to accumulate is included, the extraction range A is reduced. Accumulation of odometry errors can be reduced.
- step S50 the self-position estimation unit 20 extracts the target position data included in the extraction range A set in step S40 from the target position data stored in the target position storage unit 16. And the self-position of a vehicle is estimated by collating the extracted target position data and the target position contained in map information.
- the self-position estimating unit 20 extracts target position data within the extraction range A set in step S40 from the target position data such as lane markings and curbs accumulated in step S30. To do. At this time, the target position data is extracted until the movement amount ⁇ L (t) calculated in step S20 is integrated retroactively from the current time t and the integrated value exceeds the extraction range A.
- the target position in the absolute coordinate system of the vehicle is estimated by matching the extracted target position data with the target position of the map information stored in the 3D map database 3. That is, the position and posture angle of a total of three degrees of freedom including the position in the east-west direction (X coordinate), the position in the north-south direction (Y coordinate), and the azimuth angle (yaw angle ⁇ ) are estimated.
- the self-position estimation process according to the present embodiment ends.
- an ICP Intelligent Closest Point
- the lane markings are matched using the end points at both ends as evaluation points.
- the closer to the vehicle (camera 2) the more the target position data is not affected by the odometry error, so that the vicinity of the vehicle is linearly complemented to increase the number of evaluation points, and the distance of the vehicle is the evaluation point Decrease the number.
- link node information As shown in FIG. 10, branched links (arrows) and nodes (circles) are recorded, and it is possible to determine straight, right turn, left turn, merge and branch for each link. Attribute information is assigned. Therefore, by detecting which link the vehicle has passed through and referring to this link node information, it is possible to obtain the right / left turn information of the vehicle.
- the extraction range A is increased, for example, 200 m Set as is.
- the extraction range A may be continuously changed according to the number of right / left turns.
- the extraction range A is first set to 200 m, and the number of times the vehicle has changed lanes within the range of the extraction range A is counted from the link node information registered in the 3D map database 3. As shown in FIG. 10, since link information is individually set for each lane of a road with a plurality of lanes in the link node information, the number of lane changes is counted by referring to the link node information. can do.
- the extraction range A is set large, for example, 200 m. To do.
- the number of lane changes is one or more as the travel history, since the past movement amount change is large, it is estimated that the movement amount detection error is large, and the extraction range A is reduced to, for example, 100 m. Set.
- the extraction range A is set to 200 m, and the number of times the vehicle has shunted or joined within the range of the extraction range A is counted from the link node information registered in the 3D map database 3.
- branching or merging as shown in FIG. 10, it is only necessary to refer to attribute information given to each link of link node information.
- the merge and branch links are not shown, but attribute information that can determine straight, right turn, left turn, merge, and branch is given to each link. It can be determined whether the vehicles have diverged or merged.
- the extraction range A is increased, for example, 200 m. Set.
- the number of diversions or merging is one or more as the travel history, since the past movement amount change is large, it is estimated that the movement amount detection error is large, and the extraction range A is small, for example, 180 m. Set to.
- the extraction range A is set to 200 m, and the radius of curvature of the curve traveled by the vehicle within the range of the extraction range A is detected from the link node information registered in the 3D map database 3. Since the radius of curvature is recorded in each link of the link node information, the radius of curvature can be detected by specifying the link on which the vehicle has traveled.
- the radius of curvature of the curve traveled by the vehicle is larger than 50 m as the travel history, the change in the past travel amount is small, so that the detection error of the travel amount is estimated to be small and the extraction range A is increased, for example, 200 m. Set as is.
- the radius of curvature of the curve on which the vehicle has traveled is 50 m or less as the travel history, since the past travel amount change is large, it is estimated that the travel amount detection error is large, and the extraction range A is reduced. For example, it is set to 100 m. At this time, the extraction range A may be continuously changed according to the curvature radius.
- the extraction range A is reduced so that the accumulation of odometry errors is reduced.
- a past movement amount change is estimated using the above-described embodiment and a plurality of travel history forms of the first to fourth modified examples, a movement amount detection error is estimated, and an extraction range A is set. May be.
- the extraction range A is decreased for each error factor that is assumed to have a large past movement amount change such as a deviation amount ⁇ y that is a past movement amount change in the travel history and the number of right / left turns.
- An amount ⁇ A is set, and when it is estimated that a detection error is large in each error factor, each decrease amount ⁇ A is decreased from the extraction range A. For example, in FIG.
- the decrease ⁇ A is set to 20 m.
- the reduction amount ⁇ A is set to 10 m.
- the radius of curvature of the curve traveled by the vehicle in the travel history is 50 m or less, the reduction amount ⁇ A is set to 50 m. If there are a plurality of error factors that are estimated to have a large movement amount detection error, the extraction range A is set by reducing the sum of these reduction amounts ⁇ A from a preset value of 200 m.
- the extraction range A is set to 170 m by reducing 30 m, which is the sum of the respective reduction amounts 20 m and 10 m, from 200 m. .
- the matching executed in step S50 becomes difficult, so the minimum value of the extraction range A is set to 100 m, for example.
- the reduction width ⁇ A is set to an optimal value by verifying the matching state in step S50 in advance through experiments and simulations.
- FIG. 12A shows a bird's-eye conversion of an image of a fisheye camera that captures the downward direction of the left and right of the vehicle.
- FIG. 12B shows the past movement amount and the history of sensing results (overhead image) are accumulated in the current overhead image 121.
- the connected parts 123 are added, a bird's-eye view image of a certain past section can be obtained. Therefore, even if there is an obstacle such as a parked vehicle, the self-position can be estimated using past sensing results.
- the conventional self-position estimation technology always uses the history of sensing results at a fixed distance to eliminate the influence of the moving speed of the moving body and to perform the self-position estimation stably. This is because, when a history of sensing results for a certain period of time is used, if the moving body is extremely slow or stopped, it is the same as using only the current sensing results. Further, when the moving speed is increased, the area obtained by connecting the history of sensing results becomes wider, and the calculation load for adjusting to match the map information increases.
- FIG. 14 is a diagram showing the result of this comparison.
- the black circles in the figure indicate the positions of the white line edges detected every moment by the laser range finder mounted on the vehicle, and the white circles are the past by the conventional self-position estimation technique. The result of concatenating the sensing results is shown.
- the detection error of the movement amount is estimated from the travel history of the vehicle, and the data extracted from the accumulated target position data as the estimated detection error increases. Reduce the range. For example, as shown in FIG. 15, the extraction range 151 is widened when traveling straight with a small detection error of the movement amount, and the extraction range 153 is narrowed when repeating left and right turns with a large detection error.
- a predetermined range (extraction range A) of target position data is set based on the travel history, and the target position extracted from the predetermined range (extraction range A).
- the self-position of the vehicle is estimated by comparing the data and the target position included in the map information.
- the detection error is small, and the predetermined range (extraction range A) is widened to increase the number of target positions. Data and the target position of map information can be collated. Therefore, based on the travel history, the self-position can be estimated with high accuracy and stably not only in a situation where the detection error of the movement amount is small but also in a situation where the detection error becomes large.
- the predetermined range is narrowed as the vehicle turns more frequently in the travel history.
- a left or right turn at an intersection involves not only turning of the vehicle but also acceleration / deceleration before and after that, and the change in the past movement amount is large, so that the detection error of the movement amount in the longitudinal direction of the vehicle body becomes large. Therefore, when the number of right / left turns where the movement amount detection error is likely to accumulate is large, the accumulation of the movement amount detection error can be reduced by reducing the predetermined range (extraction range A).
- the position can be estimated with high accuracy and stability.
- the predetermined range is narrowed as the number of vehicle lane changes increases in the travel history.
- a vehicle when a lane is changed, a past movement amount change is large, and a non-linear side-slip motion of the vehicle body occurs. Therefore, it is difficult to estimate the movement amount with high accuracy, and a movement amount detection error increases. Therefore, when the number of times of lane change in which the detection error of the movement amount is likely to be accumulated is large, the accumulation of the detection error of the movement amount can be reduced by reducing the predetermined range (extraction range A).
- the self position can be estimated with high accuracy and stability.
- the predetermined range is narrowed as the number of diversions or merging of vehicles in the travel history increases.
- a past movement amount change such as a lane change or a turn is large, and a behavior in which a detection error of the movement amount is enlarged occurs. Therefore, when the number of diversions or merges in which movement amount detection errors are likely to accumulate is large, accumulation of movement amount detection errors can be reduced by reducing the predetermined range (extraction range A).
- the self-position can be estimated with high accuracy and stability.
- the predetermined range is narrowed as the curvature radius of the curve traveled by the vehicle in the travel history is smaller.
- the past movement amount change is large, and a non-linear side skid motion of the vehicle body occurs, making it difficult to estimate the movement amount with high accuracy. Therefore, when traveling on a curve with a small radius of curvature where the detection error of the movement amount is likely to be accumulated, the accumulation of the detection error of the movement amount can be reduced by reducing the predetermined range (extraction range A), As a result, the self-position can be estimated with high accuracy and stability.
- the detection error of the movement amount is estimated from the travel history of the vehicle in step S40.
- the predetermined range extraction range A
- Steps S10 to S30 and step S50 are the same as those in the first embodiment shown in FIG.
- step S140 the self-position estimating unit 20 estimates the detection error of the moving amount of the vehicle from the travel history, and extracts the target position data from the target position accumulating unit 16 based on the estimated detection error of the moving amount.
- a predetermined range extraction range A
- the detection error of the movement amount is estimated from the vehicle behavior that is the past movement amount change in the travel history, and as a specific example of the vehicle behavior, the detection error of the movement amount from the turning amount of the vehicle. Is estimated.
- the self-position estimation unit 20 calculates the posture angle ⁇ (t) of the vehicle calculated in step S50 one cycle before from the travel history, and the posture angle ⁇ (t ⁇ T) of the vehicle that is a time T [s] ahead of the current time. And a turning amount d ⁇ [rad], which is a change in the moving amount, is obtained. Then, when the absolute value of the turning amount d ⁇ , which is the past movement amount change, is equal to or greater than the threshold value d ⁇ th, the self-position estimation unit 20 can estimate that the movement amount detection error is large.
- the self-position estimation unit 20 can estimate that the detection error of the movement amount is small.
- the threshold value d ⁇ th may be set to 1.745 [rad] ⁇ 100 [deg], for example.
- the large amount of turn which is a change in the amount of movement of the vehicle, can be estimated as a large amount of detection error in the amount of movement because it can be estimated that turning, lane change, or traveling on a curved road where odometry errors are likely to accumulate. can do.
- the self-position estimation unit 20 extracts the extraction range A [m when extracting the target position data in step S50. ] Is increased to, for example, 200 m.
- the self-position estimation unit 20 reduces the extraction range A [m] and sets it to 100 m, for example.
- the extraction range A may be changed so as to be continuously reduced as the turning amount, which is a past movement amount change in the travel history, increases. That is, the extraction range A is set to be smaller as the movement amount detection error estimated from the past movement amount change in the travel history increases.
- the extraction range A is reduced so that the accumulation of odometry errors is reduced.
- the turning amount d ⁇ is obtained by integrating the absolute value of the change amount ⁇ (t) of the yaw angle every moment from the current time t of the travel history to the time T [s] before. Also good. In this case, it is possible to detect that the actual turning amount is large even if the attitude angle is apparently restored to the original during slalom running or the like.
- the maximum value ⁇ abs [rad / s] of the turning speed (yaw rate) in the extraction range A is detected instead of the integral value of the turning amount, and this ⁇ abs is equal to or greater than the threshold value ⁇ th [rad / s]. It may be estimated that the detection error of the movement amount is large.
- the threshold ⁇ th may be set to 0.39 [rad / s] ⁇ 20 [deg / s], for example.
- the extraction range A is set to 200 m, and the absolute maximum value ⁇ abs [m / s 2 ] of the measured value ⁇ [m / s 2 ] of the acceleration sensor 46 that measures acceleration in the longitudinal direction of the vehicle. Is detected.
- the extraction range A is set large, for example, 200 m.
- the extraction range A is set to be small, for example, 100 m.
- the threshold value ⁇ th may be set to 0.2 [m / s 2 ], for example. At this time, the extraction range A may be continuously changed according to ⁇ abs.
- the extraction range A is reduced so that accumulation of odometry errors is reduced.
- the acceleration sensor 46 may be used as the acceleration sensor 46, and the acceleration in the vehicle width direction and the vertical direction of the vehicle may be measured and determined by the combined component.
- the extraction range A may be set by the method described in the first embodiment by determining from the vehicle behavior, for example, that the vehicle has turned left or right at the intersection, or has changed lanes.
- the self-position estimation apparatus estimates that the detection error of the movement amount is larger as the turning amount of the vehicle that is the past movement amount change in the travel history is larger, and the predetermined range. (Extraction range A) is reduced.
- extraction range A the predetermined range.
- the self-position estimation apparatus it is estimated that the detection error of the moving amount is larger as the vehicle speed change of the vehicle in the travel history is larger, and the predetermined range (extraction range A) is reduced.
- the predetermined range extraction range A
- the detection error of the movement amount in the front-rear direction of the vehicle body becomes large. Therefore, when the vehicle speed change in which the movement amount detection error is likely to accumulate based on the travel history is large, the accumulation of the movement amount detection error may be reduced by reducing the predetermined range (extraction range A).
- the self-position can be estimated with high accuracy and stability.
- the vehicle has been described as an example.
- the mobile body is equipped with at least one camera or laser range finder or a sensor for measuring odometry, an aircraft
- the present invention can also be applied to ships and ships.
- the position and posture angle of the three degrees of freedom of the vehicle are obtained, but it is also possible to estimate the position and posture angle of the six degrees of freedom.
Abstract
Description
[自己位置推定システムの構成]
図1は、本実施形態に係る自己位置推定装置を備えた自己位置推定システムの構成を示すブロック図である。図1に示すように、本実施形態に係る自己位置推定システムは、ECU1と、カメラ2と、三次元地図データベース3と、車両センサ群4と、レーザレンジファインダ5とを備えている。
次に、本実施形態に係る自己位置推定処理の手順を図3のフローチャートを参照して説明する。尚、本実施形態では、車両の自己位置を推定する際に、図4に示す2つの座標系を用いている。すなわち、地図情報の原点を中心とした絶対座標系と、車両の後輪車軸中心を原点とした相対空間座標系である。絶対座標系は、地図情報の原点を原点Oとして、東西方向をX軸、南北方向をY軸、鉛直上方をZ軸としている。この絶対座標系において、車両が向いている方位角(ヨー角)θ[rad]は、東方向(X軸方向)を0としたときの反時計回りの角度で表される。また、相対空間座標系は、車両の後輪車軸中心を原点oとして、車両の前後方向をx軸、車幅方向をy軸、鉛直上方をz軸とする。
本実施形態の変形例1としては、ステップS40において車両の走行履歴から移動量の検出誤差を推定する際に、走行履歴の具体例として、車両の右左折回数が多いほど、過去の移動量変化が大きいので、移動量の検出誤差が大きいと推定し、抽出範囲Aを狭くする。この場合、まず抽出範囲Aを200mに設定しておき、三次元地図データベース3に登録されたリンク・ノード情報から、抽出範囲Aの範囲内で車両が交差点で右左折した回数をカウントする。
変形例2としては、ステップS40の走行履歴の具体例として、車両の車線変更の回数が多いほど、過去の移動量変化が大きいので、移動量の検出誤差が大きいと推定し、抽出範囲Aを狭くする。この場合、まず抽出範囲Aを200mに設定しておき、三次元地図データベース3に登録されたリンク・ノード情報から、抽出範囲Aの範囲内で車両が車線変更した回数をカウントする。図10に示すように、リンク・ノード情報には、複数の車線がある道路について車線毎に個別にリンク情報が設定されているので、リンク・ノード情報を参照すれば、車線変更の回数をカウントすることができる。
変形例3としては、ステップS40の走行履歴の具体例として、車両の分流または合流の回数が多いほど、過去の移動量変化が大きいので、移動量の検出誤差が大きいと推定し、抽出範囲Aを小さくする。この場合、まず抽出範囲Aを200mに設定しておき、三次元地図データベース3に登録されたリンク・ノード情報から、抽出範囲Aの範囲内で車両が分流または合流した回数をカウントする。分流または合流の判断については、図10に示すように、リンク・ノード情報の各リンクに付与された属性情報を参照すればよい。図10では合流、分岐のリンクは図示されていないが、各リンクには直進、右折、左折、合流、分岐を判断できるような属性情報が付与されているので、この属性情報を参照すれば、車両が分流または合流したかを判断することができる。
変形例4としては、ステップS40の走行履歴の具体例として、車両が走行したカーブの曲率半径が小さいほど、過去の移動量変化が大きいので、移動量の検出誤差が大きいと推定し、抽出範囲Aを狭くする。この場合、まず抽出範囲Aを200mに設定しておき、三次元地図データベース3に登録されたリンク・ノード情報から、抽出範囲Aの範囲内で車両が走行したカーブの曲率半径を検出する。曲率半径は、リンク・ノード情報の各リンクに記録されているので、車両が走行したリンクを特定すれば、曲率半径を検出することができる。
変形例5として、上述した実施形態と変形例1~4の複数の走行履歴の形態を用いて過去の移動量変化を推定して、移動量の検出誤差を推定し、抽出範囲Aを設定してもよい。この場合、図11に示すように、走行履歴の過去の移動量変化であるずれ量Δyや右左折回数等の過去の移動量変化が大きいと想定される誤差要因毎にそれぞれ抽出範囲Aの減少量ΔAを設定しておき、各誤差要因において検出誤差が大きいと推定された場合には、各減少量ΔAをそれぞれ抽出範囲Aから減少させるようにする。例えば、図11では、走行履歴でずれ量Δyが50m以上、交差点通過回数が3回以上、右左折回数が2回以上の場合にはそれぞれ減少量ΔAが20mに設定され、分流または合流、車線変更の回数が1回以上である場合には減少量ΔAが10mに設定されている。また、走行履歴で車両が走行したカーブの曲率半径が50m以下である場合には減少量ΔAが50mに設定されている。そして、移動量の検出誤差が大きいと推定された誤差要因が複数あった場合には、これらの減少量ΔAの和を、予め設定された200mから減少させて抽出範囲Aを設定する。例えば、走行履歴で右左折回数が2回以上で車線変更が1回あった場合には、それぞれの減少量20mと10mの和となる30mを200mから減少させて抽出範囲Aを170mに設定する。このとき、減少量ΔAの和が大きくなりすぎて抽出範囲Aが小さくなりすぎると、ステップS50で実行されるマッチングが難しくなるので、抽出範囲Aの最小値を例えば100mに設定しておくようにする。また、減少幅ΔAは、ステップS50のマッチング状態を予め実験やシミュレーションによって検証し、最適な値が設定されるようにする。
次に、本実施形態に係る自己位置推定装置による効果を説明する。まず、従来の自己位置推定技術について説明すると、従来では移動量とセンシング結果の履歴とを蓄積して連結することで自己位置を推定していた。例えば、図12(a)は、車両の左右の下向き方向を撮像した魚眼カメラの画像を俯瞰変換したものである。図12(a)に示すように現在の俯瞰画像121は小さいが、図12(b)に示すように、現在の俯瞰画像121に過去の移動量とセンシング結果(俯瞰画像)の履歴を蓄積して連結した部分123を追加すると、過去の一定区間の俯瞰画像を得ることができる。したがって、駐車車両等の障害物があっても過去のセンシング結果を用いて自己位置を推定することが可能になる。
次に、本発明の第2実施形態に係る自己位置推定装置について図面を参照して説明する。尚、本実施形態に係る自己位置推定システムの構成は第1実施形態と同一なので、詳細な説明は省略する。
本実施形態に係る自己位置推定処理の手順を図16のフローチャートを参照して説明する。第1実施形態では、ステップS40において車両の走行履歴から移動量の検出誤差を推定していた。しかし、本実施形態では、ステップS140において走行履歴の中の車両挙動に注目して、所定範囲(抽出範囲A)を設定することが、第1実施形態と相違している。尚、ステップS10~S30とステップS50は、図3の第1実施形態と同一であるため、詳細な説明は省略する。
また、変形例6としては、ステップS140の車両挙動の具体例として、車両の車速変化が大きいほど、移動量の検出誤差が大きいと推定し、抽出範囲Aを狭くする。この場合、まず抽出範囲Aを200mに設定しておき、車両の前後方向の加速度を計測する加速度センサ46の計測値α[m/s2]の絶対値の最大値αabs[m/s2]を検出する。
以上詳細に説明したように、本実施形態に係る自己位置推定装置では、走行履歴の過去の移動量変化である車両の旋回量が大きいほど、移動量の検出誤差が大きいと推定し、所定範囲(抽出範囲A)を小さくする。車両が旋回すると、旋回方向だけでなくタイヤのすべりによる車幅方向の移動量の検出誤差が大きくなる。したがって、走行履歴に基づいて、移動量の検出誤差が蓄積しやすい車両の旋回量が大きい場合には、所定範囲(抽出範囲A)を小さくすることで、移動量の検出誤差の蓄積を少なくすることができ、これによって自己位置を高精度、且つ安定して推定することができる。
2、2a、2b カメラ
3 三次元地図データベース
4 車両センサ群
5、5a、5b レーザレンジファインダ
10 自己位置推定装置
12 物標位置検出部
14 移動量検出部
16 物標位置蓄積部
18 地図情報取得部
20 自己位置推定部
41 GPS受信機
42 アクセルセンサ
43 ステアリングセンサ
44 ブレーキセンサ
45 車速センサ
46 加速度センサ
47 車輪速センサ
48 ヨーレートセンサ
Claims (9)
- 移動体の自己位置を推定する自己位置推定装置であって、
前記移動体の周囲に存在する物標の物標位置を検出する物標位置検出部と、
前記移動体の移動量を検出する移動量検出部と、
前記物標位置検出部で検出された物標位置を、前記移動量検出部で検出された移動量だけ移動させて物標位置データとして蓄積する物標位置蓄積部と、
地図上に存在する物標の物標位置を含む地図情報を取得する地図情報取得部と、
前記移動体の現在位置までの移動履歴に基づいて設定された所定範囲の前記物標位置データと前記地図情報に含まれる物標位置とを照合して前記移動体の自己位置を推定する自己位置推定部と
を備えたことを特徴とする自己位置推定装置。 - 前記自己位置推定部は、前記移動履歴の前記移動体の過去の移動量変化が大きいほど、前記所定範囲を小さくすることを特徴とする請求項1に記載の自己位置推定装置。
- 前記移動体は、車両であり、
前記自己位置推定部は、前記移動履歴で前記車両の右左折回数が多いほど、前記所定範囲を小さくすることを特徴とする請求項1または2に記載の自己位置推定装置。 - 前記移動体は、車両であり、
前記自己位置推定部は、前記移動履歴で前記車両の車線変更の回数が多いほど、前記所定範囲を小さくすることを特徴とする請求項1~3のいずれか1項に記載の自己位置推定装置。 - 前記移動体は、車両であり、
前記自己位置推定部は、前記移動履歴で前記車両の分流または合流の回数が多いほど、前記所定範囲を小さくすることを特徴とする請求項1~4のいずれか1項に記載の自己位置推定装置。 - 前記移動体は、車両であり、
前記自己位置推定部は、前記移動履歴で前記車両が走行したカーブの曲率半径が小さいほど、前記所定範囲を小さくすることを特徴とする請求項1~5のいずれか1項に記載の自己位置推定装置。 - 前記自己位置推定部は、前記移動履歴で前記移動体の旋回量が大きいほど、前記所定範囲を小さくすることを特徴とする請求項1~6のいずれか1項に記載の自己位置推定装置。
- 前記自己位置推定部は、前記移動履歴で前記移動体の移動速度変化が大きいほど、前記所定範囲を小さくすることを特徴とする請求項1~7のいずれか1項に記載の自己位置推定装置。
- 移動体の自己位置を推定する自己位置推定方法であって、
前記移動体に搭載された制御部が、前記移動体の周囲に存在する物標の物標位置を検出し、
前記制御部が、前記移動体の移動量を検出し、
前記制御部が、検出された前記物標位置を、検出された前記移動量だけ移動させて物標位置データとして蓄積し、
前記制御部が、地図上に存在する物標の物標位置を含む地図情報を取得し、
前記制御部が、前記移動体の現在位置までの移動履歴に基づいて設定された所定範囲の前記物標位置データと前記地図情報に含まれる物標位置とを照合して前記移動体の自己位置を推定することを特徴とする自己位置推定方法。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580080282.3A CN107615201B (zh) | 2015-05-28 | 2015-05-28 | 自身位置估计装置及自身位置估计方法 |
EP15893366.3A EP3306429B1 (en) | 2015-05-28 | 2015-05-28 | Position estimation device and position estimation method |
RU2017146057A RU2668459C1 (ru) | 2015-05-28 | 2015-05-28 | Устройство оценки положения и способ оценки положения |
CA2987373A CA2987373C (en) | 2015-05-28 | 2015-05-28 | Position estimation device and position estimation method |
KR1020177034796A KR101880013B1 (ko) | 2015-05-28 | 2015-05-28 | 자기 위치 추정 장치 및 자기 위치 추정 방법 |
US15/577,156 US10260889B2 (en) | 2015-05-28 | 2015-05-28 | Position estimation device and position estimation method |
MX2017015167A MX364590B (es) | 2015-05-28 | 2015-05-28 | Dispositivo de estimacion de posicion y metodo de estimacion de posicion. |
JP2017520184A JP6384604B2 (ja) | 2015-05-28 | 2015-05-28 | 自己位置推定装置及び自己位置推定方法 |
PCT/JP2015/065415 WO2016189732A1 (ja) | 2015-05-28 | 2015-05-28 | 自己位置推定装置及び自己位置推定方法 |
BR112017025513A BR112017025513A2 (pt) | 2015-05-28 | 2015-05-28 | dispositivo de estimação de posição e método de estimação de posição |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/065415 WO2016189732A1 (ja) | 2015-05-28 | 2015-05-28 | 自己位置推定装置及び自己位置推定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016189732A1 true WO2016189732A1 (ja) | 2016-12-01 |
Family
ID=57393976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/065415 WO2016189732A1 (ja) | 2015-05-28 | 2015-05-28 | 自己位置推定装置及び自己位置推定方法 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10260889B2 (ja) |
EP (1) | EP3306429B1 (ja) |
JP (1) | JP6384604B2 (ja) |
KR (1) | KR101880013B1 (ja) |
CN (1) | CN107615201B (ja) |
BR (1) | BR112017025513A2 (ja) |
CA (1) | CA2987373C (ja) |
MX (1) | MX364590B (ja) |
RU (1) | RU2668459C1 (ja) |
WO (1) | WO2016189732A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018212280A1 (ja) * | 2017-05-19 | 2018-11-22 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
WO2018212283A1 (ja) * | 2017-05-19 | 2018-11-22 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
JP2019028617A (ja) * | 2017-07-27 | 2019-02-21 | 株式会社ゼンリン | 移動体制御システム及び移動体制御方法 |
JP2019061703A (ja) * | 2018-11-29 | 2019-04-18 | 株式会社ゼンリン | 走行支援装置、プログラム |
WO2019187750A1 (ja) * | 2018-03-28 | 2019-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JP2019207214A (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
US11161506B2 (en) | 2017-04-27 | 2021-11-02 | Zenrin Co., Ltd. | Travel support device and non-transitory computer-readable medium |
WO2022208617A1 (ja) * | 2021-03-29 | 2022-10-06 | パイオニア株式会社 | 地図データ構造、記憶装置、情報処理装置、制御方法、プログラム及び記憶媒体 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
MX368749B (es) * | 2015-07-31 | 2019-10-15 | Nissan Motor | Metodo de control de desplazamiento y aparato de control de desplazamiento. |
US10508923B2 (en) | 2015-08-28 | 2019-12-17 | Nissan Motor Co., Ltd. | Vehicle position estimation device, vehicle position estimation method |
JP6932058B2 (ja) * | 2017-10-11 | 2021-09-08 | 日立Astemo株式会社 | 移動体の位置推定装置及び位置推定方法 |
CN110243357B (zh) * | 2018-03-07 | 2021-09-10 | 杭州海康机器人技术有限公司 | 一种无人机定位方法、装置、无人机及存储介质 |
WO2019185165A1 (en) * | 2018-03-30 | 2019-10-03 | Toyota Motor Europe | System and method for adjusting external position information of a vehicle |
KR102420476B1 (ko) * | 2018-05-25 | 2022-07-13 | 에스케이텔레콤 주식회사 | 차량의 위치 추정 장치, 차량의 위치 추정 방법, 및 이러한 방법을 수행하도록 프로그램된 컴퓨터 프로그램을 저장하는 컴퓨터 판독가능한 기록매체 |
WO2020080088A1 (ja) * | 2018-10-15 | 2020-04-23 | 三菱電機株式会社 | 情報処理装置 |
GB2596708B (en) * | 2019-03-07 | 2024-01-10 | Mobileye Vision Technologies Ltd | Aligning road information for navigation |
KR102634443B1 (ko) * | 2019-03-07 | 2024-02-05 | 에스케이텔레콤 주식회사 | 차량용 센서의 보정 정보 획득 장치 및 방법 |
CN112149659B (zh) * | 2019-06-27 | 2021-11-09 | 浙江商汤科技开发有限公司 | 定位方法及装置、电子设备和存储介质 |
CN110647149B (zh) * | 2019-09-30 | 2022-09-16 | 长春工业大学 | 一种agv调度和交叉口分流控制方法 |
US20210247506A1 (en) * | 2020-02-12 | 2021-08-12 | Aptiv Technologies Limited | System and method of correcting orientation errors |
US11731639B2 (en) * | 2020-03-03 | 2023-08-22 | GM Global Technology Operations LLC | Method and apparatus for lane detection on a vehicle travel surface |
DE102020115746A1 (de) | 2020-06-15 | 2021-12-16 | Man Truck & Bus Se | Verfahren zum Beurteilen einer Genauigkeit einer Positionsbestimmung einer Landmarke, sowie Bewertungssystem |
DE102022004316A1 (de) | 2021-12-08 | 2023-06-15 | Mercedes-Benz Group AG | System und Verfahren zur Landmarkenextraktion |
DE102022002921A1 (de) | 2022-08-11 | 2024-02-22 | Mercedes-Benz Group AG | System für die Wegbestätigung eines Fahrzeugs und Verfahren davon |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08247775A (ja) * | 1995-03-15 | 1996-09-27 | Toshiba Corp | 移動体の自己位置同定装置および自己位置同定方法 |
JPH09152344A (ja) * | 1995-12-01 | 1997-06-10 | Fujitsu Ten Ltd | 車両位置検出装置 |
JP2007309757A (ja) * | 2006-05-17 | 2007-11-29 | Toyota Motor Corp | 対象物認識装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3898655A (en) * | 1974-01-14 | 1975-08-05 | Bendix Corp | Variable range cut-off system for dual frequency CW radar |
DE4324531C1 (de) * | 1993-07-21 | 1994-12-01 | Siemens Ag | Verfahren zur Erstellung einer Umgebungskarte und zur Bestimmung einer Eigenposition in der Umgebung durch eine selbstbewegliche Einheit |
US6023653A (en) | 1995-11-30 | 2000-02-08 | Fujitsu Ten Limited | Vehicle position detecting apparatus |
US7287884B2 (en) * | 2002-02-07 | 2007-10-30 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation supporting device and vehicle operation supporting system |
JP2008250906A (ja) | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
US8655588B2 (en) * | 2011-05-26 | 2014-02-18 | Crown Equipment Limited | Method and apparatus for providing accurate localization for an industrial vehicle |
JP5761162B2 (ja) * | 2012-11-30 | 2015-08-12 | トヨタ自動車株式会社 | 車両位置推定装置 |
KR102027771B1 (ko) * | 2013-01-31 | 2019-10-04 | 한국전자통신연구원 | 차량 속도 적응형 장애물 검출 장치 및 방법 |
CN103728635B (zh) * | 2013-12-27 | 2017-01-18 | 苍穹数码技术股份有限公司 | 基于虚拟电子地标的高可靠定位预测方法及系统 |
US9347779B1 (en) * | 2014-12-10 | 2016-05-24 | Here Global B.V. | Method and apparatus for determining a position of a vehicle based on driving behavior |
-
2015
- 2015-05-28 KR KR1020177034796A patent/KR101880013B1/ko active IP Right Grant
- 2015-05-28 JP JP2017520184A patent/JP6384604B2/ja active Active
- 2015-05-28 US US15/577,156 patent/US10260889B2/en active Active
- 2015-05-28 CN CN201580080282.3A patent/CN107615201B/zh active Active
- 2015-05-28 RU RU2017146057A patent/RU2668459C1/ru active
- 2015-05-28 WO PCT/JP2015/065415 patent/WO2016189732A1/ja active Application Filing
- 2015-05-28 MX MX2017015167A patent/MX364590B/es active IP Right Grant
- 2015-05-28 CA CA2987373A patent/CA2987373C/en active Active
- 2015-05-28 BR BR112017025513A patent/BR112017025513A2/pt not_active Application Discontinuation
- 2015-05-28 EP EP15893366.3A patent/EP3306429B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08247775A (ja) * | 1995-03-15 | 1996-09-27 | Toshiba Corp | 移動体の自己位置同定装置および自己位置同定方法 |
JPH09152344A (ja) * | 1995-12-01 | 1997-06-10 | Fujitsu Ten Ltd | 車両位置検出装置 |
JP2007309757A (ja) * | 2006-05-17 | 2007-11-29 | Toyota Motor Corp | 対象物認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3306429A4 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11161506B2 (en) | 2017-04-27 | 2021-11-02 | Zenrin Co., Ltd. | Travel support device and non-transitory computer-readable medium |
JPWO2018212280A1 (ja) * | 2017-05-19 | 2020-03-19 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
WO2018212283A1 (ja) * | 2017-05-19 | 2018-11-22 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
US11519727B2 (en) | 2017-05-19 | 2022-12-06 | Pioneer Corporation | Measurement device, measurement method and program |
JP2022034051A (ja) * | 2017-05-19 | 2022-03-02 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
WO2018212280A1 (ja) * | 2017-05-19 | 2018-11-22 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
JPWO2018212283A1 (ja) * | 2017-05-19 | 2020-03-19 | パイオニア株式会社 | 測定装置、測定方法およびプログラム |
JP7020813B2 (ja) | 2017-07-27 | 2022-02-16 | 株式会社ゼンリン | 移動体制御システム及びプログラム |
JP2019028617A (ja) * | 2017-07-27 | 2019-02-21 | 株式会社ゼンリン | 移動体制御システム及び移動体制御方法 |
JPWO2019187750A1 (ja) * | 2018-03-28 | 2021-01-07 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
WO2019187750A1 (ja) * | 2018-03-28 | 2019-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
US11472419B2 (en) | 2018-03-28 | 2022-10-18 | Hitachi Astemo, Ltd. | Vehicle control device |
WO2019230098A1 (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
CN112204348A (zh) * | 2018-05-30 | 2021-01-08 | 歌乐株式会社 | 信息处理装置 |
JP2019207214A (ja) * | 2018-05-30 | 2019-12-05 | クラリオン株式会社 | 情報処理装置 |
JP7137359B2 (ja) | 2018-05-30 | 2022-09-14 | フォルシアクラリオン・エレクトロニクス株式会社 | 情報処理装置 |
JP2019061703A (ja) * | 2018-11-29 | 2019-04-18 | 株式会社ゼンリン | 走行支援装置、プログラム |
WO2022208617A1 (ja) * | 2021-03-29 | 2022-10-06 | パイオニア株式会社 | 地図データ構造、記憶装置、情報処理装置、制御方法、プログラム及び記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
EP3306429A4 (en) | 2018-07-11 |
MX364590B (es) | 2019-05-02 |
RU2668459C1 (ru) | 2018-10-01 |
CN107615201A (zh) | 2018-01-19 |
CN107615201B (zh) | 2018-11-20 |
MX2017015167A (es) | 2018-04-13 |
KR20180004206A (ko) | 2018-01-10 |
US20180172455A1 (en) | 2018-06-21 |
EP3306429B1 (en) | 2019-09-25 |
CA2987373C (en) | 2018-12-04 |
CA2987373A1 (en) | 2016-12-01 |
JPWO2016189732A1 (ja) | 2018-03-22 |
JP6384604B2 (ja) | 2018-09-05 |
BR112017025513A2 (pt) | 2018-08-07 |
US10260889B2 (en) | 2019-04-16 |
KR101880013B1 (ko) | 2018-07-18 |
EP3306429A1 (en) | 2018-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6384604B2 (ja) | 自己位置推定装置及び自己位置推定方法 | |
US11124163B2 (en) | Method for controlling travel of vehicle, and device for controlling travel of vehicle | |
RU2692097C1 (ru) | Устройство и способ задания позиции остановки транспортного средства | |
JP6418332B2 (ja) | 車両位置推定装置、車両位置推定方法 | |
US10890453B2 (en) | Vehicle localization device | |
US11526173B2 (en) | Traveling trajectory correction method, traveling control method, and traveling trajectory correction device | |
CN109564098B (zh) | 自身位置推定方法及自身位置推定装置 | |
RU2735720C1 (ru) | Способ оценки транспортного средства, способ корректировки маршрута движения, устройство оценки транспортного средства и устройство корректировки маршрута движения | |
JP2004531424A (ja) | 車用の感知装置 | |
JP6020729B2 (ja) | 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 | |
US11042759B2 (en) | Roadside object recognition apparatus | |
WO2016194168A1 (ja) | 走行制御装置及び方法 | |
JP6941178B2 (ja) | 自動運転制御装置及び方法 | |
JP7182963B2 (ja) | 移動体検知システム及び移動体検知方法 | |
US11156466B2 (en) | Lane determination device | |
JP6784633B2 (ja) | 車両の走行制御装置 | |
JP2023144778A (ja) | 区画線認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15893366 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017520184 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2987373 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15577156 Country of ref document: US Ref document number: MX/A/2017/015167 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20177034796 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017146057 Country of ref document: RU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015893366 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112017025513 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112017025513 Country of ref document: BR Kind code of ref document: A2 Effective date: 20171128 |