WO2018061084A1 - 自己位置推定方法及び自己位置推定装置 - Google Patents
自己位置推定方法及び自己位置推定装置 Download PDFInfo
- Publication number
- WO2018061084A1 WO2018061084A1 PCT/JP2016/078428 JP2016078428W WO2018061084A1 WO 2018061084 A1 WO2018061084 A1 WO 2018061084A1 JP 2016078428 W JP2016078428 W JP 2016078428W WO 2018061084 A1 WO2018061084 A1 WO 2018061084A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- self
- target position
- position data
- vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000001514 detection method Methods 0.000 claims description 39
- 230000033001 locomotion Effects 0.000 abstract description 25
- 238000012545 processing Methods 0.000 description 17
- 238000000605 extraction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 6
- 238000009825 accumulation Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000032683 aging Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0272—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present invention relates to a self-position estimation method and a self-position estimation apparatus.
- Patent Document 1 A technique for estimating the self-position of a robot that can move independently is known (see Patent Document 1).
- the result (surrounding environment information) obtained by detecting a region where the mobile robot can move with a sensor is limited to a predetermined region based on the mobile robot.
- the self-position is estimated by collating with the environmental map that is held.
- white lines located on both sides of the vehicle in the vehicle width direction may be used.
- an error is included in the detected position of the white line.
- the position of the white line in the vehicle width direction with respect to the vehicle is regularly offset due to a calibration error or the like. For this reason, the self-position estimation result becomes unstable or the self-position estimation accuracy decreases.
- the present invention has been made in view of the above problems, and its purpose is to improve self-position estimation accuracy by eliminating target position data that is estimated to have a large relative position error.
- a method and a self-position estimation apparatus are provided.
- the self-position estimation method detects a relative position between a target and a moving object existing around the moving object, and moves the detected relative position by the moving amount of the moving object.
- the target position data is stored as target position data, selected based on the reliability of the relative position of the target position data to the moving object, and the selected target position data and the target existing on or around the road.
- the self-position that is the current position of the moving body is estimated by collating with the map information including the position information.
- the self-position estimation method it is possible to eliminate target position data that is estimated to have a large relative position error, so that the self-position estimation accuracy can be improved.
- FIG. 1 is a block diagram illustrating an example of the configuration of the self-position estimation apparatus according to the embodiment.
- FIG. 2 is a perspective view showing a state where the surrounding sensor group 1 is mounted on the vehicle V.
- FIG. 3 is a flowchart showing an example of a self-position estimation method using the self-position estimation apparatus of FIG.
- FIG. 4 is a perspective view showing an environment in which the vehicle V travels when performing self-position estimation.
- 5 (a) to 5 (d) show the position 71 of the curb 61 and the white line 62 in the vehicle coordinate system detected by the target position detection unit 31 during the time t1 to t4 in the example shown in FIG. 63, target position data 72 and 73.
- FIG. 5 (a) to 5 (d) show the position 71 of the curb 61 and the white line 62 in the vehicle coordinate system detected by the target position detection unit 31 during the time t1 to t4 in the example shown in FIG. 63, target position data 72 and 73.
- FIG. 6 is a diagram showing a result of integrating the movement amount of the vehicle V calculated based on the detection result by the vehicle sensor group 5 in the examples shown in FIGS. 5 (a) to 5 (d).
- FIG. 7 is a diagram showing the target position data converted into the odometry coordinate system in the examples shown in FIGS. 5 and 6.
- FIG. 8 is a conceptual diagram showing straight line information (N1, N2, N3) extracted from target position data (71a to 71d, 72a to 72d, 73a to 73d).
- FIG. 9 is a diagram showing straight lines (N2, N3) approximated to the target position data (72, 73) indicating the road boundary.
- FIG. 1 straight line information
- FIG. 10 is a diagram illustrating a state in which a road boundary (72j, 72k, 73j, 73k) capable of linear approximation is detected, which indicates a road boundary that identifies a road on which the vehicle V is traveling.
- FIG. 11 is a diagram illustrating a state where a road boundary (72m, 72n, 73m, 73n) that can be approximated by a curve is detected, which indicates a road boundary that identifies a road on which the vehicle V is traveling.
- the self-position estimation apparatus includes a surrounding sensor group 1, a processing device 3, a storage device 4, and a vehicle sensor group 5.
- the self-position estimation apparatus according to the present embodiment is mounted on a vehicle V (see FIG. 2) and estimates the self-position of the vehicle V.
- the position in the east-west direction (X-axis direction) (X coordinate [m])
- the position in the north-south direction (Y-axis direction) (Y coordinate [m])
- attitude angle As information, the position and posture angle of a total of three degrees of freedom on the two-dimensional plane of the azimuth angle ⁇ (yaw angle [rad]) of the vehicle are estimated.
- the ambient sensor group 1 includes, for example, a plurality of laser range finders (LRF) 101 and 102 and a plurality of cameras 201 and 202.
- Laser range finders (LRF) 101 and 102 detect the distance and direction to the target by receiving the reflected light of the irradiated laser light from the target.
- the cameras 201 and 202 capture the surroundings of the vehicle V and acquire digital images that can be processed.
- the ambient sensor group 1 includes a plurality of sensors that respectively detect the targets existing around the vehicle V.
- the ambient sensor group 1 may include sonar and radar.
- the targets existing around the vehicle V include targets, stop lines, pedestrian crossings, speed limits, etc. that indicate the boundaries of the white lines, curbs, median strips, guardrails, reflectors, etc. existing on the track around the vehicle V.
- Road structures such as road markings, signs, traffic lights, and utility poles.
- FIG. 2 is an example illustrating a state in which the surrounding sensor group 1 is mounted on the vehicle V.
- the LRFs 101 and 102 can be mounted, for example, near the front fenders on the left and right sides of the vehicle V, respectively.
- the LRFs 101 and 102 scan the laser beam at a predetermined scanning angle (for example, 90 °) so that the locus of the laser beam to be irradiated forms a vertical plane with respect to the road surface as a rotation axis along the front-rear direction D of the vehicle V.
- the LRFs 101 and 102 can detect a target such as a curb that exists in the left-right direction of the vehicle V.
- the LRFs 101 and 102 sequentially output the detected target shape to the processing device 3 as a detection result.
- the cameras 201 and 202 can be mounted on door mirrors on both the left and right sides of the vehicle V, for example.
- the cameras 201 and 202 capture an image with a solid-state image sensor such as a CCD or CMOS, for example.
- the cameras 201 and 202 photograph a road surface on the side of the vehicle V.
- the cameras 201 and 202 sequentially output the captured images to the processing device 3.
- the storage device 4 is a map information storage unit that stores map information 41 including position information of targets existing on or around the road.
- the storage device 4 can be composed of a semiconductor memory, a magnetic disk, or the like.
- Targets (landmarks) recorded in the map information 41 are detected by the surrounding sensor group 1 in addition to road markings such as stop lines, pedestrian crossings, pedestrian crossing notices, lane markings, and structures such as curbs. Includes various possible facilities.
- the map information 41 is described only by position information on a two-dimensional plane even if the target is actually a target such as a curb. In the map information 41, position information such as curbstones and white lines is defined by a collection of straight line information having two-dimensional position information of both end points.
- the map information 41 is described as straight line information on a two-dimensional plane approximated by a broken line when the shape of the real environment is a curve.
- the vehicle sensor group 5 includes a GPS receiver 51, an accelerator sensor 52, a steering sensor 53, a brake sensor 54, a vehicle speed sensor 55, an acceleration sensor 56, a wheel speed sensor 57, and other sensors such as a yaw rate sensor.
- Each of the sensors 51 to 57 is connected to the processing device 3, and sequentially outputs various detection results to the processing device 3.
- the processing device 3 can calculate the position of the vehicle V in the map information 41 using each detection result of the vehicle sensor group 5 or calculate an odometry indicating the amount of movement of the vehicle V per unit time.
- the movement amount of the vehicle V is measured by odometry based on the number of rotations of the tire, inertia measurement using a gyroscope and acceleration sensor, a method of receiving radio waves from a satellite such as GNSS (Global Earth Navigation Satellite System), and measurement by an external sensor.
- GNSS Global Earth Navigation Satellite System
- SLAM Simultaneous Localization and Mapping
- the processing device 3 includes a target position detection unit 31, a movement amount estimation unit 32, a target position accumulation unit 33, a straight line extraction unit 34, a target position selection unit 35, a self-position estimation unit 36, a target And a target attribute estimation unit 37.
- the processing device 3 can be configured by, for example, a microcontroller that is an integrated circuit including a central processing unit (CPU), a memory, an input / output I / F, and the like. In this case, the CPU executes a computer program installed in advance in the microcontroller, thereby realizing a plurality of information processing units (31 to 37) constituting the processing device 3.
- Each part which comprises the processing apparatus 3 may be comprised from integral hardware, and may be comprised from separate hardware.
- the microcontroller may also be used as an electronic control unit (ECU) used for other control related to the vehicle V such as automatic driving control.
- the “self-position estimation circuit” includes a movement amount estimation unit 32, a target position accumulation unit 33, a straight line extraction unit 34, a target position selection unit 35, a self-position estimation unit 36, and a target attribute estimation unit. 37.
- the target position detection unit 31 detects the relative position between the target and the vehicle V existing around the vehicle V based on the detection result of at least one of the LRFs 101 and 102 and the cameras 201 and 202.
- the relative position detected by the target position detection unit 31 is a position in the vehicle coordinate system.
- the center of the rear wheel axle of the vehicle V may be the origin
- the forward direction may be the positive direction of the x axis
- the left direction may be the positive direction of the y axis
- the upward direction may be the positive direction of the z axis.
- the “target detection sensor” includes the vehicle sensor group 5 and the target position detection unit 31.
- the movement amount estimation unit 32 detects odometry, which is the movement amount of the vehicle V per unit time, based on detection result information of at least one of the sensors included in the vehicle sensor group 5.
- the movement amount of the vehicle V is detected as a movement amount in the odometry coordinate system.
- the target position accumulation unit 33 moves the relative position of the target detected by the target position detection unit 31 by the movement amount of the vehicle V detected by the movement amount estimation unit 32, and target position data. Is stored in the primary storage device or the storage device 4 in the processing device 3.
- the straight line extraction unit 34 extracts straight line information from the target position data accumulated by the target position accumulation unit 33.
- the target attribute estimation unit 37 estimates target attributes based on the detection results of at least one of the LRFs 101 and 102 and the cameras 201 and 202.
- the target position selection unit 35 selects target position data based on the reliability of the relative position of the target position data to the vehicle.
- the target position selection unit 35 determines the relative position of the target position data with respect to the vehicle V based on the straight line information extracted by the straight line extraction unit 34 and the target attribute estimated by the target attribute estimation unit 37.
- Judge reliability The self-position estimating unit 36 estimates the self-position that is the current position of the vehicle V by comparing the selected target position data with map information including position information of the target existing on or around the road. To do.
- step S ⁇ b> 01 the self-position estimation apparatus measures the surroundings of the vehicle V using the surrounding sensor group 1.
- the surrounding sensor group 1 detects each target existing around the vehicle V.
- the target position detection unit 31 determines the position of the target with respect to the LRFs 101 and 102 and the cameras 201 and 202, that is, the sensor coordinates, based on the detection result of at least one of the LRFs 101 and 102 and the cameras 201 and 202.
- Estimate the relative position of the target in the system For example, in the case of the cameras 201 and 202, the relationship between the position in the image and the distance may be measured in advance.
- the motion stereo method can be used. Not limited to this, other known methods may be used. For other sensors (sonar, LRF, radar) that can acquire distance information, the values may be used as they are.
- FIG. 4 is an example illustrating an environment in which the vehicle V travels when performing self-position estimation.
- the laser light emitted from the LRF 101 is applied to the road surface including the curb 61 like a line 64.
- the target position detection unit 31 extracts a place where the shape change is large from the direction and distance of the irradiated laser light as the position of the curb 61 and detects the position in the sensor coordinate system. Since it can be assumed that there is always a road surface in the vertically downward direction of the LRFs 101 and 102, the curb 61 can be detected by extracting a point having a large change when the road surface and the height are compared.
- the target position detection unit 31 detects white lines 62 and 63 existing on both sides of the vehicle V from the luminance information of the images captured by the cameras 201 and 202.
- the target position detection unit 31 detects, from the grayscale images captured by the cameras 201 and 202, a pattern in which the luminance changes in order of dark, bright, and dark, so that the white lines 62 and 63 are located at the center of the bright part.
- the position in the sensor coordinate system detected in step S05 is treated as two-dimensional data after the height information is excluded.
- step S07 the target position detection unit 31 converts the relative position of the target in the sensor coordinate system into the relative position of the target in the vehicle coordinate system using a preset conversion formula.
- 5 (a) to 5 (d) show the position 71 of the curb 61 and the white line 62 in the vehicle coordinate system detected by the target position detection unit 31 during the time t1 to t4 in the example shown in FIG. 63, target position data 72 and 73.
- t1 is the past time and t4 is the newest time.
- the movement amount estimation unit 32 calculates the position of the vehicle V in the odometry coordinate system by integrating the movement amount of the vehicle V calculated based on the detection result by the vehicle sensor group 5.
- the position of the vehicle V at the time when power is supplied to the self-position estimation apparatus or when the process is reset may be set as the origin, and the azimuth angle of the vehicle V may be set to 0 °.
- the integration of the movement amount of the vehicle V is performed in the odometry coordinate system.
- FIG. 6 shows an integration of the movement amounts (M1, M2, M3, M4) of the vehicle V calculated based on the detection result by the vehicle sensor group 5 in the examples shown in FIGS. 5 (a) to 5 (d). It is a figure which shows a result.
- the movement amount includes a change in position and orientation ( ⁇ : yaw angle) on the two-dimensional coordinates.
- the movement amount estimation unit 32 calculates the position (Xo, Yo) of the vehicle V in the odometry coordinate system.
- step S07 the target position accumulating unit 33 moves the relative position of the target in the vehicle coordinate system detected by the target position detecting unit 31 by the movement amount of the vehicle V detected by the movement amount estimating unit 32. Are stored as target position data.
- FIG. 7 is a diagram showing target position data (71a to 71d, 72a to 72d, 73a to 73d) converted to the odometry coordinate system in the examples shown in FIGS.
- the target position accumulating unit 33 uses the movement amount (M1, M2, M3,%) Of the target in the sensor coordinate system measured in the past (t1, t2, t3,). Based on M4), it is converted into a target position in the odometry coordinate system, and this is stored as target position data.
- step S09 the target position selection unit 35 extracts target position data (71a to 71d, 72a to 72d, 73a to 73d) indicating the road boundary from the plurality of accumulated target position data.
- the reliability of the relative position with respect to the vehicle V is calculated for the target position data (71a to 71d, 72a to 72d, 73a to 73d).
- the straight line extraction unit 34 uses the target position data (71a to 71d, 72a to 72d, 73a to 73d) accumulated by the target position accumulation unit 33 to obtain straight line information (N1, N2,. N3) is extracted. A linear approximation is performed on the white line detection result (target position data). Then, the target position selection unit 35 selects the vehicle V of the target position data according to the difference between the distance from the vehicle V to the target obtained from the relative position of the target and the assumed distance from the vehicle V to the target. Judge the reliability of the relative position. The smaller the difference, the higher the reliability.
- the straight line extraction unit 34 approximates a straight line (N2, N3) to the target position data (72, 73) indicating the road boundary shown in FIG.
- the target position selection unit 35 measures the distance in the vehicle width direction from the vehicle V to the straight line (N2, N3).
- the assumed distance from the center of the vehicle V to the white line is 2 m.
- the absolute value of the difference between the distance from the vehicle V to the straight line (N2, N3) and the assumed distance (2m) including the displacement of the position of the vehicle V with respect to the running path, detection error, etc. is 1 m or more, To the straight line (N2, N3) is determined to be highly likely to be inaccurate.
- the target position selection unit 35 evaluates the reliability of the target position data 73 indicating the right road boundary, and evaluates the reliability of the target position data 72 indicating the left road boundary high.
- the target position selection unit 35 excludes the target position data 73 evaluated with low reliability, and selects only the target position data 72 evaluated with high reliability.
- the reliability determination method based on the difference from the assumed distance described above can be applied not only to the road boundary such as white lines and curbs, but also to other targets.
- road structures such as signs, traffic lights, and utility poles exist in the roadside zone. Therefore, the assumed distance can be set from the running road width, and the difference between the relative distance from the vehicle V to the detected road structure and the assumed distance can be calculated.
- the target position selection unit 35 further determines the reliability based on the attributes of the target with respect to the target position data 72 selected based on the difference from the assumed distance. Specifically, the target position selection unit 35 determines the reliability of the target position data 72 based on the target attribute estimated by the target attribute estimation unit 37, and uses the target position for self-position estimation. Further refine the position data.
- the white line is the same, since the area where the solid line can be detected is larger than the broken line, it can be determined that the detection accuracy of the relative position, that is, the reliability is high.
- the map information it can be specified in advance whether the detected white line is a solid line or a broken line. If the white line located on one side of the vehicle V is a solid line, but the white line located on the other side is found to be a broken line, even if the detection error and the distance from the vehicle V are similar, The target position data indicating the white line located on one side is determined to be highly reliable. Then, target position data indicating a white line located on one side is selected.
- the type of white line is an example of a target attribute, and can be applied to other attributes. For example, since the white line is easier to detect than the yellow line, the reliability of the white line is judged to be high. Further, the reliability may be judged between different targets. For example, in the stop line and the pedestrian crossing, there are more characteristic parts of the pedestrian crossing than in the stop line.
- the target position selection unit 35 further selects the target position data based on the time at which the target position data can be continuously detected with respect to the target position data selected based on the above-described target attribute. Judge the reliability of.
- the target position selection unit 35 further targets the target position data selected based on the above-described continuous detection time, based on the error distribution when linearly approximating the target position data at the road boundary. Determine the reliability of location data. In other words, the target position selection unit 35 determines the reliability of the target position data based on the straight line information (approximate straight line) extracted by the straight line extraction unit 34 and uses the target position data for self-position estimation. Further narrow down.
- the target position selection unit 35 determines whether or not a plurality of parallel road boundary boundaries are detected as target position data indicating a road boundary (for example, a white line) that identifies the road on which the vehicle V is traveling. .
- target position data indicating a road boundary (for example, a white line) that identifies the road on which the vehicle V is traveling.
- target position data indicating a road boundary (for example, a white line) that identifies the road on which the vehicle V is traveling.
- the straight line extraction unit 34 performs linear approximation on the target position data indicating the road boundary.
- the target position selection unit 35 selects target position data (72j, 73j) included in the range LA that can be approximated by a straight line from the target position data (72j, 72k, 73j, 73k). At this time, the target position selection unit 35 expands the range LA that can be approximated by a straight line with the vehicle V as a reference. For example, a section where the minimum distance of the target position data to the approximate line is ⁇ 15 cm to 15 cm and the number of target position data is 80% or more is set as a range LA that can be approximated by a straight line. On the other hand, target position data (72k, 73k) not included in the range LA that can be approximated by a straight line is excluded.
- the line extraction unit 34 performs curve approximation instead of linear approximation (linear approximation).
- the target position selection unit 35 highly evaluates the reliability of the target position data (72m, 73m) included in the range LB that can be approximated by the curves (N2, N3) and selects it as target position data used for self-position estimation. To do.
- target position data (72n, 73n) not included in the range LB that can be approximated by the curves (N2, N3) is excluded.
- the target position selection unit 35 determines (1) the difference from the assumed distance from the vehicle V to the target, (2) the attribute of the target, (3) the continuous detection time, and (4) the road boundary.
- An example is shown in which the reliability of the relative position of the target position data with respect to the vehicle V is determined in the order of the error distribution when the target position data shown is linearly approximated.
- the present invention is not limited to this, and the order of reliability determination can be arbitrarily changed. It is also possible to carry out only a part of the judgments (1) to (4).
- each reliability judgment may be quantified and comprehensive evaluation may be performed. For example, evaluation scores may be given in multiple stages in each reliability judgment, and the overall scores may be calculated by adding the evaluation scores. Thereby, it can be judged after quantifying the reliability of the detected target.
- the self-position estimating unit 36 collates the target position data selected by the target position selecting unit 35 with the position of the target in the map information 41. In other words, the position of the target in the map information 41 and the target position data determined to be highly reliable by the target position selection unit 35 are matched.
- the self-position estimating unit 36 estimates the self-position of the vehicle V through the above-described target position collation (map matching). That is, the position and posture angle of a total of three degrees of freedom including the position in the east-west direction (X coordinate), the position in the north-south direction (Y coordinate), and the azimuth angle (yaw angle ⁇ ) of the vehicle V are estimated.
- a known self-position estimation method may be used as a method for estimating the position on the map. Proceeding to step S17, the self-position estimating unit 36 outputs the estimated self-position of the vehicle V.
- the self-position estimation unit 36 matches the end points at both ends of the target positions included in the map information 41, for example, to the lane markings as evaluation points.
- the target position data is not affected by the odometry error as it is closer to the vehicle V (ambient sensor group 1), the self-position estimation unit 36 compensates for the vicinity of the vehicle V by linear interpolation and calculates the number of evaluation points. It is possible to increase the number of evaluation points in the distance of the vehicle V.
- the target position data is selected based on the reliability of the relative position of the target position data with respect to the vehicle V, the target position data estimated to have a large relative position error can be eliminated, and the self-position estimation accuracy is improved. improves.
- the target position selection unit 35 determines that the reliability of the relative position of the target position data to the vehicle is higher as the difference between the distance from the vehicle V to the target and the assumed distance is smaller. As a result, target position data that is estimated to have a large relative position error can be appropriately excluded, so that the self-position estimation accuracy is improved.
- the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V based on the attributes of the target. For example, when comparing a solid line with a white line and a broken line, it is determined that the solid line that can constantly acquire target position data has higher reliability than the broken line. Therefore, the target position data estimated to have a large relative position error can be appropriately determined, so that the self-position estimation accuracy is improved.
- the target position selection unit 35 determines the reliability of the relative position of the target position data with respect to the vehicle V based on the time when the target position data can be detected continuously. As a result, it is possible to stably and accurately estimate the self position.
- the target position selection unit 35 specifies a travel path on which the vehicle V travels.
- target position data indicating a plurality of parallel travel path boundaries is detected, the target position with high reliability relative to the vehicle V is detected.
- Select data is selected. Thereby, a road boundary with high position accuracy is selected, and self-position estimation accuracy is further increased.
- the target position selection unit 35 determines that the reliability of the relative position of the target position data indicating the road boundary with the vehicle V is higher as the error from the approximate line when the road boundary is approximated is smaller. As a result, a road boundary with high detection position accuracy is selected, and the self-position estimation accuracy is further increased.
- the moving body is not limited to the vehicle V as a moving object moving on land, but includes a ship, an aircraft, a spacecraft, and other moving objects.
- the processing circuit includes a programmed processing device such as a processing device including an electrical circuit.
- Processing devices also include devices such as application specific integrated circuits (ASICs) and conventional circuit components arranged to perform the functions described in the embodiments.
- ASICs application specific integrated circuits
- Target detection sensor 31
- Target position detection unit 32
- Movement amount estimation unit (self-position estimation circuit)
- Target position storage unit 34
- Straight line extraction unit (self-position estimation circuit)
- Target position selector (self-position estimation circuit)
- Self-position estimation circuit) 36
- Self-position estimation unit 37
- Target attribute estimation unit 41
- Map information 61
- Step (target) 62, 63
- White line 72j, 72k, 72m, 72n
- Target position data 73j, 73k, 73n, 73m
- Target position data M1 to M4 Movement amount of moving object N1, N2, N3 Line V vehicle (moving object) approximating plural target position data )
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
31 物標位置検出部(物標検出センサ)
32 移動量推定部(自己位置推定回路)
33 物標位置蓄積部(自己位置推定回路)
34 直線抽出部(自己位置推定回路)
35 物標位置選択部(自己位置推定回路)
36 自己位置推定部(自己位置推定回路)
37 物標属性推定部(自己位置推定回路)
41 地図情報
61 段差(物標)
62、63 白線(物標)
72j、72k、72m、72n 物標位置データ
73j、73k、73n、73m 物標位置データ
M1~M4 移動体の移動量
N1、N2、N3 複数の物標位置データを近似した線
V 車両(移動体)
Claims (7)
- 移動体に搭載され、前記移動体の周囲に存在する物標と前記移動体との相対位置を検出する物標検出センサと、
前記相対位置を前記移動体の移動量だけ移動させた位置を物標位置データとして蓄積し、蓄積された前記物標位置データと、道路上又は道路周辺に存在する前記物標の位置情報を含む地図情報とを照合することにより、前記移動体の現在位置である自己位置を推定する自己位置推定回路と、を用いた自己位置推定方法において、
前記物標位置データの前記移動体との相対位置の信頼性に基づいて前記物標位置データを選択し、
選択した物標位置データと前記地図情報とを照合することにより前記自己位置を推定する
ことを特徴とする自己位置推定方法。 - 前記相対位置から得られる前記移動体から前記物標までの距離と前記移動体から前記物標までの想定距離との差が小さいほど、前記信頼性が高いと判断することを特徴とすることを特徴とする請求項1に記載の自己位置推定方法。
- 前記物標の属性に基づいて前記信頼性を判断することを特徴とする請求項1又は2に記載の自己位置推定方法。
- 前記物標位置データが連続して検出できた時間に基づいて前記信頼性を判断することを特徴とする請求項1~3のいずれか一項に記載の自己位置推定方法。
- 前記移動体が走行する走路を特定する、平行した複数の走路境界を示す物標位置データが検出される場合、前記移動体との相対位置の信頼性が高い前記物標位置データを選択することを特徴とする請求項1~4のいずれか一項に記載の自己位置推定方法。
- 前記走路境界を示す複数の物標位置データを近似した線と前記複数の物標位置データとの誤差が小さいほど、前記走路境界を示す物標位置データの移動体との相対位置の信頼性が高いと判断することを特徴とする請求項5に記載の自己位置推定方法。
- 移動体に搭載され、前記移動体の周囲に存在する物標と前記移動体との相対位置を検出する物標検出センサと、
前記相対位置を前記移動体の移動量だけ移動させた位置を物標位置データとして蓄積し、前記物標位置データの前記移動体との相対位置の信頼性に基づいて前記物標位置データを選択し、選択した物標位置データと、地図上に存在する前記物標の位置情報を含む地図情報とを照合することにより、前記移動体の現在位置である自己位置を推定する自己位置推定回路と、
を備えることを特徴とする自己位置推定装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680089623.8A CN109791408B (zh) | 2016-09-27 | 2016-09-27 | 自身位置推定方法及自身位置推定装置 |
US16/336,366 US11321572B2 (en) | 2016-09-27 | 2016-09-27 | Self-position estimation method and self-position estimation device |
PCT/JP2016/078428 WO2018061084A1 (ja) | 2016-09-27 | 2016-09-27 | 自己位置推定方法及び自己位置推定装置 |
BR112019006057-0A BR112019006057B1 (pt) | 2016-09-27 | 2016-09-27 | Método de estimativa de auto-posição e dispositivo de estimativa de auto-posição |
KR1020197007722A KR20190045220A (ko) | 2016-09-27 | 2016-09-27 | 자기 위치 추정 방법 및 자기 위치 추정 장치 |
CA3038643A CA3038643A1 (en) | 2016-09-27 | 2016-09-27 | Self-position estimation method and self-position estimation device |
RU2019112711A RU2720140C1 (ru) | 2016-09-27 | 2016-09-27 | Способ оценки собственной позиции и устройство оценки собственной позиции |
JP2018541754A JP6881464B2 (ja) | 2016-09-27 | 2016-09-27 | 自己位置推定方法及び自己位置推定装置 |
EP16917637.7A EP3521962B1 (en) | 2016-09-27 | 2016-09-27 | Self-position estimation method and self-position estimation device |
MX2019002985A MX2019002985A (es) | 2016-09-27 | 2016-09-27 | Metodo de estimacion de la posicion propia y dispositivo de estimacion de la posicion propia. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/078428 WO2018061084A1 (ja) | 2016-09-27 | 2016-09-27 | 自己位置推定方法及び自己位置推定装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018061084A1 true WO2018061084A1 (ja) | 2018-04-05 |
Family
ID=61759367
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/078428 WO2018061084A1 (ja) | 2016-09-27 | 2016-09-27 | 自己位置推定方法及び自己位置推定装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US11321572B2 (ja) |
EP (1) | EP3521962B1 (ja) |
JP (1) | JP6881464B2 (ja) |
KR (1) | KR20190045220A (ja) |
CN (1) | CN109791408B (ja) |
BR (1) | BR112019006057B1 (ja) |
CA (1) | CA3038643A1 (ja) |
MX (1) | MX2019002985A (ja) |
RU (1) | RU2720140C1 (ja) |
WO (1) | WO2018061084A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220049972A1 (en) * | 2016-04-22 | 2022-02-17 | Toyota Jidosha Kabushiki Kaisha | Surrounding information collection system and surrounding information acquisition apparatus |
EP4036523A1 (en) | 2021-01-28 | 2022-08-03 | Toyota Jidosha Kabushiki Kaisha | Self-position estimation accuracy verification method and self-position estimation system |
WO2023037570A1 (ja) * | 2021-09-09 | 2023-03-16 | 日立Astemo株式会社 | 車載処理装置、車両制御装置、及び自己位置推定方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11157752B2 (en) * | 2017-03-29 | 2021-10-26 | Pioneer Corporation | Degraded feature identification apparatus, degraded feature identification system, degraded feature identification method, degraded feature identification program, and computer-readable recording medium recording degraded feature identification program |
US10467903B1 (en) * | 2018-05-11 | 2019-11-05 | Arnold Chase | Passive infra-red pedestrian detection and avoidance system |
US11062608B2 (en) | 2018-05-11 | 2021-07-13 | Arnold Chase | Passive infra-red pedestrian and animal detection and avoidance system |
US10750953B1 (en) | 2018-05-11 | 2020-08-25 | Arnold Chase | Automatic fever detection system and method |
US11294380B2 (en) | 2018-05-11 | 2022-04-05 | Arnold Chase | Passive infra-red guidance system |
DE112019003433B4 (de) * | 2018-09-25 | 2024-08-22 | Hitachi Astemo, Ltd. | Erkennungsvorrichtung |
JP7332403B2 (ja) * | 2019-09-11 | 2023-08-23 | 株式会社東芝 | 位置推定装置、移動体制御システム、位置推定方法およびプログラム |
JP7486355B2 (ja) * | 2020-06-18 | 2024-05-17 | 古野電気株式会社 | 船舶用物標検出システム、船舶用物標検出方法、信頼度推定装置、及びプログラム |
JP2023053891A (ja) * | 2021-10-01 | 2023-04-13 | 三菱電機株式会社 | 自己位置推定装置及び自己位置推定方法 |
JP7241839B1 (ja) | 2021-10-06 | 2023-03-17 | 三菱電機株式会社 | 自己位置推定装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06333192A (ja) * | 1993-05-21 | 1994-12-02 | Mitsubishi Electric Corp | 自動車用白線検出装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2015191372A (ja) * | 2014-03-27 | 2015-11-02 | トヨタ自動車株式会社 | 走路境界区画線検出装置 |
JP2016091045A (ja) * | 2014-10-29 | 2016-05-23 | 株式会社日本自動車部品総合研究所 | 走行区画線認識システム |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6581005B2 (en) * | 2000-11-30 | 2003-06-17 | Nissan Motor Co., Ltd. | Vehicle position calculation apparatus and method |
KR100495635B1 (ko) * | 2002-09-02 | 2005-06-16 | 엘지전자 주식회사 | 네비게이션 시스템의 위치오차 보정방법 |
JP4392389B2 (ja) * | 2005-06-27 | 2009-12-24 | 本田技研工業株式会社 | 車両及び車線認識装置 |
US7912633B1 (en) * | 2005-12-01 | 2011-03-22 | Adept Mobilerobots Llc | Mobile autonomous updating of GIS maps |
JP4978099B2 (ja) * | 2006-08-03 | 2012-07-18 | トヨタ自動車株式会社 | 自己位置推定装置 |
JP2008250906A (ja) | 2007-03-30 | 2008-10-16 | Sogo Keibi Hosho Co Ltd | 移動ロボット、自己位置補正方法および自己位置補正プログラム |
JP4985166B2 (ja) * | 2007-07-12 | 2012-07-25 | トヨタ自動車株式会社 | 自己位置推定装置 |
JP4254889B2 (ja) * | 2007-09-06 | 2009-04-15 | トヨタ自動車株式会社 | 車両位置算出装置 |
JP4950858B2 (ja) * | 2007-11-29 | 2012-06-13 | アイシン・エィ・ダブリュ株式会社 | 画像認識装置及び画像認識プログラム |
US9026315B2 (en) * | 2010-10-13 | 2015-05-05 | Deere & Company | Apparatus for machine coordination which maintains line-of-site contact |
JP5142047B2 (ja) * | 2009-02-26 | 2013-02-13 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション装置及びナビゲーション用プログラム |
US8306269B2 (en) * | 2009-03-12 | 2012-11-06 | Honda Motor Co., Ltd. | Lane recognition device |
JP5206740B2 (ja) * | 2010-06-23 | 2013-06-12 | 株式会社デンソー | 道路形状検出装置 |
KR101472615B1 (ko) * | 2010-12-21 | 2014-12-16 | 삼성전기주식회사 | 차선이탈 경보 시스템 및 방법 |
CN103827632B (zh) * | 2012-09-06 | 2017-02-15 | 株式会社东芝 | 位置检测装置以及位置检测方法 |
JP6325806B2 (ja) * | 2013-12-06 | 2018-05-16 | 日立オートモティブシステムズ株式会社 | 車両位置推定システム |
EP2918974B1 (en) * | 2014-03-11 | 2019-01-16 | Volvo Car Corporation | Method and system for determining a position of a vehicle |
KR102255432B1 (ko) * | 2014-06-17 | 2021-05-24 | 팅크웨어(주) | 전자 장치 및 그의 제어 방법 |
JP6397663B2 (ja) * | 2014-06-18 | 2018-09-26 | シャープ株式会社 | 自走式電子機器 |
KR20160002178A (ko) * | 2014-06-30 | 2016-01-07 | 현대자동차주식회사 | 자차 위치 인식 장치 및 방법 |
WO2016093028A1 (ja) * | 2014-12-08 | 2016-06-16 | 日立オートモティブシステムズ株式会社 | 自車位置推定装置 |
EP3032221B1 (en) * | 2014-12-09 | 2022-03-30 | Volvo Car Corporation | Method and system for improving accuracy of digital map data utilized by a vehicle |
WO2016130719A2 (en) * | 2015-02-10 | 2016-08-18 | Amnon Shashua | Sparse map for autonomous vehicle navigation |
CN105718860B (zh) * | 2016-01-15 | 2019-09-10 | 武汉光庭科技有限公司 | 基于驾驶安全地图及双目交通标志识别的定位方法及系统 |
CN105929364B (zh) * | 2016-04-22 | 2018-11-27 | 长沙深之瞳信息科技有限公司 | 利用无线电定位的相对位置测量方法及测量装置 |
US10346797B2 (en) * | 2016-09-26 | 2019-07-09 | Cybernet Systems, Inc. | Path and load localization and operations supporting automated warehousing using robotic forklifts or other material handling vehicles |
-
2016
- 2016-09-27 BR BR112019006057-0A patent/BR112019006057B1/pt active IP Right Grant
- 2016-09-27 US US16/336,366 patent/US11321572B2/en active Active
- 2016-09-27 CA CA3038643A patent/CA3038643A1/en not_active Abandoned
- 2016-09-27 MX MX2019002985A patent/MX2019002985A/es unknown
- 2016-09-27 WO PCT/JP2016/078428 patent/WO2018061084A1/ja unknown
- 2016-09-27 RU RU2019112711A patent/RU2720140C1/ru active
- 2016-09-27 CN CN201680089623.8A patent/CN109791408B/zh active Active
- 2016-09-27 EP EP16917637.7A patent/EP3521962B1/en active Active
- 2016-09-27 JP JP2018541754A patent/JP6881464B2/ja active Active
- 2016-09-27 KR KR1020197007722A patent/KR20190045220A/ko not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06333192A (ja) * | 1993-05-21 | 1994-12-02 | Mitsubishi Electric Corp | 自動車用白線検出装置 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
JP2015191372A (ja) * | 2014-03-27 | 2015-11-02 | トヨタ自動車株式会社 | 走路境界区画線検出装置 |
JP2016091045A (ja) * | 2014-10-29 | 2016-05-23 | 株式会社日本自動車部品総合研究所 | 走行区画線認識システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3521962A4 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220049972A1 (en) * | 2016-04-22 | 2022-02-17 | Toyota Jidosha Kabushiki Kaisha | Surrounding information collection system and surrounding information acquisition apparatus |
EP4036523A1 (en) | 2021-01-28 | 2022-08-03 | Toyota Jidosha Kabushiki Kaisha | Self-position estimation accuracy verification method and self-position estimation system |
US11912290B2 (en) | 2021-01-28 | 2024-02-27 | Toyota Jidosha Kabushiki Kaisha | Self-position estimation accuracy verification method and self-position estimation system |
WO2023037570A1 (ja) * | 2021-09-09 | 2023-03-16 | 日立Astemo株式会社 | 車載処理装置、車両制御装置、及び自己位置推定方法 |
Also Published As
Publication number | Publication date |
---|---|
US20200019792A1 (en) | 2020-01-16 |
CN109791408A (zh) | 2019-05-21 |
EP3521962A1 (en) | 2019-08-07 |
EP3521962B1 (en) | 2021-07-21 |
JPWO2018061084A1 (ja) | 2019-07-04 |
JP6881464B2 (ja) | 2021-06-02 |
KR20190045220A (ko) | 2019-05-02 |
BR112019006057B1 (pt) | 2022-12-06 |
CN109791408B (zh) | 2022-04-26 |
RU2720140C1 (ru) | 2020-04-24 |
MX2019002985A (es) | 2019-07-04 |
CA3038643A1 (en) | 2018-04-05 |
EP3521962A4 (en) | 2019-08-07 |
BR112019006057A2 (pt) | 2019-06-18 |
US11321572B2 (en) | 2022-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018061084A1 (ja) | 自己位置推定方法及び自己位置推定装置 | |
KR102503388B1 (ko) | 자율 주행 내비게이션 동안의 객체 회피 방법 | |
RU2668459C1 (ru) | Устройство оценки положения и способ оценки положения | |
JP6477882B2 (ja) | 自己位置推定装置及び自己位置推定方法 | |
JP4343536B2 (ja) | 車用の感知装置 | |
RU2687103C1 (ru) | Устройство оценки положения транспортного средства и способ оценки положения транспортного средства | |
JPWO2017149813A1 (ja) | センサキャリブレーションシステム | |
JP6838365B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
Moras et al. | Drivable space characterization using automotive lidar and georeferenced map information | |
JP6834401B2 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP2023075184A (ja) | 出力装置、制御方法、プログラム及び記憶媒体 | |
US10970870B2 (en) | Object detection apparatus | |
JP5375249B2 (ja) | 移動経路計画装置、移動体制御装置及び移動体 | |
JP7083768B2 (ja) | 認識装置、車両制御装置、認識方法、およびプログラム | |
JP6604052B2 (ja) | 走路境界推定装置及び走路境界推定方法 | |
JP6972528B2 (ja) | 自己位置推定方法、移動体の走行制御方法、自己位置推定装置、及び移動体の走行制御装置 | |
JP2024102524A (ja) | 区画線認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16917637 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018541754 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20197007722 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3038643 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112019006057 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2016917637 Country of ref document: EP Effective date: 20190429 |
|
ENP | Entry into the national phase |
Ref document number: 112019006057 Country of ref document: BR Kind code of ref document: A2 Effective date: 20190327 |