WO2015015939A1 - 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 - Google Patents
車両位置姿勢角推定装置及び車両位置姿勢角推定方法 Download PDFInfo
- Publication number
- WO2015015939A1 WO2015015939A1 PCT/JP2014/066311 JP2014066311W WO2015015939A1 WO 2015015939 A1 WO2015015939 A1 WO 2015015939A1 JP 2014066311 W JP2014066311 W JP 2014066311W WO 2015015939 A1 WO2015015939 A1 WO 2015015939A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- distribution range
- angle
- particle
- yaw
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 42
- 239000002245 particle Substances 0.000 claims abstract description 131
- 238000001514 detection method Methods 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 15
- 238000002474 experimental method Methods 0.000 description 10
- 238000004088 simulation Methods 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 5
- 230000003111 delayed effect Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000011156 evaluation Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000005507 spraying Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000000790 scattering method Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/114—Yaw movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/12—Lateral speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/14—Yaw
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a vehicle position / orientation angle estimation apparatus and method for estimating a vehicle position and attitude angle using a particle filter.
- Patent Document 1 is disclosed as a technique for calculating the position and posture angle of a moving body using a particle filter.
- a plurality of particles are dispersed in the vicinity of the position and posture angle of the moving body calculated by odometry.
- the particle that most closely matches the measurement value of the laser sensor mounted on the moving body is calculated as the true value of the position and posture angle of the moving body.
- the existence distribution range in which the particles are scattered is calculated according to the correction amount history by calculating how much the position and the attitude angle calculated using the particle filter are corrected.
- An object of the present invention is to provide a vehicle position / posture angle estimation apparatus and method capable of performing the same.
- a vehicle position / posture angle estimation apparatus and method sets a predetermined range of particle distribution using a particle filter. Then, the position and posture angle of the vehicle is estimated from an image obtained by scattering particles within the set presence distribution range and imaging the surrounding environment of the vehicle, and when the vehicle speed increases, the set presence distribution range is changed to the vehicle width of the vehicle. Spread in the direction.
- FIG. 1 is a block diagram showing a configuration of a vehicle position / posture angle estimation system including a vehicle position / posture angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 2 is a flowchart showing a processing procedure of vehicle position / posture angle estimation processing by the vehicle position / posture angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 3 is a diagram schematically illustrating the turning center of the vehicle and the side slip angle of each tire during a constant-speed circular turn.
- FIG. 4 is a diagram for explaining a method of expanding the particle existence distribution range in the front-rear direction of the vehicle according to the speed by the vehicle position / orientation angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of a vehicle position / posture angle estimation system including a vehicle position / posture angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 2 is a flowchart showing a processing procedure of vehicle position / posture angle estimation processing by the vehicle
- FIG. 5 is a diagram for explaining a method of expanding the particle existence distribution range in the vehicle width direction according to the speed by the vehicle position / orientation angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 6 is a diagram for explaining a method of expanding the particle existence distribution range in the yaw angle direction of the vehicle according to the speed by the vehicle position / orientation angle estimation apparatus according to the first embodiment of the present invention.
- FIG. 7 is a block diagram illustrating a configuration of a vehicle position / posture angle estimation system including a vehicle position / posture angle estimation apparatus according to the second embodiment of the present invention.
- FIG. 8 is a diagram for explaining the tire lateral force and the yaw moment when the vehicle steers.
- FIG. 9 is a diagram for explaining a method of expanding the presence distribution range of particles in the vehicle width direction and the yaw angle direction according to the yaw rate by the vehicle position / posture angle estimation device according to the second embodiment of the present invention.
- FIG. 10 is a block diagram showing a configuration of a vehicle position / orientation angle estimation system including a vehicle position / orientation angle estimation apparatus according to the third embodiment of the present invention.
- FIG. 11 illustrates a method of expanding the particle existence distribution range in the vehicle front-rear direction, the vehicle width direction, and the yaw angle direction according to the steering angle by the vehicle position / posture angle estimation apparatus according to the third embodiment of the present invention.
- FIG. FIG. 12 is a diagram for explaining a particle scattering method by the vehicle position / posture angle estimation apparatus according to the present invention.
- FIG. 1 is a block diagram showing a configuration of a vehicle position / posture angle estimation system equipped with a vehicle position / posture angle estimation apparatus according to the present embodiment.
- the vehicle position / posture angle estimation system according to the present embodiment includes an ECU 1, a camera 2, a three-dimensional map database 3, and a vehicle sensor group 4.
- the ECU 1 is an electronic control unit configured by a ROM, a RAM, an arithmetic circuit, and the like, and includes the vehicle position / posture angle estimation device 10 according to the present embodiment.
- the ECU 1 may also be used as an ECU used for other controls.
- the camera 2 uses a solid-state imaging device such as a CCD, for example.
- the camera 2 is installed on the front bumper of the vehicle so that the optical axis is horizontal and the front of the vehicle can be imaged.
- the captured image is transmitted to the ECU 1.
- the 3D map database 3 stores 3D position information such as edges of the surrounding environment including road surface display, for example.
- 3D position information such as edges of the surrounding environment including road surface display, for example.
- at least three-dimensional information of the position and direction of lane markings and curbs indicating road edges is recorded, and in addition to road surface indications such as white lines, stop lines, pedestrian crossings, road marks, etc.
- road surface indications such as white lines, stop lines, pedestrian crossings, road marks, etc.
- edge information of structures such as buildings.
- each map information such as a road edge is defined as an aggregate of edges. In the case where the edge is a long straight line, for example, since it is divided every 1 m, there is no extremely long edge. In the case of a straight line, each edge has three-dimensional position information indicating both end points of the straight line. In the case of a curve, each edge has three-dimensional position information indicating the end points and the center point of the curve.
- the vehicle sensor group 4 includes a GPS receiver 41, an accelerator sensor 42, a steering sensor 43, a brake sensor 44, a vehicle speed sensor 45, an acceleration sensor 46, a wheel speed sensor 47, and a yaw rate sensor 48. Yes.
- the vehicle sensor group 4 is connected to the ECU 1 and supplies various detection values detected by the sensors 41 to 48 to the ECU 1.
- the ECU 1 uses the output value of the vehicle sensor group 4 to calculate the approximate position of the vehicle or to calculate odometry indicating the amount of movement of the vehicle per unit time.
- the vehicle position / posture angle estimation device 10 is a device for estimating the position and posture angle of a vehicle by matching an image obtained by capturing the surrounding environment of the vehicle with three-dimensional map data. Then, by executing a specific program, the edge image calculation unit 12, odometry calculation unit 14, vehicle speed detection unit 15, particle presence distribution range setting unit 16, particle scattering unit 18, projection image creation unit 20, likelihood calculation unit 22 and the position / orientation angle estimation unit 24.
- the edge image calculation unit 12 acquires an image obtained by capturing the surrounding environment of the vehicle from the camera 2, detects an edge from the image, and calculates an edge image.
- the image captured by the camera 2 captures at least a lane line and a curb indicating the road edge as road surface information necessary for estimating the position and posture angle of the host vehicle.
- a road surface display such as a white line, a stop line, a pedestrian crossing, and a road surface mark may be captured.
- the odometry calculation unit 14 calculates odometry, which is the amount of movement of the vehicle per unit time, using various sensor values obtained from the vehicle sensor group 4.
- the vehicle speed detection unit 15 detects the speed of the vehicle by acquiring the sensor value measured by the vehicle speed sensor 45.
- the particle presence distribution range setting unit 16 sets a predetermined particle presence distribution range in the vicinity of the position and the attitude angle moved by the odometry calculated by the odometry calculation unit 14, and the particles according to the traveling state of the vehicle. Correct the existence distribution range of. Specifically, as shown in FIG. 12, the particle P of the position and posture angle of the vehicle V (t1) estimated before one loop and the surrounding particles P1 to P5 are moved by odometry, and the presence of the particles Set the distribution range and correct. In the present embodiment, when the speed increases as the vehicle travels, correction is performed to expand the particle presence distribution range in the vehicle front-rear direction, the vehicle width direction, and the yaw angle direction.
- the particle spraying unit 18 sprays particles within the particle presence distribution range set by the particle presence distribution range setting unit 16. As shown in FIG. 12, the particle scattering unit 18 sets particles P10 to P15 in order to estimate the position and posture angle of a new vehicle V (t2).
- the projection image creation unit 20 creates a projection image for each of the particles dispersed by the particle scattering unit 18. For example, three-dimensional position information such as edges stored in the three-dimensional map database 3 is projected and converted so as to be an image captured by a camera from the position and posture angle of each particle to create a projection image.
- the likelihood calculation unit 22 compares the projection image created by the projection image creation unit 20 with the edge image calculated by the edge image calculation unit 12, and calculates the likelihood for each of the particles.
- This likelihood is an index indicating how likely the position and posture angle of each particle is relative to the actual vehicle position and posture angle, and the higher the degree of coincidence between the projected image and the edge image, The likelihood is set to be high.
- the position / orientation angle estimation unit 24 estimates the position and orientation angle of the vehicle based on the likelihood calculated by the likelihood calculation unit 22. For example, the position and posture angle of the particle having the highest likelihood are calculated as the estimation result of the actual position and posture angle of the vehicle.
- the position of six degrees of freedom (front-rear direction, vehicle width direction, vertical direction) and posture angle (roll, pitch, yaw) of the vehicle are obtained.
- the present technology can be applied even when estimating a position (front-rear direction, lateral direction) and posture angle (yaw) with three degrees of freedom, such as an automatic guided vehicle used in a factory without a suspension or the like. Is possible.
- the vertical position and posture angle roll and pitch are fixed, so these parameters may be measured and used in advance.
- the position and posture angle of a total of six degrees of freedom of roll, pitch, and yaw are estimated as the vehicle position information, the front-rear direction, the vehicle width direction, the vertical direction, and the posture angle information.
- the roll is a rotation direction about the vehicle longitudinal direction
- the pitch is the rotation direction about the vehicle width direction of the vehicle
- the yaw is the rotation direction about the vehicle vertical direction (FIG. 12). reference).
- vehicle position / orientation angle estimation process described below is continuously performed, for example, at intervals of about 100 msec.
- step S ⁇ b> 110 the edge image calculation unit 12 calculates an edge image from the image of the camera 2.
- An edge in the present embodiment refers to a portion where the luminance of a pixel changes sharply.
- the edge detection method for example, the Canny method can be used.
- various methods such as differential edge detection may be used.
- the edge image calculation unit 12 extracts the direction of brightness change of the edge, the color near the edge, and the like from the image of the camera 2.
- step S160 and step S170 which will be described later, it is possible to set the likelihood using information other than these edges recorded in the 3D map database 3 and calculate the position and attitude angle of the host vehicle. It becomes possible.
- step S120 the odometry calculation unit 14 calculates odometry, which is the amount of movement of the vehicle from the time calculated in step S120 one loop before, based on the sensor value obtained from the vehicle sensor group 4. . In the case of the first loop after starting the program, the odometry is calculated as zero.
- the odometry calculating unit 14 calculates the turning amount (turning angle) in the yaw angle direction from the difference between the encoder values of the wheel speed sensors 47 for the left and right wheels, while limiting the vehicle motion to a plane as a method for calculating the odometry.
- the average movement amount is obtained from the encoder value of the wheel speed sensor 47 of each wheel, and the cosine and sine of the turning angle in the yaw angle direction are calculated with respect to this movement amount. Can be obtained.
- the movement amount and the rotation amount per unit time may be calculated from the wheel speed and yaw rate of each wheel.
- the wheel speed may be substituted by the difference between the vehicle speed and the positioning value of the GPS receiver 41, and the yaw rate may be substituted by the steering angle.
- Various calculation methods can be considered as the odometry calculation method, but any method may be used as long as odometry can be calculated.
- the Ackerman steering geometry (Chapter 3 of automobile motion and control, written by Masato Abe, Sankai You may ask for odometry according to the issue.
- the equation of motion of a linear two-wheel model that can take into account the side slip angle of the tire (Chapter 3 Section 3.2.1 P.56 (3.12) and (3.13)) It is even better to calculate using the formula, written by Masato Abe, published by Sankaido.
- step S130 the particle presence distribution range setting unit 16 moves the position and posture angle of each particle estimated in step S170 one loop before by the odometry calculated in this step S120.
- the data from the GPS receiver 41 included in the vehicle sensor group 4 is used as the initial position information.
- the vehicle position and posture angle calculated last when the vehicle stopped last time may be stored, and the initial position and posture angle information may be stored.
- the existence distribution range of particles is set in the vicinity of the position and posture angle of the vehicle moved by the odometry in consideration of the vehicle dynamics and the running state.
- the particle presence distribution range is expanded in the vehicle front-rear direction, the vehicle width direction, and the yaw angle direction.
- FIG. 3 is a diagram schematically showing the turning center of a vehicle and a side slip angle of each tire during a constant-speed circular turn.
- FIG. 3 (a) shows a case of a steady circular turn at an extremely low speed.
- the presence distribution range of the particles is expanded in the vehicle front-rear direction, the vehicle width direction, and the yaw angle direction.
- the existence distribution range in the front-rear direction may be continuously changed according to the vehicle speed.
- Vlgth, Rlg_min, and Rlg_max are set to 20 [km / h], 0.5 [m], and 1.0 [m], respectively.
- Vlgth is a speed at which an error in the longitudinal direction of the vehicle increases.
- Rlg_min is a low-speed particle existence distribution range with little error in the front-rear direction of the vehicle, and is set in advance by obtaining an appropriate value through experiments and simulations.
- Rlg_max is a high-speed particle existence distribution range in which an error in the front-rear direction of the vehicle increases, and is set in advance by obtaining an appropriate value through experiments and simulations.
- the existence distribution range of particles other than the front-rear direction is set as follows.
- ⁇ 0.5 [m], ⁇ 0.1 [m], ⁇ 0... Respectively, in the vehicle width direction, vertical direction, roll, pitch, and yaw directions from the position and posture angle moved by odometry. 5 [deg], ⁇ 0.5 [deg], and ⁇ 3.0 [deg].
- the existence distribution range is set in a range of ⁇ Rlt_min (m) in the vehicle width direction from the position moved by the odometry. To do.
- the presence distribution range is set in the range of ⁇ Rlt_max (m) in the vehicle width direction of the vehicle to widen the presence distribution range.
- the existence distribution range in the vehicle width direction may be continuously changed according to the vehicle speed.
- Vltth, Rlt_min, and Rlt_max are set to 20 [km / h], 0.2 [m], and 0.5 [m], respectively.
- Vltth is a speed at which the vehicle is caused to move in the vehicle width direction due to a centrifugal force acting on the vehicle and a side slip angle on each wheel.
- Rlt_min is a low-speed particle distribution range in which no movement in the vehicle width direction occurs in the vehicle, and is set in advance by obtaining an appropriate value through experiments and simulations.
- Rlt_max is an existing distribution range of particles at a high speed at which movement in the vehicle width direction occurs in the vehicle, and is set in advance by obtaining an appropriate value through experiments and simulations.
- the existence distribution range is within a range of ⁇ Ryw_min (rad) from the attitude angle moved by the odometry to the vehicle yaw angle direction.
- the existence distribution range is set in the range of ⁇ Ryw_max (rad) in the yaw angle direction of the vehicle to widen the existence distribution range.
- the existence distribution range in the yaw angle direction may be continuously changed according to the vehicle speed.
- Vywth, Ryw_min, and Ryw_max are 10 [km / h], 0.02 [rad], and 0.05 [rad], respectively.
- Vywth is a speed at which a centrifugal force acts on the vehicle to cause each wheel to have a skid angle and a motion in the yaw angle direction occurs on the vehicle.
- Ryw_min is a low-speed particle existence distribution range in which no movement in the yaw angle direction occurs in the vehicle, and is set in advance by obtaining an appropriate value through experiments and simulations.
- Ryw_max is the existence distribution range of particles at a high speed at which movement in the yaw angle direction occurs in the vehicle, and is set in advance by obtaining an appropriate value through experiments and simulations.
- step S140 the particle scattering unit 18 distributes particles within the presence distribution range set in step S130.
- random parameters are set using a random number table or the like within the range (upper and lower limits) set in step S130 for the six-degree-of-freedom parameters that define the position and posture angle of particles.
- 500 particles are always created.
- particles may be dispersed using the technique disclosed in Patent Document 1.
- it is calculated how much the position and posture angle of the vehicle moved by the odometry in step S130 is corrected by the position and posture angle calculated in step S170, and particles are determined according to this correction amount. What is necessary is just to set and distribute the average and distribution to spread.
- the range in which the particles are dispersed is the presence distribution range set in step S130.
- the existence distribution range may also be obtained using the technique disclosed in Patent Document 1 and ORed with the existence distribution range set in step S130. Further, the number of particles to be dispersed may be dynamically determined according to the presence distribution range set in step S130 using the technique disclosed in Patent Document 1.
- step S150 the projection image creation unit 20 creates a projection image (virtual image) for each of the plurality of particles dispersed in step S140.
- a projection image virtual image
- three-dimensional position information such as an edge stored in the three-dimensional map database 3 is projected and converted into a camera image at each predicted position and posture angle candidate to create a projection image for evaluation.
- the evaluation point group projected on the projection image is compared with the edge on the edge image calculated in step S110 in step S160 described later.
- the projection conversion requires an external parameter indicating the position of the camera 2 and an internal parameter of the camera 2.
- the external parameter may be calculated from the predicted position and the posture angle candidate by measuring the relative position from the vehicle to the camera 2 in advance.
- the internal parameters may be calibrated in advance.
- step S110 if the brightness change direction of the edge, the color near the edge, and the like are extracted from the camera image, it is desirable to create a projection image using them.
- step S160 the likelihood calculating unit 22 compares the edge image calculated in step S110 with the projection image created in step S150 for each of the plurality of particles scattered in step S140. Then, based on the comparison result, the likelihood is calculated for each particle that is a predicted position and posture angle candidate.
- the likelihood is an index indicating how likely each predicted position and posture angle candidate is to the actual vehicle position and posture angle.
- the likelihood calculating unit 22 sets the likelihood to be higher as the matching degree between the projection image and the edge image is higher. An example of how to obtain this likelihood will be described below.
- a pixel position is specified on the projection image, and it is determined whether or not an edge exists at this pixel position. Then, it is determined whether or not an edge exists at a position on the edge image having the same pixel position as that of the projection image.
- 1 is set as the likelihood likelihood (unit: none)
- 0 is set.
- Such processing is performed for all evaluation points, and the number of coincidence evaluation points, which is the sum of them, is used as the likelihood.
- normalization is performed so that the total value of the respective likelihoods becomes 1. In addition, since many other calculation methods can be considered as the calculation method of likelihood, you may use those methods.
- the position / orientation angle estimation unit 24 calculates the final vehicle position and attitude angle using the plurality of predicted position and attitude angle candidates having the likelihood information calculated in step S160. For example, the position / orientation angle estimation unit 24 calculates the predicted position and orientation angle candidate with the highest likelihood as the actual position and orientation angle of the vehicle. Further, a weighted average of the predicted position and posture angle may be obtained using the likelihood of each predicted position and posture angle candidate, and the values may be used as the final vehicle position and posture angle. When the vehicle position and orientation angle estimation results are thus calculated, the vehicle position and orientation angle estimation process according to the present embodiment is terminated.
- the position and the attitude angle of the vehicle are estimated by matching the image captured by the camera 2 mounted on the vehicle and the 3D map database 3.
- the laser sensor The position and posture angle of the vehicle may be estimated using the measured values.
- the three-dimensional map database 3 includes, for example, position information of pole-shaped obstacles such as utility poles that can measure the distance and azimuth from the vehicle with a laser sensor, and structures such as buildings and fences around the road The information about the position and shape of the image is provided as an occupation grid map.
- This occupancy grid map expresses a map by dividing the environment with fine grid-like cells and adding an occupancy probability that each cell is occupied by an obstacle (probability robotics, Chapter 9, Daily Communications) Issue).
- step S150 in the flowchart of FIG. 2 when the vehicle is at the position and posture angle of each particle, the arrangement of the obstacles and structures stored in the 3D map database 3 is calculated. And project it onto the occupied grid map.
- step S160 in the flowchart of FIG. 2 for each particle, the number of cells occupied by each obstacle or structure stored in the 3D map database 3 is detected by the laser sensor on the occupied grid map.
- the likelihood is calculated by counting the number of completions.
- the particle presence distribution range is expanded in the vehicle width direction of the vehicle. Therefore, even when the vehicle speed increases and the vehicle moves in the vehicle width direction, the particle presence distribution range can be set appropriately, so that the position and posture angle of the vehicle can be accurately estimated.
- the particle presence distribution range when the vehicle speed increases, the particle presence distribution range is expanded in the yaw angle direction of the vehicle.
- the particle presence distribution range can be set appropriately, so that the position and posture angle of the vehicle can be accurately estimated.
- FIG. 7 is a block diagram showing a configuration of a vehicle position / posture angle estimation system equipped with the vehicle position / posture angle estimation device according to the present embodiment.
- the vehicle position / posture angle estimation apparatus 10 according to the present embodiment is different from the first embodiment in that it further includes a yaw rate detection unit 75.
- the other structure is the same as 1st Embodiment, the same number is attached
- the yaw rate detector 75 acquires the sensor value measured by the yaw rate sensor 48 to detect the yaw rate that is the rate of change in the yaw angle direction of the vehicle.
- the position / orientation angle estimation process of the vehicle according to the present embodiment is different from the first embodiment in the method for setting the particle presence distribution range executed in step S130 of FIG.
- the particle existence distribution range is set according to the speed of the vehicle.
- the particle existence distribution range is set according to the yaw rate of the vehicle. ing.
- the particle presence distribution range setting unit 16 moves the position and posture angle of each particle one loop before by odometry. Then, a particle presence distribution range is set in the vicinity of the position and posture angle of the moved vehicle, and in this embodiment, when the yaw rate of the vehicle increases, the particle presence distribution range is set in the vehicle width direction and the yaw angle direction of the vehicle. It is spreading.
- FIG. 8 is a diagram showing tire lateral force and yaw moment when the vehicle is steered.
- FIG. 8A shows a case where the vehicle travels straight
- FIG. 8B shows a case where the front wheel is steered
- FIG. 8C shows a case of steady circle turning.
- each wheel has a side slip angle ⁇ as described in FIG. Turn the steering wheel and steer at a high vehicle speed.
- FIG. 8B immediately after turning, first, only the front wheel has a side slip angle ⁇ and a tire lateral force Ff is generated on the front wheel. Then, the yaw moment Yf is generated by the tire lateral force Ff, and the turning starts. In this state, the movement in the yaw angle direction is the main, so that the posture angle in the yaw angle direction is likely to have an error.
- the presence distribution range of the particles is expanded in the vehicle width direction and the yaw angle direction of the vehicle.
- the presence distribution range is expanded by setting a presence distribution range in a range of ⁇ Rltr_max (m) in the vehicle width direction and in a range of ⁇ Rywr_max (rad) in the yaw angle direction. .
- the presence distribution range is set to be wide in consideration of an increase in errors in the vehicle width direction and the yaw angle direction caused by road surface conditions, individual differences of moving objects, and the like.
- the existence distribution range may be continuously changed according to the yaw rate.
- ⁇ th, Rltr_min, Rltr_max, Rywr_min, and Rywr_max are set to 0.15 [rad / s], 0.2 [m], 0.5 [m], 0.02 [rad], 0.05 [rad].
- ⁇ th is a yaw rate at which movement in the yaw angle direction or the vehicle width direction of the vehicle occurs by turning the vehicle.
- Rltr_min and Rywr_min are particle existence distribution ranges at a low yaw rate at which movement in the vehicle width direction and yaw angle direction does not occur in the vehicle, and are set in advance by obtaining appropriate values through experiments and simulations.
- Rltr_max and Rywr_max are the existence distribution ranges of particles at a high yaw rate that cause the vehicle to move in the vehicle width direction and the yaw angle direction, and are set in advance by obtaining appropriate values through experiments and simulations.
- the particle existence distribution range by using the method of the present embodiment and the method of the first embodiment together.
- the larger value of the upper and lower limit values in each direction is set.
- the existence distribution range may be set by using.
- the particle presence distribution range setting unit 16 expands the particle presence distribution range in the vehicle yaw angle direction and then extends in the vehicle width direction after a predetermined time has elapsed. To control. For example, after extending in the yaw angle direction, a time delay of 0.5 [s] is added to expand in the vehicle width direction.
- the vehicle width direction is delayed with respect to the vehicle width direction, so that the position of the vehicle in the vehicle width direction changes with a delay with respect to the attitude angle in the yaw angle direction. It is possible to appropriately set the existence distribution range of the particles in correspondence with.
- the movement of the vehicle in the yaw angle direction is delayed due to the steering mechanism and the tire lateral force with respect to the steering of the driver. Therefore, the timing for expanding the existence distribution range of particles in the yaw angle direction with respect to steering may be delayed. That is, when the steering is steered by the driver, for example, a time delay of 0.2 [s] may be added before the particle distribution range in the yaw angle direction is expanded.
- the presence distribution range of the particles is expanded in the vehicle width direction after a predetermined time has elapsed since the particle distribution range is expanded in the vehicle yaw angle direction. .
- the particle presence distribution range in response to the position of the vehicle in the vehicle width direction being delayed with respect to the attitude angle in the yaw angle direction.
- FIG. 10 is a block diagram illustrating a configuration of a vehicle position / posture angle estimation system equipped with the vehicle position / posture angle estimation apparatus according to the present embodiment.
- the vehicle position / posture angle estimation apparatus 10 according to the present embodiment is different from the first embodiment in that it further includes a steering angle detection unit 105.
- the other structure is the same as 2nd Embodiment, the same number is attached
- the steering angle detection unit 105 detects the steering angle of the vehicle by acquiring the sensor value measured by the steering sensor 43.
- the position / orientation angle estimation process of the vehicle according to the present embodiment is different from the first embodiment in the method for setting the particle presence distribution range executed in step S130 of FIG.
- the particle existence distribution range is set according to the speed of the vehicle, but in this embodiment, the particle existence distribution range is set according to the steering angle of the vehicle. It is different.
- the particle presence distribution range setting unit 16 moves the position and posture angle of each particle one loop before by odometry. Then, a particle presence distribution range is set in the vicinity of the position and posture angle of the moved vehicle, and when the steering steering angle of the vehicle increases, the particle presence distribution range is changed to the vehicle front-rear direction, the vehicle width direction, and the yaw angle direction. It has spread to.
- the vehicle when the steering angle detected by the steering angle detector 105 is less than the threshold value ⁇ th, the vehicle is moved within the range of ⁇ Rlgs_min in the vehicle longitudinal direction from the position moved by the odometry.
- An existence distribution range is set in a range of ⁇ Rlts_min in the width direction. Further, the existence distribution range is set in a range of ⁇ Ryws_min (rad) in the yaw angle direction from the attitude angle moved by the odometry.
- the presence distribution range is expanded by setting a presence distribution range in a range of ⁇ Rlgs_max (m) in the vehicle front-rear direction and a range of ⁇ Rlts_max (m) in the vehicle width direction. ing. In the yaw angle direction, the presence distribution range is set in a range of ⁇ Ryws_max (rad) to widen the presence distribution range.
- the existence distribution range may be continuously changed according to the steering angle.
- ⁇ th, Rlgs_min, Rlts_min, and Ryws_min are set to 3 [rad], 0.2 [m], 0.2 [m], and 0.02 [rad], respectively.
- Rlgs_max, Rlts_max, and Ryws_max are set to 1.0 [m], 0.5 [m], and 0.05 [rad], respectively.
- ⁇ th is a steering angle at which an error increases due to the movement of the vehicle in the yaw angle direction or the vehicle width direction due to the turning of the vehicle.
- Rlgs_min, Rlts_min, and Ryws_min are particle distribution ranges at a low steering steering angle at which the vehicle does not generate movement in the vehicle width direction or yaw angle direction, and are set by obtaining appropriate values in advance through experiments and simulations.
- Rlgs_max, Rlts_max, and Ryws_max are the existence distribution ranges of particles at a high steering steering angle at which movement in the vehicle width direction and yaw angle direction occurs in the vehicle, and are set by obtaining appropriate values in advance through experiments and simulations. Has been.
- the particle existence distribution range by using the method of the present embodiment together with the method of the first and second embodiments, and in that case, it is the largest of the upper and lower limit values in each direction.
- the existence distribution range may be set using the value.
Abstract
Description
[車両位置姿勢角推定システムの構成]
図1は、本実施形態に係る車両位置姿勢角推定装置を搭載した車両位置姿勢角推定システムの構成を示すブロック図である。図1に示すように、本実施形態に係る車両位置姿勢角推定システムは、ECU1と、カメラ2と、三次元地図データベース3と、車両センサ群4とを備えている。
次に、本実施形態に係る車両の位置姿勢角推定処理の手順を図2のフローチャートを参照して説明する。尚、本実施形態では、車両の位置情報として前後方向、車幅方向、上下方向、姿勢角情報としてロール、ピッチ、ヨーの合計6自由度の位置と姿勢角を推定する。ただし、ロールは車両の前後方向を軸とした回転方向であり、ピッチは車両の車幅方向を軸とした回転方向であり、ヨーは車両の上下方向を軸とした回転方向である(図12参照)。
以上詳細に説明したように、本実施形態に係る車両位置姿勢角推定装置では、車両の速度が高くなると、パーティクルの存在分布範囲を車両の車幅方向に広げている。これにより、車両の速度が高くなって車両に車幅方向への移動が生じた場合でもパーティクルの存在分布範囲を適正に設定できるので、車両の位置及び姿勢角を正確に推定することができる。
次に、本発明の第2実施形態に係る車両位置姿勢角推定装置について図面を参照して説明する。
図7は、本実施形態に係る車両位置姿勢角推定装置を搭載した車両位置姿勢角推定システムの構成を示すブロック図である。図7に示すように、本実施形態に係る車両位置姿勢角推定装置10は、ヨーレート検出部75をさらに備えたことが第1実施形態と相違している。尚、その他の構成は第1実施形態と同一なので、同一の番号を付して詳細な説明は省略する。
本実施形態に係る車両の位置姿勢角推定処理は、図2のステップS130で実行されるパーティクルの存在分布範囲の設定方法が、第1実施形態と相違している。第1実施形態では、車両の速度に応じてパーティクルの存在分布範囲を設定していたが、本実施形態では、車両のヨーレートに応じてパーティクルの存在分布範囲を設定するようにしたことが相違している。
以上詳細に説明したように、本実施形態に係る車両位置姿勢角推定装置では、車両のヨーレートが高くなると、パーティクルの存在分布範囲を車両のヨー角方向に広げている。これにより、車両のヨーレートが高くなって車両にヨー角方向への移動が生じた場合でもパーティクルの存在分布範囲を適正に設定できるので、車両の位置及び姿勢角を正確に推定することができる。
次に、本発明の第3実施形態に係る車両位置姿勢角推定装置について図面を参照して説明する。
図10は、本実施形態に係る車両位置姿勢角推定装置を搭載した車両位置姿勢角推定システムの構成を示すブロック図である。図10に示すように、本実施形態に係る車両位置姿勢角推定装置10は、操舵角検出部105をさらに備えたことが第1実施形態と相違している。尚、その他の構成は第2実施形態と同一なので、同一の番号を付して詳細な説明は省略する。
本実施形態に係る車両の位置姿勢角推定処理は、図2のステップS130で実行されるパーティクルの存在分布範囲の設定方法が、第1実施形態と相違している。第1実施形態では、車両の速度に応じてパーティクルの存在分布範囲を設定していたが、本実施形態では、車両のステアリング操舵角に応じてパーティクルの存在分布範囲を設定するようにしたことが相違している。
以上詳細に説明したように、本実施形態に係る車両位置姿勢角推定装置では、車両の操舵角が大きくなると、存在分布範囲を車両の車幅方向とヨー角方向に広げている。これにより、操舵角が大きくなって車両に車幅方向やヨー角方向への移動が生じた場合でもパーティクルの存在分布範囲を適正に設定できるので、車両の位置及び姿勢角を正確に推定することができる。
2 カメラ
3 三次元地図データベース
4 車両センサ群
10 車両位置姿勢角推定装置
12 エッジ画像算出部
14 オドメトリ算出部
15 車速検出部
16 パーティクル存在分布範囲設定部
18 パーティクル散布部
20 投影画像作成部
22 尤度算出部
24 位置姿勢角推定部
41 GPS受信機
42 アクセルセンサ
43 ステアリングセンサ
44 ブレーキセンサ
45 車速センサ
46 加速度センサ
47 車輪速センサ
48 ヨーレートセンサ
75 ヨーレート検出部
105 操舵角検出部
Claims (6)
- パーティクルフィルタを用いて所定範囲のパーティクルの存在分布範囲を設定し、設定した存在分布範囲内にパーティクルを散布して車両の周囲環境を撮像した画像から車両の位置及び姿勢角を推定する車両位置姿勢角推定装置であって、
前記車両の速度を検出する車速検出部と、
前記車速検出部で検出された車両の速度が高くなると、前記存在分布範囲を前記車両の車幅方向に広げるパーティクル存在分布範囲設定部と
を備えたことを特徴とする車両位置姿勢角推定装置。 - 前記パーティクル存在分布範囲設定部は、前記車両の速度が高くなると、前記存在分布範囲を、前記車両の上下方向を軸とした回転方向であるヨー角方向に広げることを特徴とする請求項1に記載の車両位置姿勢角推定装置。
- 前記車両のヨー角方向の変化率であるヨーレートを検出するヨーレート検出部をさらに備え、
前記パーティクル存在分布範囲設定部は、前記ヨーレート検出部で検出されたヨーレートが高くなると、前記存在分布範囲を前記ヨー角方向に広げることを特徴とする請求項2に記載の車両位置姿勢角推定装置。 - 前記パーティクル存在分布範囲設定部は、前記車両のヨーレートが高くなると、前記存在分布範囲を前記車両のヨー角方向に広げてから所定時間経過後に前記車両の車幅方向に広げることを特徴とする請求項3に記載の車両位置姿勢角推定装置。
- 前記車両の操舵角を検出する操舵角検出部をさらに備え、
前記パーティクル存在分布範囲設定部は、前記操舵角検出部で検出された操舵角が大きくなると、前記存在分布範囲を前記車両の車幅方向とヨー角方向に広げることを特徴とする請求項1~4のいずれか1項に記載の車両位置姿勢角推定装置。 - パーティクルフィルタを用いて所定範囲のパーティクルの存在分布範囲を設定し、設定した存在分布範囲内にパーティクルを散布して車両の周囲環境を撮像した画像から車両の位置及び姿勢角を推定する車両位置姿勢角推定装置の車両位置姿勢角推定方法であって、
前記車両位置姿勢角推定装置は、
前記車両の速度を検出し、
前記車両の速度が高くなると、前記存在分布範囲を前記車両の車幅方向に広げる
ことを特徴とする車両位置姿勢角推定方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14831869.4A EP3029538B1 (en) | 2013-08-01 | 2014-06-19 | Vehicle position/bearing estimation device and vehicle position/bearing estimation method |
RU2016107125A RU2626424C1 (ru) | 2013-08-01 | 2014-06-19 | Устройство оценки позиции и угла пространственной ориентации транспортного средства и способ оценки позиции и угла пространственной ориентации транспортного средства |
US14/908,407 US10363940B2 (en) | 2013-08-01 | 2014-06-19 | Vehicle position attitude-angle estimation device and vehicle position attitude-angle estimation method |
MX2016001351A MX345393B (es) | 2013-08-01 | 2014-06-19 | Dispositivo de estimación del ángulo de actitud de la posición del vehículo y método de estimación del ángulo de actitud de la posición del vehículo. |
CN201480054274.7A CN105593776B (zh) | 2013-08-01 | 2014-06-19 | 车辆位置姿势角推定装置及车辆位置姿势角推定方法 |
BR112016002163A BR112016002163A2 (pt) | 2013-08-01 | 2014-06-19 | dispositivo para estimativa de ângulo de atitude de posição de veículo e método de estimativa de ângulo de atitude de posição de veículo |
JP2015529451A JP6020729B2 (ja) | 2013-08-01 | 2014-06-19 | 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013160074 | 2013-08-01 | ||
JP2013-160074 | 2013-08-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015015939A1 true WO2015015939A1 (ja) | 2015-02-05 |
Family
ID=52431481
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/066311 WO2015015939A1 (ja) | 2013-08-01 | 2014-06-19 | 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US10363940B2 (ja) |
EP (1) | EP3029538B1 (ja) |
JP (1) | JP6020729B2 (ja) |
CN (1) | CN105593776B (ja) |
BR (1) | BR112016002163A2 (ja) |
MX (1) | MX345393B (ja) |
RU (1) | RU2626424C1 (ja) |
WO (1) | WO2015015939A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019187750A1 (ja) * | 2018-03-28 | 2019-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JP2019215853A (ja) * | 2018-06-11 | 2019-12-19 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 測位のための方法、測位のための装置、デバイス及びコンピュータ読み取り可能な記憶媒体 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160259034A1 (en) * | 2015-03-04 | 2016-09-08 | Panasonic Intellectual Property Management Co., Ltd. | Position estimation device and position estimation method |
JP6511406B2 (ja) * | 2016-02-10 | 2019-05-15 | クラリオン株式会社 | キャリブレーションシステム、キャリブレーション装置 |
CN110709302B (zh) * | 2017-06-13 | 2022-11-25 | 日立安斯泰莫株式会社 | 车辆控制装置 |
US11531354B2 (en) * | 2017-12-05 | 2022-12-20 | Sony Corporation | Image processing apparatus and image processing method |
CN109109861B (zh) * | 2018-09-24 | 2020-02-14 | 合肥工业大学 | 车道保持横向控制决策方法及车道保持横向控制决策装置 |
WO2020248210A1 (en) * | 2019-06-14 | 2020-12-17 | Bayerische Motoren Werke Aktiengesellschaft | Roadmodel manifold for 2d trajectory planner |
WO2022141240A1 (en) * | 2020-12-30 | 2022-07-07 | SZ DJI Technology Co., Ltd. | Determining vehicle positions for autonomous driving based on monocular vision and semantic map |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010224755A (ja) | 2009-03-23 | 2010-10-07 | Toyota Motor Corp | 移動体及び移動体の位置推定方法 |
JP2011040993A (ja) * | 2009-08-11 | 2011-02-24 | Nikon Corp | 被写体追尾プログラム、およびカメラ |
JP2012108798A (ja) * | 2010-11-18 | 2012-06-07 | Secom Co Ltd | 移動物体追跡装置 |
WO2013002067A1 (ja) * | 2011-06-29 | 2013-01-03 | 株式会社日立産機システム | 移動ロボット、及び移動体に搭載される自己位置姿勢推定システム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2889882B1 (fr) * | 2005-08-19 | 2009-09-25 | Renault Sas | Procede et systeme de prediction de choc entre un vehicule et un pieton. |
DE102006019216A1 (de) * | 2006-04-21 | 2007-10-25 | Claas Selbstfahrende Erntemaschinen Gmbh | Verfahren zur Steuerung eines landwirtschaftlichen Maschinensystems |
IE20100162A1 (en) * | 2009-03-19 | 2010-09-29 | Cork Inst Technology | A location and tracking system |
CN102087109A (zh) * | 2009-12-04 | 2011-06-08 | 财团法人资讯工业策进会 | 位置估测系统、装置及其估测方法 |
CN101800890B (zh) * | 2010-04-08 | 2013-04-24 | 北京航空航天大学 | 一种高速公路监控场景下多车辆视频跟踪方法 |
US8452535B2 (en) * | 2010-12-13 | 2013-05-28 | GM Global Technology Operations LLC | Systems and methods for precise sub-lane vehicle positioning |
JP5807518B2 (ja) * | 2011-11-09 | 2015-11-10 | 富士通株式会社 | 推定装置、推定方法、および推定プログラム |
CN102768361A (zh) * | 2012-07-09 | 2012-11-07 | 东南大学 | 基于遗传粒子滤波与模糊神经网络的gps/ins组合定位方法 |
US9250324B2 (en) * | 2013-05-23 | 2016-02-02 | GM Global Technology Operations LLC | Probabilistic target selection and threat assessment method and application to intersection collision alert system |
-
2014
- 2014-06-19 BR BR112016002163A patent/BR112016002163A2/pt not_active IP Right Cessation
- 2014-06-19 US US14/908,407 patent/US10363940B2/en not_active Expired - Fee Related
- 2014-06-19 JP JP2015529451A patent/JP6020729B2/ja not_active Expired - Fee Related
- 2014-06-19 RU RU2016107125A patent/RU2626424C1/ru active
- 2014-06-19 WO PCT/JP2014/066311 patent/WO2015015939A1/ja active Application Filing
- 2014-06-19 EP EP14831869.4A patent/EP3029538B1/en not_active Not-in-force
- 2014-06-19 CN CN201480054274.7A patent/CN105593776B/zh not_active Expired - Fee Related
- 2014-06-19 MX MX2016001351A patent/MX345393B/es active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010224755A (ja) | 2009-03-23 | 2010-10-07 | Toyota Motor Corp | 移動体及び移動体の位置推定方法 |
JP2011040993A (ja) * | 2009-08-11 | 2011-02-24 | Nikon Corp | 被写体追尾プログラム、およびカメラ |
JP2012108798A (ja) * | 2010-11-18 | 2012-06-07 | Secom Co Ltd | 移動物体追跡装置 |
WO2013002067A1 (ja) * | 2011-06-29 | 2013-01-03 | 株式会社日立産機システム | 移動ロボット、及び移動体に搭載される自己位置姿勢推定システム |
Non-Patent Citations (3)
Title |
---|
MASATO ABE: "JIDOSHA NO UNDO TO SEIGYO", SANKAIDO PUBLISHING CO., LTD, article "chapter 3, section 3.2.1, P. 56, (3.12)" |
See also references of EP3029538A4 |
YUSUKE SATO: "A Study of Line Based Localization for RoboCup Middle Size League", THE ROBOTICS AND MECHATRONICS CONFERENCE '09 KOEN RONBUNSHU, THE JAPAN SOCIETY OF MECHANICAL ENGINEERS, 24 May 2009 (2009-05-24), XP008179898 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019187750A1 (ja) * | 2018-03-28 | 2019-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JPWO2019187750A1 (ja) * | 2018-03-28 | 2021-01-07 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
US11472419B2 (en) | 2018-03-28 | 2022-10-18 | Hitachi Astemo, Ltd. | Vehicle control device |
JP2019215853A (ja) * | 2018-06-11 | 2019-12-19 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 測位のための方法、測位のための装置、デバイス及びコンピュータ読み取り可能な記憶媒体 |
US10964054B2 (en) | 2018-06-11 | 2021-03-30 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and device for positioning |
Also Published As
Publication number | Publication date |
---|---|
MX2016001351A (es) | 2016-04-07 |
JP6020729B2 (ja) | 2016-11-02 |
CN105593776A (zh) | 2016-05-18 |
JPWO2015015939A1 (ja) | 2017-03-02 |
EP3029538A1 (en) | 2016-06-08 |
MX345393B (es) | 2017-01-30 |
US20160185355A1 (en) | 2016-06-30 |
CN105593776B (zh) | 2018-01-23 |
US10363940B2 (en) | 2019-07-30 |
EP3029538A4 (en) | 2016-11-16 |
EP3029538B1 (en) | 2018-04-04 |
RU2626424C1 (ru) | 2017-07-27 |
BR112016002163A2 (pt) | 2017-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6020729B2 (ja) | 車両位置姿勢角推定装置及び車両位置姿勢角推定方法 | |
JP5962771B2 (ja) | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 | |
US10508923B2 (en) | Vehicle position estimation device, vehicle position estimation method | |
JP6384604B2 (ja) | 自己位置推定装置及び自己位置推定方法 | |
JP6418332B2 (ja) | 車両位置推定装置、車両位置推定方法 | |
CN109923028B (zh) | 中立点检测装置以及转向操纵控制系统 | |
US20140169630A1 (en) | Driving assistance apparatus and driving assistance method | |
US10162361B2 (en) | Vehicle control device | |
JP6171849B2 (ja) | 移動体位置姿勢角推定装置及び移動体位置姿勢角推定方法 | |
US11120277B2 (en) | Apparatus and method for recognizing road shapes | |
JP2013186551A (ja) | 移動物体位置姿勢推定装置及び方法 | |
US11042759B2 (en) | Roadside object recognition apparatus | |
CN111267862B (zh) | 一种依赖跟随目标的虚拟车道线构造方法和系统 | |
US20210012119A1 (en) | Methods and apparatus for acquisition and tracking, object classification and terrain inference | |
JP6044084B2 (ja) | 移動物体位置姿勢推定装置及び方法 | |
JP2019012345A (ja) | 右左折時衝突被害軽減装置 | |
GB2571588A (en) | Object classification method and apparatus | |
CN115489518A (zh) | 一种倒车控制方法和装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14831869 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015529451 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14908407 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/001351 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016002163 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014831869 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016107125 Country of ref document: RU Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 112016002163 Country of ref document: BR Kind code of ref document: A2 Effective date: 20160129 |