WO2013133129A1 - 移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法 - Google Patents
移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法 Download PDFInfo
- Publication number
- WO2013133129A1 WO2013133129A1 PCT/JP2013/055470 JP2013055470W WO2013133129A1 WO 2013133129 A1 WO2013133129 A1 WO 2013133129A1 JP 2013055470 W JP2013055470 W JP 2013055470W WO 2013133129 A1 WO2013133129 A1 WO 2013133129A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- likelihood
- image
- virtual
- moving object
- posture
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a moving object position / posture estimation apparatus and a moving object position / posture estimation method for estimating the position and posture angle of a moving object.
- Patent Document 1 As a technique for calculating the position of a moving object by comparing a three-dimensional map and a captured image of a camera, for example, a technique described in Patent Document 1 below is known.
- an edge image is created by extracting an edge from an actual video obtained by an in-vehicle camera provided in a vehicle.
- a virtual image is created by projecting a 3D map in which the position and shape of the edge of the surrounding environment are recorded in 3D with the position and orientation of the in-vehicle camera.
- the position and posture angle of the in-vehicle camera are adjusted so that the edge image matches the virtual image.
- the position and posture angle of the in-vehicle camera in the three-dimensional space are estimated.
- Patent Document 1 even if the actual video and the virtual image match, if the matching location is far from the in-vehicle camera, there may be a large error in the position of the in-vehicle camera. Conversely, if the matched position is close to the in-vehicle camera, the error in the attitude angle of the in-vehicle camera may be large.
- the present invention has been proposed in view of the above-described circumstances, and provides a moving object position / posture estimation apparatus and a moving object position / posture estimation method that can accurately estimate the position and posture angle of a moving object. For the purpose.
- a moving object position / orientation estimation apparatus is a moving object position / orientation estimation apparatus that estimates a position and an attitude angle of a moving object, and includes an imaging unit, a contrast image acquisition unit, and a likelihood setting unit. And a moving object position / posture estimation unit.
- the imaging unit captures a moving object and acquires a captured image.
- the contrast image acquisition unit acquires a contrast image when viewed from a predetermined position and posture angle.
- the likelihood setting unit compares the captured image acquired by the imaging unit with the contrast image acquired by the contrast image acquiring unit, and the far position pixel in the captured image matches the far position pixel in the contrast image Is set to a high attitude angle likelihood of the contrast image, and the position likelihood of the contrast image is set to be high when the neighboring position pixel in the captured image matches the neighboring position pixel in the contrast image.
- the moving object position / posture estimation unit estimates the posture angle of the moving object based on the posture angle of the contrast image in which the posture angle likelihood is set high by the likelihood setting unit, and the position likelihood is set high by the likelihood setting unit.
- the position of the moving object is estimated based on the position of the contrast image.
- the moving object position / posture estimation method is a moving object position / posture estimation method for estimating the position and posture angle of a moving object.
- a captured image obtained by capturing the periphery of the moving object is compared with a contrast image viewed from a predetermined position and posture angle.
- the attitude angle likelihood of the contrast image is set high.
- the neighboring position pixel in the captured image matches the neighboring position pixel in the contrast image
- the position likelihood of the contrast image is set high.
- the posture angle of the moving object is estimated based on the posture angle of the contrast image set with a high posture angle likelihood.
- the position of the moving object is estimated based on the position of the contrast image set with a high position likelihood.
- FIG. 1 is a block diagram showing a configuration of a moving object position / posture estimation apparatus according to the first embodiment of the present invention.
- 2A illustrates a captured image acquired by the imaging unit
- FIG. 2B illustrates an edge image obtained by extracting an edge from the captured image in FIG. 2A
- FIG. 2D is a plan view showing the virtual image when the virtual position in FIG. 2C is shifted to the right.
- FIG. 3 is a flowchart showing an example of the operation procedure of the moving object position / orientation estimation apparatus according to the first embodiment of the present invention.
- FIG. 4 is a plan view for explaining an operation of moving particles (candidate points).
- FIG. 5 is a graph showing an example of the relationship between the distance from the vehicle and the position likelihood like_p.
- FIG. 6 is a graph showing an example of the relationship between the distance from the vehicle and the attitude angle likelihood like_a.
- FIG. 7 is a flowchart showing an example of an operation procedure of the moving object position / posture estimation apparatus according to the second embodiment of the present invention.
- the moving object position / orientation estimation apparatus includes an ECU (Engine Control Unit) 1, a camera (an example of an imaging unit) 2, a three-dimensional map database 3, and a vehicle sensor group 4.
- the vehicle sensor group 4 includes a GPS receiver 41, an accelerator sensor 42, a steering sensor 43, a brake sensor 44, a vehicle speed sensor 45, an acceleration sensor 46, a wheel speed sensor 47, and other sensors 48 such as a yaw rate sensor.
- the ECU 1 is actually composed of a ROM, a RAM, an arithmetic circuit, and the like.
- the ECU 1 executes processing in accordance with a computer program for estimating a moving object position / orientation stored in a ROM, whereby a virtual image acquisition unit 11 (an example of a contrast image acquisition unit), a likelihood setting unit 12, a moving object position / orientation estimation. It functions as the unit 13.
- the camera 2 uses a solid-state image sensor such as a CCD.
- a CCD solid-state image sensor
- the camera 2 is installed, for example, at a front portion (position) of the vehicle in a direction (attitude angle) in which the front of the vehicle can be imaged.
- the camera 2 captures the periphery of the vehicle at predetermined time intervals and acquires a captured image.
- the acquired captured image is supplied to the ECU 1.
- the 3D map database 3 stores, for example, 3D position information such as edges of the surrounding environment including road surface display.
- the 3D map database 3 stores edge information of structures such as curbs and buildings in addition to road surface displays such as white lines, stop lines, pedestrian crossings, and road surface marks as 3D position information. ing.
- These three-dimensional position information is defined by an aggregate of edges. When the edge is a long straight line, the edge is divided every 1 m, for example, and therefore there is no extremely long edge. In the case of a straight line, each edge has three-dimensional position information indicating both end points of the straight line. In the case of a curve, each edge has three-dimensional position information indicating both end points and a center point of the curve.
- the vehicle sensor group 4 is connected to the ECU 1.
- the vehicle sensor group 4 supplies various sensor values detected by the sensors 41 to 48 to the ECU 1.
- the ECU 1 uses the output value of the vehicle sensor group 4 to calculate an approximate position of the vehicle and an odometry indicating the amount of movement of the vehicle per unit time.
- the ECU 1 is an electronic control unit that estimates the position and attitude angle of the vehicle using the captured image captured by the camera 2 and the three-dimensional position information stored in the three-dimensional map database 3.
- the ECU 1 may also be used as an ECU used for other controls.
- the moving object position / orientation estimation apparatus estimates the position and attitude angle of the vehicle by comparing the captured image captured by the camera 2 with a contrast image viewed from a predetermined position and attitude angle.
- a virtual image obtained by converting 3D map data into an image captured from a virtual position and a virtual posture angle is used as an example of “a contrast image when viewed from a predetermined position and posture angle”.
- a captured image as shown in FIG. 2A is obtained and an edge image as shown in FIG. 2B is obtained.
- the virtual image obtained by projecting the three-dimensional position information on the position and posture angle of the camera 2 is as shown in FIG.
- both the far position (A) and the near position (B) match, so the virtual position that generated the virtual image and It can be estimated that the virtual posture angle corresponds to the position and posture angle of the host vehicle.
- the virtual image is as shown in FIG.
- the far position (A) is coincident, but the neighboring position (B) is greatly shifted.
- the virtual posture angle of the virtual image in FIG. 2C is shifted (not shown) and compared with the captured image in FIG. 2A, the near position (B) matches, but the far position (A) Deviates greatly.
- the moving object position / posture estimation apparatus determines that the virtual position of the virtual image is likely when the neighboring position pixel in the captured image matches the neighboring position pixel in the virtual image. . Conversely, the moving object position / posture estimation apparatus determines that the virtual posture angle of the virtual image is likely when the far position pixel in the captured image matches the far position pixel in the virtual image.
- the operation of the moving object position / orientation estimation apparatus will be described with reference to the position / orientation estimation algorithm shown in FIG.
- the position of the vehicle with three degrees of freedom front-rear direction, lateral direction, vertical direction
- the attitude angle roll, pitch, yaw
- the position / orientation estimation algorithm shown in FIG. 3 is continuously performed by the ECU 1 at intervals of, for example, about 100 msec.
- step S1 the ECU 1 acquires a video image captured by the camera 2, and calculates an edge image from the captured image included in the video image.
- An edge in the present embodiment refers to a location where the luminance of a pixel changes sharply.
- the Canny method can be used as the edge detection method.
- the edge detection method is not limited to this, and various other methods such as differential edge detection may be used.
- the ECU 1 extracts the brightness change direction of the edge, the color near the edge, and the like from the captured image of the camera 2.
- the position likelihood and the posture angle likelihood are set using information other than the edges recorded in the three-dimensional map database 3, and the vehicle position and The posture angle may be estimated.
- the ECU 1 calculates odometry, which is the amount of movement from the vehicle position calculated by the position / orientation estimation algorithm one loop before, from the sensor value obtained from the vehicle sensor group 4. In the case of the first loop after starting the position / orientation estimation algorithm, the odometry is calculated as zero.
- the ECU1 calculates the odometry which is the movement amount which the vehicle advanced to the unit time using the various sensor values obtained from the vehicle sensor group 4.
- the vehicle motion is limited to a plane
- the vehicle speed and the yaw rate are calculated from the sensor values detected by the wheel speed sensor and the steering sensor of each wheel, and the amount of movement per unit time is calculated. And calculating the amount of rotation.
- the ECU 1 may substitute the wheel speed by the difference between the vehicle speed and the positioning value of the GPS receiver 41, or may substitute the yaw rate sensor by the steering angle.
- Various calculation methods can be considered as a method for calculating odometry, but any method may be used as long as odometry can be calculated.
- the ECU 1 acquires a contrast image when viewed from a predetermined position and posture angle.
- the virtual image acquisition unit 11 as an example of the contrast image acquisition unit calculates a plurality of virtual (predicted) position and virtual (predicted) posture angle candidates from the odometry calculated in step S2.
- the plurality of virtual position and virtual posture angle candidates are candidates for the vehicle position and the posture angle.
- the virtual image acquisition unit 11 moves the odometry calculated in step S2 from the vehicle position estimated in step S6 one loop before.
- the virtual image acquisition unit 11 calculates a plurality of virtual position and virtual attitude angle candidates in the vicinity of the moved vehicle position.
- the virtual image acquisition unit 11 uses the data from the GPS receiver 41 included in the vehicle sensor group 4 as initial position information. Or the virtual image acquisition part 11 memorize
- the virtual image acquisition unit 11 takes into account the odometry error caused by the measurement error of the vehicle sensor group 4 and the communication delay, and the dynamic characteristics of the vehicle that cannot be taken into account by the odometry, and the true values of the vehicle position and the attitude angle are calculated.
- a plurality of virtual position and virtual attitude angle candidates that can be taken are generated.
- the candidates for the virtual position and the virtual attitude angle set the upper and lower limits of the error with respect to the 6-degree-of-freedom parameters of the position and the attitude angle, and randomly use a random number table or the like within the upper and lower limits of the error. Set.
- 500 candidates for the virtual position and the virtual posture angle are created.
- the upper and lower limits of the error for the 6-degree-of-freedom parameters of position and posture angle are ⁇ 0.05 [m], ⁇ 0.05 [m], ⁇ 0.05 [0.05] in the order of front / rear direction, lateral direction, vertical direction, roll, pitch, yaw. m], ⁇ 0.5 [deg], ⁇ 0.5 [deg], ⁇ 0.5 [deg].
- the number of virtual position and virtual attitude angle candidates to be created and the upper and lower limits of the error with respect to the 6-degree-of-freedom parameters of position and attitude angle are changed as appropriate by detecting or estimating the driving state of the vehicle and the road surface condition. It is desirable.
- the error in the plane direction front / rear direction, lateral direction, yaw
- the upper and lower limits of the error of these three parameters are increased.
- the virtual image acquisition unit 11 may set a plurality of virtual position and virtual posture angle candidates using a so-called particle filter. At this time, the virtual image acquisition unit 11 moves the position and posture angle of each particle (candidate point), which is a candidate for a plurality of virtual positions and virtual posture angles generated in step S6 one loop before, by odometry. . Specifically, as shown in FIG. 4, the particle P of the position and posture angle of the vehicle V (t1) estimated before one loop and the surrounding particles P1 to P5 are moved by odometry. As a result, the virtual image acquisition unit 11 sets the particles P10 to P15 for estimating the position and attitude angle of the new vehicle V (t2).
- the virtual image acquisition unit 11 calculates a plurality of virtual position and virtual posture angle candidates at this time. That is, the position and posture angle of each particle are set as candidates for a plurality of virtual positions and virtual posture angles.
- the position information and attitude angle information of each particle are moved by the odometry after taking into account the odometry error caused by the measurement error of the vehicle sensor group 4 and the communication delay and the dynamic characteristics of the vehicle that cannot be taken into account by the odometry.
- each particle does not have position information and attitude angle information. For this reason, it is good also considering the detection data of the GPS receiver 41 contained in the vehicle sensor group 4 as initial position information.
- the position information and attitude angle information of each particle may be set from the vehicle position estimated last when the vehicle stopped last time.
- the upper and lower limits of the error with respect to the six-degree-of-freedom parameters of position and posture angle are set from the vehicle position estimated last when the vehicle stopped last time. Then, the position information and the posture angle of each particle are set randomly using a random number table or the like within the range of the upper and lower limits of this error.
- 500 particles are created in the case of the first loop.
- the upper and lower limits of the error with respect to the 6-degree-of-freedom parameter of each particle are ⁇ 0.05 [m], ⁇ 0.05 [m], ⁇ 0.05 [m] in the order of the front and rear direction, the horizontal direction, the vertical direction, the roll, the pitch, and the yaw. , ⁇ 0.5 [deg], ⁇ 0.5 [deg], ⁇ 0.5 [deg].
- the virtual image acquisition unit 11 creates a virtual image (projected image) for each of the plurality of virtual position and virtual attitude angle candidates created in step S3.
- the virtual image acquisition unit 11 converts, for example, three-dimensional position information such as an edge stored in the three-dimensional map database 3 so as to be a camera image picked up from the virtual position and the virtual attitude angle, and the virtual image Create This virtual image is an evaluation image for evaluating whether or not each candidate of the virtual position and the virtual posture angle matches the actual position and posture angle of the host vehicle.
- an external parameter indicating the position of the camera 2 and an internal parameter of the camera 2 are required.
- the external parameter may be calculated from the candidate for the virtual position and the virtual posture angle by measuring the relative position from the vehicle position (for example, the center position) to the camera 2 in advance.
- the internal parameters may be calibrated in advance.
- the three-dimensional position information is also extracted from the camera image. It is further preferable to create a virtual image in advance.
- step S5 the ECU 1 (likelihood setting unit 12) obtains the edge image created in step S1 and the virtual image created in step S4 for each of the plurality of virtual position and virtual attitude angle candidates set in step S3. Compare.
- the likelihood setting unit 12 sets a position likelihood and a posture angle likelihood for each virtual position and virtual posture angle candidate based on the comparison result.
- the position likelihood is an index indicating how likely the virtual position candidates are to the actual vehicle position.
- the attitude angle likelihood is an index indicating how likely the virtual attitude angle candidates are likely to be relative to the actual vehicle attitude angle.
- the likelihood setting unit 12 sets the position likelihood or the posture angle likelihood higher as the matching degree between the virtual image and the edge image is higher. A method for obtaining the position likelihood or the posture angle likelihood will be described later.
- the likelihood setting unit 12 compares the captured image with the virtual image, and sets the posture angle likelihood of the virtual image to be high when the far position pixel in the captured image matches the far position pixel in the virtual image. .
- the likelihood setting unit 12 compares the captured image and the virtual image, and sets the position likelihood of the virtual image to be high when the neighboring position pixel in the captured image matches the neighboring position pixel in the virtual image.
- the far position pixel and the near position pixel may be set according to the positions of the captured image and the virtual image, for example. For example, a range that is lower than the intermediate position in the vertical direction in the captured image and the virtual image by a predetermined width may be set as an image region in which a far position pixel exists.
- a range upward by a predetermined width from the bottom surface position in the vertical direction in the captured image and the virtual image may be set as an image area in which a neighboring position pixel exists. This distance can be obtained because the position of the camera 2 is assumed when the virtual image when the three-dimensional map is projected onto the camera 2 is obtained.
- the likelihood setting unit 12 determines whether or not the edges on the virtual image and the edge image match, that is, the edge on the edge image exists at the pixel coordinate position where the edge on the virtual image exists. Determine whether you are doing.
- the likelihood setting unit 12 refers to the three-dimensional map database 3 for the matching edges, and the position of the matching edges in the three-dimensional space. Ask for.
- the distance L (unit: m) between the position information of the candidate of the virtual position and the virtual attitude angle for which the position likelihood is obtained and the matching edge portion is obtained, and the reciprocal of the distance L is obtained as the position likelihood like_p ( Unit: None).
- the likelihood setting unit 12 sets the position likelihood like_p to 0 when the edges on the virtual image and the edge image do not match.
- the likelihood setting unit 12 performs this process for all the pixels on the virtual image.
- the likelihood setting unit 12 sets a large position likelihood like_p for the pixel when the edge image and the virtual image match in a portion close to the vehicle. Conversely, when the edge image and the virtual image coincide with each other at a portion far from the vehicle, the ECU 1 sets a small position likelihood like_p for the pixel.
- the likelihood setting unit 12 sets the total of the position likelihood like_p of all the pixels as the position likelihood LIKE_P (unit: none) for the virtual image.
- the likelihood setting unit 12 may provide an upper limit value according to the distance from the vehicle as shown in FIG. 5 when obtaining the position likelihood like_p.
- the likelihood setting unit 12 sets the position likelihood like_p so that a predetermined upper limit value lthp is obtained when the pixel of the edge image (captured image) matches the pixel of the virtual image in the vicinity of the predetermined distance Lthp or more.
- the likelihood setting part 12 may reduce the increase width of likelihood like_p, so that the distance from a vehicle becomes short.
- the distance Lthp is set to 1.0 [m].
- the likelihood setting unit 12 prepares a position likelihood setting map in which the relationship between the distance from the vehicle and the position likelihood like_p is described in advance.
- the ECU 1 refers to the position likelihood setting map and sets the position likelihood like_p according to the distance from the vehicle where the edge image and the virtual image match.
- ECU1 also calculates the attitude angle likelihood.
- Likelihood setting unit 12 when the edge portion on the virtual image and the edge portion on the edge image match, the distance L (unit: m) from the vehicle of the matching pixel, as in the case of obtaining position likelihood LIKE_P. Ask for.
- the likelihood setting unit 12 sets a value obtained by dividing the distance L from the vehicle by 10 as the posture angle likelihood like_a (unit: none). If the edge part on the virtual image and the edge image do not match, 0 is set to the posture angle likelihood like_a.
- the likelihood setting unit 12 performs the process of setting the posture angle likelihood like_a on all the pixels on the virtual image.
- the likelihood setting unit 12 sets a large posture angle likelihood like_a for the pixel when the edge image and the virtual image match in a portion far from the vehicle.
- the likelihood setting unit 12 sets a small posture angle likelihood like_a for the pixel when the edge image and the virtual image coincide with each other near the vehicle.
- the likelihood setting unit 12 sets the sum of the posture angle likelihood like_a of all pixels as the posture angle likelihood LIKE_A (unit: none) for the virtual image.
- the likelihood setting unit 12 may provide an upper limit value according to the distance from the vehicle as shown in FIG. 6 when obtaining the attitude angle likelihood like_a.
- the ECU 1 sets the posture angle likelihood like_a so that the predetermined upper limit value ltha is obtained when the pixel of the edge image (captured image) and the pixel of the virtual image coincide with each other at a distance equal to or longer than the predetermined distance Ltha.
- the ECU 1 may decrease the increment of the attitude angle likelihood like_a as the distance from the vehicle increases.
- the distance Ltha is set to 30.0 [m].
- the likelihood setting unit 12 prepares a posture angle likelihood setting map in which the relationship between the distance from the vehicle and the posture angle likelihood like_a is described in advance.
- the ECU 1 refers to the posture angle likelihood setting map and sets the posture angle likelihood like_a according to the distance from the vehicle where the edge image and the virtual image match.
- the likelihood setting unit 12 obtains the position likelihood LIKE_P and the posture angle likelihood LIKE_A for each virtual image. That is, the position likelihood LIKE_P and the posture angle likelihood LIKE_A are calculated for each candidate of the virtual position and the virtual posture angle.
- the likelihood setting unit 12 normalizes the total values of the position likelihood LIKE_P and the posture angle likelihood LIKE_A to 1 using all the results for a plurality of virtual position and virtual posture angle candidates.
- the moving object position / posture estimation unit 13 uses the plurality of virtual position and virtual posture angle candidates for which the position likelihood LIKE_P and the posture angle likelihood LIKE_A have been obtained in step S5, and finally uses them.
- the vehicle position and attitude angle are calculated.
- the moving object position / posture estimation unit 13 estimates the actual posture angle of the vehicle based on the posture angle of the virtual image in which the posture angle likelihood LIKE_A is set high.
- the moving object position / posture estimation unit 13 estimates the actual position of the vehicle based on the position of the virtual image in which the position likelihood LIKE_P is set high.
- the moving object position / posture estimation unit 13 may calculate, for example, the virtual position and virtual posture angle of the virtual image with the highest position likelihood LIKE_P as the actual position and posture angle of the vehicle.
- the moving object position / posture estimation unit 13 may calculate the virtual position and the virtual posture angle of the virtual image set with the highest posture angle likelihood LIKE_A as the actual position and posture angle of the vehicle.
- the moving object position / posture estimation unit 13 calculates the virtual position and virtual posture angle of the virtual image having the highest sum of the position likelihood LIKE_P and the posture angle likelihood LIKE_A as the actual position and posture angle of the vehicle. Also good.
- the moving object position / orientation estimation unit 13 weights each virtual position and virtual attitude angle candidate according to the position likelihood LIKE_P of each virtual image, and calculates the weighted virtual position and virtual attitude angle.
- the average value may be calculated as the actual position and attitude angle of the vehicle.
- the moving object position / posture estimation unit 13 weights each virtual position and virtual posture angle candidate according to the posture angle likelihood LIKE_A of each virtual image, and weighted virtual positions and virtual posture angles. May be calculated as the actual position and attitude angle of the vehicle.
- the moving object position / posture estimation unit 13 performs weighting on each virtual position and virtual posture angle candidate according to the sum of the position likelihood LIKE_P and the posture angle likelihood LIKE_A of each virtual image.
- the average value of the virtual position and the virtual attitude angle may be calculated as the actual position and attitude angle of the vehicle.
- the moving object position / posture estimation unit 13 sets the virtual position of the particle having the highest position likelihood LIKE_P as the actual position of the vehicle.
- the moving object position / posture estimation unit 13 weights the virtual positions of the plurality of particles according to the position likelihood LIKE_P, and calculates the average value of the weighted virtual positions of the plurality of particles as the actual position of the vehicle. May be calculated as Alternatively, the moving object position / posture estimation unit 13 may calculate the virtual posture angle of the particle having the highest posture angle likelihood LIKE_A as the actual posture angle of the vehicle.
- the moving object position / posture estimation unit 13 weights the virtual posture angles of the plurality of particles according to the posture angle likelihood LIKE_A, and calculates an average value of the virtual posture angles of the weighted particles. The actual posture angle may be calculated.
- the virtual image acquisition unit 11 performs resampling of each particle based on the position likelihood LIKE_P and the posture angle likelihood LIKE_A. That is, the plurality of virtual positions and the virtual posture angles are reset based on the posture angle likelihoods of the plurality of virtual images and the position likelihoods of the plurality of virtual images.
- the virtual image acquisition unit 11 resamples each particle around the particle having the highest sum of the position likelihood LIKE_P and the posture angle likelihood LIKE_A.
- the virtual image acquisition unit 11 temporarily separates the position information and the posture angle information of each particle, re-sampling the particles with only the position information based on the position likelihood LIKE_P, and the particles with only the posture angle information. Performs resampling based on the attitude angle likelihood LIKE_A.
- the virtual image acquisition unit 11 randomly reconstructs particles having position information and posture angle information by combining the position information and posture angle information of particles with only position information and particles with only posture angle information. May be.
- the ECU 1 can sequentially calculate the position and posture angle of the vehicle by repeatedly performing the above steps S1 to S6.
- the captured image and the virtual image are compared, and the far position pixel in the captured image and the far position pixel in the virtual image are compared. If they match, the posture angle likelihood of the virtual image is set high.
- the position likelihood of the virtual image is set high.
- the moving object position / posture estimation apparatus estimates the actual posture angle of the moving object based on the virtual posture angle of the virtual image for which the posture angle likelihood is set high.
- the actual position of the moving object is estimated based on the virtual position of the virtual image for which the position likelihood is set high.
- the moving object position / posture estimation apparatus different likelihoods are set for the position and the posture angle, and the position and the posture angle are separately set according to the distance to the place where the real image and the virtual image match. It can be adjusted and estimated. Specifically, when the captured image and the virtual image coincide with each other at a location far from the camera 2, the virtual posture angle is close to the true value, but the error in the virtual position may be large. For this reason, the posture angle likelihood can be set larger, but the position likelihood is not set too large. On the other hand, when the captured image and the virtual image match at a place close to the camera 2, the position likelihood is set larger, and conversely, the posture angle likelihood is not set too large.
- the position and orientation angle of a moving object can be estimated with high accuracy.
- the posture angle likelihood of the virtual image is set so that the increase range of the posture angle likelihood is reduced or becomes a predetermined upper limit value.
- the position likelihood of the virtual image is set so that the increase range of the position likelihood is reduced or becomes a predetermined upper limit value. .
- a plurality of particles are set, and the position likelihood LIKE_P and the posture angle likelihood LIKE_A for each particle are set. Based on the position likelihood LIKE_P and the posture angle likelihood LIKE_A for each particle, the position and posture angle of the vehicle are obtained. Then, the particles can be resampled based on the position likelihood LIKE_P and the posture angle likelihood LIKE_A.
- the particle filter when the estimated order is n, in order to increase the estimation accuracy by a, it is necessary to increase the number of particles to be scattered to a power of n in principle (probability robotics / Chapter 4 Section 3 ((Author) Sea Thrun / Wolfram Burgard / Dieter Fox, (Translated) Ryuichi Ueda, (Published) Mainichi Communications Inc.).
- the order is sixth. Therefore, if the estimation accuracy is increased by a factor of 2, the calculation time is 64 times, and if the estimation accuracy is increased by a factor of 3, the calculation time is increased by 729 times.
- the second embodiment of the present invention a moving object position / posture estimation apparatus and a moving object position / posture estimation method for estimating each of the position and posture angle of a vehicle using different virtual images will be described.
- a plurality of candidate points (particles) composed of combinations of virtual positions and virtual attitude angles are set, and the position and attitude angle of the vehicle are determined based on the virtual image acquired for each candidate point.
- the virtual image of the vehicle vicinity area based on a some virtual position is acquired, and the position of a vehicle is estimated using the virtual image of a vehicle vicinity area.
- a virtual image of the vehicle far region based on the plurality of virtual posture angles is acquired, and the posture angle of the vehicle is estimated using the virtual image of the vehicle far region. That is, the difference is that different virtual images are used for estimating the position and posture angle of the vehicle.
- the hardware configuration of the moving object position / orientation estimation apparatus according to the second embodiment is the same as that shown in FIG.
- the software configuration of the ECU 1, that is, the virtual image acquisition unit 11, the likelihood setting unit 12, and the moving object position / posture estimation unit 13 are different as shown below.
- the operation of the moving object position / orientation estimation apparatus will be described with reference to the position / orientation estimation algorithm shown in FIG.
- the position of the vehicle with three degrees of freedom front-rear direction, lateral direction, vertical direction
- the attitude angle roll, pitch, yaw
- the position / orientation estimation algorithm shown in FIG. 7 is continuously performed by the ECU 1 at intervals of, for example, about 100 msec.
- steps S1 to S2 are the same as steps S1 to S2 described with reference to FIG.
- step S2 the process proceeds to step S13, and the ECU 1 (virtual image acquisition unit 11) calculates an initial predicted position and an initial predicted posture angle having six degrees of freedom from the odometry calculated in step S2.
- the virtual image acquisition unit 11 sets, as the initial predicted position, the position moved by the odometry calculated in step S2 from the vehicle position estimated in step S16 one loop before.
- the ECU 1 does not have the previous vehicle position information.
- the virtual image acquisition unit 11 sets the data from the GPS receiver 41 included in the vehicle sensor group 4 as the initial predicted position.
- the virtual image acquisition unit 11 may store the vehicle position and the posture angle calculated last when the vehicle stopped last time, and may be used as the initial predicted position and the initial predicted posture angle.
- the virtual image acquisition unit 11 sets a plurality of virtual positions in the initial predicted position calculated in step S13 and in the vicinity thereof.
- the virtual image acquisition unit 11 can take the odometry error caused by the measurement error of the vehicle sensor group 4 and the communication delay and the dynamic characteristics of the vehicle that cannot be taken into account by the odometry, and can take the true value of the vehicle position.
- Generate multiple virtual positions The virtual position is set for each of the three-degree-of-freedom parameters included in the position by setting an upper and lower limit of the error, and is set at predetermined intervals within the range of the upper and lower limits of the error.
- the virtual image acquisition unit 11 creates a virtual image of a region close to the vehicle for each virtual position set in step S14 using the evaluation point projection method.
- the creation of the virtual image in step S15 is intended to estimate the position of the vehicle 1. Therefore, only the information in the vicinity of the vehicle 1 out of the three-dimensional position information such as edges stored in the three-dimensional map database 3 is converted into a virtual image picked up from the virtual position and the virtual attitude angle by using the evaluation point projecting means. To do.
- a virtual image is created by projecting only three-dimensional position information such as an edge within a distance of 3 m from each virtual position.
- the virtual posture angle used in step S15 the vehicle posture angle estimated in step S19 one loop before or the initial predicted posture angle obtained in step S13 may be used.
- the likelihood setting unit 12 compares each virtual image in the region close to the vehicle created in step S15 with the captured image. Specifically, for each of the virtual images created in step S15, an edge image composed of an edge included in the virtual image and an edge included in the captured image is compared, and an edge in the virtual image is compared with an edge in the edge image.
- the degree of matching is calculated. For example, the degree of coincidence of the number of pixels whose edges match between the virtual image near the vehicle and the edge image, that is, the number of pixels where the edge on the edge image exists at the pixel coordinate position on the virtual image where the edge exists Count as.
- the likelihood setting unit 12 sets the position likelihood higher as the matching degree between the edge in the virtual image and the edge in the captured image is higher.
- the moving object position / posture estimation unit 13 estimates the position of the vehicle based on the virtual position of the virtual image set with a high position likelihood by the likelihood setting unit 12. For example, the moving object position / posture estimation unit 13 calculates, as the actual position of the vehicle 1, the virtual position of the virtual image having the largest number of matching pixels among the plurality of virtual images created in step S15. Alternatively, the moving object position / posture estimation unit 13 weights each virtual position according to the position likelihood of each virtual image, and calculates an average value of the weighted virtual positions as the actual position of the vehicle. May be.
- a virtual image of the vehicle vicinity region based on the plurality of virtual positions can be acquired, and the position of the vehicle can be estimated using the virtual image of the vehicle vicinity region.
- step S17 the process proceeds to step S17, and the virtual image acquisition unit 11 sets a plurality of virtual attitude angles with reference to the initial predicted attitude angle calculated in step S13.
- the virtual image acquisition unit 11 takes into account the odometry error caused by the measurement error of the vehicle sensor group 4 and the communication delay and the dynamic characteristics of the vehicle that cannot be taken into account by the odometry, and can take the true value of the attitude angle of the vehicle. Generate multiple possible virtual attitude angles.
- the virtual posture angle is set at predetermined intervals within the upper and lower limits of the error for each of the three-degree-of-freedom parameters included in the posture angle.
- the upper and lower limits of the error with respect to the three-degree-of-freedom parameter included in the posture angle are ⁇ 0.5 [deg] [, ⁇ 0.5 [deg], ⁇ 0.5 [deg] in the order of roll, pitch, and yaw.
- Set the virtual posture angle at intervals of [deg]. Therefore, 20 ⁇ 20 ⁇ 20 8000 virtual positions are created.
- the upper and lower limits of the error and the predetermined interval are desirably changed as appropriate by detecting or estimating the driving state of the vehicle and the road surface condition. For example, when a sudden turn or slip occurs, the error of the yaw angle is likely to increase. Therefore, it is preferable to increase the upper and lower limits of the yaw angle error, or to increase the upper and lower limits of the pitch angle error in cases such as overcoming a step.
- step S18 the virtual image acquisition unit 11 creates a virtual image of an area far from the vehicle for each virtual attitude angle set in step S17 using the evaluation point projection method.
- the creation of the virtual image in step S18 is intended to estimate the attitude angle of the vehicle 1. Therefore, only the information far from the vehicle 1 out of the three-dimensional position information such as the edges stored in the three-dimensional map database 3 is converted into a virtual image captured from the virtual position and the virtual attitude angle using the evaluation point projecting means.
- a virtual image is created by projecting only three-dimensional position information such as an edge that is a distance of 20 m or more from the initial predicted position calculated in step S13 or the position of the vehicle 1 calculated in step S16.
- the position of the vehicle estimated in step S19 one loop before or the initial predicted position obtained in step S13 may be used. Even if the projection is extremely far away, the projection process takes time, and the resolution of the camera 2 cannot distinguish whether the edges match. For this reason, in this embodiment, three-dimensional position information such as an edge having a distance of 50 m or more is not projected.
- the likelihood setting unit 12 compares each virtual image of the region far from the vehicle created in step S18 with the captured image. Specifically, for each of the virtual images created in step S18, the edge included in the virtual image is compared with the edge image including the edge included in the captured image, and the edge in the virtual image and the edge in the edge image are compared. The degree of coincidence with is calculated. For example, the degree of coincidence is the number of pixels whose edges match between the virtual image far away from the vehicle and the edge image, that is, the number of pixels where the edge on the edge image exists at the pixel coordinate position on the virtual image where the edge exists. Count as. Then, the likelihood setting unit 12 sets the posture angle likelihood higher as the matching degree between the edge in the virtual image and the edge in the captured image is higher.
- the moving object position / posture estimation unit 13 estimates the posture angle of the vehicle based on the virtual posture angle of the virtual image set with a high posture angle likelihood by the likelihood setting unit 12. For example, the moving object position / posture estimation unit 13 calculates, as the actual posture angle of the vehicle 1, the virtual posture angle of the virtual image having the largest number of matching pixels among the plurality of virtual images created in step S18. Alternatively, the moving object position / posture estimation unit 13 performs weighting on each virtual posture angle according to the posture angle likelihood of each virtual image, and calculates the average value of the weighted virtual posture angles. It may be calculated as an attitude angle.
- a virtual image of the vehicle far region based on a plurality of virtual posture angles can be acquired, and the posture angle of the vehicle can be estimated using the virtual image of the vehicle far region.
- the ECU 1 can sequentially calculate the position and the attitude angle of the vehicle by repeatedly performing the processing from step S1 to step S19 as described above.
- each of the position and attitude angle of the vehicle is estimated using different virtual images. Specifically, a virtual image of the vehicle vicinity region based on a plurality of virtual positions is acquired, and the position of the vehicle is estimated using the virtual image of the vehicle vicinity region. Then, a virtual image of the vehicle far region based on the plurality of virtual posture angles is acquired, and the posture angle of the vehicle is estimated using the virtual image of the vehicle far region.
- different likelihoods can be set for the position and the posture angle, and the position and the posture angle can be adjusted and estimated separately according to the distance to the place where the real image and the virtual image match. it can. Therefore, the position and posture angle of the moving object can be estimated with high accuracy.
- the posture angle likelihood of the virtual image is set so that the increase range of the posture angle likelihood is reduced or becomes a predetermined upper limit value. It may be set.
- the posture angle likelihood is determined based on the degree of matching in the far portion. It can be prevented. For this reason, even if there is noise or an error in the edge extracted from a portion that is extremely far from the camera 2, the influence can be suppressed, and the estimation error of the posture angle can be reduced.
- the position likelihood of the virtual image is set so that the increase range of the position likelihood is reduced or becomes a predetermined upper limit value. May be. This prevents the position likelihood from becoming too large when the two images match within a certain distance from the camera 2. Therefore, according to the moving object position / posture estimation apparatus, even if there is noise or an error in an edge extracted from a portion that is extremely close to the camera 2, the influence can be suppressed, and the position estimation error can be reduced. can do.
- a vehicle is taken as an example, but any moving object equipped with at least one camera can be applied to an aircraft or a ship.
- the position (front-rear direction, lateral direction, vertical direction) and posture angle (roll, pitch, yaw) of the six degrees of freedom of the vehicle are obtained, but the present invention is not limited to this.
- this embodiment is also used when estimating a position (front-rear direction, lateral direction) and posture angle (yaw) with three degrees of freedom, such as an automatic guided vehicle used in a factory without a suspension or the like. Is applicable.
- the vertical position and the posture angle such as roll and pitch are fixed. Therefore, these parameters are measured in advance or the 3D map database 3 is referred to. And ask for it.
- a comparison image acquisition unit that acquires a comparison image when viewed from a predetermined position and posture angle
- three-dimensional map data is imaged from a virtual position and a virtual posture angle
- the virtual image acquisition unit 11 that converts the image to the acquired image and acquires the virtual image is illustrated.
- the “contrast image acquisition unit” is not limited to this.
- the contrast image acquisition unit may acquire a captured image captured by the camera 2 in the past as a contrast image.
- the moving object position / posture estimation unit 13 estimates the posture angle of the vehicle based on the posture angle of the vehicle when the contrast image in which the posture angle likelihood is set high by the likelihood setting unit 12 is captured.
- the position of the vehicle may be estimated based on the position of the vehicle when the contrast image in which the position likelihood is set high by the degree setting unit 12 is captured.
- different likelihoods are set for the position and the posture angle, and the position and the posture angle are separately set according to the distance to the place where the current captured image and the past captured image (contrast image) match. It can be adjusted and estimated. Therefore, the position and posture angle of the moving object can be estimated with high accuracy.
- the posture angle likelihood is increased when the captured image matches the virtual image at the far position, and the captured image is captured at the nearby position. If the virtual image and the virtual image match, the position likelihood is increased. Therefore, the posture angle likelihood is set by matching in the distance with a large position error, and the position likelihood is set by matching in the vicinity where the posture angle error is large. it can. Thereby, the position and posture angle of the moving object can be estimated with high accuracy. Therefore, the present invention has industrial applicability.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Navigation (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(第1実施形態)
図1を参照して、本発明の第1実施形態に係わる移動物体位置姿勢推定装置の構成を説明する。第1実施形態に係わる移動物体位置姿勢推定装置は、ECU(Engine Control Unit)1と、カメラ(撮像部の一例)2と、3次元地図データベース3と、車両センサ群4とを備える。車両センサ群4には、GPS受信機41、アクセルセンサ42、ステアリングセンサ43、ブレーキセンサ44、車速センサ45、加速度センサ46、車輪速センサ47、及び、ヨーレートセンサ等のその他センサ48が含まれる。なお、ECU1は、実際にはROM、RAM、演算回路等にて構成されている。ECU1は、ROMに格納された移動物体位置姿勢推定用のコンピュータプログラムに従って処理を実行することによって、仮想画像取得部11(対比画像取得部の一例)、尤度設定部12、移動物体位置姿勢推定部13として機能する。
(第2実施形態)
本発明の第2実施形態では、車両の位置及び姿勢角の各々を異なる仮想画像を用いて推定する移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法を説明する。
2 カメラ(撮像部)
3 3次元地図データベース
11 仮想画像取得部(対比画像取得部)
12 尤度設定部
13 移動物体位置姿勢推定部
Claims (8)
- 移動物体の位置及び姿勢角を推定する移動物体位置姿勢推定装置であって、
前記移動物体周辺を撮像して、撮像画像を取得する撮像部と、
予め定めた位置及び姿勢角から見た時の対比画像を取得する対比画像取得部と、
前記撮像部により取得された撮像画像と前記対比画像取得部により取得された対比画像とを比較し、前記撮像画像内の遠方位置画素と前記対比画像内の遠方位置画素とが一致した場合には前記対比画像の姿勢角尤度を高く設定し、前記撮像画像内の近傍位置画素と前記対比画像内の近傍位置画素とが一致した場合には前記対比画像の位置尤度を高く設定する尤度設定部と、
前記尤度設定部により前記姿勢角尤度が高く設定された前記対比画像の姿勢角に基づいて前記移動物体の姿勢角を推定し、前記尤度設定部により前記位置尤度が高く設定された前記対比画像の位置に基づいて前記移動物体の位置を推定する移動物体位置姿勢推定部と
を有することを特徴とする移動物体位置姿勢推定装置。 - 前記対比画像取得部は、三次元地図データを仮想位置及び仮想姿勢角から撮像した画像に変換して、仮想画像を取得する仮想画像取得部であり、
前記尤度設定部は、前記撮像部により取得された撮像画像と前記仮想画像取得部により取得された仮想画像とを比較し、前記撮像画像内の遠方位置画素と前記仮想画像内の遠方位置画素とが一致した場合には前記仮想画像の姿勢角尤度を高く設定し、前記撮像画像内の近傍位置画素と前記仮想画像内の近傍位置画素とが一致した場合には前記仮想画像の位置尤度を高く設定し、
前記移動物体位置姿勢推定部は、前記尤度設定部により前記姿勢角尤度が高く設定された前記仮想画像の仮想姿勢角に基づいて前記移動物体の姿勢角を推定し、前記尤度設定部により前記位置尤度が高く設定された前記仮想画像の仮想位置に基づいて前記移動物体の位置を推定する
ことを特徴とする請求項1に記載の移動物体位置姿勢推定装置。 - 前記対比画像取得部は、前記撮像部が過去に撮像した前記撮像画像を前記対比画像として取得し、
前記移動物体位置姿勢推定部は、前記尤度設定部により前記姿勢角尤度が高く設定された前記対比画像を撮像した時の前記移動体の姿勢角に基づいて前記移動物体の姿勢角を推定し、前記尤度設定部により前記位置尤度が高く設定された前記対比画像を撮像した時の前記移動体の位置に基づいて前記移動物体の位置を推定する
ことを特徴とする請求項1に記載の移動物体位置姿勢推定装置。 - 前記尤度設定部は、前記撮像画像及び前記対比画像内の遠方位置画素のうち、所定距離以上の遠方で前記撮像画像の画素と前記対比画像の画素が一致した場合には、尤度の増加幅を少なく又は所定の上限値となるように前記対比画像の姿勢角尤度を設定することを特徴とする請求項1乃至請求項3の何れか一項に記載の移動物体位置姿勢推定装置。
- 前記尤度設定部は、前記撮像画像及び前記対比画像内の近傍位置画素のうち、所定距離以上の近傍で前記撮像画像の画素と前記対比画像の画素が一致した場合には、尤度の増加幅を少なく又は所定の上限値となるように前記対比画像の位置尤度を設定することを特徴とする請求項1乃至請求項4の何れか一項に記載の移動物体位置姿勢推定装置。
- 前記仮想画像取得部は、仮想位置及び仮想姿勢角が設定された候補点を複数設定して、前記候補点ごとに仮想画像を取得し、前記尤度設定部は、各仮想画像と撮像画像とを比較して姿勢角尤度及び位置尤度を設定し、前記移動物体位置姿勢推定部は、複数の仮想画像の姿勢角尤度に基づいて前記移動物体の姿勢角を推定し、複数の仮想画像の位置尤度に基づいて前記移動物体の位置を推定し、
前記仮想画像取得部は、複数の仮想画像の姿勢角尤度及び複数の仮想画像の位置尤度に基づいて、前記複数の候補点を再設定すること
を特徴とする請求項2、4及び5の何れか一項に記載の移動物体位置姿勢推定装置。 - 前記対比画像取得部は、複数の位置を設定し、前記位置ごとに前記移動体に近い領域の前記対比画像を取得し、
前記尤度設定部は、前記移動体に近い領域の各対比画像と前記撮像画像とを比較して、前記対比画像内のエッジと前記撮像画像内のエッジとの一致度合いが高いほど、前記位置尤度を高く設定し、
前記移動物体位置姿勢推定部は、前記尤度設定部により前記位置尤度が高く設定された前記対比画像の位置に基づいて前記移動物体の位置を推定し、
前記対比画像取得部は、複数の姿勢角を設定し、前記姿勢角ごとに前記移動体から遠い領域の前記対比画像を取得し、
前記尤度設定部は、前記移動体から遠い領域の各対比画像と前記撮像画像とを比較して、前記対比画像内のエッジと前記撮像画像内のエッジとの一致度合いが高いほど、前記姿勢角尤度を高く設定し、
前記移動物体位置姿勢推定部は、前記尤度設定部により前記姿勢角尤度が高く設定された前記対比画像の姿勢角に基づいて前記移動物体の姿勢角を推定する
ことを特徴とする請求項1に記載の移動物体位置姿勢推定装置。 - 移動物体の位置及び姿勢角を推定する移動物体位置姿勢推定方法であって、
前記移動物体周辺を撮像した撮像画像と、予め定めた位置及び姿勢角から見た時の対比画像とを比較し、
前記撮像画像内の遠方位置画素と前記対比画像内の遠方位置画素とが一致した場合には前記対比画像の姿勢角尤度を高く設定し、前記撮像画像内の近傍位置画素と前記対比画像内の近傍位置画素とが一致した場合には前記対比画像の位置尤度を高く設定し、
前記姿勢角尤度が高く設定された前記対比画像の姿勢角に基づいて前記移動物体の姿勢角を推定し、前記位置尤度が高く設定された前記対比画像の位置に基づいて前記移動物体の位置を推定する
を有することを特徴とする移動物体位置姿勢推定方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/383,012 US9797981B2 (en) | 2012-03-06 | 2013-02-28 | Moving-object position/attitude estimation apparatus and moving-object position/attitude estimation method |
JP2014503799A JP5804185B2 (ja) | 2012-03-06 | 2013-02-28 | 移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法 |
EP13758148.4A EP2824425B1 (en) | 2012-03-06 | 2013-02-28 | Moving-object position/attitude estimation apparatus and method for estimating position/attitude of moving object |
CN201380013066.8A CN104204726B (zh) | 2012-03-06 | 2013-02-28 | 移动物体位置姿态估计装置和移动物体位置姿态估计方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-049372 | 2012-03-06 | ||
JP2012049372 | 2012-03-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013133129A1 true WO2013133129A1 (ja) | 2013-09-12 |
Family
ID=49116610
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/055470 WO2013133129A1 (ja) | 2012-03-06 | 2013-02-28 | 移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9797981B2 (ja) |
EP (1) | EP2824425B1 (ja) |
JP (1) | JP5804185B2 (ja) |
CN (1) | CN104204726B (ja) |
WO (1) | WO2013133129A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160031900A (ko) * | 2014-09-15 | 2016-03-23 | 삼성전자주식회사 | 이미지 촬영 방법 및 이미지 촬영 장치 |
CN105723180A (zh) * | 2013-11-13 | 2016-06-29 | 日产自动车株式会社 | 移动体位置估计装置以及移动体位置估计方法 |
CN105809701A (zh) * | 2016-03-25 | 2016-07-27 | 成都易瞳科技有限公司 | 全景视频姿态标定方法 |
EP3147629A1 (en) * | 2014-05-20 | 2017-03-29 | Nissan Motor Co., Ltd. | Object detection device and object detection method |
JP2018077162A (ja) * | 2016-11-10 | 2018-05-17 | 株式会社デンソーアイティーラボラトリ | 車両位置検出装置、車両位置検出方法及び車両位置検出用コンピュータプログラム |
JP2018194417A (ja) * | 2017-05-17 | 2018-12-06 | 株式会社Soken | 位置推定装置、移動装置 |
JP2019168428A (ja) * | 2018-03-26 | 2019-10-03 | 康二 蓮井 | 移動距離測定装置、移動距離測定方法、及び移動距離測定プログラム |
WO2019208101A1 (ja) * | 2018-04-27 | 2019-10-31 | 日立オートモティブシステムズ株式会社 | 位置推定装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150226573A1 (en) * | 2014-02-11 | 2015-08-13 | Qualcomm Incorporated | Pedometer integrated pedestrian positioning |
KR101784183B1 (ko) * | 2014-06-17 | 2017-10-11 | 주식회사 유진로봇 | ADoG 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법 |
CN106488143B (zh) * | 2015-08-26 | 2019-08-16 | 刘进 | 一种生成视频数据、标记视频中物体的方法、系统及拍摄装置 |
DE202015008708U1 (de) * | 2015-12-18 | 2017-03-21 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Fahrzeugpositioniersystem |
CN108335329B (zh) * | 2017-12-06 | 2021-09-10 | 腾讯科技(深圳)有限公司 | 应用于飞行器中的位置检测方法和装置、飞行器 |
JP7010778B2 (ja) * | 2018-06-29 | 2022-01-26 | 国立大学法人東海国立大学機構 | 観測位置推定装置、その推定方法、及びプログラム |
CN109003305B (zh) * | 2018-07-18 | 2021-07-20 | 江苏实景信息科技有限公司 | 一种定位定姿方法及装置 |
CN108871314B (zh) * | 2018-07-18 | 2021-08-17 | 江苏实景信息科技有限公司 | 一种定位定姿方法及装置 |
IL265818A (en) * | 2019-04-02 | 2020-10-28 | Ception Tech Ltd | System and method for determining the position and orientation of an object in space |
JP7204612B2 (ja) * | 2019-08-06 | 2023-01-16 | 株式会社東芝 | 位置姿勢推定装置、位置姿勢推定方法及びプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326168A (ja) * | 2004-05-12 | 2005-11-24 | Fuji Photo Film Co Ltd | 運転支援システム、車両、および運転支援方法 |
JP2009199572A (ja) | 2008-01-25 | 2009-09-03 | Kazuo Iwane | 三次元機械地図、三次元機械地図生成装置、ナビゲーション装置及び自動運転装置 |
US20110115902A1 (en) * | 2009-11-19 | 2011-05-19 | Qualcomm Incorporated | Orientation determination of a mobile station using side and top view images |
JP2011528819A (ja) * | 2008-06-11 | 2011-11-24 | ノキア コーポレイション | ユーザインターフェース制御のためのカメラジェスチャ |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB8925196D0 (en) * | 1989-11-08 | 1990-05-30 | Smiths Industries Plc | Navigation systems |
US5422828A (en) * | 1991-12-18 | 1995-06-06 | Choate; William C. | Method and system for image-sequence-based target tracking and range estimation |
US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
JP3833786B2 (ja) * | 1997-08-04 | 2006-10-18 | 富士重工業株式会社 | 移動体の3次元自己位置認識装置 |
JP3052286B2 (ja) * | 1997-08-28 | 2000-06-12 | 防衛庁技術研究本部長 | 飛行システムおよび航空機用擬似視界形成装置 |
JP2001344597A (ja) * | 2000-05-30 | 2001-12-14 | Fuji Heavy Ind Ltd | 融合視界装置 |
US6690883B2 (en) * | 2001-12-14 | 2004-02-10 | Koninklijke Philips Electronics N.V. | Self-annotating camera |
JP3846494B2 (ja) * | 2004-07-13 | 2006-11-15 | 日産自動車株式会社 | 移動障害物検出装置 |
WO2007027847A2 (en) * | 2005-09-01 | 2007-03-08 | Geosim Systems Ltd. | System and method for cost-effective, high-fidelity 3d-modeling of large-scale urban environments |
JP2007316966A (ja) * | 2006-05-26 | 2007-12-06 | Fujitsu Ltd | 移動ロボット、その制御方法及びプログラム |
JP4870546B2 (ja) * | 2006-12-27 | 2012-02-08 | 株式会社岩根研究所 | レイヤー生成・選択機能を備えたcvタグ映像表示装置 |
DE102006062061B4 (de) * | 2006-12-29 | 2010-06-10 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung, Verfahren und Computerprogramm zum Bestimmen einer Position basierend auf einem Kamerabild von einer Kamera |
JP5183071B2 (ja) * | 2007-01-22 | 2013-04-17 | 任天堂株式会社 | 表示制御装置および表示制御プログラム |
US8050458B2 (en) * | 2007-06-18 | 2011-11-01 | Honda Elesys Co., Ltd. | Frontal view imaging and control device installed on movable object |
US7826666B2 (en) * | 2008-02-27 | 2010-11-02 | Honeywell International Inc. | Methods and apparatus for runway segmentation using sensor analysis |
GB0818561D0 (en) * | 2008-10-09 | 2008-11-19 | Isis Innovation | Visual tracking of objects in images, and segmentation of images |
JPWO2012133371A1 (ja) * | 2011-03-28 | 2014-07-28 | 日本電気株式会社 | 撮像位置および撮像方向推定装置、撮像装置、撮像位置および撮像方向推定方法ならびにプログラム |
-
2013
- 2013-02-28 CN CN201380013066.8A patent/CN104204726B/zh active Active
- 2013-02-28 JP JP2014503799A patent/JP5804185B2/ja active Active
- 2013-02-28 WO PCT/JP2013/055470 patent/WO2013133129A1/ja active Application Filing
- 2013-02-28 US US14/383,012 patent/US9797981B2/en active Active
- 2013-02-28 EP EP13758148.4A patent/EP2824425B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326168A (ja) * | 2004-05-12 | 2005-11-24 | Fuji Photo Film Co Ltd | 運転支援システム、車両、および運転支援方法 |
JP2009199572A (ja) | 2008-01-25 | 2009-09-03 | Kazuo Iwane | 三次元機械地図、三次元機械地図生成装置、ナビゲーション装置及び自動運転装置 |
JP2011528819A (ja) * | 2008-06-11 | 2011-11-24 | ノキア コーポレイション | ユーザインターフェース制御のためのカメラジェスチャ |
US20110115902A1 (en) * | 2009-11-19 | 2011-05-19 | Qualcomm Incorporated | Orientation determination of a mobile station using side and top view images |
Non-Patent Citations (1)
Title |
---|
See also references of EP2824425A4 |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105723180A (zh) * | 2013-11-13 | 2016-06-29 | 日产自动车株式会社 | 移动体位置估计装置以及移动体位置估计方法 |
CN105723180B (zh) * | 2013-11-13 | 2017-08-15 | 日产自动车株式会社 | 移动体位置估计装置以及移动体位置估计方法 |
EP3147629A1 (en) * | 2014-05-20 | 2017-03-29 | Nissan Motor Co., Ltd. | Object detection device and object detection method |
EP3147629A4 (en) * | 2014-05-20 | 2017-04-05 | Nissan Motor Co., Ltd | Object detection device and object detection method |
US10477093B2 (en) | 2014-09-15 | 2019-11-12 | Samsung Electronics Co., Ltd. | Method for capturing image and image capturing apparatus for capturing still images of an object at a desired time point |
WO2016043423A1 (en) * | 2014-09-15 | 2016-03-24 | Samsung Electronics Co., Ltd. | Method for capturing image and image capturing apparatus |
CN106605236A (zh) * | 2014-09-15 | 2017-04-26 | 三星电子株式会社 | 用于捕捉图像的方法和图像捕捉装置 |
KR20160031900A (ko) * | 2014-09-15 | 2016-03-23 | 삼성전자주식회사 | 이미지 촬영 방법 및 이미지 촬영 장치 |
KR102232517B1 (ko) * | 2014-09-15 | 2021-03-26 | 삼성전자주식회사 | 이미지 촬영 방법 및 이미지 촬영 장치 |
CN105809701A (zh) * | 2016-03-25 | 2016-07-27 | 成都易瞳科技有限公司 | 全景视频姿态标定方法 |
JP2018077162A (ja) * | 2016-11-10 | 2018-05-17 | 株式会社デンソーアイティーラボラトリ | 車両位置検出装置、車両位置検出方法及び車両位置検出用コンピュータプログラム |
JP2018194417A (ja) * | 2017-05-17 | 2018-12-06 | 株式会社Soken | 位置推定装置、移動装置 |
JP2019168428A (ja) * | 2018-03-26 | 2019-10-03 | 康二 蓮井 | 移動距離測定装置、移動距離測定方法、及び移動距離測定プログラム |
WO2019208101A1 (ja) * | 2018-04-27 | 2019-10-31 | 日立オートモティブシステムズ株式会社 | 位置推定装置 |
JP2019191133A (ja) * | 2018-04-27 | 2019-10-31 | 日立オートモティブシステムズ株式会社 | 位置推定装置 |
JP7190261B2 (ja) | 2018-04-27 | 2022-12-15 | 日立Astemo株式会社 | 位置推定装置 |
US11538241B2 (en) | 2018-04-27 | 2022-12-27 | Hitachi Astemo, Ltd. | Position estimating device |
Also Published As
Publication number | Publication date |
---|---|
CN104204726A (zh) | 2014-12-10 |
EP2824425A4 (en) | 2015-05-27 |
CN104204726B (zh) | 2017-05-31 |
EP2824425A1 (en) | 2015-01-14 |
EP2824425B1 (en) | 2017-05-17 |
JPWO2013133129A1 (ja) | 2015-07-30 |
US9797981B2 (en) | 2017-10-24 |
US20150015702A1 (en) | 2015-01-15 |
JP5804185B2 (ja) | 2015-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5804185B2 (ja) | 移動物体位置姿勢推定装置及び移動物体位置姿勢推定方法 | |
EP3176013B1 (en) | Predictive suspension control for a vehicle using a stereo camera sensor | |
JP4814669B2 (ja) | 3次元座標取得装置 | |
JP5966747B2 (ja) | 車両走行制御装置及びその方法 | |
JP4702569B2 (ja) | 車両用画像処理装置 | |
JP6233345B2 (ja) | 路面勾配検出装置 | |
JP5867176B2 (ja) | 移動物体位置姿勢推定装置及び方法 | |
EP3032818B1 (en) | Image processing device | |
JP4943034B2 (ja) | ステレオ画像処理装置 | |
WO2017051480A1 (ja) | 画像処理装置及び画像処理方法 | |
JP5310027B2 (ja) | 車線認識装置、及び車線認識方法 | |
US10614321B2 (en) | Travel lane detection method and travel lane detection device | |
JP6044084B2 (ja) | 移動物体位置姿勢推定装置及び方法 | |
JP7145770B2 (ja) | 車間距離測定装置、誤差モデル生成装置および学習モデル生成装置とこれらの方法およびプログラム | |
JP2009139325A (ja) | 車両用走行路面検出装置 | |
JP2012517055A (ja) | ビデオに基く車両の運転者援助システムの作動方法及び装置 | |
JP5891802B2 (ja) | 車両位置算出装置 | |
CN107248171B (zh) | 一种基于三角剖分的单目视觉里程计尺度恢复方法 | |
JP5760523B2 (ja) | 走路推定装置及びプログラム | |
JP5903901B2 (ja) | 車両位置算出装置 | |
JP5330341B2 (ja) | 車載カメラを用いた測距装置 | |
JP2004038760A (ja) | 車両用走行路認識装置 | |
JP6704307B2 (ja) | 移動量算出装置および移動量算出方法 | |
JP4876676B2 (ja) | 位置計測装置、方法及びプログラム、並びに移動量検出装置、方法及びプログラム | |
JP2022180672A (ja) | 外界認識装置、外界認識方法、および外界認識プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13758148 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2013758148 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013758148 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014503799 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14383012 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |