WO2005038710A1 - 移動体動き算出方法および装置、並びにナビゲーションシステム - Google Patents
移動体動き算出方法および装置、並びにナビゲーションシステム Download PDFInfo
- Publication number
- WO2005038710A1 WO2005038710A1 PCT/JP2004/015677 JP2004015677W WO2005038710A1 WO 2005038710 A1 WO2005038710 A1 WO 2005038710A1 JP 2004015677 W JP2004015677 W JP 2004015677W WO 2005038710 A1 WO2005038710 A1 WO 2005038710A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- movement
- moving body
- corresponding points
- plane
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 508
- 238000000034 method Methods 0.000 title claims description 90
- 238000004364 calculation method Methods 0.000 claims description 135
- 238000011156 evaluation Methods 0.000 claims description 84
- 230000036961 partial effect Effects 0.000 claims description 44
- 238000010586 diagram Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 17
- 230000000694 effects Effects 0.000 description 15
- 239000002689 soil Substances 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 9
- 230000002194 synthesizing effect Effects 0.000 description 9
- 230000002829 reductive effect Effects 0.000 description 8
- 230000007423 decrease Effects 0.000 description 6
- 239000002131 composite material Substances 0.000 description 5
- 238000007796 conventional method Methods 0.000 description 4
- 238000002474 experimental method Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 241000898323 Microtus irani Species 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8086—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
Definitions
- the present invention relates to a method and an apparatus for calculating a movement of a moving body, and a navigation system.
- the present invention relates to a technique for calculating the movement of a moving body, and more particularly to a technique for calculating the movement of the moving body in a three-dimensional space using an image captured by a camera installed on the moving body.
- the main methods for calculating the movement of a moving object such as a vehicle and a self-propelled robot are methods that use sensors that detect the rotation of wheels and the angle, acceleration, and angular acceleration of steered wheels, and cameras installed on the moving object. O Using the image around the moving object taken in step o
- the method using a sensor that detects the rotation of the wheels and the angle of the steered wheels has the advantage of being stable because the error is small under certain circumstances such as a flat and dry road surface, and the detection is rarely possible. is there.
- the error of the movement becomes large in principle.
- the method using a sensor that detects acceleration or angular acceleration in principle, the movement of a moving object can be calculated even when the road surface and wheels slide or when the road surface is not flat.
- the optical flow is a vector that connects two points in time that correspond to one point on the subject, that is, points on the image, that is, corresponding points. Since a geometric constraint equation is established between the corresponding point and the camera motion, the camera motion can be calculated when multiple corresponding points satisfy certain conditions.
- Non-Patent Document 1 proposes a method called a so-called eight-point method, in which force motion is calculated from a set of eight or more stationary corresponding points between two images.
- 8-point method it is difficult to extract corresponding points with a small error of 8 points or more because errors are included in the calculation results of the corresponding points.
- the error of the camera motion may be large even if the error of the corresponding point is relatively small from the qualitative point.
- Non-Patent Literature 2 and Patent Literature 1 a method of calculating camera motion and plane equations from four or more corresponding points, assuming that all corresponding input points are on the same plane Has been proposed.
- This method is an example of a method of calculating camera motion from fewer than eight corresponding points, assuming that a plurality of corresponding points satisfy a certain constraint.
- the fact that the number of corresponding points used for calculating the camera motion is small means that when the corresponding points include an error with a certain probability, the effect that the calculated camera motion has a low probability of including an error can be expected. Furthermore, when more than four corresponding points are used, the camera motion is calculated in a manner that minimizes the square error, and the measurement included in the corresponding points is calculated. The effect that the influence of the error can be reduced can be expected.
- a camera motion with a small error is calculated by applying a Kalman filter that reflects a vehicle behavior and a camera arrangement to a camera motion calculated by selecting a corresponding point on a road surface plane.
- a method has been proposed. By applying a fill-in process to the calculated camera motion, errors in camera motion calculation can be reduced.
- the camera movement is limited to three parameters of parallel movement and rotation on a plane, and a geometric constraint equation between the corresponding point and the camera movement is given by:
- a method has been proposed in which a camera motion is calculated by an exploratory method using an optical constraint equation regarding pixel values between small regions near a corresponding point.
- Non-patent Document 4 among unknown parameters when calculating camera motion from corresponding points, rotation and translation are calculated in two stages, and unknown parameters obtained in one process are calculated. A method has been proposed to reduce the number of evenings and reduce errors in camera motion calculation.
- Patent Literature 1 International Patent Publication No. 97/35 1 6 1 Pamphlet
- Patent Document 2 Japanese Patent Application Laid-Open No. 2000-2006-160
- Patent Literature 3 Japanese Patent Application Publication No. 2003-51015 27
- Non-Patent Document 1 A computer algorithm for reconstructing a scene fr om two projections., H.C.Longuet-Higgins, Nature, 293: 133-135, Sept. 1981
- Non-Patent Document 2 “Image Understanding” Kenichi Kanaya, Morikita Publishing, 1990
- Non-Patent Document 3 "A Robust Method for Computing Vehicle Ego -motion"
- Non-Patent Document 4 "Recovery of Ego -Motion Using Image Stabilization"
- Kanaya's method can calculate the camera motion from four or more corresponding points on a plane, but the stationary motion that exists on the same plane from multiple corresponding points in the image It is necessary to extract corresponding points.
- the filtering process is effective as a method for reducing the effect of errors included in the camera motion calculation result.
- the cause of camera movements and errors follow the model assumed in advance.For example, if the vehicle is running at low speed, the effects of unpredictable disturbances such as bumps and bumps on the road surface will increase. However, it is difficult to design Phil Yu itself. For this reason, there is a problem that the error increases when the parameter of the phil evening is not appropriate.
- Non-Patent Document 4 separates rotation and translation in two steps, so for example, if both components of rotation and translation are included to the extent that they cannot be ignored, In some cases, it is not possible to correctly estimate the stage.
- the present invention provides a technique for obtaining the movement of a moving body on which a camera is installed using a camera image, using a corresponding point (optical point), and the corresponding point having a large error.
- An object of the present invention is to make it possible to accurately calculate the movement of a moving object even when the moving object is included.
- the present invention provides a method for obtaining a plurality of corresponding points from two camera images having different shooting times when obtaining a motion of a moving object on which a camera is installed using images around the moving object taken by the camera. Assuming a predetermined plane in the image, a first motion of the moving object is obtained using the obtained plurality of corresponding points, and the first motion and the plurality of corresponding points are used to obtain the moving object. It seeks a second move.
- unknown parameters can be reduced by assuming a predetermined plane when obtaining the first motion, and by giving a constraint that some corresponding points exist on the predetermined plane. It is possible to obtain a rough, first error-free motion with respect to the plane. Then, by using the first motion, for example, even when the plane is actually inclined, a second motion with a small error can be obtained.
- the first motion is calculated by selecting a partial corresponding point composed of m (m is an integer of 3 or more) corresponding points out of a plurality of corresponding points by q (q is an integer of 2 or more) A group is selected, a motion candidate of the moving object is calculated from each of the selected q corresponding partial points using a plane expression representing a plane, and the calculated q motion candidates are evaluated by a predetermined evaluation method. It is preferable that the first movement is specified according to the evaluation result.
- the first motion of the moving object having a small error can be calculated without being affected by a large error such as an outlier included in the corresponding point.
- the method includes a two-step motion calculation step, calculates a first motion assuming a plane, and then calculates a second motion using the first motion. Even if the road surface is inclined, the motion of the moving object can be calculated with high accuracy.
- FIG. 1 is a diagram showing a situation in the first embodiment of the present invention.
- FIG. 2 is a block diagram showing a configuration according to the first embodiment of the present invention.
- FIG. 3 is a block diagram showing a detailed configuration of the first motion calculator in FIG. 4 (a) and 4 (b) are examples of images taken in the situation of FIG. 1, and FIG. 4 (c) is a diagram showing corresponding points.
- FIG. 5 is a diagram showing the relationship between camera movement and image change.
- FIG. 6 is a graph conceptually showing the distribution of the evaluation value E 3 i of the motion Mj for each corresponding point.
- FIG. 7 is an example of a composite image on which a predicted trajectory corresponding to the calculated motion is superimposed.
- Fig. 8 (a) is an example of an actual image from the camera installed in the vehicle, and
- Fig. 8 (b) is a graph showing the relationship between the moving speed of the vehicle and the corresponding point accuracy rate in the situation of Fig. 8 (a).
- Fig. 9 (a) is an image example when the vehicle is tilted
- Fig. 9 (b) is the distribution of the evaluation value Eji of the motion Mj for each corresponding point in the situation of Fig. 9 (a).
- FIG. 10 (a) shows a situation where another vehicle is present behind the vehicle
- FIG. 10 (b) is an example of an image taken in the situation of FIG. 10 (a).
- FIG. 11 is a graph showing the relationship between the ratio (R / Rtrue) of the stationary corresponding point ratio R to the true stationary corresponding point ratio Rtrue and the error EM of the force motion.
- Figures 12 (a) and (b) are diagrams conceptually showing a search using planar parameters.
- FIG. 13 shows an example of a composite image on which a caution message or the like is superimposed.
- Figure 14 is an example of a captured image in an indoor parking lot.
- FIG. 15 is a diagram showing a representation of the movement of a moving object based on the Atsukaman model.
- FIG. 16 is a configuration diagram of a navigation system according to the second embodiment of the present invention.
- Figure 17 is a graph showing the trajectory of the vehicle obtained by experiments in an indoor parking lot.
- FIG. 18 is a diagram showing another configuration for implementing the present invention. BEST MODE FOR CARRYING OUT THE INVENTION
- a method for obtaining a motion of a moving object on which a camera is installed by using an image around the moving object photographed by the camera includes a plurality of images from two images having different photographing times.
- the moving body motion calculation method according to the first aspect, wherein the plane is a road surface, a ceiling surface, or a wall surface.
- the moving object motion calculating method obtains the first motion using the three corresponding points.
- the first motion calculating step includes a step of updating a plane formula that defines the plane using the first motion or the second motion obtained in the past.
- the first motion calculation step includes, among the plurality of corresponding points, a partial corresponding point composed of m (m is an integer of 3 or more) corresponding points, q (q Is an integer of 2 or more.)
- the first step of selecting pairs and the selection in the first step From the set of q corresponding partial points, a second step of calculating the motion candidate of the moving object using a plane expression representing the plane, and q motion candidates calculated in the second step are respectively described.
- a moving object motion calculating method which includes a third step of performing the above-described first motion specifying in accordance with a result of the evaluation by a predetermined evaluation method.
- the first motion calculation step comprises: using the plurality of corresponding points and the first motion or the second motion obtained in the past to calculate a stationary corresponding point. Calculating a stationary corresponding point ratio that is a ratio, wherein the third step is to perform an evaluation for specifying the first motion using the calculated stationary corresponding point ratio.
- the first motion calculating step includes a step of calculating a predicted value of the movement of the moving body from the first motion or the second motion obtained in the past.
- the third step provides a moving object motion calculation method according to a fifth aspect, wherein the first motion is specified in consideration of the predicted value.
- the second motion is evaluated while using the first motion as an initial value while evaluating the plurality of corresponding points by a search method.
- a moving object motion calculation method according to a first aspect to be obtained is provided.
- the second motion calculation step performs the evaluation using a stationary corresponding point that is a ratio of a stationary corresponding point. provide.
- the moving object is a vehicle
- the second motion calculation step includes a partial corresponding point corresponding to the first motion instead of the first motion instead of the first motion.
- the plane equation is used as an initial value, and the search method is used to evaluate the plurality of corresponding points while tilting the plane around the axle center close to the camera.
- a moving object motion calculation method according to a fifth aspect for obtaining a second motion is provided.
- the moving body has non-steered wheels whose axle direction is fixed, and in the first and second motion calculation steps, a coordinate system for representing the motion of the moving body.
- the vertical axis of the non-steered wheel so as to be orthogonal to the straight line including the axle,
- a first aspect of the present invention provides a moving body motion calculation method in which a motion of the moving body on a road surface is obtained as an arc motion about the vertical axis.
- the moving body motion calculation method according to the 11th aspect, wherein the vertical axis passes through a center position of the non-steered wheel.
- a device for obtaining a motion using a surrounding image of a moving object on which a camera is installed by using the image of the moving object a plurality of images from two images having different shooting times are provided.
- a second motion calculation unit for obtaining a second motion indicating the motion of the moving body.
- a navigation device having a position information acquiring unit for acquiring position information of a moving object, and a moving object motion calculating device of the thirteenth aspect, wherein the position information acquiring unit A navigation system for obtaining a current position of the moving object based on the position information thus obtained and the movement of the moving object obtained by the moving object movement calculating device is provided.
- a vehicle provided with a camera is taken as an example of a moving object, and the motion of the vehicle is obtained using an image behind the vehicle captured by the camera.
- FIG. 1 is a diagram showing a situation in the present embodiment.
- a vehicle 120 as a moving body is provided with a camera 120 at a rear portion thereof so as to photograph the rear area.
- the camera 120 shoots the area around the rear of the vehicle and outputs a series of image sequences.
- FIG. 2 is a block diagram showing a configuration including the moving object motion calculation device according to the present embodiment.
- reference numeral 100 denotes a moving body motion calculation device that obtains the movement of the vehicle 1 from an image captured by the camera 120
- 121 denotes a mobile body motion calculation based on an image captured by the camera 120.
- An image synthesizing device 122 generates a synthesized image on which the image is superimposed, and a display 122 displays the synthesized image generated by the image synthesizing device 122.
- the moving body motion calculation device 100 includes a corresponding point calculation unit 101 for obtaining a plurality of corresponding points (optical flow vectors) between two consecutive images output from the camera 120, and a corresponding point calculation unit 1 Using a plurality of corresponding points output from 0 1, assuming a plane, a first movement calculating section 102 for obtaining a first movement Ma indicating the movement of the vehicle 1, and a corresponding point calculating section 1 Using the plurality of corresponding points output from the first motion calculating unit 102 and the first motion Ma output from the first motion calculating unit 102, a second motion Mb indicating the motion of the vehicle 1 is calculated. And a second motion calculation unit 103 to be obtained.
- the moving body motion calculating apparatus 100, the image synthesizing apparatus 122, and the display 122 in FIG. 2 are installed in the vehicle 1, for example, but may be provided in a different place from the vehicle 1.
- FIG. 3 is a block diagram showing a detailed configuration of the first motion calculator 102 in the moving body motion calculator 100 of FIG.
- reference numeral 1 1 1 denotes a partial corresponding point selecting unit for selecting a plurality of sets of partial corresponding points each including a predetermined number of corresponding points from a plurality of corresponding points output from the corresponding point calculating unit 101
- 1 1 2 denotes
- a motion candidate calculation unit that calculates a motion candidate of the vehicle 1 from each of the partial corresponding points of each pair selected by the partial corresponding point selection unit 1 1 1, 1 1 3 is calculated by the motion candidate calculation unit 1 1 2
- a motion evaluation selection unit that evaluates the motion candidate by a predetermined evaluation method and specifies the first motion Ma according to the evaluation result.
- 1 14 is a plane equation calculation unit that outputs a plane equation representing a predetermined plane assumed when calculating the first movement Ma
- 1 15 is a stationary corresponding point ratio which is a ratio of stationary corresponding points.
- a stationary corresponding point ratio calculation unit that outputs R, and 116 is a moving object motion prediction unit that calculates a predicted value of the moving object motion.
- the world coordinate system (Xw, Yw, Zw) is stationary in a three-dimensional space
- the vehicle coordinate system (Xc, Yc, Zc) and the viewpoint coordinate system (X, Y, ⁇ ). ) Is fixed to vehicle 1.
- View point coordinate system (x, y, z) and vehicle coordinate system The relationship between the coordinate values (xc, yc, zc) can be expressed as (Equation 1) using a matrix C representing the positional relationship between coordinate systems.
- FIG. 4 (a) and 4 (b) show an example of a series of image sequences output from the force camera 120 when the vehicle 1 moves backward in the situation of FIG. 1, and FIG. 4 (a) shows the sequence at time t_l.
- the captured image FIG. 4 (b) is an image captured at time t.
- FIG. 4 (c) is a diagram showing corresponding points (arrows in the figure) which are movements of the same point between the two images at time t-1 and time t.
- FIG. 5 is a diagram showing a relationship between a three-dimensional movement of the camera 120 and an image change.
- the stationary point X in the world coordinate system becomes the coordinate value (x, y) in the viewpoint coordinate system.
- , z) to (x,, y,, z,).
- (Equation 3) is established between the coordinate values (x, y, z), (x,, y,, z ') and the camera motion M.
- 5 (b) and 5 (c) are images at time t-1 and t, respectively.
- point X at viewpoint coordinates (X, ⁇ , z) is at image coordinates (u, v) and time t
- the point X at the viewpoint coordinates (x,, y,, z,) corresponds to the image coordinates (u,, v,), respectively.
- Equation 5 Holds.
- f indicates the focal length of the camera.
- Equation 9 is obtained. Since two equations (Equation 9) hold for each set of corresponding points, if the plane equation and the focal length f of the camera 120 are known, independent (not collinear) corresponding points (u, If three or more sets of (V), (11 ', V') are given, six variables of the motion M (tX, ty, tz, wx, wy, wz) of the camera 120 can be calculated.
- the vehicle coordinate system (Xc, Yc, Zc) and the viewpoint coordinate system (X, ⁇ , Z) are assumed to be arranged as shown in Fig. 1, and represent the positional relationship between the vehicle coordinate system and the viewpoint coordinate system.
- C shall be measured in advance.
- the focal length f of the camera 120 is also obtained in advance, and its value is obtained by multiplying the focal length in the real coordinate system by a conversion coefficient between the real coordinates and the image coordinates.
- the corresponding point calculation unit 101 temporarily stores a series of image sequences output from the camera 120, and calculates n corresponding points from two consecutive images in the image sequence (n is an integer of 2 or more). ) To detect.
- n an integer of 2 or more.
- the number n of the corresponding points is a predetermined value.
- the two images for detecting the corresponding points do not necessarily have to be continuous, as long as the imaging times are different.
- the corresponding points may be detected by, for example, the corresponding point search method of Kanade, Lucas and Tomasi (eg, "Detection and Tracking of Point Features", Carlo Tomasi and Takeo Kanade, Carnegie Mellon University Technical Report, CMU -CS-91-132, April 1991.). This method is widely known Since it is a method, detailed description is omitted here.
- Kanade Lucas and Tomasi
- This method is widely known Since it is a method, detailed description is omitted here.
- the plane equation calculation unit 114 outputs a predetermined plane equation.
- the plane formula used here is an approximation of the road surface in the situation of FIG. 1 as a plane.
- the plane formula calculating unit 114 calculates the planar parameters S (a, b) of the previously measured (Formula 6). , c) is output.
- the plane equation calculation unit 114 also has a function of updating the plane equation. This will be described later.
- the stationary corresponding point ratio calculation unit 115 outputs the stationary corresponding point ratio R.
- the stationary corresponding point ratio is preferably a ratio of a corresponding point that is stationary with respect to world coordinates and has a correct correspondence among the n corresponding points calculated by the corresponding point calculator 101. It represents.
- the correspondence of the corresponding points is correct means that the error of the coordinate values of the corresponding points is equal to or less than a predetermined threshold. Note that it is extremely difficult to accurately obtain the stationary corresponding point ratio R in advance in online processing in real time, and in practice, the stationary corresponding point ratio R is the predicted value of the ratio described above. . .
- the stationary corresponding point ratio calculation unit 115 outputs a predetermined value, for example, 0.5 as the stationary corresponding point ratio R.
- the stationary corresponding point ratio R 0.5 means that it is predicted that "the n corresponding points calculated by the corresponding point calculating unit 101 include 50% of the stationary corresponding points having a correct correspondence".
- Means As the predetermined value, for example, the stationary corresponding point ratio is measured a plurality of times in advance in a running state similar to that in FIG. 1, and the lowest value among the measured values may be used.
- the stationary corresponding point calculation unit 115 also has a function of newly calculating the stationary corresponding point ratio R. This will be described later.
- the moving body motion prediction unit 116 has a function of obtaining a predicted value of the motion of the vehicle 1, but here, it is assumed that no particular processing is performed.
- the partial corresponding point selection unit 111 includes n corresponding points Pi output from the corresponding point calculation unit 101, the plane expression S output from the plane expression calculation unit 114, and the stationary corresponding point ratio calculation unit.
- R output from 1 15
- q pairs of partial corresponding points composed of m corresponding points are selected.
- the number m of corresponding points forming the partial corresponding points is the number of corresponding points required to calculate the movement of the moving object.
- m 3.
- m can be greater than 3.
- the partial corresponding point selection unit 111 calculates the number q of partial corresponding point sets.
- R r is a road surface corresponding point rate
- z is a guarantee rate
- the road surface stationary corresponding point ratio R r is a probability that each corresponding point selected as a partial corresponding point exists on the road surface, is stationary with respect to world coordinates, and has a correct correspondence relationship. Therefore, if there is no correlation between the probability R s that the corresponding point exists on the road surface and the stationary corresponding point rate R of the corresponding point, the stationary corresponding point ratio R is calculated as the probability R s that the corresponding point exists on the road surface.
- the product of the multiplication is the road stationary point ratio Rr. Also, if the corresponding points existing on the road surface can be accurately selected, and the static corresponding point ratio of the selected corresponding points is R, the road corresponding point ratio R of the selected corresponding points is , Which is equal to the stationary corresponding point ratio R.
- R r 0.3 is used as the predetermined value.
- the predetermined value for example, in a running state similar to that in FIG. 1, a value obtained by measuring the probability Hs that the selected corresponding point exists on the road surface and multiplying this by the stationary corresponding point ratio R is used. Good.
- Guarantee rate z statistically guarantees that there is at least one set of partial corresponding points where all (m) corresponding points out of q sets of partial corresponding points are composed of stationary correct corresponding points on the road surface. It is rate to do.
- z 0.999 Used.
- the partial corresponding point selection unit 111 selects q sets of partial corresponding points consisting of different m corresponding points from n corresponding points using the calculated number of pairs q and outputs the selected sets.
- a corresponding point may be selected at random from all n corresponding points, or a corresponding point satisfying a specific condition may be extracted and then selected at random.
- the latter is used. That is, after extracting corresponding points in the image area where the plane equation S can exist, q sets of partial corresponding points consisting of m different corresponding points are randomly selected from the extracted corresponding points and output. For example, in FIG. 4 (d), the area where the plane S can exist in the image is limited to the shaded area below the horizontal line.
- the road surface stationary corresponding point ratio R r when selecting the partial corresponding points after extracting the corresponding points existing in the shaded area is the road stationary corresponding point ratio when selecting randomly from n corresponding points.
- R r is higher than R r, and as a result, the number q of partial corresponding points can be made smaller.
- the motion candidate calculation unit 112 calculates q motion candidates from the q sets of partial corresponding points selected by the partial corresponding point selection unit 111 using the plane equation S.
- the corresponding point in the image P i (u, v, u 5, ⁇ ') is located on the plane equation S (a, b, c) , and, if the camera motion M was assumed that small , (Equation 9) holds.
- Equation 9 assuming that the corresponding point and the plane equation are known and the camera movement M is unknown, the camera movement M can be calculated if there are three or more corresponding points.
- the motion evaluation selecting unit 113 selects the q motions calculated by the motion candidate calculating unit 112.
- the n corresponding points P i output from the corresponding point calculating unit 101 and the stationary corresponding point ratio R output from the stationary corresponding point ratio calculating unit 115 are used. If necessary, the motion prediction output from the moving body motion prediction unit 116 may be used. This will be described later.
- (Equation 11) means that there are four independent linear equations for the unknown number 4, and the corresponding point P i (ui, vi, ui ', vi,) calculated from the image contains an error. Therefore, the three-dimensional coordinates (xi, yi, zi) are calculated using the least squares method.
- the camera motion corresponding to is selected as the first motion Ma.
- the evaluation values E ji are sorted in ascending order, the horizontal axis represents the corresponding point numbers (1,..., n) after sorting, and the vertical axis represents the evaluation value E ji of the camera motion Mj for each corresponding point. Value.
- the first motion calculation unit 102 calculates the first motion Ma, which is the motion of the vehicle 1, using the n corresponding points Pi on the image at the time t-1 and the time t. And output.
- the second motion calculation unit 103 uses the first motion Ma output from the first motion calculation unit 102 and the n corresponding points P i output from the corresponding point calculation unit 101 to generate a vehicle.
- a second movement Mb indicating both movements is calculated.
- the second motion Mb is calculated by a search method using the first motion Ma as an initial value.
- the first motion Ma (tX, ty, tz, wx, w, wz) is set as an initial value, and a predetermined minute motion (dtx, dty, dtz, dwx, dwy, dwz) to obtain multiple motion candidates.
- o 3 to 6 motion candidates
- second motion calculating section 103 outputs the motion candidate Mb k selected at that time as second motion Mb.
- the moving body motion calculation apparatus 100 calculates the motion of the vehicle 1 between the time t-1 and the time t. Further, by repeating the same processing for the image sequence output from the camera 120, the movement of the vehicle 1 can be continuously calculated.
- the image synthesizing device 121 receives the image captured by the camera 120 and the second motion Mb output from the moving object motion calculating device 100, and And generating a composite image on which information regarding the image is superimposed.
- the display 122 displays the composite image generated by the image composition device 122.
- FIG. 7 is an example of the displayed composite image, in which a predicted trajectory TR corresponding to the second movement Mb of the vehicle 1 is superimposed on an image captured by the camera 120.
- the motion of the vehicle can be accurately calculated.
- Fig. 8 (a) is an example of an image actually taken with a camera installed behind the vehicle. The corresponding points were searched for the moving image captured in the scene shown in Fig. 8 (a) by the method described above.
- Figure 8 (b) is a graph showing the relationship between the corresponding point correct answer rate and the moving speed of the vehicle at that time. Here, the ratio of the corresponding points where the error on the image coordinates is within one pixel is defined as the corresponding point correct answer rate. In the graph of Fig. 8 (b), the values of time and moving speed are scaled appropriately.
- the first motion Ma is accurately calculated in the first motion calculation unit 102 even if the corresponding point correct answer rate is greatly reduced. be able to. That is, in the case of the method of this embodiment, if at least one correct partial corresponding point is included in q partial corresponding points, q motion candidates M j Contains at least one correct move. Furthermore, using the stationary corresponding point rate: R corresponding to the correct answer rate of the corresponding point search, q movements M j are evaluated by the (nxR) -th evaluation value from the better evaluation value for the n corresponding points. It is possible to calculate the correct motion Ma by performing evaluation using.
- the motion of the vehicle can be accurately calculated.
- the first motion Ma is calculated using a horizontal and flat road surface equation obtained by measurement in a state where the vehicle is stationary as a predetermined plane equation.
- the road surface such as a hill or a hoistway is inclined, or when the posture of the vehicle changes due to the loading of occupants, luggage, or the operation of the steering wheel, the predetermined flat type measured in advance It no longer matches the road surface plane on which the vehicle is running.
- Fig. 9 (a) is an example of an image when the car! 3 ⁇ 4 is tilted.
- the road surface in the image also leans, so that it does not match the predetermined plane formula measured in advance.
- the motion M j calculated using (Equation 9) includes an error.
- the evaluation value E ji of the motion M j for the n corresponding points gradually increases as compared with the case where the plane equation is correct, and the evaluation value E j also increases. Becomes. Therefore, even the first motion M a having the smallest evaluation value includes an error.
- the second motion Mb is calculated using the first motion Ma as an initial value and using a search method based on n XR evaluation values from the smaller one.
- the evaluation function is smooth between the initial value and the true value (or the best value)
- the true value can be obtained by an exploratory method.
- the evaluation value used in the present embodiment satisfies the above condition as long as it is near the true value. Therefore, even if the first motion M a includes an error because the actual road surface no longer matches the predetermined plane formula, the error is calculated by the second motion calculation unit 102. of A small second motion M b can be calculated.
- the system that monitors the area around the vehicle with a camera has the following features. That is, a camera with a wide viewing angle is installed so that the driver can monitor the surroundings of the vehicle, typically, as shown in Fig. 4, so that images from the road surface near the vehicle to obstacles above the horizon are displayed in the image. Is done.
- the image of the camera installed in this way can be expected to reflect the road surface in almost all scenes.
- the motion of the vehicle can be calculated from three corresponding points using (Equation 9).
- the 'method without assuming a plane formula' requires at least four or more corresponding points in order to obtain a motion (or a motion candidate), and is therefore susceptible to errors in the calculation process.
- the effect of being able to avoid the effects of errors can be obtained.
- FIG. 10 (a) is a diagram showing a situation in which another moving vehicle 2 exists behind the own vehicle 1.
- own vehicle 1 is moving backward (right in the figure), and other vehicle 2 is moving forward (left in the figure).
- the image shown in Fig. 10 (b) is Taken by 120.
- the movement of 1 is a movement with a large error.
- the movement of the host vehicle 1 is calculated by assuming the equation of the road surface plane. For this reason, in the first motion calculation unit 102, the other vehicle separated from the road surface plane
- the movement Mj of the vehicle 1 obtained from the corresponding point on 2 includes a large error, and the evaluation value Ej increases.
- the motion obtained from the corresponding point on the other vehicle 2 is not selected in the first motion calculating unit 102, and the motion with respect to the road surface plane obtained from the corresponding point on the road is Movement is selected as Ma. From the above, it can be said that the present embodiment has an effect that the movement of a moving object can be correctly calculated even when another moving object is present in the image.
- the plane equation calculation unit 114 can also update the plane equation using the first motion Ma obtained by the first motion calculation unit 102 from the past image.
- the second motion Mb may be used instead of the first motion Ma, or both may be used. You may use it.
- the calculation may be performed each time the motion calculation is performed a predetermined number of times, or may be performed when a change occurs in the road surface or the vehicle condition.
- the stationary corresponding point ratio calculating unit 115 includes a plurality of corresponding points output from the corresponding point calculating unit 101 and a first motion calculating unit based on past images. It is also possible to calculate the stationary corresponding point ratio using the first motion Ma calculated in 102.
- the stationary corresponding point calculation unit 115 calculates the first motion Ma calculated by the first motion calculation unit 102 with respect to n corresponding points as described above (Equation 1). 1) and (Equation 12) are evaluated. Then, a ratio of the n corresponding points whose evaluation value Ei is equal to or smaller than a predetermined threshold is calculated, and this is stored as a new stationary corresponding point ratio. Then, when the next motion calculation is executed, the retained corresponding point ratio is output. Thereafter, by repeating the same processing sequentially, the stationary corresponding point rate is updated.
- the calculated motion includes an error with respect to the true motion.
- the relationship between the stationary corresponding point ratio R, the true stationary corresponding point ratio Rtrue, and the camera motion error EM is shown in the graph of FIG.
- the graph of FIG. 11 schematically shows the common tendency based on the results of experiments performed by the inventors of the present application.
- the stationary corresponding point ratio R and the true stationary corresponding point ratio Rtrue match (the ratio R / Rtrue is 1).
- the camera motion error EM becomes the smallest, and the camera motion calculation accuracy decreases even when the stationary corresponding point ratio R becomes larger or smaller than the true stationary corresponding point ratio Rtrue. .
- the stationary corresponding point ratio R (t-1) calculated using the motion obtained at time t-1 is expected to be close to the true stationary corresponding point ratio Rtrue (t) in the image at time t. You. Therefore, by updating the stationary corresponding point ratio, it is possible to calculate a motion with a small error.
- the moving object motion was calculated using the stationary corresponding point ratio R (t-1) calculated at time t-1.
- the stationary corresponding point ratio R (t) obtained from the moving object motion was calculated. If the stationary corresponding point ratio R (t-1) is lower, the moving object motion at time t may be recalculated using the new stationary corresponding point ratio R (t).
- the second motion Mb may be used instead of the first motion Ma, or both may be used.
- the moving body motion prediction unit 116 can predict the motion of the vehicle 1 using the first motion Ma obtained in the past by the first motion calculation unit 102, and can output the motion prediction value.
- the motion evaluation selection unit 113 calculates the first motion Ma in consideration of the motion prediction value output from the moving body motion prediction unit 116.
- the moving body motion estimating unit 116 calculates that the motion of the vehicle 1 is substantially constant velocity Prediction is performed assuming velocity motion. That is, the first motion Ma calculated at time t-1 is stored and output as the motion prediction value Mr t at time t.
- the motion prediction value Mr t is also evaluated, and by selecting the first motion Ma, even if a correct motion cannot be obtained from the q partial correspondence points, the vehicle 1 If the motion is close to constant velocity / constant angular velocity motion, the motion prediction value Mr t is selected as the first motion Ma. As a result, it is possible to avoid a case where a large error occurs in the first movement Ma.
- the motion prediction is performed assuming that the moving body is moving at a constant velocity and a constant angular velocity.
- the method of the motion prediction is not limited to this.
- the motion prediction may be performed assuming that the mobile body is performing uniform acceleration / conformal acceleration motion.In this case as well, a case where a large error occurs in the first motion Ma is avoided.
- the motion prediction of the moving object may use the second motion Mb instead of the first motion Ma, or may use both of them.
- the number of unknown parameters can be reduced, and therefore, the first motion Ma with a small large error can be obtained. Furthermore, the second motion Mb with a small error can be obtained by using a search method using the first motion Ma as an initial value.
- (Equation 11) and (Equation 12) are used to evaluate motion candidates.
- the evaluation value Ej i expressed was used, the evaluation method is not limited to this, and any evaluation value may be used as long as it is a value for evaluating whether the corresponding point matches the camera motion. It doesn't matter.
- an evaluation value indicating a relationship between a position (motion) between cameras and a corresponding point which is called an epipolar constraint or a basic matrix, may be used.
- (Equation 13) is a specific example.
- the (nxR) -th evaluation value from the smaller of the evaluation values Eji for the n corresponding points is selected as the evaluation value Ej.
- the method of specifying j is not limited to this.
- the average value of the evaluation values from the smaller evaluation value E ji to the (nxR) -th evaluation value may be used as the evaluation value E j.
- the method of specifying the evaluation value E j may be changed according to the value of the stationary corresponding point ratio R. For example, if the stationary corresponding point rate R is larger than a predetermined threshold, the (nxR) th evaluation value from the smaller of the evaluation values E ji is set to Ej, while the stationary corresponding point rate R is set to a predetermined threshold. If the evaluation value is smaller than the above, the average of the evaluation values from the smaller one of the evaluation values E ji to the (nxR) -th order may be used as E j.
- the stationary corresponding point ratio R when the stationary corresponding point ratio: R is relatively small, the influence of errors included in the evaluation values can be suppressed by taking the average of the plurality of evaluation values.
- the stationary corresponding point ratio: R when the stationary corresponding point ratio: R is relatively high, by using one evaluation value as the evaluation value Ej, the small difference near the (nxR) th is not averaged, and the evaluation value is evaluated. The effect is reflected in Ej.
- the calculation of the second motion Mb is performed by a search method using the first motion Ma as an initial value.
- the present invention is not limited to such a method.
- the first motion calculating unit 102 instead of the first motion Ma, calculates a partial corresponding point and a plane expression from which the first motion Ma is calculated, and the second motion calculating unit 1 Output to 03.
- the second motion calculating unit 103 uses the partial corresponding points and the plane expression and the n corresponding points output from the corresponding point calculating unit 101 to calculate the second motion Mb by another search method. It may be calculated.
- the camera motion candidate Mbk (k2 1,..., O) is calculated using (Equation 9).
- the evaluation value Ebk of the motion candidate Mbk for the n corresponding points is calculated by the above-described method, and the motion candidate Mbk having the minimum evaluation value Ebk is selected. Further, the above-described processing is repeatedly executed using the plane equation Sk corresponding to the selected Mbk as an initial value, and the processing is terminated when the value converges or when the number of repetitions reaches a predetermined number. Then, the motion candidate Mbk selected at that time is output as the second motion Mb.
- This method recalculates the motion based on the plane formula S close to the actual road surface, when the main cause of the error included in the first motion Ma is the difference between the plane formula S and the actual road surface Therefore, it is thought that motion with small error can be calculated. In addition, when this method is used, the search space can be narrowed, so that there is an advantage that the processing is lightened.
- c is a scale term, so searching for c may not be effective.
- the plane equation S (a, b, c) of the viewpoint coordinate system is used as the initial value, and when the search is performed using the plane equation Sk with a small increase / decrease of a certain small change, Fig. 12 As shown in (a), the search uses a plane formula that is unrelated to the actual road surface fluctuation. On the other hand, as shown in FIG.
- a predetermined minute change for example, when the plane is inclined with respect to the axle center close to the camera 120 (the center of the rear wheel axle in the figure), a predetermined minute change (da, db , dc), the search using the plane equation S k with increased or decreased Since it can be said that the search reflects position fluctuations with the road surface, it is possible to calculate motion with a smaller number of repetitions and to calculate motion with a smaller error.
- a predetermined minute change for example, in a state similar to FIG. 12, various plane formulas S and their minute changes (da, db, dc) are measured in advance and used. Just fine.
- the movement locus of the vehicle 1 is superimposed on the image as shown in FIG. 7 using the second movement Mb output from the moving body movement calculation device 100,
- the usage form of the moving body movement is not limited to this.
- the image synthesizing apparatus 122 inputs the image coordinates and the three-dimensional position of (nxR) corresponding points from the smaller evaluation value Ei of the first motion Ma. If there are (nxR) corresponding points whose three-dimensional position is on the path of the moving object and is within a predetermined range, a caution as shown in Fig. 13 The message may be superimposed and synthesized, and the corresponding corresponding point may be highlighted by surrounding the frame with a frame.
- the output of the moving body motion calculating apparatus 100 is supplied to the image synthesizing apparatus 121 and the display 122, but this is performed by the moving body motion calculating apparatus 100. It is not intended to limit the use of the moving object movement or the output thereof, and may be used in combination with any other device.
- the camera 120 is installed at the rear of the vehicle 1 and photographs the rear of the vehicle.
- the position and the photographing range of the camera are not limited to this.
- the camera can be in any position as long as the surroundings are photographed.
- the plane assumed for the motion calculation is the road surface, but the assumed plane is not limited to the road surface, and the relative position with respect to the moving object may be given in advance. Any plane can be used as long as it can be made.
- the ceiling surface can also be given a relative position with respect to the moving object in advance, so that the ceiling surface may be assumed to be a flat surface instead of a road surface.
- the wall surface may be assumed to be a plane instead of the road surface.
- the vehicle coordinate system may be set so that the vertical axis is orthogonal to the straight line including the axle of the non-steered wheels of the vehicle, and the movement of the vehicle on the road surface may be represented as an arc movement around the vertical axis.
- FIG. 15 is a diagram showing a representation of the movement of a moving object based on the so-called Atsuri-man model.
- Fig. 15 for a vehicle whose front wheel is a steered wheel and whose rear wheel is a non-steered wheel whose axle is fixed to the vehicle body, the center of the rotational component on the horizontal plane is placed on the extension line of the axle of the non-steered wheel. I have.
- a navigation system is configured by combining a moving body motion calculation device configured in the same manner as in the first embodiment with a navigation device.
- the navigation device here has a function to measure the current position using radio waves from artificial satellites, and a function to display the current position and guide the display to the destination using the current position and map information. Shall have.
- FIG. 16 is a configuration diagram of a navigation system including the mobile object motion calculation device according to the present embodiment.
- the same components as those in FIGS. 2 and 3 are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the navigation device 130 includes a position information acquisition unit 131, a position information calculation unit 132, an image synthesis unit 133, and a map information storage unit 134.
- the position information acquisition section 131 has a so-called GPS function, and receives radio waves from a plurality of artificial satellites to acquire information on the position of the vehicle. Also, the position information acquisition unit 131 can receive radio waves from the satellite, and if the position information can be acquired by the GPS function (when the GPS function is valid), outputs the acquired position information and outputs the acquired position information from the satellite. If it is not possible to obtain the location information by the GPS function (for example, when the GPS function is disabled) due to the inability to receive radio waves, it is assumed that the location information cannot be obtained. Output the information shown.
- the position information calculating unit 132 receives the position information from the position information obtaining unit 131, and the moving object motion information (second motion Mb) output from the moving object motion calculating device 100, Calculate the current position of the vehicle.
- the map information storage unit 134 stores information on roads, parking lots, and maps corresponding to the map coordinates.
- the image synthesizing unit 133 inputs the current position from the position information calculating unit 132, reads the road and map information corresponding to the current position from the map information storage unit 134, converts it into an image, and outputs it. .
- the operation of the navigation system in FIG. 16 will be described. Now, it is assumed that the vehicle is outside, is capable of receiving radio waves from satellites, and has the GPS function enabled. At this time, the position information obtaining unit 13 1 outputs the position information obtained by the GPS function, and the position information calculating unit 13 2 uses the position information received from the position information obtaining unit 13 1 as the current position as it is. Output and store temporarily.
- the image synthesizing unit 133 receives the current position from the position information calculating unit 132, reads out the map information around the current position stored in the map information storage unit 134, converts it into a map image, and outputs it. This map image is displayed on the display 122.
- the position information calculating section 13 2 receives the moving object motion information input from the moving object motion calculating apparatus 102. ignore.
- the position information acquisition unit 1331 outputs information indicating that the GPS function is invalid.
- the position information calculation unit 132 integrates the moving object movement output from the moving object movement calculation device 100 into the current position stored inside, and sets the result as a new current position. Output and temporarily store.
- the image synthesizing unit 13 3 operates in the same way as when the GPS function is enabled, receives the current position from the position information calculation unit 13 2, and stores a map around the current position stored in the map information storage unit 1 34. The information is read, converted to a map image and output. This map image is displayed on the display 122.
- the position information calculation unit 13 2 calculates the current position, reads out the parking lot information stored in the map information storage unit 13 4 as necessary, and Send to 4.
- the plane type calculation unit 114 reads out and outputs the plane type information included in the parking lot information as needed.
- the position information calculation unit 1332 sends the position information obtained by the position information acquisition unit 131 Is output as the current position.
- FIG. 17 is a graph showing the trajectory of a vehicle obtained by an experiment using the moving object motion calculation method according to the present invention in an indoor parking lot.
- the moving body motion calculation method according to the present invention it is possible to calculate the moving body motion using only the camera image installed in the vehicle, and in addition, when the road surface is inclined. Since the error can be suppressed to a small value, it is possible to calculate a movement trajectory with a small error even in an indoor parking lot as shown in FIG.
- the error of the corresponding point search tends to increase (the corresponding point correct answer rate decreases) as the moving speed of the moving object increases, and the corresponding point If the error becomes too large, the error in the movement of the moving object becomes large.
- the use of the mobile body motion calculation device 100 is limited to indoors where radio waves from satellites cannot be received, and radio waves from satellites can be received. Outdoors, the location information obtained by the GPS function is used as the current location. This makes it possible to calculate the current position with high accuracy.
- the method of calculating the current position is selectively switched according to whether or not the position information can be acquired by the GPS function. In both cases, the current position can always be calculated.
- the switching of the current position calculation method is performed according to whether the current position can be acquired by the GPS function.
- the calculation method of the current position may be switched according to whether or not the determined current position is near the entrance of the indoor parking lot on the map information.
- a temporary The defined plane is not limited to the road surface, and may be any plane as long as it can provide a relative position with respect to the moving object in advance.
- the plane formula when the ceiling is assumed to be a plane is stored in the map information storage unit 134 together with the map information.
- the position information calculation unit 13 2 reads out the plane formula of the ceiling of the indoor parking lot from the map information storage unit 134, and performs the first movement. The output is output to the plane expression calculation unit 114 of the calculation unit 102.
- the plane expression calculation unit 1 ⁇ 4 outputs the plane expression. Thereby, the moving body motion calculation apparatus 100 calculates the moving body movement based on the ceiling plane.
- the moving body motion calculation device 100 selects and uses the plane expression according to the current position, so that It is possible to calculate highly accurate moving body movement.
- the vehicle has been described as an example of a moving object.
- the type of the moving object is not limited to a vehicle, and may be, for example, a robot.
- each processing unit and device may be realized by hardware or may be realized by software.
- each processing means is realized by software using a computer having a CPU 141, a ROM 144, a RAM 144, and an image input / output function as shown in FIG. Good.
- the present invention can accurately determine the movement of a moving object on which a camera is installed by using a camera image. Can be used for location presentation services, etc. Is,
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Navigation (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04792819A EP1580687A4 (en) | 2003-10-17 | 2004-10-15 | METHOD FOR DETERMINING MOBILE UNIT MOTION, DEVICE AND NAVIGATION SYSTEM |
JP2005514877A JP3843119B2 (ja) | 2003-10-17 | 2004-10-15 | 移動体動き算出方法および装置、並びにナビゲーションシステム |
US10/540,135 US7542834B2 (en) | 2003-10-17 | 2004-10-15 | Mobile unit motion calculating method, apparatus and navigation system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003357729 | 2003-10-17 | ||
JP2003-357729 | 2003-10-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005038710A1 true WO2005038710A1 (ja) | 2005-04-28 |
Family
ID=34463254
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/015677 WO2005038710A1 (ja) | 2003-10-17 | 2004-10-15 | 移動体動き算出方法および装置、並びにナビゲーションシステム |
Country Status (4)
Country | Link |
---|---|
US (1) | US7542834B2 (ja) |
EP (1) | EP1580687A4 (ja) |
JP (1) | JP3843119B2 (ja) |
WO (1) | WO2005038710A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007256029A (ja) * | 2006-03-23 | 2007-10-04 | Denso It Laboratory Inc | ステレオ画像処理装置 |
JP2007280387A (ja) * | 2006-03-31 | 2007-10-25 | Aisin Seiki Co Ltd | 物体移動の検出方法及び検出装置 |
JP2008282386A (ja) * | 2007-05-10 | 2008-11-20 | Honda Motor Co Ltd | 物体検出装置、物体検出方法及び物体検出プログラム |
JP2009077022A (ja) * | 2007-09-19 | 2009-04-09 | Sanyo Electric Co Ltd | 運転支援システム及び車両 |
JP2014519444A (ja) * | 2011-06-15 | 2014-08-14 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 駐車誘導用後付けセット |
CN104678998A (zh) * | 2013-11-29 | 2015-06-03 | 丰田自动车株式会社 | 自主移动体及其控制方法 |
JP2017042247A (ja) * | 2015-08-25 | 2017-03-02 | 富士フイルム株式会社 | 基準点評価装置、方法およびプログラム、並びに位置合せ装置、方法およびプログラム |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6801244B2 (en) * | 2000-02-29 | 2004-10-05 | Kabushiki Kaisha Toshiba | Obstacle detection apparatus and method |
WO2005038710A1 (ja) * | 2003-10-17 | 2005-04-28 | Matsushita Electric Industrial Co., Ltd. | 移動体動き算出方法および装置、並びにナビゲーションシステム |
EP1849123A2 (en) * | 2005-01-07 | 2007-10-31 | GestureTek, Inc. | Optical flow based tilt sensor |
US8666661B2 (en) * | 2006-03-31 | 2014-03-04 | The Boeing Company | Video navigation |
US20090089705A1 (en) * | 2007-09-27 | 2009-04-02 | Microsoft Corporation | Virtual object navigation |
DE102008045619A1 (de) | 2008-09-03 | 2010-03-04 | Daimler Ag | Verfahren, Vorrichtung und System zur Ermittlung einer Geschwindigkeit eines Fahrzeugs |
JP5419403B2 (ja) * | 2008-09-04 | 2014-02-19 | キヤノン株式会社 | 画像処理装置 |
DE102008061060A1 (de) | 2008-12-08 | 2010-06-10 | Daimler Ag | Verfahren zur Ermittlung mindestens einer Rotationsachse eines Fahrzeugs |
DE102009016562A1 (de) | 2009-04-06 | 2009-11-19 | Daimler Ag | Verfahren und Vorrichtung zur Objekterkennung |
JP5223811B2 (ja) * | 2009-08-06 | 2013-06-26 | 株式会社日本自動車部品総合研究所 | 画像補正装置、画像補正方法、及びそれらに用いられる変換マップ作成方法 |
TWI407280B (zh) * | 2009-08-20 | 2013-09-01 | Nat Univ Tsing Hua | 自動搜尋系統及方法 |
DE102010013093A1 (de) * | 2010-03-29 | 2011-09-29 | Volkswagen Ag | Verfahren und System zur Erstellung eines Modells eines Umfelds eines Fahrzeugs |
US11405841B2 (en) | 2012-07-20 | 2022-08-02 | Qualcomm Incorporated | Using UE environmental status information to improve mobility handling and offload decisions |
WO2014076769A1 (ja) | 2012-11-13 | 2014-05-22 | 株式会社東芝 | 検出装置、方法及びプログラム |
DE102012221572A1 (de) * | 2012-11-26 | 2014-05-28 | Robert Bosch Gmbh | Autonomes Fortbewegungsgerät |
JP5962689B2 (ja) * | 2014-02-14 | 2016-08-03 | トヨタ自動車株式会社 | 自律移動体、及びその故障判定方法 |
DE102014212819A1 (de) * | 2014-07-02 | 2016-01-07 | Zf Friedrichshafen Ag | Ortspositionsabhängige Darstellung von Fahrzeugumfelddaten auf einer mobilen Einheit |
KR102337209B1 (ko) * | 2015-02-06 | 2021-12-09 | 삼성전자주식회사 | 주변 상황 정보를 통지하기 위한 방법, 전자 장치 및 저장 매체 |
IL238473A0 (en) * | 2015-04-26 | 2015-11-30 | Parkam Israel Ltd | A method and system for discovering and mapping parking areas |
WO2018023736A1 (en) * | 2016-08-05 | 2018-02-08 | SZ DJI Technology Co., Ltd. | System and method for positioning a movable object |
CN109716255A (zh) * | 2016-09-18 | 2019-05-03 | 深圳市大疆创新科技有限公司 | 用于操作可移动物体以规避障碍物的方法和系统 |
KR102313026B1 (ko) * | 2017-04-11 | 2021-10-15 | 현대자동차주식회사 | 차량 및 차량 후진 시 충돌방지 보조 방법 |
JP6840024B2 (ja) * | 2017-04-26 | 2021-03-10 | 株式会社クボタ | オフロード車両及び地面管理システム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000275013A (ja) * | 1999-03-24 | 2000-10-06 | Mr System Kenkyusho:Kk | 視点位置姿勢の決定方法、コンピュータ装置および記憶媒体 |
JP2001266160A (ja) * | 2000-03-22 | 2001-09-28 | Toyota Motor Corp | 周辺認識方法および周辺認識装置 |
JP2003178309A (ja) * | 2001-10-03 | 2003-06-27 | Toyota Central Res & Dev Lab Inc | 移動量推定装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5777690A (en) * | 1995-01-20 | 1998-07-07 | Kabushiki Kaisha Toshiba | Device and method for detection of moving obstacles |
JP3463397B2 (ja) | 1995-02-09 | 2003-11-05 | 日産自動車株式会社 | 走行路検出装置 |
US6192145B1 (en) | 1996-02-12 | 2001-02-20 | Sarnoff Corporation | Method and apparatus for three-dimensional scene processing using parallax geometry of pairs of points |
JP3729916B2 (ja) | 1996-02-29 | 2005-12-21 | 富士通株式会社 | 3次元情報復元装置 |
AU1793301A (en) | 1999-11-26 | 2001-06-04 | Mobileye, Inc. | System and method for estimating ego-motion of a moving vehicle using successiveimages recorded along the vehicle's path of motion |
JP4425495B2 (ja) * | 2001-06-08 | 2010-03-03 | 富士重工業株式会社 | 車外監視装置 |
WO2005038710A1 (ja) * | 2003-10-17 | 2005-04-28 | Matsushita Electric Industrial Co., Ltd. | 移動体動き算出方法および装置、並びにナビゲーションシステム |
JP4107605B2 (ja) * | 2005-02-01 | 2008-06-25 | シャープ株式会社 | 移動体周辺監視装置、移動体周辺監視方法、制御プログラムおよび可読記録媒体 |
-
2004
- 2004-10-15 WO PCT/JP2004/015677 patent/WO2005038710A1/ja not_active Application Discontinuation
- 2004-10-15 US US10/540,135 patent/US7542834B2/en active Active
- 2004-10-15 JP JP2005514877A patent/JP3843119B2/ja not_active Expired - Lifetime
- 2004-10-15 EP EP04792819A patent/EP1580687A4/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000275013A (ja) * | 1999-03-24 | 2000-10-06 | Mr System Kenkyusho:Kk | 視点位置姿勢の決定方法、コンピュータ装置および記憶媒体 |
JP2001266160A (ja) * | 2000-03-22 | 2001-09-28 | Toyota Motor Corp | 周辺認識方法および周辺認識装置 |
JP2003178309A (ja) * | 2001-10-03 | 2003-06-27 | Toyota Central Res & Dev Lab Inc | 移動量推定装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1580687A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007256029A (ja) * | 2006-03-23 | 2007-10-04 | Denso It Laboratory Inc | ステレオ画像処理装置 |
JP2007280387A (ja) * | 2006-03-31 | 2007-10-25 | Aisin Seiki Co Ltd | 物体移動の検出方法及び検出装置 |
JP2008282386A (ja) * | 2007-05-10 | 2008-11-20 | Honda Motor Co Ltd | 物体検出装置、物体検出方法及び物体検出プログラム |
US8300887B2 (en) | 2007-05-10 | 2012-10-30 | Honda Motor Co., Ltd. | Object detection apparatus, object detection method and object detection program |
JP2009077022A (ja) * | 2007-09-19 | 2009-04-09 | Sanyo Electric Co Ltd | 運転支援システム及び車両 |
JP2014519444A (ja) * | 2011-06-15 | 2014-08-14 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 駐車誘導用後付けセット |
CN104678998A (zh) * | 2013-11-29 | 2015-06-03 | 丰田自动车株式会社 | 自主移动体及其控制方法 |
JP2017042247A (ja) * | 2015-08-25 | 2017-03-02 | 富士フイルム株式会社 | 基準点評価装置、方法およびプログラム、並びに位置合せ装置、方法およびプログラム |
US10242452B2 (en) | 2015-08-25 | 2019-03-26 | Fujifilm Corporation | Method, apparatus, and recording medium for evaluating reference points, and method, apparatus, and recording medium for positional alignment |
Also Published As
Publication number | Publication date |
---|---|
EP1580687A1 (en) | 2005-09-28 |
EP1580687A4 (en) | 2006-01-11 |
JP3843119B2 (ja) | 2006-11-08 |
US20060055776A1 (en) | 2006-03-16 |
US7542834B2 (en) | 2009-06-02 |
JPWO2005038710A1 (ja) | 2007-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005038710A1 (ja) | 移動体動き算出方法および装置、並びにナビゲーションシステム | |
US10762643B2 (en) | Method for evaluating image data of a vehicle camera | |
Zhou et al. | Ground-plane-based absolute scale estimation for monocular visual odometry | |
KR101725060B1 (ko) | 그래디언트 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법 | |
KR101708659B1 (ko) | 이동 로봇의 맵을 업데이트하기 위한 장치 및 그 방법 | |
Badino | A robust approach for ego-motion estimation using a mobile stereo platform | |
CN107111879B (zh) | 通过全景环视图像估计车辆自身运动的方法和设备 | |
EP2948927B1 (en) | A method of detecting structural parts of a scene | |
KR101776621B1 (ko) | 에지 기반 재조정을 이용하여 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법 | |
KR101784183B1 (ko) | ADoG 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법 | |
US8401783B2 (en) | Method of building map of mobile platform in dynamic environment | |
US9014421B2 (en) | Framework for reference-free drift-corrected planar tracking using Lucas-Kanade optical flow | |
US9940725B2 (en) | Method for estimating the speed of movement of a camera | |
US11082633B2 (en) | Method of estimating the speed of displacement of a camera | |
KR101544021B1 (ko) | 3차원 지도 생성 장치 및 3차원 지도 생성 방법 | |
CN109443348A (zh) | 一种基于环视视觉和惯导融合的地下车库库位跟踪方法 | |
KR102559203B1 (ko) | 포즈 정보를 출력하는 방법 및 장치 | |
Michot et al. | Bi-objective bundle adjustment with application to multi-sensor slam | |
CN113137968B (zh) | 基于多传感器融合的重定位方法、重定位装置和电子设备 | |
Hong et al. | Visual inertial odometry using coupled nonlinear optimization | |
JP2007241326A (ja) | 移動体動き算出装置 | |
CN113744308A (zh) | 位姿优化方法、装置、电子设备、介质及程序产品 | |
JP2019032751A (ja) | カメラ運動推定装置、カメラ運動推定方法及びプログラム | |
Jurado et al. | Inertial and imaging sensor fusion for image-aided navigation with affine distortion prediction | |
Rabe | Detection of moving objects by spatio-temporal motion analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2005514877 Country of ref document: JP |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004792819 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2006055776 Country of ref document: US Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10540135 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 2004792819 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2004792819 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10540135 Country of ref document: US |