JPH0719818A - Three-dimensional movement predicting device - Google Patents

Three-dimensional movement predicting device

Info

Publication number
JPH0719818A
JPH0719818A JP18677893A JP18677893A JPH0719818A JP H0719818 A JPH0719818 A JP H0719818A JP 18677893 A JP18677893 A JP 18677893A JP 18677893 A JP18677893 A JP 18677893A JP H0719818 A JPH0719818 A JP H0719818A
Authority
JP
Japan
Prior art keywords
measured
equation
dimensional
motion
motion parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP18677893A
Other languages
Japanese (ja)
Inventor
Motohisa Hirose
Satoru Miura
Kazuo Nakazawa
Yutaka Uchimura
悟 三浦
和夫 中沢
裕 内村
素久 廣瀬
Original Assignee
Kajima Corp
鹿島建設株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kajima Corp, 鹿島建設株式会社 filed Critical Kajima Corp
Priority to JP18677893A priority Critical patent/JPH0719818A/en
Publication of JPH0719818A publication Critical patent/JPH0719818A/en
Pending legal-status Critical Current

Links

Abstract

(57) [Summary] [Structure] The range finder 5 analyzes the position data of the object 1 to be subjected to simple vibration. The motion parameter estimation unit 9 estimates the motion parameter of the measured object 1, and the position prediction unit 11
Predicts the future position of the measured object 1 based on the estimated motion parameters. The robot 13 holds the measured object 1 by the manipulator 15 based on the future position information of the measured object 1 obtained from the position prediction section 11. [Effect] It is possible to estimate the motion parameter of the measured object that performs simple vibration and predict the future position of the measured object.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention predicts the three-dimensional position of an object to be subjected to simple vibration and outputs the predicted three-dimensional position information.
The present invention relates to a three-dimensional motion prediction device that is provided to a robot that holds a measured object.

[0002]

2. Description of the Related Art Conventionally, there are two methods for estimating motion parameters of a moving object from moving images. The first method is called the two-step estimation method. First, in the first step, the optical flow between frames before and after the motion of the moving object or the correspondence between frames is obtained, and in the second step, the optical flow of the moving object in the image is calculated. The motion parameters are estimated. The second method is a method called a direct estimation method, which is a method of directly estimating a motion parameter only from a change in image density without obtaining an optical flow.

[0003]

However, in the conventional method, the three-dimensional motion is analyzed from the image information which is the two-dimensional information, but when the distance information is not known, the three-dimensional motion is analyzed. Was difficult. Further, since it is assumed that the image slightly changes, it is difficult to estimate the three-dimensional motion with high accuracy.

"Masanobu Yamamoto," Direct estimation method of three-dimensional motion parameter by using moving image and distance information ", IEICE Transactions J68-D, 4, pp. 562-569.
(1985/4) ”discloses a method of directly estimating a motion parameter from a moving image together with distance information.
However, in this method, the motion of the object is assumed to be a constant velocity motion, and therefore it cannot be applied to an object that does not perform such motion.

The present invention has been made in view of the above problems, and an object thereof is to measure an object, such as a pendulum, which can be assumed to be a simple vibration by a three-dimensional measurement. It is an object of the present invention to provide a three-dimensional motion prediction device that estimates a motion parameter of a measured object from the obtained measurement information of the position of the measured object and predicts a future position of the measured object.

[0006]

In order to achieve the above-mentioned object, the present invention provides an analyzing means for analyzing the three-dimensional position of an object to be subjected to simple vibration, and the object to be analyzed analyzed by the analyzing means. A three-dimensional motion prediction apparatus comprising: means for estimating a motion parameter of the measured object from the three-dimensional position; and means for predicting a future position of the measured object based on the estimated motion parameter. Is.

[0007]

According to the present invention, the three-dimensional position of the object to be subjected to simple vibration is analyzed, the motion parameter of the object to be measured is estimated from the analyzed three-dimensional position of the object to be measured, and the estimated motion parameter is used as a basis. Then, the future position of the measured object is predicted.

[0008]

Embodiments of the present invention will now be described in detail with reference to the drawings. FIG. 1 is a diagram showing a schematic configuration of a system using a three-dimensional motion prediction device according to one embodiment of the present invention. In FIG. 1, 1 is a measured object that performs a simple vibration (pendulum motion), and 3 is a three-dimensional motion prediction device. The three-dimensional motion prediction device 3 includes a range finder 5, a motion parameter estimation unit 9, and a position prediction unit 11. Range finder 5
Takes an image of the object to be measured 1 and obtains the three-dimensional position of the object to be measured 1 based on the taken image.

FIG. 2 shows the relationship between the range finder 5 and the object to be measured 1. In FIG. 2, X and Y are absolute coordinate systems, X r and Y r are coordinate systems of the range finder 5, X p ,
Y p represents the coordinate system of the measured object 1. However, in any coordinate system, the Z axis is in the direction perpendicular to the paper surface and is not shown.

When the measured object 1 is simply vibrating, the time history of the X p axis coordinate component of the center of gravity P of the measured object 1 is as shown in FIG. However, considering the time required for the measurement processing of the three-dimensional measurement, the obtained measurement points are shown in FIG.
It becomes discrete data like a black dot inside.

The motion parameter estimation unit 9 estimates the motion parameter of the center of gravity P from the coordinate data of the center of gravity P discrete in time. The position prediction unit 11 predicts the coordinate value (X f , Y f , Z f ) of the center of gravity P at the future time T f based on the motion parameter obtained by the motion parameter estimation unit 9.

The future position coordinates of the object to be measured 1 are input to the robot 13, and the manipulator 15 of the robot 13 is used.
The object to be measured 1 is gripped by.

Next, the principle of motion parameter estimation by the motion parameter estimation unit 9 will be described. The motion model of the center of gravity P in the coordinate system of the measured object 1 is set as in the following equation.

X b (t i ) = a x sin (ω x t i + φ x ) + b x (1) y b (t i ) = a y sin (ω y t i + φ y ) + b y (2) z b (t i ) = a z sin (ω z t i + φ z ) + b z (3) where x b (t i ), y b (t i ), z b (t i ). Is X, Y, Z at time t i in the coordinate system of the measured object 1.
Is an axial component, and a x , a y , and a z are amplitudes, ω x ,
ω y and ω z are frequencies, φ x , φ y and φ z are phases, and b
x, b y, b z represents an offset from the origin.

Note that z b (t i ) is x b (t i ), y
Since it can be solved by the same procedure as b (t i ), the following description will be made only for the X and Y coordinate components.

As shown in FIG. 2, since the coordinate system of the object to be measured 1 and the coordinate system of the range finder 5 have a relationship of θ rotation about the Z axis, the equations (1) and (2) are expressed by the range finder 5
It is expressed by the following equation in the coordinate system of.

[0017]

[Equation 1]

Now, x r (t i ) and y r (t i ) are position data of the object to be measured 1 measured by the range finder 5, and t i is known at the measurement time.
Is a nonlinear equation with 9 unknown parameters.
Here, the nine unknown parameters are θ, a x , a y ,
ω x, ω y, φ x , φ y, b x, is a b y. Therefore,
It is possible to obtain the approximate solution of the parameter from the measurement data of 9 points by the numerical solution method. However, equation (4) includes a polyvalent function such as sin (), and there is no guarantee that the approximate solution will converge to the true value depending on the setting of the initial value. Therefore, the parameters other than ω x and ω y in Eq. (4) are analytically deleted by the following procedure, the formulas relating to ω x and ω y are derived, and the numerical solution of this formula is examined to determine the motion. Estimate the parameters.

First, from equation (4), x (t 0 ), y
Erase b x and b y by (t 0 ).

[0020]

[Equation 2]

Here, P i , A, and G i are defined as follows.

[0022]

[Equation 3]

The equation (4) is transformed into the following equation (8).

P i = AG i (8) Further, by addition theorem, G i is transformed into the following formula (9).

[0025]

[Equation 4]

Here, s xi , s yi , c xi and c yi are defined by the following equations.

[0027]

[Equation 5]

Then, A is eliminated from P 1 , P 3 , G 1 , and G 3 to obtain the equation (10).

G 1 P 1 -1 = G 3 P 3 -1 (10) P 1 -1 , P 3 -1 are replaced as shown in equation (11).

[0030]

[Equation 6]

Next, from equation (10), equations (12) to (15) relating to tan φ x and tan φ y are obtained.

Tan φ x = − {(q x1 s x1 + q y1 s x2 −q x3 s x3 −q y3 s x4 ) / (q x1 c x1 + q y1 c x2 −q x3 c x3 −q y3 c x4 ) } (12) tan φ x = − {(q x2 s x1 + q y2 s x2 −q x4 s x3 −q y4 s x4 ) / (q x2 c x1 + q y2 c x2 −q x4 c x3 −q y4 c x4 )} (13) tan φ y = − {(q x1 s y1 + q y1 s y2 −q x3 s y3 −q y3 s y4 ) / (q x1 cy1 + q y1 cy2 −q x3 cy3 −q y3 cy4 )} (14) tan φ y = − {(q x2 s y1 + q y2 s y2 −q x4 s y3 −q y4 s y4 ) / (q x2 cy1 + q y2 cy2 −q x4 cy3 −q y4 cy4 )} (15) Next, equation (12), equation (13), equation (14), and equation (1)
5) are made simultaneous, and φ x and φ y are eliminated, and equations (16) and (17) are obtained.

(Q x1 s x1 + q y1 s x2 −q x3 s x3 −q y3 s x4 ) · (q x2 c x1 + q y2 c x2 −q x4 c x3 −q y4 c x4 ) = (q x2 s x1 + Q y2 s x2 −q x4 s x3 −q y4 s x4 ) ・ (q x1 c x1 + q y1 c x2 −q x3 c x3 −q y3 c x4 ) (16) (q x1 s y1 + q y1 s y2 -Q x3 sy3 -q y3 sy4 )-(q x2 cy1 + q y2 cy2 -q x4 cy3 -q y4 cy4 ) = (q x2 sy1 + q y2 sy2 -q x4 sy3 -q y4 s y4 ) · (q x1 cy1 + q y1 cy2 −q x3 cy3 −q y3 cy4 ) (17) Instead of solving equations (16) and (17), equations (16) and (17) ), The left side-the right side is F (ω x ), F
y ), Equation (18) and Equation (19) are obtained, and F
Ω x and ω y for which (ω x ) = 0 and F (ω y ) = 0 are obtained.

F (ω x ) = (q x1 s x1 + q y1 s x2 −q x3 s x3 −q y3 s x4 ) · (q x2 c x1 + q y2 c x2 −q x4 c x3 −q y4 c x4 ) − (Q x2 s x1 + q y2 s x2 −q x4 s x3 −q y4 s x4 ) · (q x1 c x1 + q y1 c x2 −q x3 c x3 −q y3 c x4 ) (18) F (ω y ) = (q x1 s y1 + q y1 s y2 −q x3 s y3 −q y3 s y4 ) · (q x2 cy1 + q y2 cy2 −q x4 cy3 −q y4 cy4 ) − (q x2 s y1 + Q y2 s y2 −q x4 s y3 −q y4 s y4 ) · (q x1 c y1 + q y1 c y2 −q x3 c y3 −q y3 c y4 ) (19) Formula (18), Formula (19) Are expressions relating to ω x and ω y , respectively, but if ω x and ω y are exchanged, the same equation is obtained.
This corresponds to the fact that θ is an unknown number in the equation (5), but θ is π / 2−θ, and the equation (5) holds even if a x and a y and ω x and ω y are exchanged. To do. The inclination θ of the coordinate system of the range finder 5 with respect to the coordinate system of the DUT 1 is π
This is because it is irrelevant to the frequency even if it is rotated by / 2.

Therefore, instead of separately obtaining ω x and ω y from the equations (18) and (19), F (ω x ) = 0 is replaced with F (ω) = 0 in the equation (18), and Two solutions close to "0" are obtained, and the ones close to "0" are defined as ω x and ω y . When ω x and ω y are obtained, other motion parameters are obtained by the procedure reverse to that described above.

Solving F (ω) = 0 in the equation (18),
The Newton-Raphson method is used to obtain ω x and ω y , but ω may not converge to the true value depending on how the initial values are set.
Take steps to differentiate (ω) and converge the solution.

In FIG. 4, F (ω) = 0 is solved by following a procedure in which F (ω) is differentiated and the solution converges.
6 is a flowchart showing a process of estimating a motion parameter of a and predicting a future position.

FIG. 5 is an explanatory diagram for obtaining a solution in accordance with the flowchart shown in FIG. 4 for F (ω) = 0. FIG. 5A shows F (ω), and FIG. 5B shows F ′.
(Ω), and FIG. 5C shows F ″ (ω).

As shown in FIG. 4, the time t the position of the i data x r (t i), to measure the y r (t i) (step 401), P xi in equation (18) related to the frequency omega, P
yi, substituting t i (i = 1~4) (Step 40
2). An appropriate initial value is set to ω 0, and the parameters n and j are cleared (step 403). A first approximate solution ω 1 n on F n (ω) is obtained by the Newton-Raphson method (step 404). ω 1 n and ω 0 are compared (step 405), and when ω 1 n0 is not satisfied (FIG. 5).
In the case of (a) and FIG. 5 (b), F n (ω) is differentiated (step 406).

When ω 1 n0 is satisfied in step 405 (in the case of FIG. 5C), a convergent solution ω n * in F n (ω) is obtained (step 407). And ω n *
The initial value is δ (δ> 0), and n is reduced by 1 (step 4
08), and it is determined whether n <0 (step 409).
Steps 407 and 40 until n <0 is satisfied
The process of 8 is repeated.

When n <0 in step 409, ω
The convergent solution ω 0 * in F (ω) is stored in 0 (j), j is increased by “1” (step 410), it is determined whether ω 0 * = 0 (step 411), and ω 0 * If = 0, then ω 0 *
With −δ as an initial value, “0” is substituted for n (step 41
2) and returns to step 404.

If ω 0 * = 0 in step 411,
Let ω 0 (j-1) and ω 0 (j-2) be ω x and ω y , respectively.
(Step 413). Thus ω x ,
When ω y is calculated, the procedure up to Expression (18) is reversed, and θ, a x , a y , φ x , φ y , b x , and b y are obtained (step 414). Thus the parameter ω x ,
When ω y , θ, a x , a y , φ x , φ y , b x , and b y are calculated, the center of gravity P of the measured object 1 at the desired time t f is calculated by the formulas (1) and (2). The coordinates x f and y f of the are calculated (step 415).

[0043] When the FIG. 5 will be described, omega 0 omega 0 * as an initial value is determined for the two solutions close to "0" in the F (omega) = 0 is omega x, are omega y.

In the same manner as the above procedure, z of equation (3)
Parameters relating to b (t i ) are obtained, and the coordinates x f , y f , and z f of the center of gravity P of the measured object 1 at time t f are obtained from the obtained parameters.

As described above, in this embodiment, the object 1 to be subjected to simple vibration is imaged by the range finder 5 and the image data is analyzed to estimate the motion parameter, and the future position of the object 1 to be measured. Can be predicted.

[0046]

As described above in detail, according to the present invention, it is possible to estimate the motion parameter of the object to be subjected to simple vibration and to predict the future position of the object to be measured.

[Brief description of drawings]

FIG. 1 is a diagram showing a schematic configuration of a system using a three-dimensional motion prediction device according to an embodiment of the present invention.

FIG. 2 is a diagram showing the relationship between the absolute coordinate system, the coordinate system of the range finder 5 and the coordinate system of the DUT 1.

FIG. 3 is a diagram showing a time history of the X coordinate component of the center of gravity P of the measured object 1.

FIG. 4 is a flowchart showing the processing of this embodiment.

FIG. 5 is an explanatory diagram for obtaining a solution of F (ω) = 0.

[Explanation of symbols]

 1 ... Object to be measured 3 ... Three-dimensional motion prediction device 5 ... Range finder 9 ... Motion parameter estimation unit 11 ... Position prediction unit 13 ... Robot 15 ... Manipulator

 ─────────────────────────────────────────────────── ─── Continued Front Page (72) Inventor Kazuo Nakazawa 3-11-16 Hiyoshi, Kohoku Ward, Yokohama City, Kanagawa Banno Corp. 4-C

Claims (2)

[Claims]
1. Analyzing means for analyzing a three-dimensional position of a measured object that performs simple vibration, and means for estimating a motion parameter of the measured object from the three-dimensional position of the measured object analyzed by the analyzing means. And a means for predicting a future position of the measured object based on the estimated motion parameter.
2. The robot according to claim 1, wherein future position information of the measured object obtained by the three-dimensional motion prediction apparatus is input to a robot, and the robot grips the measured object. Dimensional motion prediction device.
JP18677893A 1993-06-30 1993-06-30 Three-dimensional movement predicting device Pending JPH0719818A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP18677893A JPH0719818A (en) 1993-06-30 1993-06-30 Three-dimensional movement predicting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP18677893A JPH0719818A (en) 1993-06-30 1993-06-30 Three-dimensional movement predicting device

Publications (1)

Publication Number Publication Date
JPH0719818A true JPH0719818A (en) 1995-01-20

Family

ID=16194445

Family Applications (1)

Application Number Title Priority Date Filing Date
JP18677893A Pending JPH0719818A (en) 1993-06-30 1993-06-30 Three-dimensional movement predicting device

Country Status (1)

Country Link
JP (1) JPH0719818A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007240344A (en) * 2006-03-09 2007-09-20 Fujitsu Ltd Dynamic shape measuring method and dynamic shape measuring device
JP2008058221A (en) * 2006-09-01 2008-03-13 Kobe Univ Velocity of high-speed moving body, estimation method of position, estimation program, and estimation system
US8140188B2 (en) 2008-02-18 2012-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Robotic system and method for observing, learning, and supporting human activities
WO2012153629A1 (en) * 2011-05-12 2012-11-15 株式会社Ihi Device and method for controlling prediction of motion
JP2012236254A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for holding moving body
JP2012245568A (en) * 2011-05-25 2012-12-13 Ihi Corp Device and method for controlling and predicting motion
JP2012247835A (en) * 2011-05-25 2012-12-13 Ihi Corp Robot movement prediction control method and device
JP2013240847A (en) * 2012-05-18 2013-12-05 Ihi Corp Robot hand device, and control method
CN108020855A (en) * 2017-11-29 2018-05-11 安徽省通信息科技有限公司 The pose and instantaneous center of rotation combined estimation method of a kind of glide steering robot
WO2020032211A1 (en) * 2018-08-10 2020-02-13 川崎重工業株式会社 Data generating device, data generating method, data generating program, and remote operation system

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007240344A (en) * 2006-03-09 2007-09-20 Fujitsu Ltd Dynamic shape measuring method and dynamic shape measuring device
JP2008058221A (en) * 2006-09-01 2008-03-13 Kobe Univ Velocity of high-speed moving body, estimation method of position, estimation program, and estimation system
US8140188B2 (en) 2008-02-18 2012-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Robotic system and method for observing, learning, and supporting human activities
WO2012153629A1 (en) * 2011-05-12 2012-11-15 株式会社Ihi Device and method for controlling prediction of motion
JP2012236254A (en) * 2011-05-12 2012-12-06 Ihi Corp Device and method for holding moving body
CN103517789A (en) * 2011-05-12 2014-01-15 株式会社Ihi Device and method for controlling prediction of motion
US9108321B2 (en) 2011-05-12 2015-08-18 Ihi Corporation Motion prediction control device and method
JP2012247835A (en) * 2011-05-25 2012-12-13 Ihi Corp Robot movement prediction control method and device
JP2012245568A (en) * 2011-05-25 2012-12-13 Ihi Corp Device and method for controlling and predicting motion
JP2013240847A (en) * 2012-05-18 2013-12-05 Ihi Corp Robot hand device, and control method
CN108020855A (en) * 2017-11-29 2018-05-11 安徽省通信息科技有限公司 The pose and instantaneous center of rotation combined estimation method of a kind of glide steering robot
CN108020855B (en) * 2017-11-29 2020-01-31 安徽省一一通信息科技有限公司 posture and rotation instantaneous center joint estimation method for skid-steer robot
WO2020032211A1 (en) * 2018-08-10 2020-02-13 川崎重工業株式会社 Data generating device, data generating method, data generating program, and remote operation system

Similar Documents

Publication Publication Date Title
US20200096317A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
US20170294023A1 (en) Constrained key frame localization and mapping for vision-aided inertial navigation
CN102914293B (en) Messaging device and information processing method
Danescu et al. Probabilistic lane tracking in difficult road scenarios using stereovision
Kruger et al. Real-time estimation and tracking of optical flow vectors for obstacle detection
US9025857B2 (en) Three-dimensional measurement apparatus, measurement method therefor, and computer-readable storage medium
ES2700506T3 (en) Adaptive path smoothing for video stabilization
DE69926868T2 (en) Road profile detection
DE102018116111A1 (en) A uniform deep convolution neural network for the estimation of free space, the estimation of the object recognition and the object position
US8792726B2 (en) Geometric feature extracting device, geometric feature extracting method, storage medium, three-dimensional measurement apparatus, and object recognition apparatus
EP0420657B1 (en) Moving object detecting system
US5311305A (en) Technique for edge/corner detection/tracking in image frames
KR100446636B1 (en) Apparatus and method for measuring ego motion of autonomous vehicles and 3D shape of object in front of autonomous vehicles
JP4985516B2 (en) Information processing apparatus, information processing method, and computer program
KR20150032789A (en) Method for estimating ego motion of an object
US10109104B2 (en) Generation of 3D models of an environment
EP1205765B1 (en) Method of recognising conducting cables for low flying aircraft
US10083512B2 (en) Information processing apparatus, information processing method, position and orientation estimation apparatus, and robot system
Wu et al. Recovery of the 3-d location and motion of a rigid object through camera image (an Extended Kalman Filter approach)
Papanikolopoulos et al. Vision and control techniques for robotic visual tracking.
Gee et al. Discovering Planes and Collapsing the State Space in Visual SLAM.
US20090227266A1 (en) Location measurement method based on predictive filter
JP2009008662A (en) Object detection cooperatively using sensor and video triangulation
Rajagopalan et al. An MRF model-based approach to simultaneous recovery of depth and restoration from defocused images
US7062071B2 (en) Apparatus, program and method for detecting both stationary objects and moving objects in an image using optical flow