JPH0719818A  Threedimensional movement predicting device  Google Patents
Threedimensional movement predicting deviceInfo
 Publication number
 JPH0719818A JPH0719818A JP18677893A JP18677893A JPH0719818A JP H0719818 A JPH0719818 A JP H0719818A JP 18677893 A JP18677893 A JP 18677893A JP 18677893 A JP18677893 A JP 18677893A JP H0719818 A JPH0719818 A JP H0719818A
 Authority
 JP
 Japan
 Prior art keywords
 measured
 equation
 dimensional
 motion
 motion parameter
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Pending
Links
 230000005484 gravity Effects 0.000 description 8
 238000000034 methods Methods 0.000 description 8
 235000020127 ayran Nutrition 0.000 description 6
 238000010586 diagrams Methods 0.000 description 6
 230000003287 optical Effects 0.000 description 3
 238000002940 NewtonRaphson method Methods 0.000 description 2
 101710014612 dUTP pyrophosphatase Proteins 0.000 description 2
 230000014509 gene expression Effects 0.000 description 2
 238000007796 conventional methods Methods 0.000 description 1
Abstract
Predicts the future position of the measured object 1 based on the estimated motion parameters. The robot 13 holds the measured object 1 by the manipulator 15 based on the future position information of the measured object 1 obtained from the position prediction section 11. [Effect] It is possible to estimate the motion parameter of the measured object that performs simple vibration and predict the future position of the measured object.
Description
[0001]
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention predicts the threedimensional position of an object to be subjected to simple vibration and outputs the predicted threedimensional position information.
The present invention relates to a threedimensional motion prediction device that is provided to a robot that holds a measured object.
[0002]
2. Description of the Related Art Conventionally, there are two methods for estimating motion parameters of a moving object from moving images. The first method is called the twostep estimation method. First, in the first step, the optical flow between frames before and after the motion of the moving object or the correspondence between frames is obtained, and in the second step, the optical flow of the moving object in the image is calculated. The motion parameters are estimated. The second method is a method called a direct estimation method, which is a method of directly estimating a motion parameter only from a change in image density without obtaining an optical flow.
[0003]
However, in the conventional method, the threedimensional motion is analyzed from the image information which is the twodimensional information, but when the distance information is not known, the threedimensional motion is analyzed. Was difficult. Further, since it is assumed that the image slightly changes, it is difficult to estimate the threedimensional motion with high accuracy.
"Masanobu Yamamoto," Direct estimation method of threedimensional motion parameter by using moving image and distance information ", IEICE Transactions J68D, 4, pp. 562569.
(1985/4) ”discloses a method of directly estimating a motion parameter from a moving image together with distance information.
However, in this method, the motion of the object is assumed to be a constant velocity motion, and therefore it cannot be applied to an object that does not perform such motion.
The present invention has been made in view of the above problems, and an object thereof is to measure an object, such as a pendulum, which can be assumed to be a simple vibration by a threedimensional measurement. It is an object of the present invention to provide a threedimensional motion prediction device that estimates a motion parameter of a measured object from the obtained measurement information of the position of the measured object and predicts a future position of the measured object.
[0006]
In order to achieve the abovementioned object, the present invention provides an analyzing means for analyzing the threedimensional position of an object to be subjected to simple vibration, and the object to be analyzed analyzed by the analyzing means. A threedimensional motion prediction apparatus comprising: means for estimating a motion parameter of the measured object from the threedimensional position; and means for predicting a future position of the measured object based on the estimated motion parameter. Is.
[0007]
According to the present invention, the threedimensional position of the object to be subjected to simple vibration is analyzed, the motion parameter of the object to be measured is estimated from the analyzed threedimensional position of the object to be measured, and the estimated motion parameter is used as a basis. Then, the future position of the measured object is predicted.
[0008]
Embodiments of the present invention will now be described in detail with reference to the drawings. FIG. 1 is a diagram showing a schematic configuration of a system using a threedimensional motion prediction device according to one embodiment of the present invention. In FIG. 1, 1 is a measured object that performs a simple vibration (pendulum motion), and 3 is a threedimensional motion prediction device. The threedimensional motion prediction device 3 includes a range finder 5, a motion parameter estimation unit 9, and a position prediction unit 11. Range finder 5
Takes an image of the object to be measured 1 and obtains the threedimensional position of the object to be measured 1 based on the taken image.
FIG. 2 shows the relationship between the range finder 5 and the object to be measured 1. In FIG. 2, X and Y are absolute coordinate systems, X _{r} and Y _{r} are coordinate systems of the range finder 5, X _{p} ,
Y _{p} represents the coordinate system of the measured object 1. However, in any coordinate system, the Z axis is in the direction perpendicular to the paper surface and is not shown.
When the measured object 1 is simply vibrating, the time history of the X _{p} axis coordinate component of the center of gravity P of the measured object 1 is as shown in FIG. However, considering the time required for the measurement processing of the threedimensional measurement, the obtained measurement points are shown in FIG.
It becomes discrete data like a black dot inside.
The motion parameter estimation unit 9 estimates the motion parameter of the center of gravity P from the coordinate data of the center of gravity P discrete in time. The position prediction unit 11 predicts the coordinate value (X _{f} , Y _{f} , Z _{f} ) of the center of gravity P at the future time T _{f} based on the motion parameter obtained by the motion parameter estimation unit 9.
The future position coordinates of the object to be measured 1 are input to the robot 13, and the manipulator 15 of the robot 13 is used.
The object to be measured 1 is gripped by.
Next, the principle of motion parameter estimation by the motion parameter estimation unit 9 will be described. The motion model of the center of gravity P in the coordinate system of the measured object 1 is set as in the following equation.
X ^{b} (t _{i} ) = a _{x} sin (ω _{x} t _{i} + φ _{x} ) + b _{x} (1) y ^{b} (t _{i} ) = a _{y} sin (ω _{y} t _{i} + φ _{y} ) + b _{y} (2) z ^{b} (t _{i} ) = a _{z} sin (ω _{z} t _{i} + φ _{z} ) + b _{z} (3) where x ^{b} (t _{i} ), y ^{b} (t _{i} ), z ^{b} (t _{i} ). Is X, Y, Z at time t _{i} in the coordinate system of the measured object 1.
Is an axial component, and a _{x} , a _{y} , and a _{z} are amplitudes, ω _{x} ,
ω _{y} and ω _{z} are frequencies, φ _{x} , φ _{y} and φ _{z} are phases, and b
_{x,} b _{y,} b _{z} represents an offset from the origin.
Note that z ^{b} (t _{i} ) is x ^{b} (t _{i} ), y
Since it can be solved by the same procedure as ^{b} (t _{i} ), the following description will be made only for the X and Y coordinate components.
As shown in FIG. 2, since the coordinate system of the object to be measured 1 and the coordinate system of the range finder 5 have a relationship of θ rotation about the Z axis, the equations (1) and (2) are expressed by the range finder 5
It is expressed by the following equation in the coordinate system of.
[0017]
[Equation 1]
Now, x ^{r} (t _{i} ) and y ^{r} (t _{i} ) are position data of the object to be measured 1 measured by the range finder 5, and t _{i} is known at the measurement time.
Is a nonlinear equation with 9 unknown parameters.
Here, the nine unknown parameters are θ, a _{x} , a _{y} ,
_{ω x, ω y, φ x} , φ y, b x, is a b _{y.} Therefore,
It is possible to obtain the approximate solution of the parameter from the measurement data of 9 points by the numerical solution method. However, equation (4) includes a polyvalent function such as sin (), and there is no guarantee that the approximate solution will converge to the true value depending on the setting of the initial value. Therefore, the parameters other than ω _{x} and ω _{y in} Eq. (4) are analytically deleted by the following procedure, the formulas relating to ω _{x} and ω _{y} are derived, and the numerical solution of this formula is examined to determine the motion. Estimate the parameters.
First, from equation (4), x (t _{0} ), y
Erase b _{x} and b _{y} by (t _{0} ).
[0020]
[Equation 2]
Here, P _{i} , A, and G _{i} are defined as follows.
[0022]
[Equation 3]
The equation (4) is transformed into the following equation (8).
P _{i} = AG _{i} (8) Further, by addition theorem, G _{i} is transformed into the following formula (9).
[0025]
[Equation 4]
Here, s _{xi} , s _{yi} , c _{xi} and c _{yi} are defined by the following equations.
[0027]
[Equation 5]
Then, A is eliminated from P _{1} , P _{3} , G _{1} , and G _{3} to obtain the equation (10).
G _{1} P _{1} ^{1} = G _{3} P _{3} ^{1} (10) P _{1} ^{1} , P _{3} ^{1} are replaced as shown in equation (11).
[0030]
[Equation 6]
Next, from equation (10), equations (12) to (15) relating to tan φ _{x} and tan φ _{y} are obtained.
Tan φ _{x} = − {(q _{x1} s _{x1} + q _{y1} s _{x2} −q _{x3} s _{x3} −q _{y3} s _{x4} ) / (q _{x1} c _{x1} + q _{y1} c _{x2} −q _{x3} c _{x3} −q _{y3} c _{x4} ) } (12) tan φ _{x} = − {(q _{x2} s _{x1} + q _{y2} s _{x2} −q _{x4} s _{x3} −q _{y4} s _{x4} ) / (q _{x2} c _{x1} + q _{y2} c _{x2} −q _{x4} c _{x3} −q _{y4} c _{x4} )} (13) tan φ _{y} = − {(q _{x1} s _{y1} + q _{y1} s _{y2} −q _{x3} s _{y3} −q _{y3} s _{y4} ) / (q _{x1} _{cy1} + q _{y1} _{cy2} −q _{x3} _{cy3} −q _{y3} _{cy4} )} (14) tan φ _{y} = − {(q _{x2} s _{y1} + q _{y2} s _{y2} −q _{x4} s _{y3} −q _{y4} s _{y4} ) / (q _{x2} _{cy1} + q _{y2} _{cy2} −q _{x4} _{cy3} −q _{y4} _{cy4} )} (15) Next, equation (12), equation (13), equation (14), and equation (1)
5) are made simultaneous, and φ _{x} and φ _{y} are eliminated, and equations (16) and (17) are obtained.
(Q _{x1} s _{x1} + q _{y1} s _{x2} −q _{x3} s _{x3} −q _{y3} s _{x4} ) · (q _{x2} c _{x1} + q _{y2} c _{x2} −q _{x4} c _{x3} −q _{y4} c _{x4} ) = (q _{x2} s _{x1} + Q _{y2} s _{x2} −q _{x4} s _{x3} −q _{y4} s _{x4} ) ・ (q _{x1} c _{x1} + q _{y1} c _{x2} −q _{x3} c _{x3} −q _{y3} c _{x4} ) (16) (q _{x1} s _{y1} + q _{y1} s _{y2} Q _{x3} _{sy3} q _{y3} _{sy4} )(q _{x2} _{cy1} + q _{y2} _{cy2} q _{x4} _{cy3} q _{y4} _{cy4} ) = (q _{x2} _{sy1} + q _{y2} _{sy2} q _{x4} _{sy3} q _{y4} s _{y4} ) · (q _{x1} _{cy1} + q _{y1} _{cy2} −q _{x3} _{cy3} −q _{y3} _{cy4} ) (17) Instead of solving equations (16) and (17), equations (16) and (17) ), The left sidethe right side is F (ω _{x} ), F
(Ω _{y} ), Equation (18) and Equation (19) are obtained, and F
Ω _{x} and ω _{y for} which (ω _{x} ) = 0 and F (ω _{y} ) = 0 are obtained.
F (ω _{x} ) = (q _{x1} s _{x1} + q _{y1} s _{x2} −q _{x3} s _{x3} −q _{y3} s _{x4} ) · (q _{x2} c _{x1} + q _{y2} c _{x2} −q _{x4} c _{x3} −q _{y4} c _{x4} ) − (Q _{x2} s _{x1} + q _{y2} s _{x2} −q _{x4} s _{x3} −q _{y4} s _{x4} ) · (q _{x1} c _{x1} + q _{y1} c _{x2} −q _{x3} c _{x3} −q _{y3} c _{x4} ) (18) F (ω _{y} ) = (q _{x1} s _{y1} + q _{y1} s _{y2} −q _{x3} s _{y3} −q _{y3} s _{y4} ) · (q _{x2} _{cy1} + q _{y2} _{cy2} −q _{x4} _{cy3} −q _{y4} _{cy4} ) − (q _{x2} s _{y1} + Q _{y2} s _{y2} −q _{x4} s _{y3} −q _{y4} s _{y4} ) · (q _{x1} c _{y1} + q _{y1} c _{y2} −q _{x3} c _{y3} −q _{y3} c _{y4} ) (19) Formula (18), Formula (19) Are expressions relating to ω _{x} and ω _{y} , respectively, but if ω _{x} and ω _{y} are exchanged, the same equation is obtained.
This corresponds to the fact that θ is an unknown number in the equation (5), but θ is π / 2−θ, and the equation (5) holds even if a _{x} and a _{y} and ω _{x} and ω _{y} are exchanged. To do. The inclination θ of the coordinate system of the range finder 5 with respect to the coordinate system of the DUT 1 is π
This is because it is irrelevant to the frequency even if it is rotated by / 2.
Therefore, instead of separately obtaining ω _{x} and ω _{y} from the equations (18) and (19), F (ω _{x} ) = 0 is replaced with F (ω) = 0 in the equation (18), and Two solutions close to "0" are obtained, and the ones close to "0" are defined as ω _{x} and ω _{y} . When ω _{x} and ω _{y} are obtained, other motion parameters are obtained by the procedure reverse to that described above.
Solving F (ω) = 0 in the equation (18),
The NewtonRaphson method is used to obtain ω _{x} and ω _{y} , but ω may not converge to the true value depending on how the initial values are set.
Take steps to differentiate (ω) and converge the solution.
In FIG. 4, F (ω) = 0 is solved by following a procedure in which F (ω) is differentiated and the solution converges.
6 is a flowchart showing a process of estimating a motion parameter of a and predicting a future position.
FIG. 5 is an explanatory diagram for obtaining a solution in accordance with the flowchart shown in FIG. 4 for F (ω) = 0. FIG. 5A shows F (ω), and FIG. 5B shows F ′.
(Ω), and FIG. 5C shows F ″ (ω).
As shown in FIG. 4, the time t the position of the _{i} data x ^{r} (t _{i),} to measure the y ^{r} (t _{i)} (step 401), P _{xi} in equation (18) related to the frequency omega, P
_{yi,} substituting t _{i (i} = 1~4) (Step 40
2). An appropriate initial value is set to ω _{0,} and the parameters n and j are cleared (step 403). A first approximate solution ω _{1} ^{n} on F ^{n} (ω) is obtained by the NewtonRaphson method (step 404). ω _{1} ^{n} and ω _{0} are compared (step 405), and when ω _{1} ^{n} <ω _{0} is not satisfied (FIG. 5).
In the case of (a) and FIG. 5 (b), F ^{n} (ω) is differentiated (step 406).
When ω _{1} ^{n} <ω _{0} is satisfied in step 405 (in the case of FIG. 5C), a convergent solution ω ^{n *} in F ^{n} (ω) is obtained (step 407). And ω ^{n *} −
The initial value is δ (δ> 0), and n is reduced by 1 (step 4
08), and it is determined whether n <0 (step 409).
Steps 407 and 40 until n <0 is satisfied
The process of 8 is repeated.
When n <0 in step 409, ω
The convergent solution ω ^{0 *} in F (ω) is stored in _{0} (j), j is increased by “1” (step 410), it is determined whether ω ^{0 *} = 0 (step 411), and ω ^{0 *} If = 0, then ω ^{0 *}
With −δ as an initial value, “0” is substituted for n (step 41
2) and returns to step 404.
If ω ^{0 *} = 0 in step 411,
_{Let} ω _{0} (j1) and ω _{0} (j2) be ω _{x} and ω _{y} , respectively.
(Step 413). Thus ω _{x} ,
When ω _{y} is calculated, the procedure up to Expression (18) is reversed, and θ, a _{x} , a _{y} , φ _{x} , φ _{y} , b _{x} , and b _{y} are obtained (step 414). Thus the parameter ω _{x} ,
When ω _{y} , θ, a _{x} , a _{y} , φ _{x} , φ _{y} , b _{x} , and b _{y} are calculated, the center of gravity P of the measured object 1 at the desired time t _{f} is calculated by the formulas (1) and (2). The coordinates x _{f} and y _{f of the} are calculated (step 415).
[0043] When the FIG. 5 will be described, omega _{0} omega ^{0 *} as an initial value is determined for the two solutions close to "0" in the F (omega) = 0 is omega _{x,} are omega _{y.}
In the same manner as the above procedure, z of equation (3)
Parameters relating to ^{b} (t _{i} ) are obtained, and the coordinates x _{f} , y _{f} , and z _{f} of the center of gravity P of the measured object 1 at time t _{f} are obtained from the obtained parameters.
As described above, in this embodiment, the object 1 to be subjected to simple vibration is imaged by the range finder 5 and the image data is analyzed to estimate the motion parameter, and the future position of the object 1 to be measured. Can be predicted.
[0046]
As described above in detail, according to the present invention, it is possible to estimate the motion parameter of the object to be subjected to simple vibration and to predict the future position of the object to be measured.
FIG. 1 is a diagram showing a schematic configuration of a system using a threedimensional motion prediction device according to an embodiment of the present invention.
FIG. 2 is a diagram showing the relationship between the absolute coordinate system, the coordinate system of the range finder 5 and the coordinate system of the DUT 1.
FIG. 3 is a diagram showing a time history of the X coordinate component of the center of gravity P of the measured object 1.
FIG. 4 is a flowchart showing the processing of this embodiment.
FIG. 5 is an explanatory diagram for obtaining a solution of F (ω) = 0.
1 ... Object to be measured 3 ... Threedimensional motion prediction device 5 ... Range finder 9 ... Motion parameter estimation unit 11 ... Position prediction unit 13 ... Robot 15 ... Manipulator
─────────────────────────────────────────────────── ─── Continued Front Page (72) Inventor Kazuo Nakazawa 31116 Hiyoshi, Kohoku Ward, Yokohama City, Kanagawa Banno Corp. 4C
Claims (2)
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

JP18677893A JPH0719818A (en)  19930630  19930630  Threedimensional movement predicting device 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

JP18677893A JPH0719818A (en)  19930630  19930630  Threedimensional movement predicting device 
Publications (1)
Publication Number  Publication Date 

JPH0719818A true JPH0719818A (en)  19950120 
Family
ID=16194445
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

JP18677893A Pending JPH0719818A (en)  19930630  19930630  Threedimensional movement predicting device 
Country Status (1)
Country  Link 

JP (1)  JPH0719818A (en) 
Cited By (10)
Publication number  Priority date  Publication date  Assignee  Title 

JP2007240344A (en) *  20060309  20070920  Fujitsu Ltd  Dynamic shape measuring method and dynamic shape measuring device 
JP2008058221A (en) *  20060901  20080313  Kobe Univ  Velocity of highspeed moving body, estimation method of position, estimation program, and estimation system 
US8140188B2 (en)  20080218  20120320  Toyota Motor Engineering & Manufacturing North America, Inc.  Robotic system and method for observing, learning, and supporting human activities 
WO2012153629A1 (en) *  20110512  20121115  株式会社Ihi  Device and method for controlling prediction of motion 
JP2012236254A (en) *  20110512  20121206  Ihi Corp  Device and method for holding moving body 
JP2012245568A (en) *  20110525  20121213  Ihi Corp  Device and method for controlling and predicting motion 
JP2012247835A (en) *  20110525  20121213  Ihi Corp  Robot movement prediction control method and device 
JP2013240847A (en) *  20120518  20131205  Ihi Corp  Robot hand device, and control method 
CN108020855A (en) *  20171129  20180511  安徽省通信息科技有限公司  The pose and instantaneous center of rotation combined estimation method of a kind of glide steering robot 
WO2020032211A1 (en) *  20180810  20200213  川崎重工業株式会社  Data generating device, data generating method, data generating program, and remote operation system 

1993
 19930630 JP JP18677893A patent/JPH0719818A/en active Pending
Cited By (13)
Publication number  Priority date  Publication date  Assignee  Title 

JP2007240344A (en) *  20060309  20070920  Fujitsu Ltd  Dynamic shape measuring method and dynamic shape measuring device 
JP2008058221A (en) *  20060901  20080313  Kobe Univ  Velocity of highspeed moving body, estimation method of position, estimation program, and estimation system 
US8140188B2 (en)  20080218  20120320  Toyota Motor Engineering & Manufacturing North America, Inc.  Robotic system and method for observing, learning, and supporting human activities 
WO2012153629A1 (en) *  20110512  20121115  株式会社Ihi  Device and method for controlling prediction of motion 
JP2012236254A (en) *  20110512  20121206  Ihi Corp  Device and method for holding moving body 
CN103517789A (en) *  20110512  20140115  株式会社Ihi  Device and method for controlling prediction of motion 
US9108321B2 (en)  20110512  20150818  Ihi Corporation  Motion prediction control device and method 
JP2012247835A (en) *  20110525  20121213  Ihi Corp  Robot movement prediction control method and device 
JP2012245568A (en) *  20110525  20121213  Ihi Corp  Device and method for controlling and predicting motion 
JP2013240847A (en) *  20120518  20131205  Ihi Corp  Robot hand device, and control method 
CN108020855A (en) *  20171129  20180511  安徽省通信息科技有限公司  The pose and instantaneous center of rotation combined estimation method of a kind of glide steering robot 
CN108020855B (en) *  20171129  20200131  安徽省一一通信息科技有限公司  posture and rotation instantaneous center joint estimation method for skidsteer robot 
WO2020032211A1 (en) *  20180810  20200213  川崎重工業株式会社  Data generating device, data generating method, data generating program, and remote operation system 
Similar Documents
Publication  Publication Date  Title 

US20200096317A1 (en)  Threedimensional measurement apparatus, processing method, and nontransitory computerreadable storage medium  
US20170294023A1 (en)  Constrained key frame localization and mapping for visionaided inertial navigation  
CN102914293B (en)  Messaging device and information processing method  
Danescu et al.  Probabilistic lane tracking in difficult road scenarios using stereovision  
Kruger et al.  Realtime estimation and tracking of optical flow vectors for obstacle detection  
US9025857B2 (en)  Threedimensional measurement apparatus, measurement method therefor, and computerreadable storage medium  
ES2700506T3 (en)  Adaptive path smoothing for video stabilization  
DE69926868T2 (en)  Road profile detection  
DE102018116111A1 (en)  A uniform deep convolution neural network for the estimation of free space, the estimation of the object recognition and the object position  
US8792726B2 (en)  Geometric feature extracting device, geometric feature extracting method, storage medium, threedimensional measurement apparatus, and object recognition apparatus  
EP0420657B1 (en)  Moving object detecting system  
US5311305A (en)  Technique for edge/corner detection/tracking in image frames  
KR100446636B1 (en)  Apparatus and method for measuring ego motion of autonomous vehicles and 3D shape of object in front of autonomous vehicles  
JP4985516B2 (en)  Information processing apparatus, information processing method, and computer program  
KR20150032789A (en)  Method for estimating ego motion of an object  
US10109104B2 (en)  Generation of 3D models of an environment  
EP1205765B1 (en)  Method of recognising conducting cables for low flying aircraft  
US10083512B2 (en)  Information processing apparatus, information processing method, position and orientation estimation apparatus, and robot system  
Wu et al.  Recovery of the 3d location and motion of a rigid object through camera image (an Extended Kalman Filter approach)  
Papanikolopoulos et al.  Vision and control techniques for robotic visual tracking.  
Gee et al.  Discovering Planes and Collapsing the State Space in Visual SLAM.  
US20090227266A1 (en)  Location measurement method based on predictive filter  
JP2009008662A (en)  Object detection cooperatively using sensor and video triangulation  
Rajagopalan et al.  An MRF modelbased approach to simultaneous recovery of depth and restoration from defocused images  
US7062071B2 (en)  Apparatus, program and method for detecting both stationary objects and moving objects in an image using optical flow 