CN101344968A - Movement compensation method for star sky background image - Google Patents

Movement compensation method for star sky background image Download PDF

Info

Publication number
CN101344968A
CN101344968A CNA2008101507817A CN200810150781A CN101344968A CN 101344968 A CN101344968 A CN 101344968A CN A2008101507817 A CNA2008101507817 A CN A2008101507817A CN 200810150781 A CN200810150781 A CN 200810150781A CN 101344968 A CN101344968 A CN 101344968A
Authority
CN
China
Prior art keywords
motion
sample
estimation
point
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2008101507817A
Other languages
Chinese (zh)
Inventor
张艳宁
孙瑾秋
姜磊
段锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CNA2008101507817A priority Critical patent/CN101344968A/en
Publication of CN101344968A publication Critical patent/CN101344968A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a motion compensation method of a starry sky background image. The method comprises the steps that: first, according to the gray scale and the motion feature of the starry sky background image, motion background compensation is carried out by utilizing two key processes of sample motion parameter estimation and integral motion parameter estimation. An abnormal point is directly removed in the sample motion vector estimation stage and the amount of calculating redundant information in the motion compensation method is reduced; and second, the method of the combination of block matching and centroid extraction is introduced in the sample motion parameter estimation stage. Due to the adoption of the block matching method, the operation speed of sample estimation is raised, the removal of the abnormal point furthermore reduces the amount of data information to be processed and the method based on the centroid extraction further improves the precision of a sample point estimation parameter. The motion compensation method adopts the method based on the combination of the block matching and the centroid extraction to achieve motion compensation in the starry sky background, therefore, calculation amount is lowered by 50 percent and the compensation precision is improved to 0.5 pixels from 1 pixel of the prior art.

Description

Motion compensation process in the starry sky background image
Technical field
The present invention relates to the motion compensation process in a kind of the starry sky background image disposal route, particularly starry sky background image.
Background technology
Weak target under the compound movement background detects problem and has directly determined the operating distance of astronomical sight system and detected performance, and the movement background compensation is one of gordian technique of Weak target detection.
Document " based on the movement background compensation technique of robust regression; signal Processing; 2002; 18 (1): 36-38 " discloses a kind of estimation of carrying out the sample point kinematic parameter based on the method for light stream, and employing iterates the estimation that weighted least-squares method is carried out the global motion parameter, the error of the global motion parameter estimation of having avoided impact point to be taken as sampled point and having introduced, but wherein asking for of weight function increased operand greatly, reduce the speed of motion compensation, also can only realize the compensation precision of 1 pixel.
Summary of the invention
For the deficiency that speed is low, precision is low that overcomes the prior art motion compensation, the invention provides the motion compensation process in a kind of starry sky background image, employing is based on the method for piece coupling, can effectively improve the arithmetic speed that sample is estimated, the rejecting of abnormity point can reduce the quantity of information of pending data, can improve the precision of sample point estimated parameter based on the method for barycenter extraction.
The technical solution adopted for the present invention to solve the technical problems: the motion compensation process in a kind of starry sky background image is characterized in that comprising the steps:
(a) determining by effective coverage between consecutive frame, image is evenly divided, carry out choosing of sample point in the little module after division, and mate the mode that combines with centroid method by piece, carry out the estimation of sample point motion vector, remove redundant information by abnormity point elimination, calculate the motion vector of sample characteristics;
(b) employing determines to characterize the camera motion of global motion model based on six parameter models of parallel projection;
(c) the definition sample point is gathered, and calculates the kinematic parameter of all sample points, goes out velocity field by the camera motion model parameter estimation that makes up, and obtains the kinematic parameter of global motion model by the optimum solution of finding the solution global motion model parameters;
(d). compensate by the back two field picture of bilinear interpolation method adjacent two frames.
The invention has the beneficial effects as follows: owing to adopt the method for mating based on piece, improved the arithmetic speed that sample is estimated effectively, the rejecting of abnormity point has further reduced the quantity of information of pending data again, and further improved the precision of sample point estimated parameter based on the method that barycenter extracts, the method that the present invention's employing is mated based on piece and the barycenter extraction combines is waken up with a start the motion compensation under the starry sky background, operand has reduced by 50%, and compensation precision has been brought up to 0.5 pixel by 1 original pixel.
Below in conjunction with drawings and Examples the present invention is elaborated.
Description of drawings
Accompanying drawing is the motion compensation process process flow diagram in the starry sky background image of the present invention.
Embodiment
With reference to accompanying drawing.1, at first carries out the sample characteristics estimation.
The deep space background shows as black in image, fixed star is similar to small and weak moving target, is rendered as point source in image, because also there is much noise in detecting devices self in the image.Take into full account the characteristic of image and starry sky background motion target detection, directly carry out the rejecting of abnormity point in the Sample selection stage, avoid the employing of camera model motion model parameters estimation stages to iterate weighted least-squares method and carried out asking for of global parameter, saved finding the solution of weight function, simplified the process that global parameter is asked for, effectively shorten the time of background compensation, reduced calculated amount.
The basic process in sample characteristics estimation stage is as follows:
(a) effective coverage is determined between consecutive frame.
Because background generation background motion in the video sequence is promptly worked as in the boundary effect influence, there is relative motion in background between consecutive frame, and image can not overlap fully.Therefore, have only in sample point is chosen at the public domain, could effectively estimate the kinematic parameter of sample point.Therefore, at first need to determine effective coverage between adjacent two frames.
The scope of choosing the border horizontal ordinate of effective coverage is 16≤x≤L x-16, the scope of ordinate is 16≤y≤L y-16, wherein 16 expressions are decided to be 16 pixels with border width, wherein, and L xAnd L yDifference presentation video frame size.Therefore, can obtain the effective coverage of consecutive frame.
(b) based on uniform sampling thought, image evenly is divided into 9, in little module, carry out sample point and choose.
More representative and dispersed for sample point is in earlier stage chosen, it is more accurate to make global motion model parameters estimate, avoids choosing too much sample point and consumes the more processing time, and the present invention has used for reference the thought of uniform sampling.
The image of present frame evenly is divided into 9, uses for reference the notion of magnitude in the astronomy, the global search gray-scale value is greater than the sampled point of given threshold value, and this moment, threshold value was decided to be 180.With evenly divide threshold value in the fritter of back greater than 180 some as the interested feature point set that finds, record searches the gray-scale value of sampled point and horizontal stroke, the ordinate of position at first, with this as the sampled point of selecting.
(c) characteristic matching and estimation of motion vectors.
The present invention has adopted the technology that combines with centroid method based on the piece coupling to carry out the estimation of motion vectors of sample point.Promptly adopt the matching criterior of minimum average B configuration absolute difference function, utilize three step search procedures to carry out the optimum matching block search, wherein, the size of determining piece is 16 * 16.Simultaneously, in order to reduce the operation times of each position to be matched in the piece matching process, the precision of maintainance block coupling has been introduced continuous eliminating algorithm, and the precision of registration has been brought up to 0.5 pixel by 1 pixel.
(d) abnormity point elimination.
Under the starry sky background, moving target and background fixed star all show as hot spot in image, but target has different kinetic characteristics with background, if moving target is chosen as sample point, to estimating at than mistake of global motion parameter.Owing to deep space background under the starry sky background shows as black region in image,, possibly can't find best matching blocks again if this point is chosen as sample point when carrying out search matched.We claim these points to be abnormity point.
In order effectively to avoid abnormity point to bring than mistake, will reject abnormity point in this stage to overall motion estimation.8bit image with 1024 * 1024 be example.At first, seek gray-scale value, gray scale is rejected as abnormity point less than 180 point greater than 180 point; Secondly, the motion vector that estimates is judged, changed in, then carry out further work less than 3 pixel coverages, otherwise, be made as abnormity point and reject, and in this module, seek unique point again and carry out estimation of motion vectors.Employing is with the rejecting of carrying out abnormity point with the mode of motion vector comparison of measuring of sampled point and reference point, avoided effectively iterating the method that weighted least-squares method is carried out abnormity point elimination as the use of being mentioned in the background technology, reduced by 50% calculated amount, improved the efficient of calculating.
(e) the accurate estimation of sample characteristics kinematic parameter.
By abnormity point elimination, selected sample point is the point on the fixed star.Utilize the pixel threshold method that mark is carried out in the zone at the 1st frame sample point place earlier, determine the border of fixed star spot, utilize the square weighting method again,, carry out barycenter and ask for as formula (1), wherein, x, y are the horizontal ordinate of pixel, F (x, y) be that (m, n are the size of window for x, the gray-scale value of y) locating.x 0, y 0Be the center-of-mass coordinate of trying to achieve.
x 0 = Σ x = 1 m Σ y = 1 n F 2 ( x , y ) x Σ x = 1 m Σ y = 1 n F 2 ( x , y ) y 0 = Σ x = 1 m Σ y = 1 n F 2 ( x , y ) y Σ x = 1 m Σ y = 1 n F 2 ( x , y ) - - - ( 1 )
Then the motion vector that draws with Block Matching Algorithm (u v) draws the position of this sample point in the 2nd frame, promptly (x+u, y+v), and calculate the motion after fixed star centroid position (x ' 0, y ' 0).Therefore, ( u ^ , v ^ ) = ( x 0 ′ - x 0 , y 0 ′ - y 0 ) Be the sample characteristics accurate motion vectors that estimates.
2, camera motion model.
The global motion model parameters of computed image needs definite camera motion model that characterizes global motion, considers the validity of complexity of calculation and algorithm, and the present invention adopts six parameter models based on parallel projection.
When mode of motion such as image existence rotation, translation, system can adopt the motion model parameters estimation technique, and motion is described to Three-dimension Target.Typical kinematic parameter estimation model as shown in Equation (2).
Suppose at moment t kAny coordinate of certain of imaging rigid body be (X, Y, Z), at moment t K+1This point moves to (X ', Y ', Z '), and the translation that rotation that the available matrix R of this motion describes and vector T represent is synthesized, and has:
X ′ Y ′ Z ′ = R X Y Z + T = r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 X Y Z + t 1 t 2 t 3 - - - ( 2 )
Impact point coordinate in imaging plane (x, y) with target three-dimensional point coordinate (X, Y, relation Z) such as formula (3):
(x,y)=(X,Y)(x′,y′)=(X′,Y′) (3)
Can obtain formula (4) by formula (2) and formula (3)
x ′ = r 11 x + r 12 y + ( r 13 Z + t 1 ) y ′ = r 21 x + r 22 y + ( r 23 Z + t 2 ) - - - ( 4 )
Further can be expressed as formula (5):
x ′ = a 1 x + a 2 y + a 3 y ′ = a 4 x + a 5 y + a 6 - - - ( 5 )
Wherein, a 1=r 11, a 2=r 12, a 4=r 21, a 5=r 22, a 3=r 13Z+t 1, a 6=r 23Z+t 2
Therefore, only need seek out optimum [a 1, a 2, a 3, a 4, a 5, a 6] promptly obtained global motion model parameters.
3, global motion model parameters is asked for.
As shown in Equation (6), definition S represents the set of sample point, V ^ = { V ^ 1 , L , V ^ s , L , V ^ n | s ∈ S } The set of all sample point kinematic parameters that expression is calculated,
Figure A20081015078100064
Expression sampled point s place is by the estimated velocity field that goes out of the camera motion model of formula (5) description, as shown in Equation (7).
V ^ = { V ^ 1 , L , V ^ s , L , V ^ n | s ∈ S } - - - ( 6 )
U ^ ( x , A ) = u ^ x ( s , A ) u ^ y ( s , A ) = a 1 x s + a 2 y s + a 3 a 4 x s + a 5 y s + a 6 - - - ( 7 )
Optimum model parameter C is found the solution in being solved to of global motion model parameters Opt, i.e. the error of minimum model.As formula (8).
C opt = Arg min A ∈ R 6 Σ s ∈ S ρ ( V ^ s - U ^ ( s , A ) ) - - - ( 8 )
Wherein, ρ (g) is the optimal estimation function, adopts least square method.
4, movement background compensation.
Parameter [a when camera model 1, a 2, a 3, a 4, a 5, a 6] be after global motion model parameters estimates, selecting with the former frame is benchmark, and the background of back one frame is carried out motion compensation.
Adopt the bilinear interpolation method method.Wherein, I (x+u, y+v, k+1) choose near (x+u, y+v) brightness value of four of the position pixels makes up, the image after the motion compensation obtains by formula (9).
I(x+u,y+v,k+1)=(1-α x)(1-α y)I(x 0,y 0,k+1)+α x(1-α y)I(x 0+1,y 0,k+1) (9)
+(1-α xyI(x 0,y 0+1,k+1)+α xα yI(x 0+1,y 0+1,k+1)
Wherein, x 0Be the integral part of x+u, α xFraction part for x+u; y 0Be the integral part of y+v, α yFraction part for y+v.

Claims (1)

1, the motion compensation process in a kind of starry sky background image is characterized in that comprising the steps:
(a) determining by effective coverage between consecutive frame, image is evenly divided, carry out choosing of sample point in the little module after division, and mate the mode that combines with centroid method by piece, carry out the estimation of sample point motion vector, remove redundant information by abnormity point elimination, calculate the motion vector of sample characteristics;
(b) employing determines to characterize the camera motion of global motion model based on six parameter models of parallel projection;
(c) the definition sample point is gathered, and calculates the kinematic parameter of all sample points, goes out velocity field by the camera motion model parameter estimation that makes up, and obtains the kinematic parameter of global motion model by the optimum solution of finding the solution global motion model parameters;
(d) compensate by the back two field picture of bilinear interpolation method adjacent two frames.
CNA2008101507817A 2008-09-02 2008-09-02 Movement compensation method for star sky background image Pending CN101344968A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2008101507817A CN101344968A (en) 2008-09-02 2008-09-02 Movement compensation method for star sky background image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2008101507817A CN101344968A (en) 2008-09-02 2008-09-02 Movement compensation method for star sky background image

Publications (1)

Publication Number Publication Date
CN101344968A true CN101344968A (en) 2009-01-14

Family

ID=40246967

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2008101507817A Pending CN101344968A (en) 2008-09-02 2008-09-02 Movement compensation method for star sky background image

Country Status (1)

Country Link
CN (1) CN101344968A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063615A (en) * 2010-12-29 2011-05-18 哈尔滨工业大学 Beacon light optimizing and recognizing denoising method based on spot noise distribution topological characteristic
CN102163334A (en) * 2011-03-04 2011-08-24 北京航空航天大学 Method for extracting video object under dynamic background based on fisher linear discriminant analysis
CN103955930A (en) * 2014-04-28 2014-07-30 中国人民解放军理工大学 Motion parameter estimation method based on gray integral projection cross-correlation function characteristics
CN104966080A (en) * 2015-07-27 2015-10-07 广东东软学院 Sea surface monitoring sequence infrared image small target determination method and device
CN106470342A (en) * 2015-08-14 2017-03-01 展讯通信(上海)有限公司 Global motion estimating method and device
CN106791845A (en) * 2017-01-17 2017-05-31 湖南优象科技有限公司 A kind of quick parallax method of estimation for multi-view image coding
CN109472810A (en) * 2018-07-10 2019-03-15 湖南科技大学 A kind of glacial ice velocities visual extraction method based on remote sensing images
CN113112406A (en) * 2021-04-12 2021-07-13 南方科技大学 Feature determination method and device, electronic equipment and storage medium

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063615A (en) * 2010-12-29 2011-05-18 哈尔滨工业大学 Beacon light optimizing and recognizing denoising method based on spot noise distribution topological characteristic
CN102163334A (en) * 2011-03-04 2011-08-24 北京航空航天大学 Method for extracting video object under dynamic background based on fisher linear discriminant analysis
CN103955930A (en) * 2014-04-28 2014-07-30 中国人民解放军理工大学 Motion parameter estimation method based on gray integral projection cross-correlation function characteristics
CN103955930B (en) * 2014-04-28 2017-01-18 中国人民解放军理工大学 Motion parameter estimation method based on gray integral projection cross-correlation function characteristics
CN104966080A (en) * 2015-07-27 2015-10-07 广东东软学院 Sea surface monitoring sequence infrared image small target determination method and device
CN104966080B (en) * 2015-07-27 2018-11-23 广东东软学院 A kind of sea monitoring data sequent infrared image Weak target determination method and device
CN106470342B (en) * 2015-08-14 2020-01-17 展讯通信(上海)有限公司 Global motion estimation method and device
CN106470342A (en) * 2015-08-14 2017-03-01 展讯通信(上海)有限公司 Global motion estimating method and device
CN106791845A (en) * 2017-01-17 2017-05-31 湖南优象科技有限公司 A kind of quick parallax method of estimation for multi-view image coding
CN106791845B (en) * 2017-01-17 2019-06-14 湖南优象科技有限公司 A kind of quick parallax estimation method for multi-view image coding
CN109472810A (en) * 2018-07-10 2019-03-15 湖南科技大学 A kind of glacial ice velocities visual extraction method based on remote sensing images
CN113112406A (en) * 2021-04-12 2021-07-13 南方科技大学 Feature determination method and device, electronic equipment and storage medium
CN113112406B (en) * 2021-04-12 2023-01-31 山东迈科显微生物科技有限公司 Feature determination method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN101344968A (en) Movement compensation method for star sky background image
CN110796010B (en) Video image stabilizing method combining optical flow method and Kalman filtering
US7755619B2 (en) Automatic 3D face-modeling from video
EP2570993B1 (en) Egomotion estimation system and method
JP4467838B2 (en) Image recognition apparatus and image recognition method
CN110823358B (en) Building vibration displacement measurement method based on visual processing
US8395659B2 (en) Moving obstacle detection using images
CN101923717B (en) Method for accurately tracking characteristic points of quick movement target
KR20150032789A (en) Method for estimating ego motion of an object
US10249046B2 (en) Method and apparatus for object tracking and segmentation via background tracking
KR101703515B1 (en) Apparatus and method for target tracking of image
CN101916446A (en) Gray level target tracking algorithm based on marginal information and mean shift
Hua et al. Extended guided filtering for depth map upsampling
CN101650829B (en) Method for tracing covariance matrix based on grayscale restraint
CN114693720A (en) Design method of monocular vision odometer based on unsupervised deep learning
CN104200492A (en) Automatic detecting and tracking method for aerial video target based on trajectory constraint
CN115144828A (en) Automatic online calibration method for intelligent automobile multi-sensor space-time fusion
US20140363053A1 (en) Method and device for generating a motion field for a video sequence
CN102521846A (en) Time-space domain motion segmentation and motion estimation method based on three-dimensional video
CN104200434A (en) Non-local mean image denoising method based on noise variance estimation
CN106153041A (en) A kind of visual odometry speed-measuring method based on many depth of view information
US8351653B2 (en) Distance estimation from image motion for moving obstacle detection
Crivelli et al. From optical flow to dense long term correspondences
Sun et al. MM3DGS SLAM: Multi-modal 3D Gaussian Splatting for SLAM Using Vision, Depth, and Inertial Measurements
US8922648B2 (en) Rotation cancellation for moving obstacle detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20090114