CN101556697A - Method and system for motion target tracking based on rapid characteristic points - Google Patents

Method and system for motion target tracking based on rapid characteristic points Download PDF

Info

Publication number
CN101556697A
CN101556697A CNA2008100359038A CN200810035903A CN101556697A CN 101556697 A CN101556697 A CN 101556697A CN A2008100359038 A CNA2008100359038 A CN A2008100359038A CN 200810035903 A CN200810035903 A CN 200810035903A CN 101556697 A CN101556697 A CN 101556697A
Authority
CN
China
Prior art keywords
moving target
mentioned
motion
profile
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100359038A
Other languages
Chinese (zh)
Other versions
CN101556697B (en
Inventor
张慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Baokang Electronic Control Engineering Co Ltd
Original Assignee
Shanghai Baokang Electronic Control Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Baokang Electronic Control Engineering Co Ltd filed Critical Shanghai Baokang Electronic Control Engineering Co Ltd
Priority to CN2008100359038A priority Critical patent/CN101556697B/en
Publication of CN101556697A publication Critical patent/CN101556697A/en
Application granted granted Critical
Publication of CN101556697B publication Critical patent/CN101556697B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for motion target tracking based on rapid characteristic points. The technique comprises the steps of receiving a triggering signal; obtaining and performing differential operation to data of a previous frame and a current frame; obtaining and dividing a motion profile of the motion target; and obtaining the motion target profile of the previous frame and the current frame respectively. The invention differentiates data in a virtual coil and the surrounding area of a certain width and extracts a vehicle profile according to a proportion so as to improve the operational speed and precision, and then matches and regulates the range, thereby the integrity of the motion target is ensured, and the motion target is not easy to lose.

Description

A kind of motion target tracking method and system based on rapid characteristic points
Technical field
The present invention relates to a kind of moving target video tracing method and system, and be particularly related to a kind of motion target tracking method and system based on rapid characteristic points.
Background technology
Current society, urban development is very swift and violent, the population in city and vehicle are in sharp increase, the magnitude of traffic flow strengthens day by day, the traffic congestion phenomenon is on the rise, traffic system is faced with huge pressure, traffic problems have become the significant problem in the city management, it seriously hampers the development in city, the generation anywhere or anytime of particularly various vehicle peccancy phenomenons, make the monitoring of urban transportation become very difficult, occurred the traffic conditions of moving target video tracking technology thus in order to monitoring vehicle.
The most of employing of present moving target video tracking technology estimated what this circulation thinking of coupling correction was carried out.When setting up the initial estimation model, general way is front and back two width of cloth images to be done respectively cut apart, and extracts features such as moving-target profile, and there are very big uncertainty in this method calculation of complex and the result who obtains.Because the moving target in two two field pictures of front and back may not be all in virtual coil, so can't utilize background to obtain good moving target profile, if adopt the single-frame images partitioning algorithm of still image, there is very big error in the contour feature of the moving target that obtains, the deviation of initial value is to the follow-up very big influence that estimates at, and the algorithm complexity.
In addition, when mating, general estimation model can obtain a series of moving target estimation scope of activities up and down, and present common practices is directly to mate in estimated ranges, but in the practical application so ideal situation can not be arranged.Because the movement velocity of vehicle and direction of motion are inconsistent, might exceed estimated ranges, and cause that it fails to match; Other way is to utilize matching value that scope is adjusted, and this way severity consuming time is not suitable in the actual application.
Summary of the invention
The purpose of this invention is to provide a kind of motion target tracking method and system, be used for obtaining fast the profile of moving target to follow the tracks of this moving target based on rapid characteristic points.
To achieve these goals, the present invention proposes a kind of motion target tracking method based on rapid characteristic points, and it comprises: receive trigger pip; Obtain the data of former frame and present frame respectively; Data to above-mentioned former frame and present frame are carried out calculus of differences, obtain the motion outline of moving target; Motion outline to above-mentioned moving target is cut apart, and obtains the profile of the moving target of the profile of moving target of former frame and present frame respectively.
Further, wherein the trigger pip of above-mentioned reception is the trigger pip when moving target enters a virtual coil.
Further, the data of wherein above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region.
Further, wherein the peripheral region of above-mentioned virtual coil has certain width.
Further, wherein above-mentioned segmentation procedure intercepts the motion outline of above-mentioned moving target the profile of the moving target that obtains above-mentioned present frame for the motion outline of above-mentioned moving target is intercepted the profile of the moving target that obtains above-mentioned former frame according to first ratio according to second ratio.
Further, wherein said method more comprises the above-mentioned moving target profile extraction feature that obtains of cutting apart, carry out the estimation of next frame moving target position in the input estimation filter, utilize estimated value directly to enlarge the scope of coupling, guarantee the integrality of the moving target profile that next frame obtains, to help follow-up matching precision.
Further, wherein said method more comprises the profile of the moving target of the above-mentioned front and back of output two frames, for user monitoring.
The present invention more provides a kind of motion target tracking system based on rapid characteristic points, and it comprises trigger element, in order to receive trigger pip; Acquiring unit is in order to obtain the data of former frame and present frame; Arithmetic element is carried out calculus of differences in order to the data to above-mentioned former frame and present frame, obtains the motion outline of moving target; Cutting unit is cut apart the motion outline of above-mentioned moving target, obtains the profile of the moving target of the profile of moving target of former frame and present frame respectively.
Further, the trigger pip of wherein above-mentioned trigger element reception is the trigger pip when moving target enters a virtual coil.
Further, the data of wherein above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region.
Further, wherein the peripheral region of above-mentioned virtual coil has certain width.
Further, wherein above-mentioned cutting unit intercepts the profile of the moving target that obtains above-mentioned former frame with the motion outline of above-mentioned moving target according to first ratio, the motion outline of above-mentioned moving target is intercepted the profile of the moving target that obtains above-mentioned present frame according to second ratio.
Further, wherein said system more comprises matching unit, be used for the above-mentioned moving target profile that obtains of cutting apart is extracted feature, carry out the estimation of next frame moving target position in the input estimation filter, the estimated value data are enlarged, obtain the motion outline of next frame moving target by matching algorithm.
Further, wherein said system comprises that more output unit exports the profile of the moving target of above-mentioned front and back two frames to display unit.
Motion target tracking method and system based on rapid characteristic points of the present invention receives the data of obtaining former frame and present frame after the trigger pip, carry out the motion outline of difference acquisition moving target then, carry out ratio again and cut apart the profile that draws moving target, the present invention in the virtual coil with and on every side certain width the zone in data do difference, arithmetic speed and precision have been improved, mate, setting range, guaranteed the integrality of moving target, and be not easy moving target with losing.
Description of drawings
Figure 1 shows that the process flow diagram of a preferred embodiment of the present invention;
Figure 2 shows that the functional block diagram of a preferred embodiment of the present invention.
Embodiment
In order more to understand technology contents of the present invention, especially exemplified by preferred embodiment and cooperate appended graphic being described as follows.
Please refer to Fig. 1, Figure 1 shows that the process flow diagram of a preferred embodiment of the present invention.The present invention proposes a kind of motion target tracking method based on rapid characteristic points, it comprises: step 10: receive trigger pip, according to a preferred embodiment of the present invention, wherein the trigger pip of above-mentioned reception is the trigger pip when moving target enters a virtual coil, promptly triggers a trigger pip in the monitoring and controlling traffic situation when vehicle enters this virtual coil automatically; Carry out step 20 then: the data that obtain former frame and present frame respectively, above-mentioned front and back frame data are image frame data, according to a preferred embodiment of the present invention, the data of above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region, wherein the peripheral region of above-mentioned virtual coil has certain width Δ X, and above-mentioned width Delta X sets for the user; Then carry out step 30: the data to above-mentioned former frame and present frame are carried out calculus of differences, obtain the motion outline of moving target, utilize inter-frame difference to carry out Feature Extraction such as moving target profile, relatively can directly obtain the difference of front and back two frames by individual element, the non-vanishing place of differentiated image shows that this place is exactly detected moving target, in order better to improve arithmetic speed and precision, this method is not done difference to entire image, because in two width of cloth images of front and back too many uncertainty arranged, so for fear of this situation, directly get the near zone of the virtual coil of current triggering, difference is done in this piece zonule, so just can obtain the target travel Y of relative clean; Carry out step 40 at last: the motion outline to above-mentioned moving target is cut apart, obtain the profile of the moving target of the profile of moving target of former frame and present frame respectively, the moving target that obtains behind step 30 calculus of differences is bigger than actual moving target, because it has comprised the head of former frame moving target and the tail of present frame moving target, during easy, directly to this moving target by relatively cutting apart, according to a preferred embodiment of the present invention, wherein above-mentioned segmentation procedure 40 is for intercepting the motion outline of above-mentioned moving target the profile of the moving target that obtains above-mentioned former frame according to first ratio, the motion outline of above-mentioned moving target is intercepted the profile of the moving target that obtains above-mentioned present frame according to second ratio, be that Y*a% is the moving target of former frame, and Y*b% is the moving target of present frame, the wherein above-mentioned first ratio a%, the second ratio b% sets for the user.Though the speed of a motor vehicle is too fast or the result that obtains excessively slowly deviation to some extent, this difference is still within acceptable scope.If for example vehicle is from top to bottom to travel, coil position is positioned at the first half of image, and then a% can value 20%, and b% can value 15%.The user can be according to the direction of vehicle ' (from top to bottom or from the bottom to top), the position of virtual coil (the first half of image or the latter half) is selected, and parameter configuration software will generate corresponding ratio data input video automatically and detect in the tracker.
According to a preferred embodiment of the present invention, wherein said method more comprises step 50: successive image is carried out tracking and matching, utilize estimated value directly to enlarge the scope of coupling, promptly the profile to the moving target of the profile of the moving target of above-mentioned former frame and above-mentioned present frame carries out feature extraction, carries out the outline position of next frame moving target in the input estimation filter and estimates.In when coupling, adjust and estimate the scope that obtains it to be enlarged, comprise whole moving target as much as possible, though like this may be slightly influential to the speed of coupling, can guarantee the integrality of the moving target that obtains, and be not easy to follow and lose.Because obtained complete moving target profile, follow-up matching precision had good assurance.
According to a preferred embodiment of the present invention, wherein said method more comprises step 60: the profile of exporting the moving target of the profile of moving target of above-mentioned former frame and above-mentioned present frame, to make things convenient for the user to carry out traffic monitoring, observe the correlation circumstance of this moving target that captures.
Please refer to Fig. 2 again, Figure 2 shows that the functional block diagram of a preferred embodiment of the present invention.The present invention proposes a kind of motion target tracking system based on rapid characteristic points, it comprises trigger element 100, in order to receive trigger pip 110, according to a preferred embodiment of the present invention, the trigger pip 110 that wherein above-mentioned trigger element 100 receives is the trigger pip when moving target enters a virtual coil, promptly trigger a trigger pip in the monitoring and controlling traffic situation when vehicle enters this virtual coil automatically, this trigger element 100 receives this trigger pip 110; Reed is got unit 200, in order to obtain the data of former frame and present frame, according to a preferred embodiment of the present invention, the data of wherein above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region, wherein the peripheral region of above-mentioned virtual coil has certain width Δ X, and above-mentioned width Delta X sets for the user; Arithmetic element 300, carry out calculus of differences in order to data to above-mentioned former frame and present frame, obtain the motion outline of moving target, utilize inter-frame difference to carry out Feature Extraction such as moving target profile, relatively can directly obtain the difference of front and back two frames by individual element, the non-vanishing place of differentiated image shows that this place is exactly detected moving target, in order better to improve arithmetic speed and precision, native system is not done difference to entire image, because in two width of cloth images of front and back too many uncertainty arranged, so for fear of this situation, directly get the near zone of the virtual coil of current triggering, difference is done in this piece zonule, so just can obtain the target travel Y of relative clean; Cutting unit 400, motion outline to above-mentioned moving target is cut apart, obtain the profile of the moving target of the profile of moving target of former frame and present frame respectively, the moving target that obtains behind the calculus of differences is bigger than actual moving target, because it has comprised the head of former frame moving target and the tail of present frame moving target, during easy, directly to this moving target by relatively cutting apart, according to a preferred embodiment of the present invention, wherein above-mentioned cutting unit 400 intercepts the profile of the moving target that obtains above-mentioned former frame with the motion outline of above-mentioned moving target according to first ratio, the motion outline of above-mentioned moving target is intercepted the profile of the moving target that obtains above-mentioned present frame according to second ratio.Be that Y*a% is the moving target of former frame, and Y*b% is the moving target of present frame, the wherein above-mentioned first ratio a%, the second ratio b% sets for the user.Though the speed of a motor vehicle is too fast or the result that obtains excessively slowly deviation to some extent, this difference is still within acceptable scope.
According to a preferred embodiment of the present invention, wherein said system more comprises matching unit 500, in order to the estimated value data are enlarged, obtain the motion outline of next frame moving target by matching algorithm, promptly the profile to the moving target of the profile of the moving target of above-mentioned former frame and above-mentioned present frame extracts feature, the input estimation filter obtains the profile scope of next frame moving target, when coupling, adjust the scope that estimation obtains, it is enlarged, comprise whole moving target as much as possible, though like this may be slightly influential to the speed of coupling, but can guarantee the integrality of the moving target that obtains, and be not easy to guarantee that with losing follow-up deviation is more and more littler.
According to a preferred embodiment of the present invention, wherein said system comprises that more output unit 600 exports the profile of the moving target of the profile of the moving target of above-mentioned former frame and above-mentioned present frame to display unit, to make things convenient for the user to carry out traffic monitoring, observe the correlation circumstance of this moving target that captures.
In sum, motion target tracking method and system based on rapid characteristic points of the present invention receives the data of obtaining former frame and present frame after the trigger pip, carry out the motion outline of difference acquisition moving target then, carry out ratio again and cut apart the profile that draws moving target, the present invention in the virtual coil with and on every side certain width the zone in data do difference, arithmetic speed and precision have been improved, mate again afterwards, setting range, guaranteed the integrality of moving target, and be not easy moving target with losing.
Though the present invention discloses as above with preferred embodiment, so it is not in order to limit the present invention.The persond having ordinary knowledge in the technical field of the present invention, without departing from the spirit and scope of the present invention, when being used for a variety of modifications and variations.Therefore, protection scope of the present invention is as the criterion when looking claims person of defining.

Claims (14)

1. motion target tracking method based on rapid characteristic points is characterized in that comprising:
Receive trigger pip;
Obtain the data of former frame and present frame respectively;
Data to above-mentioned former frame and present frame are carried out calculus of differences, obtain the motion outline of moving target;
Motion outline to above-mentioned moving target is cut apart, and obtains the profile of the moving target of the profile of moving target of former frame and present frame respectively.
2. motion target tracking method according to claim 1 is characterized in that the trigger pip of wherein above-mentioned reception is the trigger pip when moving target enters a virtual coil.
3. motion target tracking method according to claim 2, the data that it is characterized in that wherein above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region.
4. motion target tracking method according to claim 3 is characterized in that wherein the peripheral region of above-mentioned virtual coil has certain width.
5. motion target tracking method according to claim 1, it is characterized in that wherein above-mentioned segmentation procedure for the motion outline of above-mentioned moving target is intercepted the profile of the moving target that obtains above-mentioned former frame according to first ratio, intercepts the motion outline of above-mentioned moving target the profile of the moving target that obtains above-mentioned present frame according to second ratio.
6. motion target tracking method according to claim 1, it is characterized in that wherein said method more comprises the above-mentioned moving target profile extraction feature that obtains of cutting apart, carry out the estimation of next frame moving target position in the input estimation filter, utilize estimated value directly to enlarge the scope of coupling, guarantee the integrality of the moving target profile that next frame obtains, to help follow-up matching precision.
7. motion target tracking method according to claim 1 is characterized in that said method wherein more comprises the profile of the moving target of the above-mentioned front and back of output two frames, for user monitoring.
8. motion target tracking system based on rapid characteristic points is characterized in that comprising:
Trigger element is in order to receive trigger pip;
Acquiring unit is in order to obtain the data of former frame and present frame;
Arithmetic element is carried out calculus of differences in order to the data to above-mentioned former frame and present frame, obtains the motion outline of moving target;
Cutting unit is cut apart the motion outline of above-mentioned moving target, obtains the profile of the moving target of the profile of moving target of former frame and present frame respectively.
9. motion target tracking according to claim 8 system is characterized in that the trigger pip that wherein above-mentioned trigger element receives is the trigger pip when moving target enters a virtual coil.
10. motion target tracking according to claim 9 system, the data that it is characterized in that wherein above-mentioned front and back frame are the data of above-mentioned virtual coil inside and peripheral region.
11. motion target tracking according to claim 10 system is characterized in that wherein the peripheral region of above-mentioned virtual coil has certain width.
12. motion target tracking according to claim 8 system, it is characterized in that wherein above-mentioned cutting unit intercepts the profile of the moving target that obtains above-mentioned former frame with the motion outline of above-mentioned moving target according to first ratio, intercepts the motion outline of above-mentioned moving target the profile of the moving target that obtains above-mentioned present frame according to second ratio.
13. motion target tracking according to claim 8 system, it is characterized in that wherein said system more comprises matching unit, be used for the above-mentioned moving target profile that obtains of cutting apart is extracted feature, carry out the estimation of next frame moving target position in the input estimation filter, the estimated value data are enlarged, obtain the motion outline of next frame moving target by matching algorithm.
14. motion target tracking according to claim 8 system is characterized in that wherein said system comprises that more output unit exports the profile of the moving target of above-mentioned front and back two frames to display unit.
CN2008100359038A 2008-04-10 2008-04-10 Method and system for motion target tracking based on rapid characteristic points Expired - Fee Related CN101556697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008100359038A CN101556697B (en) 2008-04-10 2008-04-10 Method and system for motion target tracking based on rapid characteristic points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008100359038A CN101556697B (en) 2008-04-10 2008-04-10 Method and system for motion target tracking based on rapid characteristic points

Publications (2)

Publication Number Publication Date
CN101556697A true CN101556697A (en) 2009-10-14
CN101556697B CN101556697B (en) 2012-07-25

Family

ID=41174802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008100359038A Expired - Fee Related CN101556697B (en) 2008-04-10 2008-04-10 Method and system for motion target tracking based on rapid characteristic points

Country Status (1)

Country Link
CN (1) CN101556697B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011063616A1 (en) * 2009-11-24 2011-06-03 杭州海康威视软件有限公司 Method and apparatus for moving object autotracking
CN102136196A (en) * 2011-03-10 2011-07-27 北京大学深圳研究生院 Vehicle velocity measurement method based on image characteristics
CN101739553B (en) * 2009-12-10 2012-01-11 青岛海信网络科技股份有限公司 Method for identifying target in parallax image
WO2013053159A1 (en) * 2011-10-09 2013-04-18 青岛海信网络科技股份有限公司 Method and device for tracking vehicle
CN104410842A (en) * 2014-12-25 2015-03-11 苏州智华汽车电子有限公司 Vehicle-reversing camera dynamic-detection system and method
CN108280408A (en) * 2018-01-08 2018-07-13 北京联合大学 A kind of crowd's accident detection method based on combined tracking and generalized linear model
CN109919053A (en) * 2019-02-24 2019-06-21 太原理工大学 A kind of deep learning vehicle parking detection method based on monitor video
CN110300264A (en) * 2019-06-28 2019-10-01 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011063616A1 (en) * 2009-11-24 2011-06-03 杭州海康威视软件有限公司 Method and apparatus for moving object autotracking
CN101739553B (en) * 2009-12-10 2012-01-11 青岛海信网络科技股份有限公司 Method for identifying target in parallax image
CN102136196A (en) * 2011-03-10 2011-07-27 北京大学深圳研究生院 Vehicle velocity measurement method based on image characteristics
WO2013053159A1 (en) * 2011-10-09 2013-04-18 青岛海信网络科技股份有限公司 Method and device for tracking vehicle
CN104410842A (en) * 2014-12-25 2015-03-11 苏州智华汽车电子有限公司 Vehicle-reversing camera dynamic-detection system and method
CN108280408A (en) * 2018-01-08 2018-07-13 北京联合大学 A kind of crowd's accident detection method based on combined tracking and generalized linear model
CN108280408B (en) * 2018-01-08 2021-11-02 北京联合大学 Crowd abnormal event detection method based on hybrid tracking and generalized linear model
CN109919053A (en) * 2019-02-24 2019-06-21 太原理工大学 A kind of deep learning vehicle parking detection method based on monitor video
CN110300264A (en) * 2019-06-28 2019-10-01 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and storage medium
CN110300264B (en) * 2019-06-28 2021-03-12 Oppo广东移动通信有限公司 Image processing method, image processing device, mobile terminal and storage medium

Also Published As

Publication number Publication date
CN101556697B (en) 2012-07-25

Similar Documents

Publication Publication Date Title
CN101556697B (en) Method and system for motion target tracking based on rapid characteristic points
Xuan et al. Object tracking in satellite videos by improved correlation filters with motion estimations
CN105005992B (en) A kind of based on the background modeling of depth map and the method for foreground extraction
US7058205B2 (en) Robust, on-line, view-based appearance models for visual motion analysis and visual tracking
Zhang et al. Fusing wearable imus with multi-view images for human pose estimation: A geometric approach
CN108351522B (en) Gaze direction mapping
CN104050712B (en) The method for building up and device of threedimensional model
CN103336954B (en) A kind of TV station symbol recognition method and apparatus in video
CN102999901A (en) Method and system for processing split online video on the basis of depth sensor
CN102254325B (en) Method and system for segmenting motion blur scene and extracting foreground
US9275284B2 (en) Method and apparatus for extraction of static scene photo from sequence of images
CN103578113A (en) Method for extracting foreground images
CN103002309A (en) Depth recovery method for time-space consistency of dynamic scene videos shot by multi-view synchronous camera
CN102542541B (en) Deep image post-processing method
CN103985128A (en) Three-dimensional matching method based on color intercorrelation and self-adaptive supporting weight
CN102169538B (en) Background modeling method based on pixel confidence
CN111402303A (en) Target tracking architecture based on KFSTRCF
CN103440669A (en) Dynamic Mean shift kernel bandwidth updating method based on compressed domain fusion
CN105550663A (en) Cinema attendance statistical method and system
CN111798486B (en) Multi-view human motion capture method based on human motion prediction
CN104658009A (en) Moving-target detection method based on video images
CN107358621A (en) Method for tracing object and device
CN104182976A (en) Field moving object fining extraction method
CN109389624B (en) Model drift suppression method and device based on similarity measurement
CN112084855A (en) Outlier elimination method for video stream based on improved RANSAC method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120725

Termination date: 20210410