CN110148156B - Symmetrical target image tracking method based on local optical flow - Google Patents
Symmetrical target image tracking method based on local optical flow Download PDFInfo
- Publication number
- CN110148156B CN110148156B CN201910356335.XA CN201910356335A CN110148156B CN 110148156 B CN110148156 B CN 110148156B CN 201910356335 A CN201910356335 A CN 201910356335A CN 110148156 B CN110148156 B CN 110148156B
- Authority
- CN
- China
- Prior art keywords
- centroid
- image
- frame
- frame image
- optical flow
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/269—Analysis of motion using gradient-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of image processing, and particularly discloses a symmetrical target image tracking method based on local optical flow, which comprises the following steps: s1, reading in a frame of image of a target object in sequence, judging whether the target object is symmetrical or not, if not, exiting tracking, and if yes, entering the next step; and S2, deducing the motion trend of the target object according to the centroid optical flow movement. According to the symmetrical target image tracking method based on the local optical flow, part of the target to be tracked in daily life is in a symmetrical shape, and when the optical flow is calculated, the motion trend of the whole target object can be deduced by only using a part of optical flow difference, so that the calculation amount is correspondingly reduced, and the tracking performance and efficiency are improved.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a symmetrical target image tracking method based on local optical flow.
Background
In the field of target tracking, objects in the world coordinate system are mapped in the image coordinate system of the camera after being appropriately transformed. Tracking the object in the world coordinate system is converted into corresponding processing of the two-dimensional plane image in the image coordinate system. In tracking, the target object may be in an arbitrary shape. In the tracking process, a large amount of calculation is required to calculate the optical flow of the entire tracking target object, and the tracking performance becomes unsatisfactory.
Disclosure of Invention
The invention provides a symmetrical target image tracking method based on local optical flow, and solves the technical problems that the existing target object tracking method needs a large amount of calculated optical flow, the tracking performance is poor and the tracking efficiency is low.
In order to solve the above technical problems, the present invention provides a symmetric target image tracking method based on local optical flow, comprising the steps of:
s1, reading in a frame of image of a target object in sequence, judging whether the target object is symmetrical or not, if not, exiting tracking, and if yes, entering the next step;
and S2, deducing the motion trend of the target object according to the centroid optical flow movement.
Specifically, the step S1 is performed throughout the entire process performed in the step S2.
Specifically, the step S2 specifically includes the steps of:
s21, judging whether the current image is the previous n frames, wherein n is more than or equal to 2 and n belongs to Z, if so, deducing the motion trend of the target object according to the centroid light stream movement of the previous n frames of images, otherwise deducing the motion trend of the target object according to the centroid light stream movement of two continuous images in the subsequent m frames of images of the previous n frames of images, and wherein m is more than or equal to 2 and m belongs to Z;
s22. return to said step S21.
Preferably, n is 2.
Further, if the current image is the first image of the previous n frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and contour corner points of the first frame image and calculating the centroid of the first frame image according to the edge contour and contour corner points, wherein the first frame image is the first frame when the machine starts to capture images;
s21-2, reading in a second frame image;
s21-3, detecting the edge contour and the contour corner of the second frame image and calculating the centroid of the second frame image;
s21-4, calculating a centroid movement vector of the second frame image according to the centroid of the first frame image and the centroid of the second frame image;
s21-5, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the second frame image.
Further, if the current image is a second image of n previous frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and the contour corner of the second frame image and calculating the centroid of the second frame image;
s21-2, calculating a centroid movement vector of the second frame image according to the centroid of the first frame image and the centroid of the second frame image, wherein the first frame image is the first frame when the machine starts to capture images;
s21-3, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the second frame image.
Further, if the current image is any one of the m subsequent frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and the contour corner of the current image and calculating the centroid of the current image;
s21-2, calculating a centroid movement vector of the current image according to the centroid of the current image and the centroid of the image in the previous frame of the current image;
and S21-3, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the current image.
Specifically, the centroid of a frame of image is represented as an N × N matrix, the matrix represents the size of the centroid, the range is 1 × 1 to N × N, the unit is pixel × pixel, and N ≧ 2& N ∈ Z.
Preferably, N ∈ [3,7 ].
Most preferably, N is 3.
According to the symmetrical target image tracking method based on the local optical flow, part of the target to be tracked in daily life is in a symmetrical shape, and when the optical flow is calculated, the motion trend of the whole target object can be deduced by only using a part of optical flow difference, so that the calculation amount is correspondingly reduced, and the tracking performance and efficiency are improved.
Drawings
FIG. 1 is a flowchart of a method for tracking a symmetric target image based on a local optical flow according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings, which are given solely for the purpose of illustration and are not to be construed as limitations of the invention, including the drawings which are incorporated herein by reference and for illustration only and are not to be construed as limitations of the invention, since many variations thereof are possible without departing from the spirit and scope of the invention.
The step flow of the symmetrical target image tracking method based on the local optical flow provided by the embodiment of the invention is shown in fig. 1. In this embodiment, a method for tracking a symmetric target image based on a local optical flow includes the steps of:
s1, reading in a frame of image of a target object in sequence, judging whether the target object is symmetrical or not, if not, exiting tracking, and if yes, entering the next step;
and S2, deducing the motion trend of the target object according to the centroid optical flow movement.
Specifically, the step S1 is performed throughout the entire process performed in the step S2.
Specifically, the step S2 specifically includes the steps of:
s21, judging whether the current image is the previous n frames, wherein n is more than or equal to 2 and n belongs to Z, if so, deducing the motion trend of the target object according to the centroid light stream movement of the previous n frames of images, otherwise deducing the motion trend of the target object according to the centroid light stream movement of two continuous images in the subsequent m frames of images of the previous n frames of images, and wherein m is more than or equal to 2 and m belongs to Z;
s22. return to said step S21.
Preferably, n is 2, but the embodiment is not limited to 2, and is specifically set to the previous frames because the displacement difference of the centroids of two adjacent frames needs to be calculated when processing the image frames. The moving displacement vector of the centroid needs to use two adjacent frames of images (the frame dropping condition is not considered, and even if the frame dropping occurs, the subsequent tracking effect is not influenced), and the displacement vector calculation needs at least two centroids, so that at least two frames are selected for processing calculation. The first three frames, the first four frames, or the first N (N > -4) frames may be selected, which is feasible in principle.
Specifically, the centroid of a frame image is represented as an N × N matrix that characterizes the size of the centroid, ranging from 1 × 1 to N × N, but is relatively cumbersome to process. The unit of the centroid is pixel multiplied by pixel, N is larger than or equal to 2 and N belongs to Z. The centroid is represented in the form of a matrix, and other matrix forms are possible, not to say that the centroid has only one pixel.
Preferably, N ∈ [3,7 ].
Most preferably, N is 3.
Example 1
When it is determined that the current image is the first image of the previous 2 frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and contour corner of the first frame image and calculating the centroid A of the first frame image1(and its centroid movement vector P1Zero vector, no meaning for calculation), the first frame image is the first frame when the machine starts capturing images;
s21-2, reading in a second frame image;
s21-3, detecting the edge contour and the contour corner of the second frame image and calculating the centroid A of the second frame image2;
S21-4, according to the centroid A of the first frame image1And a centroid A of the second frame image2Calculating a centroid movement vector P of the second frame image2=A2-A1;
S21-5, moving vector P according to the centroid of the second frame image2And the contour corner points deducing the motion trend of the target object.
If symmetry is still detected subsequently (from n to 3), then according to Pn=An-An-1And (4) calculating a centroid moving vector, jumping out of a loop if asymmetry occurs, and ending tracking (if symmetry exists subsequently, turning to the scene of the embodiment 3).
Example 2
When it is determined as the symmetric target, if the current image is the second frame image of the previous 2 frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and the contour corner of the second frame image and calculating the centroid A of the second frame image2;
S21-2, according to the centroid A of the first frame image1And a centroid A of the second frame image2Calculating a centroid movement vector P of the second frame image2=A2-A1The first frame image is the first frame when the machine starts to capture the image; centroid A of first frame image1(and its centroid movement vector P1Zero vector, not usefulIn the sense of a calculation) has been solved when captured;
s21-3, moving vector P according to the centroid of the second frame image2And the contour corner points deducing the motion trend of the target object.
If symmetry is still detected subsequently (from n to 3), then according to Pn=An-An-1And (4) calculating a centroid moving vector, jumping out of a loop if asymmetry occurs, and ending tracking (if symmetry exists subsequently, turning to the scene of the embodiment 3).
Example 3
When it is determined as the symmetric target, if the current image is any one frame (n is greater than or equal to 3) of the m subsequent frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and the contour corner of the current image and calculating the centroid of the current image;
s21-2, according to the centroid A of the current imagenAnd the centroid A of the previous frame image of the current imagen-1Calculating a centroid movement vector P of the current imagen=An-An-1(the centroids of two consecutive images need to be found);
s21-3, moving vector P according to centroid of current imagenAnd the contour corner points deducing the motion trend of the target object.
In step S21-2, if the centroid of the previous frame image is already determined (if the current image is the third frame and the second frame is also symmetric, the centroid is already determined in embodiment 1 or embodiment 2), the centroid is used as it is, and if the centroid is not determined, the centroid a of the current image is determined firstnThen, the next frame image (the current image becomes the previous frame image) is read and the centroid A of the current image becomes the centroid An+1By subtraction of the two centroids (P)n+1=An+1-An) And (6) obtaining.
And if asymmetry occurs subsequently, skipping out of the loop, and ending the tracking (if a symmetric situation still exists subsequently, the method still belongs to the application scenario of the embodiment).
According to the symmetrical target image tracking method based on the local optical flow, which is provided by the embodiment of the invention, part of the target to be tracked in daily life is in a symmetrical shape, and when the optical flow is calculated, the motion trend of the whole target object can be deduced by only using a part of optical flow difference, so that the calculation amount is correspondingly reduced, and the tracking performance and efficiency are improved.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (8)
1. A symmetrical target image tracking method based on local optical flow is characterized by comprising the following steps:
s1, reading in a frame of image of a target object in sequence, judging whether the target object is symmetrical or not, if not, exiting tracking, and if yes, entering the next step;
s2, deducing the motion trend of the target object according to the movement of the centroid optical flow;
the step S1 is performed throughout the entire process performed in the step S2;
the step S2 specifically includes the steps of:
s21, judging whether the current image is the previous n frames, wherein n is more than or equal to 2 and n belongs to Z, Z is an integer set, if yes, deducing the motion trend of the target object according to the centroid light stream movement of the previous n frames of images, otherwise deducing the motion trend of the target object according to the centroid light stream movement of two continuous images in the subsequent m frames of images of the previous n frames of images, and m is more than or equal to 2 and m belongs to Z;
s22. return to said step S1.
2. The local optical flow-based symmetric target image tracking method as claimed in claim 1, wherein n = 2.
3. The method as claimed in claim 2, wherein if the current image is the first image of the n previous frames, the step S21 includes the following steps:
s21-1, detecting the edge contour and contour corner points of the first frame image and calculating the centroid of the first frame image according to the edge contour and contour corner points, wherein the first frame image is the first frame when the machine starts to capture images;
s21-2, reading in a second frame image;
s21-3, detecting the edge contour and the contour corner of the second frame image and calculating the centroid of the second frame image;
s21-4, calculating a centroid movement vector of the second frame image according to the centroid of the first frame image and the centroid of the second frame image;
s21-5, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the second frame image.
4. The method as claimed in claim 2, wherein if the current image is the second image of the first n frames, the step S21 includes the steps of:
s21-1, detecting the edge contour and the contour corner of the second frame image and calculating the centroid of the second frame image;
s21-2, calculating a centroid movement vector of the second frame image according to the centroid of the first frame image and the centroid of the second frame image, wherein the first frame image is the first frame when the machine starts to capture images;
s21-3, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the second frame image.
5. The method as claimed in claim 2, wherein n =2, if the current image is any one of m subsequent frames, the step S21 specifically includes the steps of:
s21-1, detecting the edge contour and the contour corner of the current image and calculating the centroid of the current image;
s21-2, calculating a centroid movement vector of the current image according to the centroid of the current image and the centroid of the image in the previous frame of the current image;
and S21-3, deducing the motion trend of the target object according to the centroid motion vector and the contour corner of the current image.
6. The local optical flow-based symmetric target image tracking method according to any one of claims 3-5, wherein: the centroid of one frame of image is represented as an N multiplied by N matrix, the matrix represents the size of the centroid, the range is 1 multiplied by 1 to N multiplied by N, the unit is pixel multiplied by pixel, N is larger than or equal to 2 and N belongs to Z.
7. The local optical flow-based symmetric target image tracking method according to claim 6, wherein: n is an element of [3,7 ].
8. The local optical flow-based symmetric target image tracking method according to claim 7, wherein: n = 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910356335.XA CN110148156B (en) | 2019-04-29 | 2019-04-29 | Symmetrical target image tracking method based on local optical flow |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910356335.XA CN110148156B (en) | 2019-04-29 | 2019-04-29 | Symmetrical target image tracking method based on local optical flow |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110148156A CN110148156A (en) | 2019-08-20 |
CN110148156B true CN110148156B (en) | 2021-05-14 |
Family
ID=67594551
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910356335.XA Active CN110148156B (en) | 2019-04-29 | 2019-04-29 | Symmetrical target image tracking method based on local optical flow |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110148156B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109544990A (en) * | 2018-12-12 | 2019-03-29 | 惠州市德赛西威汽车电子股份有限公司 | A kind of method and system that parking position can be used based on real-time electronic map identification |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105760831B (en) * | 2015-12-07 | 2019-07-05 | 北京航空航天大学 | It is a kind of to be taken photo by plane the pedestrian tracting method of infrared video based on low latitude |
CN106204484B (en) * | 2016-07-11 | 2020-07-24 | 徐州工程学院 | Traffic target tracking method based on optical flow and local invariant features |
CN106709472A (en) * | 2017-01-17 | 2017-05-24 | 湖南优象科技有限公司 | Video target detecting and tracking method based on optical flow features |
CN106846367B (en) * | 2017-02-15 | 2019-10-01 | 北京大学深圳研究生院 | A kind of Mobile object detection method of the complicated dynamic scene based on kinematic constraint optical flow method |
CN107292911B (en) * | 2017-05-23 | 2021-03-30 | 南京邮电大学 | Multi-target tracking method based on multi-model fusion and data association |
-
2019
- 2019-04-29 CN CN201910356335.XA patent/CN110148156B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109544990A (en) * | 2018-12-12 | 2019-03-29 | 惠州市德赛西威汽车电子股份有限公司 | A kind of method and system that parking position can be used based on real-time electronic map identification |
Also Published As
Publication number | Publication date |
---|---|
CN110148156A (en) | 2019-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210020171A1 (en) | Digital Video Fingerprinting Using Motion Segmentation | |
JP5284048B2 (en) | Image processing apparatus, imaging apparatus, and image processing method | |
US8988529B2 (en) | Target tracking apparatus, image tracking apparatus, methods of controlling operation of same, and digital camera | |
US8280108B2 (en) | Image processing system, image processing method, and computer program | |
Ding et al. | Spatio-temporal recurrent networks for event-based optical flow estimation | |
US9542735B2 (en) | Method and device to compose an image by eliminating one or more moving objects | |
KR101071352B1 (en) | Apparatus and method for tracking object based on PTZ camera using coordinate map | |
Aggarwal et al. | Object tracking using background subtraction and motion estimation in MPEG videos | |
US9466095B2 (en) | Image stabilizing method and apparatus | |
JP5762250B2 (en) | Image signal processing apparatus and image signal processing method | |
EP2296095B1 (en) | Video descriptor generator | |
WO2007129591A1 (en) | Shielding-object video-image identifying device and method | |
CN110148156B (en) | Symmetrical target image tracking method based on local optical flow | |
Patro | Design and implementation of novel image segmentation and BLOB detection algorithm for real-time video surveillance using DaVinci processor | |
US12002279B2 (en) | Image processing apparatus and method, and image capturing apparatus | |
US10880457B2 (en) | Image processing apparatus, image capturing apparatus, image processing method, and storage medium | |
CN113674319B (en) | Target tracking method, system, equipment and computer storage medium | |
CN114882003A (en) | Method, medium and computing device for detecting shooting pose change of camera | |
WO2019021412A1 (en) | Location estimation device, location estimation method, and program recording medium | |
Kobayashi et al. | Pose tracking using motion state estimation for mobile augmented reality | |
JP2011259044A (en) | Image processing device and image processing method | |
JP2013246601A (en) | Image process device | |
KR101070448B1 (en) | The method for tracking object and the apparatus thereof | |
Chen et al. | Real-time people counting method with surveillance cameras implemented on embedded system | |
CN107086033B (en) | Cloud computing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |