CN112419364A - Target tracking method and system based on image feature matching - Google Patents

Target tracking method and system based on image feature matching Download PDF

Info

Publication number
CN112419364A
CN112419364A CN202011232734.4A CN202011232734A CN112419364A CN 112419364 A CN112419364 A CN 112419364A CN 202011232734 A CN202011232734 A CN 202011232734A CN 112419364 A CN112419364 A CN 112419364A
Authority
CN
China
Prior art keywords
target
frame
value
pixel value
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011232734.4A
Other languages
Chinese (zh)
Inventor
王堃
韩亚敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Yuspace Intelligent Technology Co ltd
Original Assignee
Jiangsu Yu Space Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Yu Space Technology Co ltd filed Critical Jiangsu Yu Space Technology Co ltd
Priority to CN202011232734.4A priority Critical patent/CN112419364A/en
Publication of CN112419364A publication Critical patent/CN112419364A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a target tracking method and a target tracking system based on image feature matching, which relate to the technical field of image processing and solve the technical problem that a plurality of targets cannot be accurately captured in a high-resolution image in real time. The method and the system can accurately capture a plurality of targets in a high-resolution image in real time, and realize real-time target detection.

Description

Target tracking method and system based on image feature matching
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a target tracking method and system based on image feature matching.
Background
The real-time speed information collection plays a very important role in the Traffic flow information collection, and the realization of an Intelligent Traffic System (ITS) can not separate the real-time speed information collection. The speed measurement by hardware mainly includes laser speed measurement, radar speed measurement, and the like, but it requires professional equipment, is expensive to install and maintain, and has limitations in application. The speed measurement through software mainly comprises video-based speed measurement, and the target detection is carried out on the traffic flow video by adopting an image processing technology, so that the aim of collecting various traffic information is fulfilled, and rich traffic flow information is provided.
Currently, methods for object detection mainly include a frame difference method and an optical flow method. The frame difference method is to extract a motion region of a moving object in an image by closed-valued based on temporal difference of pixels between adjacent frames of an image sequence and can eliminate interference of an unstable factor, but the frame difference method is limited in that it cannot detect a stationary and slowly moving object. The optical flow method reflects the change of pixel points in the time domain of the imaging plane coordinate system by calculating the optical flux value of the pixel points of the video image, then calculates the corresponding relation between adjacent frames, and calculates the tracked image displacement target and other information. However, the basic assumption of the optical flow method is that the luminance is constant and the moving time is short, so the optical flow method is not applicable when the light changes or the time interval is too long, most of moving object tracking is based on the principles of motion estimation, probability distribution, feature matching and the like, the calculation amount of the optical flow method is large, and the real-time requirement of video speed detection cannot be met. Therefore, the prior art cannot accurately capture a plurality of targets in a high-resolution image in real time, and cannot meet the real-time requirement of target detection.
Disclosure of Invention
The present disclosure provides a target tracking method and system based on image feature matching, which aims to accurately capture a plurality of targets in a high-resolution image in real time and realize real-time target detection.
The technical purpose of the present disclosure is achieved by the following technical solutions:
a target tracking method based on image feature matching comprises the following steps:
acquiring continuous video images, and cutting the video images into continuous first frame pictures;
background elimination is carried out on the first frame of picture to extract a target image, and a second frame of picture is obtained;
comparing the pixel value of the second frame of picture with the historical value of the pixel value, and if the difference value between the pixel value and the historical value of the pixel value does not exceed a first preset threshold value, storing the pixel value into a potential category; if the difference between the pixel value and the historical value exceeds a first preset threshold, storing the pixel value into a potential background;
judging whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets;
and if the target is the same target, calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category.
Further, the determining whether the tracked targets are the same target according to the potential categories and the centroid distance between the targets includes:
determining a target detection frame in an adjacent frame of the second frame picture;
calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked targets are the same targets, and if the centroid distance is not within the second threshold value, judging that the tracked targets are invalid.
Further, the background elimination method is based on KNN algorithm.
A target tracking system based on image feature matching, comprising:
the acquisition module acquires continuous video images and cuts the video images into continuous first frame pictures;
the background elimination module is used for eliminating the background of the first frame of picture to extract a target image so as to obtain a second frame of picture;
the comparison module compares the pixel value of the second frame of picture with the historical value of the pixel value, and stores the pixel value into a potential category if the difference value between the pixel value and the historical value is within a first preset threshold value; if the difference between the pixel value and the historical value is not within a first preset threshold, storing the pixel value into a potential background;
the judging module judges whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets;
and the calculation module is used for calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category if the target is the same target.
Further, the determining module is further configured to: determining a target detection frame in an adjacent frame of the second frame picture;
calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked targets are the same targets, and if the centroid distance is not within the second threshold value, judging that the tracked targets are invalid.
The beneficial effect of this disclosure lies in: the target tracking method and system based on image feature matching, which are disclosed by the disclosure, perform background elimination on a collected video frame to obtain a target image frame, compare a pixel value of the target image frame with a historical value of the target image frame, and store the pixel value into a potential category if a difference value between the pixel value and the historical value is within a first preset threshold value. In addition, whether the tracked targets are the same target or not is judged according to the potential categories and the centroid distance between the targets; and if the target is the same target, calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category. The accurate and stable background image can be extracted through background elimination, the target image frame is accurately separated from the original frame through the background image, whether the tracked target is the same target or not is judged through the centroid distance, and the actual displacement of the target is finally calculated. The method and the system can accurately capture a plurality of targets in a high-resolution image in real time, and realize real-time target detection.
Drawings
FIG. 1 is a flow chart of the disclosed method;
FIG. 2 is a schematic view of the disclosed system;
fig. 3 is a schematic view of a camera imaging model.
Detailed Description
The technical scheme of the disclosure will be described in detail with reference to the accompanying drawings. In the description of the present disclosure, it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated, but merely as distinguishing between different components.
Fig. 1 is a flow chart of the method of the present disclosure, as shown in fig. 1, 100: acquiring continuous video images, and cutting the video images into continuous first frame pictures. As a specific embodiment, the aim of preprocessing a video image sequence is fulfilled by acquiring continuous video images, capturing complete and stable vehicle information and then dividing the video images into a plurality of frame images.
101: and carrying out background elimination on the first frame of picture to extract a target image so as to obtain a second frame of picture. As a specific embodiment, the background elimination method is background elimination performed based on a KNN (K-nearest neighbor classification) algorithm, and background modeling is performed based on the KNN algorithm to perform background elimination, so as to realize detection of an object and a contour, thereby obtaining good initial vehicle characteristics.
102: comparing the pixel value of the second frame of picture with the historical value of the pixel value, and if the difference value between the pixel value and the historical value of the pixel value does not exceed a first preset threshold value, storing the pixel value into a potential category; if the difference between a pixel value and its historical value exceeds a first preset threshold, the pixel value is stored into a potential background.
As a specific embodiment, in a scene, for one pixel value of the second frame picture, if a difference between the pixel value and the history value does not exceed a first preset threshold compared with K history values of the pixel, the pixel value and the history value which are indicated to be a match are stored into a potential category, and after comparing all history fingers, if the difference between the pixel value and the history value exceeds the first preset threshold, the pixel point is stored as a potential background point.
103: and judging whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets. As a specific embodiment, the feature that the target contour of the adjacent frame does not change much can be obtained from the pixel values of the potential category. In the invention, the centroid represents a vehicle, and whether the selected target is the same vehicle is judged according to the characteristic that the centroid has smaller distance deviation in the adjacent image sequence.
The specific process of judging whether the tracked targets are the same target according to the centroid distance comprises the following steps:
(1) determining a target detection frame in an adjacent frame of the second frame picture;
(2) calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked target is the same target, namely the target detection frames in the adjacent frames track the same vehicle, displaying two points to represent different positions of the same vehicle, and if the centroid distance is not within the second threshold value, judging that the tracked target is invalid.
Generally, if the centroid distance is not within the second threshold, it means that the two target detection frames are not tracking the same vehicle, or the vehicle is entering and exiting the speed measurement area and is unable to display an image, thereby abandoning the distance, i.e., the tracked target is invalid. When the video frame cannot be read, it indicates that the algorithm is completed and the tracking is completed.
104: and if the target is the same target, calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category. And acquiring the unique distance of the center of mass through the matching of the distance of the center of mass so as to facilitate the precision comparison.
For the calculation of the vehicle speed depending on the actual displacement of the vehicle, the invention obtains the pixel displacement of the target vehicle, and the actual displacement needs to be obtained through camera calibration, namely, a geometric model of camera imaging is established, so that the mapping relation between the image coordinate and the actual three-dimensional road coordinate system is solved, as shown in fig. 3, and the pixel displacement is converted into the actual displacement. In addition, the vehicle motion time can be directly obtained through the image frame rate, and finally the vehicle running speed can be obtained according to the vehicle motion time and the actual displacement.
Fig. 2 is a schematic diagram of the system of the present disclosure, which includes an acquisition module, a background elimination module, a comparison module, a judgment module, and a calculation module, where the acquisition module is configured to acquire continuous video images and divide the video images into continuous first frame pictures. The background elimination module is used for eliminating the background of the first frame picture to extract a target image, and a second frame picture is obtained. The comparison module is used for comparing the pixel value of the second frame of picture with the historical value of the pixel value, and if the difference value between the pixel value and the historical value does not exceed a first preset threshold value, the pixel value is stored in a potential category; if the difference between a pixel value and its historical value exceeds a first preset threshold, the pixel value is stored into a potential background. The calculation module is used for calculating the actual displacement of the target.
The judging module is used for judging whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets; and is also used for: determining a target detection frame in an adjacent frame of the second frame picture; calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked targets are the same targets, and if the centroid distance is not within the second threshold value, judging that the tracked targets are invalid.
The foregoing is an exemplary embodiment of the present disclosure, and the scope of the present disclosure is defined by the claims and their equivalents.

Claims (6)

1. A target tracking method based on image feature matching is characterized by comprising the following steps:
acquiring continuous video images, and cutting the video images into continuous first frame pictures;
background elimination is carried out on the first frame of picture to extract a target image, and a second frame of picture is obtained;
comparing the pixel value of the second frame of picture with the historical value of the pixel value, and if the difference value between the pixel value and the historical value of the pixel value does not exceed a first preset threshold value, storing the pixel value into a potential category; if the difference between the pixel value and the historical value exceeds a first preset threshold, storing the pixel value into a potential background;
judging whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets;
and if the target is the same target, calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category.
2. The target tracking method based on image feature matching as claimed in claim 1, wherein the determining whether the tracked targets are the same target according to the potential categories and the centroid distance between the targets comprises:
determining a target detection frame in an adjacent frame of the second frame picture;
calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked targets are the same targets, and if the centroid distance is not within the second threshold value, judging that the tracked targets are invalid.
3. The target tracking method based on image feature matching as claimed in claim 1, wherein the background elimination method is based on KNN algorithm.
4. An image feature matching-based target tracking system, comprising:
the acquisition module acquires continuous video images and cuts the video images into continuous first frame pictures;
the background elimination module is used for eliminating the background of the first frame of picture to extract a target image so as to obtain a second frame of picture;
the comparison module compares the pixel value of the second frame of picture with the historical value of the pixel value, and stores the pixel value into a potential category if the difference value between the pixel value and the historical value is within a first preset threshold value; if the difference between the pixel value and the historical value is not within a first preset threshold, storing the pixel value into a potential background;
the judging module judges whether the tracked targets are the same target or not according to the potential categories and the centroid distance between the targets;
and the calculation module is used for calculating the actual displacement of the target according to the pixel displacement of the pixel values in the potential category if the target is the same target.
5. The image feature matching-based target tracking system of claim 1, wherein the determination module is further configured to: determining a target detection frame in an adjacent frame of the second frame picture;
calculating the centroid distance of the target detection frames of the adjacent frames, judging whether the centroid distance is within a second preset threshold value, if the centroid distance is within the second threshold value, the tracked targets are the same targets, and if the centroid distance is not within the second threshold value, judging that the tracked targets are invalid.
6. The image feature matching-based target tracking system of claim 1, wherein the background elimination method is KNN algorithm-based background elimination.
CN202011232734.4A 2020-11-06 2020-11-06 Target tracking method and system based on image feature matching Pending CN112419364A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011232734.4A CN112419364A (en) 2020-11-06 2020-11-06 Target tracking method and system based on image feature matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011232734.4A CN112419364A (en) 2020-11-06 2020-11-06 Target tracking method and system based on image feature matching

Publications (1)

Publication Number Publication Date
CN112419364A true CN112419364A (en) 2021-02-26

Family

ID=74782090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011232734.4A Pending CN112419364A (en) 2020-11-06 2020-11-06 Target tracking method and system based on image feature matching

Country Status (1)

Country Link
CN (1) CN112419364A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673364A (en) * 2021-07-28 2021-11-19 上海影谱科技有限公司 Video violence detection method and device based on deep neural network
TWI790078B (en) * 2021-12-20 2023-01-11 財團法人工業技術研究院 Object detection method and object detection system for video

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109118516A (en) * 2018-07-13 2019-01-01 高新兴科技集团股份有限公司 A kind of target is from moving to static tracking and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109118516A (en) * 2018-07-13 2019-01-01 高新兴科技集团股份有限公司 A kind of target is from moving to static tracking and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GENYUAN CHENG等: "Real-time detection of vehicle speed based on video image", 《2020 12TH INTERNATIONAL CONFERENCE ON MEASURING TECHNOLOGY AND MECHATRONICS AUTOMATION (ICMTMA)》, pages 2 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673364A (en) * 2021-07-28 2021-11-19 上海影谱科技有限公司 Video violence detection method and device based on deep neural network
TWI790078B (en) * 2021-12-20 2023-01-11 財團法人工業技術研究院 Object detection method and object detection system for video

Similar Documents

Publication Publication Date Title
CN110441791B (en) Ground obstacle detection method based on forward-leaning 2D laser radar
KR101647370B1 (en) road traffic information management system for g using camera and radar
EP2426642B1 (en) Method, device and system for motion detection
CN101344965A (en) Tracking system based on binocular camera shooting
CN111723778B (en) Vehicle distance measuring system and method based on MobileNet-SSD
CN105975923B (en) Method and system for tracking human objects
CN113848545B (en) Fusion target detection and tracking method based on vision and millimeter wave radar
CN112419364A (en) Target tracking method and system based on image feature matching
CN110717445A (en) Front vehicle distance tracking system and method for automatic driving
Cheng et al. Real-time detection of vehicle speed based on video image
CN112613568B (en) Target identification method and device based on visible light and infrared multispectral image sequence
JP2011513876A (en) Method and system for characterizing the motion of an object
CN115166717A (en) Lightweight target tracking method integrating millimeter wave radar and monocular camera
JP3629935B2 (en) Speed measurement method for moving body and speed measurement device using the method
CN116862832A (en) Three-dimensional live-action model-based operator positioning method
KR101161557B1 (en) The apparatus and method of moving object tracking with shadow removal moudule in camera position and time
JP2002367077A (en) Device and method for deciding traffic congestion
CN104537690B (en) One kind is based on the united moving spot targets detection method of maximum time index
CN115512263A (en) Dynamic visual monitoring method and device for falling object
CN113553958B (en) Expressway green belt detection method and device
JPH09322153A (en) Automatic monitor
CN112561930B (en) System and method for real-time framing of target in video stream
JPH0991439A (en) Object monitor
CN113516685A (en) Target tracking method, device, equipment and storage medium
CN115994934B (en) Data time alignment method and device and domain controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220111

Address after: 2209-c1, No. 19, Erquan East Road, Huizhi enterprise center, Xishan District, Wuxi City, Jiangsu Province, 214000

Applicant after: Wuxi yuspace Intelligent Technology Co.,Ltd.

Address before: Room 1101, block C, Kangyuan smart port, No. 50, Jiangdong Street, Jialing, Jianye District, Nanjing City, Jiangsu Province, 210000

Applicant before: Jiangsu Yu Space Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220920

Address after: Room 1101, block C, Kangyuan smart port, No. 50, Jiangdong Street, Jialing, Jianye District, Nanjing City, Jiangsu Province, 210000

Applicant after: Jiangsu Yu Space Technology Co.,Ltd.

Address before: 2209-c1, No. 19, Erquan East Road, Huizhi enterprise center, Xishan District, Wuxi City, Jiangsu Province, 214000

Applicant before: Wuxi yuspace Intelligent Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230830

Address after: 2209-c1, No. 19, Erquan East Road, Huizhi enterprise center, Xishan District, Wuxi City, Jiangsu Province, 214000

Applicant after: Wuxi yuspace Intelligent Technology Co.,Ltd.

Address before: Room 1101, block C, Kangyuan smart port, No. 50, Jiangdong Street, Jialing, Jianye District, Nanjing City, Jiangsu Province, 210000

Applicant before: Jiangsu Yu Space Technology Co.,Ltd.