CN114067270A - Vehicle tracking method and device, computer equipment and storage medium - Google Patents

Vehicle tracking method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114067270A
CN114067270A CN202111372602.6A CN202111372602A CN114067270A CN 114067270 A CN114067270 A CN 114067270A CN 202111372602 A CN202111372602 A CN 202111372602A CN 114067270 A CN114067270 A CN 114067270A
Authority
CN
China
Prior art keywords
vehicle
monitoring
detection frame
information
monitoring devices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111372602.6A
Other languages
Chinese (zh)
Other versions
CN114067270B (en
Inventor
朱子威
张星明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202111372602.6A priority Critical patent/CN114067270B/en
Publication of CN114067270A publication Critical patent/CN114067270A/en
Application granted granted Critical
Publication of CN114067270B publication Critical patent/CN114067270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The embodiment of the invention discloses a vehicle tracking method, a vehicle tracking device, computer equipment and a storage medium, wherein the vehicle tracking method comprises the following steps: acquiring vehicle running monitoring videos sent by a plurality of monitoring devices; detecting a vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain characteristic information of the vehicle, wherein the characteristic information comprises an image in a detection frame (bounding box) of the vehicle; vehicle features (featuremaps) are extracted from the detection frame image, AMOC-RNN time series features are fused, and a vehicle feature set acquired by each monitoring device is obtained; carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.

Description

Vehicle tracking method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a vehicle tracking method, a vehicle tracking device, computer equipment and a storage medium.
Background
Although the traffic video monitoring system can provide dead-corner-free monitoring, some emergency and illegal vehicles in the video can not be positioned at any time when running at high speed, even if the vehicles are observed, retrieved and reported in time through the naked eyes of a screen under a video monitoring picture, a traffic department can not receive a report at the first time, and traffic accident evacuees or existing traffic accident hidden dangers and traffic management departments generate information delay. When the same vehicle appears in different cameras along the line, the comparison can be only carried out manually. The relevant staff cannot be dispatched to the site for interception or tracking in the first time. For example, a suspicious vehicle is speeding on a four-way-eight-reach highway, and the direction of traffic cannot be determined. If the position of the vehicle cannot be compared in time through the video monitoring pictures at different positions, information delay can be caused.
Disclosure of Invention
The embodiment of the invention provides a vehicle tracking method and device, computer equipment and a storage medium.
In order to solve the above technical problem, the embodiment of the present invention adopts a technical solution that: there is provided a vehicle tracking method comprising the steps of:
acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
detecting a vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain characteristic information of the vehicle, wherein the characteristic information comprises a detection frame (bounding box) image of the vehicle;
extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge;
and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle.
Further, the detecting the vehicle in the vehicle driving monitoring video by using a preset FastR-CNN algorithm to obtain the characteristic information of the vehicle includes:
detecting the vehicle in the vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame (bounding box) image of the vehicle;
extracting vehicle information from the detection frame image, wherein the vehicle information includes: vehicle code, time information, position information of monitoring device, and coordinate information of the vehicle
Further, the extracting vehicle features from the detection frame image and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device includes:
extracting image features of each frame of detection frame image by a convolutional neural network image feature extraction method to serve as re-identification features;
pooling the re-identification features through a pooling layer of a convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
Further, after the vehicle driving monitoring videos sent by the multiple monitoring devices are obtained, the method further includes:
collecting t continuous monitoring picture images in preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the t +1 frame monitoring picture image has jitter according to the initialization background model;
when the jitter occurs, a digital video de-jitter algorithm is triggered to remove the video jitter.
Further, identifying the same vehicle in the multiple monitoring devices according to the similarity comparison result to track the vehicle includes:
when a plurality of vehicle characteristics with similarity smaller than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether time information of vehicles pointed by the vehicle characteristics adjacent to the positions is smaller than a preset time period value or not;
and identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
An embodiment of the present invention provides a vehicle tracking apparatus, including:
the acquisition module is used for acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
the processing module is used for detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain the characteristic information of the vehicle, wherein the characteristic information comprises a detection frame image of the vehicle;
the processing module is further configured to extract vehicle features from the detection frame image and perform time series feature fusion to obtain a vehicle feature set acquired by each monitoring device;
the processing module is further used for carrying out similarity comparison on the vehicle characteristics in the vehicle characteristic set acquired by the multiple monitoring devices installed on the same geographical edge;
and the execution module is used for identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle.
Further, the processing module comprises:
the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
a first processing sub-module, configured to extract vehicle information from the detection frame image, where the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
Further, the processing module comprises:
the second acquisition submodule is used for extracting image features of each frame of detection frame image as re-identification features by a convolutional neural network image feature extraction method;
the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
Further, still include:
the third acquisition submodule is used for acquiring t continuous monitoring picture images in preset quantity from the vehicle running monitoring video to serve as an initialization background model;
the third processing submodule is used for judging whether the t +1 frame monitoring picture image shakes or not according to the initialized background model;
and the second execution submodule is used for triggering the digital video de-jitter algorithm to remove the video jitter when the jitter occurs.
Further, the execution module includes:
the fourth processing submodule is used for judging whether the monitoring equipment pointed by the vehicle characteristics are adjacent or not according to the serial numbers of the monitoring equipment on the same road when the vehicle characteristics with the similarity smaller than the preset value exist in the monitoring equipment;
the fifth processing submodule is used for judging whether time information of vehicles pointed by the plurality of vehicle characteristics adjacent to the position is smaller than a preset time period value or not when the positions of the monitoring equipment pointed by the plurality of vehicle characteristics are adjacent;
and the third execution submodule is used for identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
Embodiments of the present invention also provide a storage medium storing computer-readable instructions, which when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method described above.
The embodiment of the invention has the beneficial effects that: similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the heavy identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in the road.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without paying creative efforts.
FIG. 1 is a schematic flow chart of a vehicle tracking method according to an embodiment of the present invention;
fig. 2 is a block diagram of a basic structure of a vehicle tracking device according to an embodiment of the present invention;
fig. 3 is a block diagram of a basic structure of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
In some flows of the description and claims of the present invention and depicted in the above-described figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, with the order of the operations being 101, 102, etc., merely to distinguish between various operations, and the order of the operations by themselves is not intended to represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Examples
As will be understood by those skilled in the art, "terminal" and "terminal device" as used herein include both devices that are wireless signal receivers, which are devices having only wireless signal receivers without transmit capability, and devices that are receive and transmit hardware, which have receive and transmit hardware capable of performing two-way communication over a two-way communication link. Such an apparatus may include: a cellular or other communication device having a single line display or a multi-line display or a cellular or other communication device without a multi-line display; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data Communications capabilities; a PDA (Personal Digital Assistant) which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. As used herein, a "terminal" and a "terminal Device" may also be a communication terminal, an Internet terminal, and a music/video playing terminal, and may be, for example, a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with a music/video playing function, and may also be a smart tv, a set-top box, and the like.
As shown in fig. 1, fig. 1 is a basic flowchart of a vehicle tracking method according to an embodiment of the present invention, including the following steps:
s1, obtaining vehicle running monitoring videos sent by a plurality of monitoring devices;
the monitoring apparatus is a photographing device installed in a highway or a street for photographing a running vehicle, such as a camera or the like.
S2, detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain characteristic information of the vehicle, wherein the characteristic information comprises a detection frame (bounding box) image of the vehicle;
the feature information is used to indicate information used to identify a vehicle in the vehicle driving monitoring video, such as a vehicle code of the vehicle, a face image of a driver, a license plate number, and the like, and may further include time information at the time of shooting, position information of the monitoring device, and coordinate information of the vehicle. In the embodiment of the invention, the detection frame image of the vehicle is obtained by detecting the vehicle, and the information is extracted from the detection frame.
Specifically, the method comprises the following steps:
firstly, detecting a vehicle in a vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame (bounding box) image of the vehicle;
step two, extracting vehicle information from the detection frame image, wherein the vehicle information comprises: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
According to one embodiment of the invention, a trained target detection method based on a convolutional neural network only for vehicles is used for processing vehicles in the current video image to obtain a vehicle detection frame of each frame of image. And continuously tracking the vehicle in a single video picture by using a Deepsort method to obtain a detection frame sequence until the vehicle disappears in the picture.
S3, extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
the embodiment of the invention comprises the following steps:
firstly, extracting image features of each frame of detection frame image as re-identification features by a convolutional neural network image feature extraction method;
step two, performing pooling on the re-identification features through a pooling layer (Poolinglayer) in a convolutional neural network feature extraction network to obtain fused time sequence re-identification features;
and thirdly, taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
In the embodiment of the invention, the time sequence re-identification characteristic is an AMOC-RNN time sequence characteristic. In some embodiments, the monitoring device may monitor a plurality of vehicles, and the vehicle feature set includes vehicle features of the plurality of vehicles.
S4, carrying out similarity comparison on vehicle features in a vehicle feature set collected by a plurality of monitoring devices installed on the same geographical edge;
in the embodiment of the invention, the same geographical line can be the same expressway, and a plurality of cameras are arranged in the same street. When the embodiment of the invention compares the similarity of the vehicle characteristics, the cosine similarity of the Euclidean distance can be compared.
The feature vectors for the two images, also two line segments in space, are both pointing from the origin ([0, 0. ]) in different directions. An included angle is formed between the two line segments. The smaller the angle, the more similar. Assuming that the vehicle feature matrix a and the vehicle feature matrix B are two n-dimensional vectors, a is [ a1, a 2., An ], and B is [ B1, B2., Bn ], the cosine of the angle θ between a and B is equal to:
the cosine of the included angle between the vehicle characteristic matrix A and the vehicle characteristic matrix B can be obtained by using the formula. The closer the cosine value is to 1, the closer the angle is to 0 degrees, i.e. the more similar the two vectors are. After the included angle of the two vectors is obtained, the similarity degree of the two vectors can be judged according to the size of the angle. Therefore, the similarity calculation of the vehicle characteristic matrix A and the vehicle characteristic matrix B can be completed, and therefore similar images of the vehicle can be found through the algorithm.
And S5, identifying the same vehicle in the multiple monitoring devices according to the similarity comparison result so as to track the vehicle.
In some embodiments, there may be a plurality of vehicles with similar vehicle outlines, and in order to further accurately identify the same vehicle, the embodiment of the present invention performs accurate identification through the following steps:
step one, when a plurality of vehicle characteristics with similarity smaller than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
step two, when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether the time information of the vehicles pointed by the vehicle characteristics adjacent to the positions is less than a preset time period value or not;
and thirdly, identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
According to one embodiment of the invention, if the images of the vehicles in any camera in different cameras are searched, the current real-time vehicle geographic position can be searched according to the serial numbers of the vehicles in the images in the current video, and the current real-time vehicle geographic position can be inquired according to the serial numbers of the cameras.
Vehicles along different roads, namely vehicles with non-adjacent positions, and vehicles appearing at different time periods along the same road are eliminated through the method, and the accuracy of the tracked vehicles is ensured.
In some embodiments, in order to obtain a clear vehicle running monitoring video, the method further comprises the step of debouncing the vehicle running monitoring video shot by the monitoring device, and the method comprises the following steps:
step one, collecting t continuous monitoring picture images with preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the t +1 frame monitoring picture image shakes or not according to the initialized background model;
and step three, when the jitter occurs, triggering a digital video de-jitter algorithm to remove the video jitter.
According to one embodiment of the invention, under a single camera, after the monitoring video starts, whether the camera shakes is judged by using the pictures of the first 20 frames, and if the camera shakes, the pictures are stabilized by using a video shaking removal method.
According to the vehicle tracking method provided by the embodiment of the invention, similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical line; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.
The embodiment of the invention also provides a vehicle tracking device. Referring to fig. 2, fig. 2 is a block diagram of a basic structure of the vehicle tracking device according to the present embodiment.
As shown in fig. 2, a vehicle tracking apparatus includes: an acquisition module 2100, a processing module 2200, and an execution module 2300. The acquiring module 2100 is configured to acquire vehicle running monitoring videos sent by multiple monitoring devices; a processing module 2200, configured to detect a vehicle detection frame (bounding box) in the vehicle driving monitoring video by using a preset target detection algorithm for a vehicle, so as to obtain feature information of the vehicle, where the feature information includes a detection frame image of the vehicle; the processing module 2200 is further configured to extract vehicle features from the detection frame image, perform time series feature fusion, and obtain a vehicle feature set collected by each monitoring device; the processing module 2200 is further configured to perform similarity comparison on vehicle features in a vehicle feature set collected by multiple monitoring devices installed on the same geographical edge; the executing module 2300 is configured to identify the same vehicle among the multiple monitoring devices according to the similarity comparison result so as to track the vehicle.
According to the vehicle tracking device provided by the embodiment of the invention, similarity comparison is carried out on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical line; and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle. The result tracking result can be fed back in real time aiming at the re-identification and tracking of a plurality of vehicles monitored by a plurality of monitoring devices in a road.
In some embodiments, the processing module comprises: the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame (bounding box) image of the vehicle; a first processing sub-module, configured to extract vehicle information from the detection frame image, where the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
In some embodiments, the processing module comprises: the second acquisition submodule is used for extracting image features of each frame of detection frame image as re-identification features through a convolutional neural network image feature extraction method; the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features; and the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
In some embodiments, further comprising: the third acquisition submodule is used for acquiring t continuous monitoring picture images in preset quantity from the vehicle running monitoring video to serve as an initialization background model; the third processing submodule is used for judging whether the t +1 frame monitoring picture image shakes or not according to the initialized background model; and the second execution submodule is used for triggering the digital video de-jitter algorithm to remove the video jitter when the jitter occurs.
In some embodiments, the execution module comprises: the fourth processing submodule is used for judging whether the monitoring equipment pointed by the vehicle characteristics are adjacent according to the serial numbers of the monitoring equipment on the same road when the vehicle characteristics with the similarity smaller than the preset value exist in the monitoring equipment; the fifth processing submodule is used for judging whether the time information of the vehicle interval pointed by the plurality of vehicle characteristics adjacent to the position is smaller than a preset time period value or not when the positions of the monitoring equipment pointed by the plurality of vehicle characteristics are adjacent; and the third execution submodule is used for identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicle. .
In order to solve the above technical problem, an embodiment of the present invention further provides a computer device. Referring to fig. 3, fig. 3 is a block diagram of a basic structure of a computer device according to the present embodiment.
As shown in fig. 3, the internal structure of the computer device is schematically illustrated. As shown in fig. 3, the computer apparatus includes a processor, a nonvolatile storage medium, a storage, and a network interface connected through a system bus. Wherein the non-volatile storage medium of the computer device stores an operating system, a database having control information sequences stored therein, and computer readable instructions which, when executed by the processor, cause the processor to implement a vehicle tracking method. The processor of the computer device is used to provide computing and control capabilities to support the operation of the entire computer device. The memory of the computer device may have stored therein computer readable instructions that, when executed by the processor, may cause the processor to perform a vehicle tracking method. The network interface of the computer device is used for connecting and communicating with the terminal. Those skilled in the art will appreciate that the architecture shown in FIG. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices in which the disclosed aspects may be implemented, as a particular computing device may include more or less components than those shown, or may have a combination of certain components, or a different arrangement of components.
In this embodiment, the processor is configured to execute specific contents of the obtaining module 2100, the processing module 2200, and the executing module 2300 in fig. 2, and the memory stores program codes and various data required for executing the modules. The network interface is used for data transmission to and from a user terminal or a server. The memory in the present embodiment stores program codes and data necessary for executing all the submodules in the vehicle tracking method, and the server can call the program codes and data of the server to execute the functions of all the submodules.
According to the computer device provided by the embodiment of the invention, the reference feature map is obtained by extracting the features of the high-definition image set in the reference pool, and due to the diversification of the images in the high-definition image set, the reference feature map contains all possible local features, so that high-frequency texture information can be provided for each low-resolution image, the feature richness is ensured, and the memory burden is reduced. In addition, the reference feature map is searched according to the low-resolution image, and the selected reference feature map can adaptively shield or enhance various different features, so that the details of the low-resolution image are richer.
The present invention also provides a storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method of any of the embodiments described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when the computer program is executed, the processes of the embodiments of the methods described above can be included. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of execution is not necessarily sequential, but may be alternated or performed with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A vehicle tracking method, comprising the steps of:
acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain the characteristic information of the vehicle, wherein the characteristic information comprises a detection frame image of the vehicle;
extracting vehicle features from the detection frame image, and performing time series feature fusion to obtain a vehicle feature set collected by each monitoring device;
carrying out similarity comparison on vehicle characteristics in a vehicle characteristic set acquired by a plurality of monitoring devices installed on the same geographical edge;
and identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle.
2. The vehicle tracking method according to claim 1, wherein the detecting the vehicle in the vehicle driving monitoring video by using a preset target detection algorithm for the vehicle obtains the characteristic information of the vehicle, and comprises:
detecting the vehicle in the vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
extracting vehicle information from the detection frame image, wherein the vehicle information includes: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
3. The vehicle tracking method according to claim 1, wherein the extracting vehicle features from the detection frame image for time series feature fusion to obtain a vehicle feature set collected by each monitoring device comprises:
extracting image features of each frame of detection frame image by a convolutional neural network image feature extraction method to serve as re-identification features;
pooling the re-identification features through a pooling layer of a convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
4. The vehicle tracking method according to claim 1, wherein after acquiring the vehicle driving monitoring video transmitted by the plurality of monitoring devices, the method further comprises:
collecting t continuous monitoring picture images in preset quantity from the vehicle running monitoring video as an initialization background model;
judging whether the t +1 frame monitoring picture image has jitter according to the initialization background model;
when the jitter occurs, a digital video de-jitter algorithm is triggered to remove the video jitter.
5. The vehicle tracking method according to claim 1, wherein identifying the same vehicle among the plurality of monitoring devices to track the vehicle according to the similarity comparison result comprises:
when a plurality of vehicle characteristics with the similarity smaller than a preset value exist in the plurality of monitoring devices, judging whether the monitoring devices pointed by the plurality of vehicle characteristics are adjacent or not according to the serial numbers of the monitoring devices on the same road;
when the positions of the monitoring equipment pointed by the vehicle characteristics are adjacent, judging whether time information of vehicles pointed by the vehicle characteristics adjacent to the positions is smaller than a preset time period value or not;
and identifying a plurality of vehicles smaller than the preset time period value as the same vehicle so as to track the vehicles.
6. A vehicle tracking device, comprising:
the acquisition module is used for acquiring vehicle running monitoring videos sent by a plurality of monitoring devices;
the processing module is used for detecting the vehicle in the vehicle running monitoring video by adopting a preset target detection algorithm aiming at the vehicle to obtain the characteristic information of the vehicle, wherein the characteristic information comprises a detection frame image of the vehicle;
the processing module is further configured to extract vehicle features from the detection frame image and perform time series feature fusion to obtain a vehicle feature set acquired by each monitoring device;
the processing module is further used for carrying out similarity comparison on the vehicle characteristics in the vehicle characteristic set acquired by the monitoring equipment installed on the same geographical line;
and the execution module is used for identifying the same vehicle in the plurality of monitoring devices according to the similarity comparison result so as to track the vehicle.
7. The vehicle tracking device of claim 6, wherein the processing module comprises:
the first acquisition submodule is used for detecting the vehicle in the vehicle running monitoring video by utilizing a trained fast target detection method FastR-CNN network based on a convolutional neural network to obtain a detection frame image of the vehicle;
the first processing submodule is used for extracting vehicle information from the detection frame image, wherein the vehicle information comprises: a vehicle code, time information, location information of a monitoring device, and coordinate information of the vehicle.
8. The vehicle tracking device of claim 6, wherein the processing module comprises:
the second acquisition submodule is used for extracting the vehicle image characteristics of each frame of detection frame image through a convolutional neural network image characteristic extraction method to serve as re-identification characteristics;
the second processing submodule is used for performing pooling on the re-identification features through a pooling layer of the convolutional neural network feature extraction network weight to obtain fused time sequence re-identification features;
and the first execution submodule is used for taking all time sequence re-identification features in the vehicle running monitoring video as a vehicle feature set of the monitoring equipment.
9. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to carry out the steps of the vehicle tracking method according to any one of claims 1 to 5.
10. A storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the vehicle tracking method of any one of claims 1 to 5.
CN202111372602.6A 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium Active CN114067270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111372602.6A CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111372602.6A CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114067270A true CN114067270A (en) 2022-02-18
CN114067270B CN114067270B (en) 2022-09-09

Family

ID=80278364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111372602.6A Active CN114067270B (en) 2021-11-18 2021-11-18 Vehicle tracking method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114067270B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820699A (en) * 2022-03-29 2022-07-29 小米汽车科技有限公司 Multi-target tracking method, device, equipment and medium
CN115171378A (en) * 2022-06-28 2022-10-11 武汉理工大学 Long-distance multi-vehicle high-precision detection tracking method based on roadside radar
CN117312598A (en) * 2023-11-27 2023-12-29 广东利通科技投资有限公司 Evidence obtaining method, device, computer equipment and storage medium for fee evasion auditing

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007235952A (en) * 2006-02-28 2007-09-13 Alpine Electronics Inc Method and device for tracking vehicle
JP2014191664A (en) * 2013-03-27 2014-10-06 Fujitsu Ltd Vehicle tracking program, image transmission program, server device, information processing apparatus, and vehicle tracking method
CN106599918A (en) * 2016-12-13 2017-04-26 开易(深圳)科技有限公司 Vehicle tracking method and system
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
CN111126223A (en) * 2019-12-16 2020-05-08 山西大学 Video pedestrian re-identification method based on optical flow guide features
CN111310633A (en) * 2020-02-10 2020-06-19 江南大学 Parallel space-time attention pedestrian re-identification method based on video
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
CN111709328A (en) * 2020-05-29 2020-09-25 北京百度网讯科技有限公司 Vehicle tracking method and device and electronic equipment
CN112069969A (en) * 2020-08-31 2020-12-11 河北省交通规划设计院 Method and system for tracking highway monitoring video mirror-crossing vehicle
WO2020248248A1 (en) * 2019-06-14 2020-12-17 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for object tracking
CN113221750A (en) * 2021-05-13 2021-08-06 杭州飞步科技有限公司 Vehicle tracking method, device, equipment and storage medium
CN113256690A (en) * 2021-06-16 2021-08-13 中国人民解放军国防科技大学 Pedestrian multi-target tracking method based on video monitoring

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007235952A (en) * 2006-02-28 2007-09-13 Alpine Electronics Inc Method and device for tracking vehicle
JP2014191664A (en) * 2013-03-27 2014-10-06 Fujitsu Ltd Vehicle tracking program, image transmission program, server device, information processing apparatus, and vehicle tracking method
CN106599918A (en) * 2016-12-13 2017-04-26 开易(深圳)科技有限公司 Vehicle tracking method and system
CN108629791A (en) * 2017-03-17 2018-10-09 北京旷视科技有限公司 Pedestrian tracting method and device and across camera pedestrian tracting method and device
WO2020145882A1 (en) * 2019-01-09 2020-07-16 Hitachi, Ltd. Object tracking systems and methods for tracking a target object
WO2020248248A1 (en) * 2019-06-14 2020-12-17 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for object tracking
CN110443828A (en) * 2019-07-31 2019-11-12 腾讯科技(深圳)有限公司 Method for tracing object and device, storage medium and electronic device
CN111126223A (en) * 2019-12-16 2020-05-08 山西大学 Video pedestrian re-identification method based on optical flow guide features
CN111310633A (en) * 2020-02-10 2020-06-19 江南大学 Parallel space-time attention pedestrian re-identification method based on video
CN111709328A (en) * 2020-05-29 2020-09-25 北京百度网讯科技有限公司 Vehicle tracking method and device and electronic equipment
CN112069969A (en) * 2020-08-31 2020-12-11 河北省交通规划设计院 Method and system for tracking highway monitoring video mirror-crossing vehicle
CN113221750A (en) * 2021-05-13 2021-08-06 杭州飞步科技有限公司 Vehicle tracking method, device, equipment and storage medium
CN113256690A (en) * 2021-06-16 2021-08-13 中国人民解放军国防科技大学 Pedestrian multi-target tracking method based on video monitoring

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WENQIAN LIU等: "multi-camera multi-object tracking", 《COMPUTER VISION AND PATTERN RECOGNITION》 *
YANGCHUN ZHU等: "A Structured Graph Attention Network for Vehicle Re-Identification", 《PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA》 *
徐少强: "监控视频对象跟踪与行为识别", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *
邹易: "基于车载单目视觉的前车多目标跟踪方法研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820699A (en) * 2022-03-29 2022-07-29 小米汽车科技有限公司 Multi-target tracking method, device, equipment and medium
CN115171378A (en) * 2022-06-28 2022-10-11 武汉理工大学 Long-distance multi-vehicle high-precision detection tracking method based on roadside radar
CN115171378B (en) * 2022-06-28 2023-10-27 武汉理工大学 High-precision detection tracking method for long-distance multiple vehicles based on road side radar
CN117312598A (en) * 2023-11-27 2023-12-29 广东利通科技投资有限公司 Evidence obtaining method, device, computer equipment and storage medium for fee evasion auditing
CN117312598B (en) * 2023-11-27 2024-04-09 广东利通科技投资有限公司 Evidence obtaining method, device, computer equipment and storage medium for fee evasion auditing

Also Published As

Publication number Publication date
CN114067270B (en) 2022-09-09

Similar Documents

Publication Publication Date Title
CN114067270B (en) Vehicle tracking method and device, computer equipment and storage medium
CN110191424B (en) Specific suspect track generation method and apparatus
US11250054B1 (en) Dynamic partitioning of input frame buffer to optimize resources of an object detection and recognition system
US8970694B2 (en) Video processing system providing overlay of selected geospatially-tagged metadata relating to a geolocation outside viewable area and related methods
WO2016202027A1 (en) Object movement trajectory recognition method and system
US11734783B2 (en) System and method for detecting on-street parking violations
US8717436B2 (en) Video processing system providing correlation between objects in different georeferenced video feeds and related methods
JP2011521541A (en) System and method for electronic monitoring
US8363109B2 (en) Video processing system providing enhanced tracking features for moving objects outside of a viewable window and related methods
CN110659391A (en) Video detection method and device
US8933961B2 (en) Video processing system generating corrected geospatial metadata for a plurality of georeferenced video feeds and related methods
US11657623B2 (en) Traffic information providing method and device, and computer program stored in medium in order to execute method
CN107613244A (en) A kind of navigation channel monitoring objective acquisition methods and device
CN103581620A (en) Image processing apparatus, image processing method and program
CN117242489A (en) Target tracking method and device, electronic equipment and computer readable medium
EP3244344A1 (en) Ground object tracking system
Glasl et al. Video based traffic congestion prediction on an embedded system
US11488390B2 (en) Map generation device, recording medium and map generation method
Luo et al. Complete trajectory extraction for moving targets in traffic scenes that considers multi-level semantic features
KR20050034224A (en) A system for automatic parking violation regulation, parking control,and disclosure and roundup of illegal vehicles using wireless communication
CN115114302A (en) Road sign data updating method and device, electronic equipment and storage medium
CN111833253A (en) Method and device for constructing spatial topology of interest points, computer system and medium
CN113689705A (en) Method and device for detecting red light running of vehicle, computer equipment and storage medium
Wan et al. Lithuanian traffic monitoring system (LTTMS) based on Android
CN114663469A (en) Target object tracking method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant