CN111369587B - Tracking method and device - Google Patents

Tracking method and device Download PDF

Info

Publication number
CN111369587B
CN111369587B CN201910510134.0A CN201910510134A CN111369587B CN 111369587 B CN111369587 B CN 111369587B CN 201910510134 A CN201910510134 A CN 201910510134A CN 111369587 B CN111369587 B CN 111369587B
Authority
CN
China
Prior art keywords
tracking
coordinates
coordinate
monitoring
video image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910510134.0A
Other languages
Chinese (zh)
Other versions
CN111369587A (en
Inventor
王科
沈涛
裴建军
于建志
张�浩
刘义
陈延鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen city public security bureau traffic police bureau
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Shenzhen city public security bureau traffic police bureau
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen city public security bureau traffic police bureau, Hangzhou Hikvision System Technology Co Ltd filed Critical Shenzhen city public security bureau traffic police bureau
Priority to CN201910510134.0A priority Critical patent/CN111369587B/en
Publication of CN111369587A publication Critical patent/CN111369587A/en
Application granted granted Critical
Publication of CN111369587B publication Critical patent/CN111369587B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the invention provides a tracking method and a tracking device, wherein the method comprises the following steps: the method comprises the steps of determining a first monitoring device with deviation of image detection, obtaining a first tracking coordinate in a video image of a target overlapping area shot by the first monitoring device, and a second tracking coordinate with the same time stamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring device, and taking a plane coordinate corresponding to the second tracking coordinate as a plane coordinate corresponding to the first tracking coordinate. And determining a target coordinate conversion matrix of the first monitoring equipment according to the first tracking coordinate and the plane coordinate corresponding to the first tracking coordinate. When the tracking coordinates of the target tracking object shot by the first monitoring equipment are obtained, determining the plane coordinates of the target tracking object according to the target coordinate conversion matrix. Based on the method, the first monitoring device with deviation of the video image can be determined, and the coordinate transformation matrix of the first monitoring device is calibrated, so that the accuracy of tracking the track information of the object is improved.

Description

Tracking method and device
Technical Field
The present invention relates to the field of video monitoring technologies, and in particular, to a tracking method and apparatus.
Background
In order to better solve the urban traffic problem, traffic managers need to perform video monitoring and tracking on vehicles. Because the road section is longer, and the monitoring field of a single monitoring device is narrower, the single monitoring device has difficulty in monitoring the whole road. In the prior art, a plurality of monitoring devices are adopted for monitoring, the monitoring devices are arranged beside a highway, each monitoring device shoots different highway sections, the highway sections shot by the adjacent monitoring devices have overlapping areas, and thus, the plurality of monitoring devices can monitor the whole highway.
The existing tracking method for multi-video monitoring comprises the following steps: the method comprises the steps that a server obtains videos shot by all monitoring devices, then obtains pixel coordinates of a first tracking object in the videos shot by all the monitoring devices, the server is preset with coordinate conversion matrixes of all the monitoring devices, and for each monitoring device, the server can convert the pixel coordinates of the first tracking object in the videos shot by the monitoring devices into plane coordinates according to the coordinate conversion matrixes corresponding to the monitoring devices. The plane coordinates may be coordinates in a gaussian plane coordinate system. And the server stores the plane coordinates of the first tracking object in the video shot by each monitoring device in an associated manner to obtain the track information of the first tracking object. Because the track information of the first tracking object is composed of plane coordinates under the same coordinate system, the position relation of the first tracking object in videos shot by different monitoring devices can be reflected.
However, when a certain monitoring device is offset due to an external force factor, the pixel coordinates of the first tracking object in the video shot by the monitoring device are offset, at this time, the coordinate transformation matrix corresponding to the monitoring device is no longer matched with the monitoring device, and according to the coordinate transformation matrix corresponding to the monitoring device, the plane coordinates converted from the pixel coordinates of the first tracking object in the video shot by the monitoring device are not accurate, so that the accuracy of the track information of the tracking object on a highway is low.
Disclosure of Invention
The embodiment of the invention aims to provide a tracking method and a tracking device, which can determine first monitoring equipment with video image deviation and calibrate a coordinate conversion matrix of the first monitoring equipment, thereby improving the accuracy of tracking track information of an object. The specific technical scheme is as follows:
in a first aspect, a tracking method is provided, where the method is applied to a server in a multi-video monitoring system, where the multi-video monitoring system further includes a plurality of monitoring devices, and monitoring areas shot by adjacent monitoring devices have overlapping areas, and the method includes:
determining a first monitoring device with deviation of image detection, wherein a tracking object in a video image shot by the first monitoring device and a tracking object in a video image shot by monitoring devices adjacent to the first monitoring device do not meet a preset matching condition;
Acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
for each first tracking coordinate, acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate conversion matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
when the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are obtained, determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate set, determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
acquiring a first video shot by the first monitoring equipment;
collecting tracking coordinates in video images of a first video according to a preset time interval, and taking shooting time of the video images to which the tracking coordinates belong as a time stamp of the tracking coordinates;
Determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping region corresponding to the preset first monitoring equipment aiming at the same tracking coordinates of each group of collected time stamps;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
Optionally, the first monitoring device for determining deviation of image detection includes:
for each monitoring device, acquiring plane coordinates of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinates of the first tracking object;
acquiring the plane coordinates of a second tracking object, which are identical to the time stamp of the plane coordinates of the first tracking object, in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object being smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
Determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
In a second aspect, there is provided a tracking apparatus, the apparatus being applied to a server in a multi-video monitoring system, the multi-video monitoring system further including a plurality of monitoring devices, wherein monitoring areas photographed by adjacent monitoring devices have overlapping areas, the apparatus comprising:
the first determining module is used for determining first monitoring equipment with deviation of image detection, wherein a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
the first acquisition module is used for acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
The second acquisition module is used for acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment aiming at each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and the coordinate conversion matrix of the second monitoring equipment;
the second determining module is used for taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
the third determining module is used for determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
and the fourth determining module is used for determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix when the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are acquired.
Optionally, the first obtaining module includes:
the receiving unit is used for receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
A first determining unit, configured to determine, for each tracking coordinate set, tracking coordinates within a coordinate range of a target overlapping area corresponding to a preset first monitoring device;
and the first acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
Optionally, the first obtaining module includes:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
the acquisition unit is used for acquiring tracking coordinates in video images of the first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a time stamp of the tracking coordinates;
the second determining unit is used for determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping area corresponding to the preset first monitoring equipment aiming at the acquired tracking coordinates with the same time stamp in each group;
and the third acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
Optionally, the first determining module includes:
a fourth obtaining unit, configured to obtain, for each monitoring device, a plane coordinate of a first tracking object in a video image of a first overlapping area captured by the monitoring device, and record a timestamp of the plane coordinate of the first tracking object;
a fifth obtaining unit, configured to obtain, in a video image of the first overlapping area captured by a monitoring device adjacent to the monitoring device, a plane coordinate of a second tracking object that is the same as a timestamp of a plane coordinate of the first tracking object;
a third determining unit, configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance to the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
a fourth determining unit, configured to determine a matching success rate corresponding to the monitoring device according to a ratio between the number of successfully matched first tracking objects and the total number of first tracking objects in the video image of the first overlapping area;
And a fifth determining unit, configured to determine a first monitoring device with a matching success rate smaller than a preset second threshold.
In a third aspect, an electronic device is provided that includes a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor to cause the processor to: the method steps described for the first method are implemented.
In a fourth aspect, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the method steps according to the first aspect.
According to the tracking method and device provided by the embodiment of the invention, the first monitoring equipment with deviation in image detection can be determined, then a plurality of first tracking coordinates in the video image of the target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates are obtained, and the first tracking coordinates are pixel coordinates of the tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment. And further, for each first tracking coordinate, acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a preset coordinate conversion matrix of the second monitoring equipment. And then determining a target coordinate conversion matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate. When the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are obtained, determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix. Therefore, the first monitoring device with the deviation of the video image can be determined, the coordinate transformation matrix of the first monitoring device is calibrated, and the accuracy of tracking the track information of the object is improved.
Of course, it is not necessary for any one product or method of practicing the invention to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a multi-video monitoring system according to an embodiment of the present application;
FIG. 2 is a flowchart of a tracking method according to an embodiment of the present application;
FIG. 3 is a flowchart of a method for obtaining a first tracking coordinate according to an embodiment of the present application;
FIG. 4 is a flowchart of a method for obtaining a first tracking coordinate according to an embodiment of the present application;
FIG. 5 is a flowchart of a method for determining a first monitoring device with deviation in image detection according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a tracking device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the application provides a tracking method which is applied to a server in a video monitoring system. The monitoring equipment is arranged beside a highway, and each monitoring equipment can shoot different highway sections to obtain different video images. The road sections shot by the adjacent monitoring devices have overlapping areas, so that a plurality of monitoring devices can monitor the whole road. As shown in fig. 1, a monitoring device 1 captures a video image of a monitoring area 1, a monitoring device 2 captures a video image of a monitoring area 2, a monitoring device 3 captures a video image of a monitoring area 3, a monitoring device 4 captures a video image of a monitoring area 4, the monitoring area 1 and the monitoring area 2 have overlapping areas, the monitoring area 2 and the monitoring area 3 have overlapping areas, and the monitoring area 3 and the monitoring area 4 have overlapping areas. It should be noted that fig. 1 only illustrates that an overlapping area exists between the monitoring areas of two adjacent monitoring devices, and the number, the installation position and the installation angle of the monitoring devices are not particularly limited.
The server can be installed in the monitoring center and connected with each monitoring device through a network. In the embodiment of the application, the server can acquire the pixel coordinates of the tracked object in the video image shot by the monitoring equipment, and acquire the plane coordinates of the tracked object in the video image shot by each monitoring equipment according to the pixel coordinates of the tracked object in the video image shot by each monitoring equipment and the preset coordinate conversion matrix corresponding to each monitoring equipment, so as to acquire the track information of the tracked object on a road. When monitoring equipment with deviation of image detection exists, the server can determine first monitoring equipment with deviation of image detection and calibrate a coordinate conversion matrix corresponding to the first monitoring equipment, so that accuracy of track information of a tracked object is improved. Wherein the tracked object may be a vehicle.
As shown in fig. 2, the method comprises the steps of:
in step 201, a first monitoring device in which image detection is biased is determined.
The method comprises the steps that a tracking object in a video image shot by a first monitoring device and a tracking object in a video image shot by a monitoring device adjacent to the first monitoring device do not meet preset matching conditions.
In practical application, when a certain monitoring device in each monitoring device is reinstalled or moved by an external force factor, the image detection of the monitoring device is caused to deviate, that is, the pixel coordinates of a tracking object in a video image shot by the monitoring device deviate from those of the previous image, so that a coordinate conversion matrix corresponding to the monitoring device is not matched with the monitoring device, that is, the plane coordinates converted from the pixel coordinates of the tracking object in the video image shot by the monitoring device are not accurate according to the coordinate conversion matrix corresponding to the monitoring device.
In implementation, for each monitoring device, the server may determine whether a tracking object in a video image captured by the monitoring device and a tracking object in a video image captured by a monitoring device adjacent to the monitoring device meet a preset matching condition, and if not, the server uses the monitoring device as a first monitoring device with deviation in image detection. For example, the server may acquire pixel coordinates of the tracking object in the video image captured by the monitoring device, take the tracking object appearing in the video image of the first overlapping area captured by the monitoring device as the first tracking object, and acquire plane coordinates and a timestamp of the plane coordinates of the first tracking object in the video image of the first overlapping area captured by the monitoring device. The method comprises the steps that plane coordinates of a first tracking object are obtained through a pixel coordinate of the first tracking object and a coordinate conversion matrix corresponding to the monitoring device, similarly, a server can obtain plane coordinates of a second tracking object and time stamps of the plane coordinates in a video image of a first overlapping area shot by the monitoring device adjacent to the monitoring device, and according to the plane coordinates of the first tracking object, the time stamps of the plane coordinates and the time stamps of the plane coordinates of the second tracking object and the time stamps of the plane coordinates and the plane coordinates of the second tracking object, the server can determine a matching rate corresponding to the monitoring device, and then the monitoring device with the matching rate smaller than a preset threshold value is used as the first monitoring device with deviation in image detection. The specific process of the server determining the first monitoring device with deviation in image detection will be described in detail later.
It should be noted that the first monitoring device for detecting deviation of the image may be selected by a technician.
Step 202, obtaining a plurality of first tracking coordinates and time stamps of the first tracking coordinates in a video image of a target overlapping area shot by a first monitoring device.
When the first tracking coordinate is the pixel coordinate of the tracking object in the video image of the target overlapping area shot by the first monitoring equipment, the first tracking coordinate is the pixel coordinate of the tracking object in the target overlapping area. The plurality of first tracking coordinates are not on the same straight line (i.e., do not satisfy a linear relationship), and the first tracking coordinates may be pixel coordinates of a tracking object near an edge of a video image captured by the first monitoring device.
In implementation, the server may acquire a plurality of first tracking coordinates in the video image of the target overlapping region captured by the first monitoring device, and a timestamp of each first tracking coordinate. For example. The server may acquire a first video captured by the first monitoring device, and determine a plurality of first tracking coordinates and time stamps of the first tracking coordinates according to a video image of the first video. Or after the first monitoring device shoots the video image of the first video, the tracking coordinates in the video image can be identified, the shooting time of the video image to which the tracking coordinates belong is used as the time stamp of the tracking coordinates, then the first monitoring device can send the determined tracking coordinates and the time stamp of the tracking coordinates to the server, and the server can determine a plurality of first tracking coordinates and the time stamps of the first tracking coordinates according to the tracking target sent by the first monitoring device and the time stamp of the tracking. The specific process of determining the first tracking coordinates and the time stamps of the first tracking coordinates by the server will be described in detail later.
It should be noted that, as shown in fig. 1, the monitoring area photographed by the first monitoring device may include two overlapping areas, and the target overlapping area may be any one of the two overlapping areas. The server needs to acquire at least 4 first tracking coordinates, because the coordinate transformation matrix of the first monitoring device needs to be calculated according to at least 4 first tracking coordinates in the subsequent process, and the specific calculation process will be described in detail later.
Step 203, for each first tracking coordinate, obtaining a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring device, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and the coordinate conversion matrix of the second monitoring device.
In an implementation, similarly, the server may further acquire a plurality of tracking coordinates in the video image of the target overlapping region captured by the second monitoring device, and a time stamp of each tracking coordinate.
For each first tracking coordinate, the server may determine, from a plurality of tracking coordinates in the video image of the target overlapping region captured by the second monitoring device, a second tracking coordinate identical to the time stamp of the first tracking coordinate according to the time stamp of the first tracking coordinate. The first tracking coordinates and the second tracking coordinates corresponding to the first tracking coordinates are as follows: and the same tracking object is respectively at the pixel coordinates in the video images shot by the first monitoring equipment and the second monitoring equipment at the same moment. For example, assume that the first tracking coordinate is (u 1 ,v 1 ) The time stamp is 10s, and a plurality of tracking coordinates in the video image of the target overlapping area shot from the second monitoring device are: (u) 2 ,v 2 ) The time stamp is 8s; (u) 3 ,v 3 ) The time stamp is 9s; (u) 4 ,v 4 ) The time stamp is 10s, and the first pieceTracking coordinates (u) 1 ,v 1 ) The second trace object with the same time stamp is (u) 4 ,v 4 )。
The server may be preset with a coordinate conversion matrix corresponding to the second monitoring device, and the server may determine, according to the coordinate conversion matrix corresponding to the second monitoring device and the second tracking coordinate, a plane coordinate corresponding to the second tracking coordinate. The plane coordinates are coordinates in a certain plane rectangular coordinate system, for example, may be coordinates in a gaussian plane coordinate system, or may be coordinates in a plane rectangular coordinate system established by a technician according to an actual road section. The plane coordinates of the same tracking object in video images shot by different monitoring devices at the same moment are the same or similar, and the first tracking coordinates and the second tracking coordinates corresponding to the first tracking coordinates are as follows: and the same tracking object is respectively at the pixel coordinates in the video images shot by the first monitoring equipment and the second monitoring equipment at the same moment. The server may use the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate.
For example, assume that the second tracking coordinate is (u 4 ,v 4 ) The coordinate transformation matrix corresponding to the second monitoring equipment is as follows
Figure BDA0002093176960000101
Then, according to the formulas (1), (2) and (3), the plane coordinate (x) corresponding to the second tracking coordinate can be calculated 4 ,y 4 ). Wherein (1)>
Figure BDA0002093176960000102
Figure BDA0002093176960000103
Figure BDA0002093176960000111
Figure BDA0002093176960000112
Where u is the abscissa of the pixel coordinate (i.e., tracking coordinate), v is the ordinate of the pixel coordinate (i.e., tracking coordinate), x is the abscissa of the planar coordinate corresponding to the pixel coordinate, y is the ordinate of the planar coordinate corresponding to the pixel coordinate,
Figure BDA0002093176960000113
is a coordinate transformation matrix.
And 204, taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate.
In implementation, for each first tracking target, the server may use the plane coordinate corresponding to the second tracking target determined according to the first tracking target in step 203 as the plane coordinate corresponding to the first tracking coordinate.
Step 205, determining a target coordinate transformation matrix corresponding to the first monitoring device according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate.
In practice, the conversion relationship between the tracking coordinates and the plane coordinates corresponding to the tracking coordinates can be known from the formulas (1), (2) and (3) as shown in the formulas (4) and (5). The server may calculate, according to the plurality of first tracking coordinates, the plane coordinates corresponding to each of the first tracking coordinates, the formula (4) and the formula (5), a perspective transformation matrix between the first tracking coordinates and the plane coordinates corresponding to the first tracking coordinates, that is, a coordinate transformation matrix corresponding to the first monitoring device.
Figure BDA0002093176960000114
Figure BDA0002093176960000115
For example, assume that there are 4 first tracking coordinates, respectively (u) 1 ,v 1 ),(u 2 ,v 2 ),(u 3 ,v 3 ),(u 4 ,v 4 ),(u 1 ,v 1 ) The corresponding plane coordinates are (x 1 ,y 1 ),(u 2 ,v 2 ) The corresponding plane coordinates are (x 2 ,y 2 ),(u 3 ,v 3 ) The corresponding plane coordinates are (x 3 ,y 3 ),(u 4 ,v 4 ) The corresponding plane coordinates are (x 4 ,y 4 ). Equation (4) is obtained according to equation (1) and equation (2), and equation (5) is obtained according to equation (1) and equation (3). And substituting the first tracking coordinates and each tracking coordinate into the formula (4) and the formula (5) by the server to obtain an equation set shown in the formula (6), and calculating the values of all parameters in the coordinate transformation matrix according to the equation set in the formula (6) to obtain the coordinate transformation matrix corresponding to the first monitoring equipment. Since the coordinate transformation matrix corresponding to the first monitoring device has 8 unknown parameters to be required, at least 8 sets of equations are required to calculate all the parameters of the coordinate transformation matrix. While the 4 first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate may list 8 sets of equations, at least 4 first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate are required.
Figure BDA0002093176960000121
And 206, when the tracking coordinates of the target tracking object in the video image shot by the target monitoring equipment are obtained, determining the plane coordinates of the target tracking object according to the pixel coordinates of the target tracking object and the target coordinate conversion matrix.
In implementation, when the tracking coordinates of the target tracking object in the video image shot by the target monitoring device are acquired, the server may determine the plane coordinates of the target tracking object according to the pixel coordinates of the target tracking object and the target coordinate conversion matrix, and store the plane coordinates of the target tracking object, so as to acquire the track information of the target tracking object.
Therefore, the first monitoring device with deviation of the video image can be determined, and the coordinate transformation matrix corresponding to the first monitoring device is calibrated, so that the accuracy of tracking the track information of the object is improved.
Alternatively, the manner in which the server obtains the first tracking coordinates and the time stamp of each first tracking coordinate may be varied, and a feasible processing manner is provided herein, see fig. 3, specifically including the following steps:
step 301, receiving a plurality of tracking coordinate sets sent by a first monitoring device.
Wherein each tracking coordinate set contains at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same.
In implementation, the first monitoring device may collect tracking coordinates in video images of the first video captured by the first monitoring device at preset time intervals, and for each tracking coordinate, take, as a timestamp of the tracking coordinate, a capturing time of the video image to which the tracking coordinate belongs. The first monitoring device takes the tracking coordinates and the time stamps of the tracking coordinates acquired each time as a tracking coordinate set, and then sends the tracking coordinate set to the server. The server may receive a plurality of sets of tracking coordinates transmitted by the first monitoring device.
Step 302, determining tracking coordinates in a coordinate range according to the coordinate range of the target overlapping area corresponding to the preset first monitoring device for each tracking coordinate set.
In implementation, a pixel coordinate range in which the target overlapping area is mapped to the video image shot by the first monitoring device is preset in the server, and for each tracking coordinate set, the server can determine tracking coordinates in the tracking coordinate set within the pixel coordinate range.
And step 303, if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
In an implementation, for each tracking coordinate set, if the number of determined tracking coordinates from the tracking coordinate set is 1, the server regards the determined tracking coordinates as first tracking coordinates and acquires a timestamp of the first tracking coordinates. Or if the number of the determined tracking coordinates is 1, the server may further determine whether the determined tracking coordinates are close to the edge of the video image shot by the first monitoring device, and if the determined tracking coordinates are close to the edge of the video image shot by the first monitoring device, the server may use the determined tracking coordinates as the first tracking coordinates.
Optionally, the present application further provides another processing manner for obtaining the first tracking coordinates and the time stamps of the first tracking coordinates, referring to fig. 4, specifically including the following steps:
step 401, acquiring a first video shot by a first monitoring device.
In implementation, the first monitoring device shoots a first video of a monitoring area corresponding to the first monitoring device, and sends the first video to the server. The server can acquire a first video shot by the first monitoring device.
Step 402, collecting tracking coordinates in video images of the first video at preset time intervals, and taking shooting time of the video images to which the tracking coordinates belong as a time stamp of the tracking coordinates.
In implementation, the server may collect tracking coordinates in the video image of the first video at preset time intervals, and for each tracking coordinate, take, as a timestamp of the tracking coordinate, a capturing time of the video image to which the tracking coordinate belongs.
Step 403, determining tracking coordinates in a coordinate range according to the coordinate range of the target overlapping region corresponding to the preset first monitoring device for each group of tracking coordinates with the same time stamp.
In implementation, a coordinate range in which the target overlapping area is mapped to the video image shot by the first monitoring device is preset in the server, the server can divide the collected tracking coordinates into a group of tracking coordinates with identical time stamps, and then the server can determine the tracking coordinates in the coordinate range for each group of collected tracking coordinates with identical time stamps.
And step 404, if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
In an implementation, for each set of tracking coordinates that are acquired with the same time stamp, if the number of determined tracking coordinates from the set of tracking coordinates is 1, the server may take the determined tracking coordinates as the first tracking coordinates and acquire the time stamp of the first tracking coordinates. Alternatively, if the number of the determined tracking coordinates is 1, the server may further determine whether the tracking coordinates are close to the edge of the video image captured by the first monitoring device, and if the tracking coordinates are close to the edge of the video image captured by the first monitoring device, the server may regard the tracking coordinates as the first tracking coordinates.
Optionally, as shown in fig. 5, the embodiment of the present application further provides a processing procedure of the monitoring device for determining that the image detection is deviated, which specifically may include the following steps:
step 501, for each monitoring device, acquiring the plane coordinates of the first tracking object in the video image of the first overlapping area shot by the monitoring device, and recording the time stamp of the plane coordinates of the first tracking object.
In implementation, for each monitoring device, the server may determine a first tracking object that appears in the video image of the first overlapping area captured by the monitoring device within a preset period of time, and acquire the pixel coordinates of the first tracking object in the video image of the first overlapping area captured by the monitoring device, and the timestamp of the pixel coordinates. The server may then determine the plane coordinates of the first tracking object according to the pixel coordinates of the first tracking object and the coordinate transformation matrix corresponding to the monitoring device, and record the time stamp of the plane coordinates of the first tracking object as the time stamp of the pixel coordinates of the first tracking object.
As shown in fig. 1, the monitoring area of the monitoring device may include two overlapping areas, where the first overlapping area may include two overlapping areas, where the two overlapping areas included in the first overlapping area are respectively indicated by a 1A overlapping area and a 1B overlapping area, and the server may determine a first tracking object that appears in the video image of the 1A overlapping area captured by the monitoring device in a preset period of time, and then acquire pixel coordinates of the first tracking object in the video image of the 1A overlapping area captured by the monitoring device, and a timestamp of the pixel coordinates. And determining a first tracking object which appears in the video image of the 1B overlapping area shot by the monitoring equipment within a preset time period, and then acquiring pixel coordinates of the first tracking object in the video image of the 1B overlapping area shot by the monitoring equipment and a time stamp of the pixel coordinates.
Step 502, obtaining the plane coordinates of a second tracking object with the same time stamp as the plane coordinates of the first tracking object in the video image of the first overlapping area shot by the monitoring device adjacent to the monitoring device.
Wherein a monitoring area photographed by a monitoring device adjacent to the monitoring device covers the first overlapping area. The monitoring device adjacent to the monitoring device will be described later with a third monitoring device.
In an implementation, the server may obtain pixel coordinates of the tracking object in the video image of the first overlapping region captured by the third monitoring device, and a time stamp of each pixel coordinate. The server may determine, according to the pixel coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device and the coordinate conversion matrix corresponding to the third monitoring device, the plane coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device, and record the timestamp of the plane coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device as the timestamp of the pixel coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device. Then, for each first tracked object, the server may determine, from the plane coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device, the plane coordinates of the second tracked object identical to the time stamp of the plane coordinates of the first tracked object.
When the first overlapping area includes the 1A overlapping area and the 1B overlapping area, the third monitoring device adjacent to the monitoring device may include a 3A monitoring device capturing the 1A overlapping area and a 3B monitoring device capturing the 1B overlapping area, and the server may acquire pixel coordinates of the tracking object in the video image of the 1A overlapping area captured by the 3A monitoring device, and a time stamp of each pixel coordinate. The server can determine the plane coordinates of the tracking object in the video image of the 1A overlapping area shot by the 3A monitoring device according to the pixel coordinates of the tracking object in the video image of the 1A overlapping area shot by the 3A monitoring device and the coordinate conversion matrix corresponding to the 3A monitoring device, and record the time stamp of the plane coordinates of the tracking object in the video image of the 1A overlapping area shot by the 3A monitoring device as the time stamp of the pixel coordinates of the tracking object in the video image of the 1A overlapping area shot by the 3A monitoring device. Then, for each first tracked object in the video image of the 1A overlapping area captured by the monitoring device, the server may determine, from the plane coordinates of the tracked object in the video image of the 1A overlapping area captured by the 3A monitoring device, the plane coordinates of a second tracked object identical to the time stamp of the plane coordinates of the first tracked object. Similarly, for each first tracked object in the video image of the 1B overlapping area captured by the monitoring device, the server may determine, from the plane coordinates of the tracked object in the video image of the 1B overlapping area captured by the 3B monitoring device, the plane coordinates of a second tracked object identical to the time stamp of the plane coordinates of the first tracked object.
In step 503, a distance between the first tracking object and the second tracking object is determined according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object.
In an implementation, for each first tracked object, the server may determine a distance between the planar coordinates of the first tracked object and the planar coordinates of the second tracked object determined from the planar coordinates of the first tracked object (i.e., a distance between the first tracked object and the second tracked object), and if there are a plurality of planar coordinates of the second tracked object determined from the planar coordinates of the first tracked object, the server may determine a distance between the planar coordinates of the first tracked object and the planar coordinates of the respective second tracked object. For example, assuming that the planar coordinates of the first tracked object are (5, 6), and the planar coordinates of the second tracked object determined from the planar coordinates (5, 6) of the first tracked object are (8, 10) and (11, 14), respectively, the distances of the first tracked object and the respective second tracked objects are 5 and 10, respectively.
When the first overlapping region includes a 1A overlapping region and a 1B overlapping region, the server may determine, for each first tracking object in the video image of the 1A overlapping region captured by the monitoring device, a distance between a plane coordinate of the first tracking object and a plane coordinate of a second tracking object determined from the plane coordinate of the first tracking object. Similarly, for each first tracked object in the video image of the 1B overlapping region captured by the monitoring device, the server may determine a distance between the planar coordinates of the first tracked object and the planar coordinates of the second tracked object determined from the planar coordinates of the first tracked object.
Step 504, if there is a second tracking object whose distance from the first tracking object is smaller than a preset first threshold, determining that the first tracking object is successfully matched; and if the second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match.
In implementation, for each first tracking object, if there is a planar coordinate of a second tracking object whose distance from the planar coordinate of the first tracking object is smaller than a preset first threshold value in the planar coordinates of each second tracking object determined according to the planar coordinates of the first tracking object, the server determines that the first tracking object is successfully matched (i.e., the first tracking object and the second tracking object are the same tracking object); if the plane coordinates of the second tracking objects determined according to the plane coordinates of the first tracking objects do not exist, the server judges that the matching of the first tracking objects fails, wherein the distance between the plane coordinates of the second tracking objects and the plane coordinates of the first tracking objects is smaller than a preset first threshold value.
When the first overlapping area comprises a 1A overlapping area and a 1B overlapping area, for each first tracking object in the video image of the 1A overlapping area shot by the monitoring equipment, if the plane coordinates of a second tracking object, which are determined according to the plane coordinates of the first tracking object, exist in the plane coordinates of the second tracking object, and the distance between the plane coordinates of the second tracking object and the plane coordinates of the first tracking object is smaller than a preset first threshold value, the server judges that the first tracking object is successfully matched. If there is no plane coordinate of the second tracking object whose distance from the plane coordinate of the first tracking object is smaller than a preset first threshold value in the plane coordinates of the second tracking object determined according to the plane coordinates of the first tracking object, the server determines that the matching of the first tracking object fails. Similarly, for each first tracking object in the video image of the 1B overlapping area shot by the monitoring device, the server may determine whether the first tracking object is successfully matched.
Step 505, determining a matching success rate corresponding to the monitoring device according to a ratio between the number of successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area.
In implementation, the server may acquire the number N1 of the first tracking objects in the video image of the first overlapping region in the preset time period, then determine the number M1 of the successfully matched first tracking objects in the acquired first tracking objects, and use M1/N1 as the matching success rate corresponding to the monitoring device. Or acquiring the number M2 of successfully matched first tracking objects in the preset number (N2) of first tracking objects in the video image of the first overlapping region, and taking M2/N2 as the corresponding matching success rate of the monitoring equipment.
When the first overlapping region includes a 1A overlapping region and a 1B overlapping region, the server can determine the total number N of the first tracking objects in the video image of the 1A overlapping region shot by the monitoring device A And the number M of successfully matched first tracking objects A And the server can determine the total number N of the first tracking objects in the video image of the 1B overlapping area shot by the monitoring device B And the number M of successfully matched first tracking objects B . The server may send (N) A +N B )/(M A +M B ) As a match rate for the monitoring device.
Step 506, determining a first monitoring device with a matching success rate smaller than a preset second threshold.
In an implementation, the server may determine, from among the monitoring devices, a first monitoring device having a matching rate less than a preset second threshold. And if a plurality of monitoring devices with the matching rate smaller than a preset second threshold value exist, the server takes the monitoring device with the smallest matching rate as the first monitoring device.
In this way, the first monitoring device with the video image deviation can be determined from the respective monitoring devices.
Based on the same technical concept, as shown in fig. 6, the embodiment of the present invention further provides a tracking device, where the device is applied to a server in a multi-video monitoring system, and the multi-video monitoring system further includes a plurality of monitoring devices, where monitoring areas shot by adjacent monitoring devices have overlapping areas, and the device includes:
a first determining module 601, configured to determine a first monitoring device with deviation in image detection, where a tracking object in a video image captured by the first monitoring device and a tracking object in a video image captured by a monitoring device adjacent to the first monitoring device do not meet a preset matching condition;
A first obtaining module 602, configured to obtain a plurality of first tracking coordinates in a video image of a target overlapping area captured by the first monitoring device, and a timestamp of each first tracking coordinate, where the first tracking coordinates are pixel coordinates of a tracking object in the video image captured by the first monitoring device when only one tracking object is in the target overlapping area of the first monitoring device and the second monitoring device;
a second obtaining module 603, configured to obtain, for each first tracking coordinate, a second tracking coordinate in the video image of the target overlapping area, which is captured by the second monitoring device and is the same as a timestamp of the first tracking coordinate, and determine, according to the second tracking coordinate and a coordinate transformation matrix of the second monitoring device, a plane coordinate corresponding to the second tracking coordinate;
a second determining module 604, configured to take the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
a third determining module 605, configured to determine a target coordinate transformation matrix corresponding to the first monitoring device according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
And a fourth determining module 606, configured to determine, when the tracking coordinates of the target tracking object in the video image captured by the first monitoring device are obtained, the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate transformation matrix.
Optionally, the first obtaining module 602 includes:
the receiving unit is used for receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
a first determining unit, configured to determine, for each tracking coordinate set, tracking coordinates within a coordinate range of a target overlapping area corresponding to a preset first monitoring device;
and the first acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
Optionally, the first obtaining module 602 includes:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
The acquisition unit is used for acquiring tracking coordinates in video images of the first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a time stamp of the tracking coordinates;
the second determining unit is used for determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping area corresponding to the preset first monitoring equipment aiming at the acquired tracking coordinates with the same time stamp in each group;
and the third acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
Optionally, the first determining module 601 includes:
a fourth obtaining unit, configured to obtain, for each monitoring device, a plane coordinate of a first tracking object in a video image of a first overlapping area captured by the monitoring device, and record a timestamp of the plane coordinate of the first tracking object;
a fifth obtaining unit, configured to obtain, in a video image of the first overlapping area captured by a monitoring device adjacent to the monitoring device, a plane coordinate of a second tracking object that is the same as a timestamp of a plane coordinate of the first tracking object;
A third determining unit, configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance to the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
a fourth determining unit, configured to determine a matching success rate corresponding to the monitoring device according to a ratio between the number of successfully matched first tracking objects and the total number of first tracking objects in the video image of the first overlapping area;
and a fifth determining unit, configured to determine a first monitoring device with a matching success rate smaller than a preset second threshold.
The embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 perform communication with each other through the communication bus 704,
A memory 703 for storing a computer program;
the processor 701 is configured to execute the program stored in the memory 703, and implement the following steps:
determining a first monitoring device with deviation of image detection, wherein a tracking object in a video image shot by the first monitoring device and a tracking object in a video image shot by monitoring devices adjacent to the first monitoring device do not meet a preset matching condition;
acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
for each first tracking coordinate, acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
Taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate conversion matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
when the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are obtained, determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate set, determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
acquiring a first video shot by the first monitoring equipment;
collecting tracking coordinates in video images of a first video according to a preset time interval, and taking shooting time of the video images to which the tracking coordinates belong as a time stamp of the tracking coordinates;
determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping region corresponding to the preset first monitoring equipment aiming at the same tracking coordinates of each group of collected time stamps;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
Optionally, the first monitoring device for determining deviation of image detection includes:
for each monitoring device, acquiring plane coordinates of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinates of the first tracking object;
Acquiring the plane coordinates of a second tracking object, which are identical to the time stamp of the plane coordinates of the first tracking object, in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object being smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present invention, a computer readable storage medium is provided, in which a computer program is stored, which when executed by a processor, implements the steps of any of the tracking methods described above.
In yet another embodiment of the present invention, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the tracking methods of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for an apparatus, an electronic device, a computer readable storage medium, a computer program product embodiment, the description is relatively simple, as it is substantially similar to the method embodiment, as relevant see the partial description of the method embodiment.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1. A tracking method, wherein the method is applied to a server in a multi-video monitoring system, the multi-video monitoring system further comprising a plurality of monitoring devices, wherein monitoring areas photographed by adjacent monitoring devices have overlapping areas, the method comprising:
determining a first monitoring device with deviation of image detection, wherein a tracking object in a video image shot by the first monitoring device and a tracking object in a video image shot by monitoring devices adjacent to the first monitoring device do not meet a preset matching condition;
acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
For each first tracking coordinate, acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate conversion matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
when the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are obtained, determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix.
2. The method according to claim 1, wherein the acquiring the plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
Receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate set, determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
3. The method according to claim 1, wherein the acquiring the plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and the time stamp of each first tracking coordinate includes:
acquiring a first video shot by the first monitoring equipment;
collecting tracking coordinates in video images of a first video according to a preset time interval, and taking shooting time of the video images to which the tracking coordinates belong as a time stamp of the tracking coordinates;
determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping region corresponding to the preset first monitoring equipment aiming at the same tracking coordinates of each group of collected time stamps;
And if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a time stamp of the first tracking coordinates.
4. The method of claim 1, wherein the first monitoring device determining that the image detection is biased comprises:
for each monitoring device, acquiring plane coordinates of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinates of the first tracking object;
acquiring the plane coordinates of a second tracking object, which are identical to the time stamp of the plane coordinates of the first tracking object, in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object being smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
Determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
5. A tracking apparatus, the apparatus being applied to a server in a multi-video monitoring system, the multi-video monitoring system further comprising a plurality of monitoring devices, wherein monitoring areas photographed by adjacent monitoring devices have overlapping areas, the apparatus comprising:
the first determining module is used for determining first monitoring equipment with deviation of image detection, wherein a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
the first acquisition module is used for acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and time stamps of the first tracking coordinates, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
The second acquisition module is used for acquiring a second tracking coordinate which is the same as the timestamp of the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment aiming at each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and the coordinate conversion matrix of the second monitoring equipment;
the second determining module is used for taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
the third determining module is used for determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate;
and the fourth determining module is used for determining the plane coordinates corresponding to the target tracking object according to the tracking coordinates of the target tracking object and the target coordinate conversion matrix when the tracking coordinates of the target tracking object in the video image shot by the first monitoring equipment are acquired.
6. The apparatus of claim 5, wherein the first acquisition module comprises:
the receiving unit is used for receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same;
A first determining unit, configured to determine, for each tracking coordinate set, tracking coordinates within a coordinate range of a target overlapping area corresponding to a preset first monitoring device;
and the first acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
7. The apparatus of claim 5, wherein the first acquisition module comprises:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
the acquisition unit is used for acquiring tracking coordinates in video images of the first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a time stamp of the tracking coordinates;
the second determining unit is used for determining tracking coordinates in a coordinate range according to the coordinate range of a target overlapping area corresponding to the preset first monitoring equipment aiming at the acquired tracking coordinates with the same time stamp in each group;
and the third acquisition unit is used for taking the determined tracking coordinates as first tracking coordinates and acquiring the time stamp of the first tracking coordinates if the number of the determined tracking coordinates is 1.
8. The apparatus of claim 5, wherein the first determining module comprises:
a fourth obtaining unit, configured to obtain, for each monitoring device, a plane coordinate of a first tracking object in a video image of a first overlapping area captured by the monitoring device, and record a timestamp of the plane coordinate of the first tracking object;
a fifth obtaining unit, configured to obtain, in a video image of the first overlapping area captured by a monitoring device adjacent to the monitoring device, a plane coordinate of a second tracking object that is the same as a timestamp of a plane coordinate of the first tracking object;
a third determining unit, configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance to the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance smaller than the preset first threshold value does not exist, judging that the first tracking object fails to match;
A fourth determining unit, configured to determine a matching success rate corresponding to the monitoring device according to a ratio between the number of successfully matched first tracking objects and the total number of first tracking objects in the video image of the first overlapping area;
and a fifth determining unit, configured to determine a first monitoring device with a matching success rate smaller than a preset second threshold.
9. An electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor, the processor being caused by the machine-executable instructions to: method steps of any of claims 1-4 are achieved.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored therein a computer program which, when executed by a processor, implements the method steps of any of claims 1-4.
CN201910510134.0A 2019-06-13 2019-06-13 Tracking method and device Active CN111369587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910510134.0A CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910510134.0A CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Publications (2)

Publication Number Publication Date
CN111369587A CN111369587A (en) 2020-07-03
CN111369587B true CN111369587B (en) 2023-05-02

Family

ID=71209989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910510134.0A Active CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Country Status (1)

Country Link
CN (1) CN111369587B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072628A (en) * 2002-08-08 2004-03-04 Univ Waseda Moving body tracking system using a plurality of cameras and its method
CN106373143A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Adaptive method and system
WO2017096761A1 (en) * 2015-12-10 2017-06-15 杭州海康威视数字技术股份有限公司 Method, device and system for looking for target object on basis of surveillance cameras
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072628A (en) * 2002-08-08 2004-03-04 Univ Waseda Moving body tracking system using a plurality of cameras and its method
CN106373143A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Adaptive method and system
WO2017096761A1 (en) * 2015-12-10 2017-06-15 杭州海康威视数字技术股份有限公司 Method, device and system for looking for target object on basis of surveillance cameras
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Consistent Labeling of Tracked Objects in Multiple Cameras with Overlapping Fields of view;Sohaib Khan,et al;IEEE Transactions On Pattern analysis and machine;第25卷(第15期);第1355-1360页 *
构造多相机全景视图实现有重叠区域的目标跟踪;邓颖娜,等;西安理工大学学报;第25卷(第2期);第189-192页 *
王标.多摄像机目标跟踪算法研究与实现.信息科技.2019,(第2期),全文. *

Also Published As

Publication number Publication date
CN111369587A (en) 2020-07-03

Similar Documents

Publication Publication Date Title
CN113671480B (en) Radar and video fusion traffic target tracking method, system, equipment and terminal
CN110163885B (en) Target tracking method and device
US11599825B2 (en) Method and apparatus for training trajectory classification model, and electronic device
KR102267335B1 (en) Method for detecting a speed employing difference of distance between an object and a monitoring camera
CN109410593A (en) A kind of whistle capturing system and method
CN111105465B (en) Camera device calibration method, device, system electronic equipment and storage medium
CN111784730B (en) Object tracking method and device, electronic equipment and storage medium
CN115063454B (en) Multi-target tracking matching method, device, terminal and storage medium
CN112230241A (en) Calibration method based on random scanning type radar
CN110765823A (en) Target identification method and device
CN113869268A (en) Obstacle ranging method and device, electronic equipment and readable medium
CN111770436A (en) Indoor Wi-Fi positioning method and device based on CSI and storage medium
CN115546705A (en) Target identification method, terminal device and storage medium
JP6322545B2 (en) Automatic location system
CN111369587B (en) Tracking method and device
CN111372040B (en) Method and device for determining coordinate conversion parameters through multi-video monitoring
TWI526996B (en) Abnormal trade proofing electronic toll collecting method and system
CN115683046A (en) Distance measuring method, distance measuring device, sensor and computer readable storage medium
CN111462176B (en) Target tracking method, target tracking device and terminal equipment
CN112788228A (en) Snapshot triggering system, method and device based on radar
CN113256804A (en) Three-dimensional reconstruction scale recovery method and device, electronic equipment and storage medium
CN111862211B (en) Positioning method, device, system, storage medium and computer equipment
CN111372051A (en) Multi-camera linkage blind area detection method and device and electronic equipment
CN111753860B (en) Analysis anomaly detection method and device
WO2024139285A1 (en) Traffic intersection management method and apparatus, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant