CN111369587A - Tracking method and device - Google Patents

Tracking method and device Download PDF

Info

Publication number
CN111369587A
CN111369587A CN201910510134.0A CN201910510134A CN111369587A CN 111369587 A CN111369587 A CN 111369587A CN 201910510134 A CN201910510134 A CN 201910510134A CN 111369587 A CN111369587 A CN 111369587A
Authority
CN
China
Prior art keywords
tracking
coordinate
coordinates
monitoring equipment
video image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910510134.0A
Other languages
Chinese (zh)
Other versions
CN111369587B (en
Inventor
王科
沈涛
裴建军
于建志
张�浩
刘义
陈延鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen city public security bureau traffic police bureau
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Shenzhen city public security bureau traffic police bureau
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen city public security bureau traffic police bureau, Hangzhou Hikvision System Technology Co Ltd filed Critical Shenzhen city public security bureau traffic police bureau
Priority to CN201910510134.0A priority Critical patent/CN111369587B/en
Publication of CN111369587A publication Critical patent/CN111369587A/en
Application granted granted Critical
Publication of CN111369587B publication Critical patent/CN111369587B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a tracking method and a device, wherein the method comprises the following steps: determining a first monitoring device with image detection deviation, acquiring a first tracking coordinate in a video image of a target overlapping area shot by the first monitoring device, and a second tracking coordinate with the same timestamp as the first tracking coordinate in a video image of the target overlapping area shot by a second monitoring device, and taking a plane coordinate corresponding to the second tracking coordinate as a plane coordinate corresponding to the first tracking coordinate. And determining a target coordinate transformation matrix of the first monitoring equipment according to the first tracking coordinate and the plane coordinate corresponding to the first tracking coordinate. And when the tracking coordinate of the target tracking object shot by the first monitoring equipment is acquired, determining the plane coordinate of the target tracking object according to the target coordinate conversion matrix. Based on the method, the first monitoring equipment with the deviation of the video image can be determined, the coordinate transformation matrix of the first monitoring equipment is calibrated, and the accuracy of the track information of the tracked object is improved.

Description

Tracking method and device
Technical Field
The invention relates to the technical field of video monitoring, in particular to a tracking method and a tracking device.
Background
In order to better solve the urban traffic problem, traffic managers need to perform video monitoring and tracking on vehicles. Due to the long road section and the narrow monitoring view of a single monitoring device, it is difficult for the single monitoring device to monitor the whole road. In the prior art, a plurality of monitoring devices are adopted for monitoring, the monitoring devices are arranged beside a road, each monitoring device shoots different road sections, and the road sections shot by adjacent monitoring devices have overlapping areas, so that the plurality of monitoring devices can monitor the whole road.
The existing tracking method for multi-video monitoring comprises the following steps: the server acquires videos shot by all the monitoring devices, then acquires pixel coordinates of the first tracking object in the videos shot by all the monitoring devices, is preset with coordinate conversion matrixes of all the monitoring devices, and can convert the pixel coordinates of the first tracking object in the videos shot by the monitoring devices into plane coordinates according to the coordinate conversion matrixes corresponding to the monitoring devices. The plane coordinates may be coordinates in a gaussian plane coordinate system. And the server stores the plane coordinates of the first tracking object in the videos shot by the monitoring devices in a correlated manner to obtain the track information of the first tracking object. Because the track information of the first tracked object is composed of plane coordinates in the same coordinate system, the position relation of the first tracked object in videos shot by different monitoring devices can be reflected.
However, when a certain monitoring device is shifted due to an external force factor, the pixel coordinate of the first tracked object in the video shot by the monitoring device is shifted, at this time, the coordinate transformation matrix corresponding to the monitoring device is not matched with the monitoring device any more, and the planar coordinate converted from the pixel coordinate of the first tracked object in the video image shot by the monitoring device is inaccurate according to the coordinate transformation matrix corresponding to the monitoring device, so that the accuracy of the track information of the tracked object on the highway is low.
Disclosure of Invention
The embodiment of the invention aims to provide a tracking method and a tracking device, which can determine first monitoring equipment with a video image deviation, calibrate a coordinate transformation matrix of the first monitoring equipment and improve the accuracy of track information of a tracked object. The specific technical scheme is as follows:
in a first aspect, a tracking method is provided, where the method is applied to a server in a multi-video monitoring system, where the multi-video monitoring system further includes a plurality of monitoring devices, and monitoring areas shot by adjacent monitoring devices have overlapping areas, and the method includes:
determining first monitoring equipment with image detection deviation, wherein a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and a timestamp of each first tracking coordinate, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment aiming at each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate;
when the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained, determining the plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate includes:
receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate group, determining a tracking coordinate in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate includes:
acquiring a first video shot by the first monitoring equipment;
acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
aiming at the tracking coordinates with the same timestamp in each group, determining the tracking coordinates in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
Optionally, the first monitoring device for determining the occurrence of the image detection deviation includes:
for each monitoring device, acquiring a plane coordinate of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinate of the first tracking object;
acquiring a plane coordinate of a second tracking object with the same timestamp as the plane coordinate of the first tracking object in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
In a second aspect, a tracking apparatus is provided, where the apparatus is applied to a server in a multi-video surveillance system, the multi-video surveillance system further includes a plurality of surveillance devices, and surveillance areas shot by adjacent surveillance devices have overlapping areas, the apparatus includes:
the first determining module is used for determining first monitoring equipment with deviation in image detection, and a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
the first acquisition module is used for acquiring a plurality of first tracking coordinates and timestamps of the first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
the second acquisition module is used for acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment according to each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
the second determining module is used for taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
the third determining module is used for determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate;
and the fourth determining module is used for determining the plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix when the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained.
Optionally, the first obtaining module includes:
the receiving unit is used for receiving a plurality of tracking coordinate groups sent by the first monitoring equipment, wherein each tracking coordinate group comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate group are the same;
the first determining unit is used for determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment for each tracking coordinate group;
a first acquisition unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a time stamp of the first tracking coordinates.
Optionally, the first obtaining module includes:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
the acquisition unit is used for acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
the second determining unit is used for determining the tracking coordinate in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring device aiming at the tracking coordinate with the same acquired timestamp in each group;
and a third acquiring unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a timestamp of the first tracking coordinates.
Optionally, the first determining module includes:
the fourth acquisition unit is used for acquiring the plane coordinates of the first tracking object in the video image of the first overlapping area shot by each monitoring device and recording the time stamp of the plane coordinates of the first tracking object;
a fifth acquiring unit, configured to acquire a plane coordinate of a second tracking object having the same time stamp as a plane coordinate of the first tracking object in the video image of the first overlap area captured by the monitoring device adjacent to the monitoring device;
a third determining unit configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
the fourth determining unit is used for determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and the fifth determining unit is used for determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
In a third aspect, there is provided an electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor, the processor being caused by the machine-executable instructions to: the method steps of the first method are implemented.
In a fourth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when being executed by a processor, carries out the method steps of the first aspect.
The tracking method and the tracking device provided by the embodiment of the invention can determine the first monitoring equipment with deviation in image detection, then obtain a plurality of first tracking coordinates in the video image of the target overlapping area shot by the first monitoring equipment and the timestamp of each first tracking coordinate, wherein the first tracking coordinates are the pixel coordinates of the tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment. And then, for each first tracking coordinate, acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a preset coordinate conversion matrix of the second monitoring equipment. And determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate. When the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained, the plane coordinate corresponding to the target tracking object is determined according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix. Therefore, the first monitoring equipment with the deviation of the video image can be determined, the coordinate transformation matrix of the first monitoring equipment is calibrated, and the accuracy of the track information of the tracked object is improved.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a multi-video monitoring system according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a tracking method provided in an embodiment of the present application;
FIG. 3 is a flowchart of a method for obtaining first tracking coordinates according to an embodiment of the present disclosure;
FIG. 4 is a flowchart of a method for obtaining first tracking coordinates according to an embodiment of the present disclosure;
FIG. 5 is a flowchart of a method for determining a deviation of an image detection of a first monitoring device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a tracking device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the application provides a tracking method, which is applied to a server in a video monitoring system, wherein the video monitoring system further comprises a plurality of monitoring devices. The monitoring equipment is installed beside the road, and each monitoring equipment can shoot different road sections to obtain different video images. Wherein the road sections taken by adjacent monitoring devices have an overlapping area, so that a plurality of monitoring devices can monitor the entire road. As shown in fig. 1, a monitoring device 1 takes a video image of a monitoring area 1, a monitoring device 2 takes a video image of a monitoring area 2, a monitoring device 3 takes a video image of a monitoring area 3, a monitoring device 4 takes a video image of a monitoring area 4, the monitoring area 1 and the monitoring area 2 have an overlapping area, the monitoring area 2 and the monitoring area 3 have an overlapping area, and the monitoring area 3 and the monitoring area 4 have an overlapping area. It should be noted that fig. 1 only illustrates that there is an overlapping area between the monitoring areas of two adjacent monitoring devices, and the number, the installation position, and the installation angle of the monitoring devices are not specifically limited.
The server can be installed in a monitoring center and is connected with each monitoring device through a network. In the embodiment of the application, the server can acquire the pixel coordinates of the tracking object in the video images shot by the monitoring devices, and acquire the plane coordinates of the tracking object in the video images shot by the monitoring devices according to the pixel coordinates of the tracking object in the video images shot by the monitoring devices and the preset coordinate conversion matrix corresponding to the monitoring devices, so as to acquire the track information of the tracking object on the highway. When the monitoring equipment with the image detection deviation exists, the server can determine the first monitoring equipment with the image detection deviation, and calibrate the coordinate conversion matrix corresponding to the first monitoring equipment, so that the accuracy of the track information of the tracked object is improved. Wherein, the tracking object may be a vehicle.
As shown in fig. 2, the method comprises the steps of:
step 201, determining a first monitoring device with image detection deviation.
The tracking object in the video image shot by the first monitoring equipment and the tracking object in the video image shot by the monitoring equipment adjacent to the first monitoring equipment do not meet the preset matching condition.
In practical application, when a certain monitoring device in each monitoring device is reinstalled or moves due to an external force factor, an image detection of the monitoring device is deviated, that is, a pixel coordinate of a tracking object in a video image shot by the monitoring device is deviated from a previous pixel coordinate, so that a coordinate conversion matrix corresponding to the monitoring device is not matched with the monitoring device any more, that is, a plane coordinate converted from the pixel coordinate of the tracking object in the video image shot by the monitoring device is not accurate any more according to the coordinate conversion matrix corresponding to the monitoring device.
In implementation, for each monitoring device, the server may determine whether a tracking object in a video image captured by the monitoring device and a tracking object in a video image captured by a monitoring device adjacent to the monitoring device satisfy a preset matching condition, and if not, the server uses the monitoring device as a first monitoring device for detecting an image deviation. For example, the server may acquire pixel coordinates of a tracking object in a video image captured by the monitoring device, regard the tracking object appearing in the video image of the first overlapping area captured by the monitoring device as a first tracking object, and acquire plane coordinates and a time stamp of the plane coordinates of the first tracking object in the video image of the first overlapping area captured by the monitoring device. The plane coordinate of the first tracking object is obtained by the pixel coordinate of the first tracking object and the coordinate conversion matrix corresponding to the monitoring device, similarly, the server can obtain the plane coordinate of the second tracking object and the timestamp of the plane coordinate in the video image of the first overlapping area shot by the monitoring device adjacent to the monitoring device, the server can determine the matching rate corresponding to the monitoring device according to the plane coordinate of the first tracking object and the timestamp of the plane coordinate, and the plane coordinate of the second tracking object and the timestamp of the plane coordinate, and then the monitoring device with the matching rate smaller than the preset threshold value is used as the first monitoring device for detecting the image deviation. The specific process of the server determining the first monitoring device with the image detection deviation will be described in detail later.
It should be noted that the first monitoring device for detecting the deviation of the image can be selected by a technician.
Step 202, a plurality of first tracking coordinates in a video image of a target overlapping area shot by a first monitoring device and a timestamp of each first tracking coordinate are obtained.
When only one tracked object exists in the target overlapping area, the first monitoring equipment shoots the pixel coordinate of the tracked object in the video image of the target overlapping area. The plurality of first tracking coordinates are not on the same straight line (i.e., do not satisfy the linear relationship), and the first tracking coordinates may be pixel coordinates of a tracking object near an edge of a video image captured by the first monitoring apparatus.
In an implementation, the server may obtain a plurality of first tracking coordinates in a video image of the target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate. For example. The server may acquire a first video shot by the first monitoring device, and determine a plurality of first tracking coordinates and a timestamp of each first tracking coordinate according to a video image of the first video. Or after the first monitoring device captures the video image of the first video, the tracking coordinate in the video image may be identified, and the capturing time of the video image to which the tracking coordinate belongs may be used as the timestamp of the tracking coordinate, then the first monitoring device may send the determined tracking coordinate and the timestamp of the tracking coordinate to the server, and the server may determine the plurality of first tracking coordinates and the timestamp of each first tracking coordinate according to the tracking target sent by the first monitoring device and the timestamp of the tracking. The specific process of the server determining the first tracking coordinates and the time stamp of each first tracking coordinate will be described in detail later.
It should be noted that, as shown in fig. 1, the monitoring area shot by the first monitoring device may include two overlapping areas, and the target overlapping area may be any one of the two overlapping areas, which is not limited in the embodiment of the present application. The server needs to obtain at least 4 first tracking coordinates, because a coordinate transformation matrix of the first monitoring device needs to be calculated according to the at least 4 first tracking coordinates in a subsequent process, and a specific calculation process will be described in detail later.
Step 203, for each first tracking coordinate, obtaining a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring device, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and the coordinate transformation matrix of the second monitoring device.
In implementation, similarly, the server may further obtain a plurality of tracking coordinates in the video image of the target overlapping area captured by the second monitoring device and a timestamp of each tracking coordinate.
For each first tracking coordinate, the server may determine, from among a plurality of tracking coordinates in the video image of the target overlapping area captured by the second monitoring device, a second tracking coordinate that is the same as the timestamp of the first tracking coordinate, according to the timestamp of the first tracking coordinate. The first tracking coordinate and the second tracking coordinate corresponding to the first tracking coordinate are as follows: the same tracking object is respectively at the same timePixel coordinates in video images captured by the first monitoring device and the second monitoring device. For example, assume that the first tracking coordinate is (u)1,v1) The timestamp is 10s, and the plurality of tracking coordinates in the video image of the target overlapping area captured from the second monitoring device are: (u)2,v2) The timestamp is 8 s; (u)3,v3) The timestamp is 9 s; (u)4,v4) Time stamp of 10s, and the first tracking coordinate (u)1,v1) The second tracking target with the same timestamp is (u)4,v4)。
The server may preset a coordinate transformation matrix corresponding to the second monitoring device, and the server may determine the plane coordinate corresponding to the second tracking coordinate according to the coordinate transformation matrix corresponding to the second monitoring device and the second tracking coordinate. The plane coordinates are coordinates in a certain plane rectangular coordinate system, for example, coordinates in a gaussian plane coordinate system, or coordinates in a plane rectangular coordinate system established by a technician according to an actual road section. The plane coordinates of the same tracking object in the video images shot by different monitoring devices at the same time are the same or similar, and the first tracking coordinate and the second tracking coordinate corresponding to the first tracking coordinate are as follows: and respectively tracking pixel coordinates of the same object in the video images shot by the first monitoring equipment and the second monitoring equipment at the same time. The server may use the plane coordinates corresponding to the second tracking coordinates as the plane coordinates corresponding to the first tracking coordinates.
For example, assume that the second tracking coordinate is (u)4,v4) The coordinate transformation matrix corresponding to the second monitoring device is
Figure BDA0002093176960000101
Then, according to the formulas (1), (2) and (3), the plane coordinate (x) corresponding to the second tracking coordinate can be calculated4,y4). Wherein the content of the first and second substances,
Figure BDA0002093176960000102
Figure BDA0002093176960000103
Figure BDA0002093176960000111
Figure BDA0002093176960000112
wherein u is an abscissa of a pixel coordinate (i.e., a tracking coordinate), v is an ordinate of the pixel coordinate (i.e., the tracking coordinate), x is an abscissa of a plane coordinate corresponding to the pixel coordinate, y is an ordinate of the plane coordinate corresponding to the pixel coordinate,
Figure BDA0002093176960000113
is a coordinate transformation matrix.
And 204, taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate.
In an implementation, for each first tracking target, the server may use the plane coordinate corresponding to the second tracking target determined according to the first tracking target in step 203 as the plane coordinate corresponding to the first tracking coordinate.
Step 205, determining a target coordinate transformation matrix corresponding to the first monitoring device according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate.
In implementation, the conversion relationship between the tracking coordinates and the plane coordinates corresponding to the tracking coordinates can be known from equations (1), (2) and (3) as shown in equations (4) and (5). The server may calculate the first tracking coordinate and a perspective transformation matrix between the first tracking coordinate and the plane coordinate corresponding to the first tracking coordinate, that is, a coordinate transformation matrix corresponding to the first monitoring device, according to the plurality of first tracking coordinates, the plane coordinate corresponding to each first tracking coordinate, formula (4), and formula (5).
Figure BDA0002093176960000114
Figure BDA0002093176960000115
For example, assume that there are 4 first tracking coordinates, each being (u)1,v1),(u2,v2),(u3,v3),(u4,v4),(u1,v1) The corresponding plane coordinate is (x)1,y1),(u2,v2) The corresponding plane coordinate is (x)2,y2),(u3,v3) The corresponding plane coordinate is (x)3,y3),(u4,v4) The corresponding plane coordinate is (x)4,y4). Formula (4) is obtained according to formula (1) and formula (2), and formula (5) is obtained according to formula (1) and formula (3). And (3) the server substitutes the first tracking coordinate and each tracking coordinate into a formula (4) and a formula (5) to obtain an equation set shown in the formula (6), and calculates the value of each parameter in the coordinate conversion matrix according to the equation set in the formula (6) to obtain the coordinate conversion matrix corresponding to the first monitoring equipment. Because the coordinate transformation matrix corresponding to the first monitoring device has 8 unknown parameters to be calculated, at least 8 sets of equations are required to calculate all the parameters of the coordinate transformation matrix. And the 4 first tracking coordinates and the plane coordinates corresponding to each first tracking coordinate may list 8 sets of equations, so that at least 4 first tracking coordinates are required for each plane coordinate corresponding to each first tracking coordinate.
Figure BDA0002093176960000121
In step 206, when the tracking coordinate of the target tracking object in the video image shot by the target monitoring device is obtained, the plane coordinate of the target tracking object is determined according to the pixel coordinate of the target tracking object and the target coordinate conversion matrix.
In implementation, when the tracking coordinate of the target tracking object in the video image shot by the target monitoring device is acquired, the server may determine the plane coordinate of the target tracking object according to the pixel coordinate of the target tracking object and the target coordinate conversion matrix, and store the plane coordinate of the target tracking object, so as to acquire the track information of the target tracking object.
Therefore, the first monitoring equipment with the deviation of the video image can be determined, the coordinate transformation matrix corresponding to the first monitoring equipment is calibrated, and the accuracy of the track information of the tracked object is improved.
Optionally, the manner in which the server obtains the first tracking coordinates and the timestamp of each first tracking coordinate may be various, and the present application provides a feasible processing manner, see fig. 3, which specifically includes the following steps:
step 301, receiving a plurality of tracking coordinate sets sent by a first monitoring device.
Each tracking coordinate set comprises at least one tracking coordinate, and the time stamps of the tracking coordinates in the same tracking coordinate set are the same.
In implementation, the first monitoring device may acquire tracking coordinates in a video image of the first video captured by the first monitoring device at preset time intervals, and for each tracking coordinate, use the capturing time of the video image to which the tracking coordinate belongs as a timestamp of the tracking coordinate. The first monitoring device takes the tracking coordinate acquired each time and the timestamp of the tracking coordinate as a tracking coordinate set, and then sends the tracking coordinate set to the server. The server may receive a plurality of sets of tracking coordinates transmitted by the first monitoring device.
Step 302, determining tracking coordinates in a coordinate range according to a preset coordinate range of a target overlapping area corresponding to the first monitoring device for each tracking coordinate group.
In implementation, a pixel coordinate range in the video image shot by the first monitoring device is mapped to the target overlapping area, and for each tracking coordinate set, the server can determine the tracking coordinate in the pixel coordinate range in the tracking coordinate set.
Step 303, if the number of the determined tracking coordinates is 1, the determined tracking coordinates are used as first tracking coordinates, and a timestamp of the first tracking coordinates is obtained.
In implementation, for each tracking coordinate group, if the number of determined tracking coordinates from the tracking coordinate group is 1, the server takes the determined tracking coordinates as first tracking coordinates and acquires a timestamp of the first tracking coordinates. Or, if the number of the determined tracking coordinates is 1, the server may further determine whether the determined tracking coordinates are close to the edge of the video image captured by the first monitoring device, and if the determined tracking coordinates are close to the edge of the video image captured by the first monitoring device, the server may use the determined tracking coordinates as the first tracking coordinates.
Optionally, the present application further provides another processing manner for acquiring the first tracking coordinates and the time stamps of the first tracking coordinates, referring to fig. 4, which specifically includes the following steps:
step 401, a first video shot by a first monitoring device is obtained.
In implementation, the first monitoring device shoots a first video of a monitoring area corresponding to the first monitoring device, and sends the first video to the server. The server may obtain a first video captured by a first monitoring device.
Step 402, collecting tracking coordinates in the video image of the first video according to a preset time interval, and using shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates.
In implementation, the server may acquire tracking coordinates in the video image of the first video at preset time intervals, and for each tracking coordinate, use the shooting time of the video image to which the tracking coordinate belongs as a timestamp of the tracking coordinate.
And step 403, determining the tracking coordinate within the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring device for the tracking coordinate with the same timestamp in each group of collected tracking coordinates.
In implementation, a coordinate range of the target overlapping area mapped to the video image shot by the first monitoring device is preset in the server, the server can divide the tracking coordinates with the same timestamp in the collected tracking coordinates into a group, and then, for each group of the tracking coordinates with the same timestamp, the server can determine the tracking coordinates in the coordinate range.
In step 404, if the number of the determined tracking coordinates is 1, the determined tracking coordinates are used as first tracking coordinates, and a timestamp of the first tracking coordinates is obtained.
In an implementation, for each set of tracking coordinates acquired with the same timestamp, if the number of determined tracking coordinates from the set of tracking coordinates is 1, the server may regard the determined tracking coordinates as first tracking coordinates and acquire a timestamp of the first tracking coordinates. Alternatively, if the number of the determined tracking coordinates is 1, the server may further determine whether the tracking coordinates are close to the edge of the video image captured by the first monitoring apparatus, and if the tracking coordinates are close to the edge of the video image captured by the first monitoring apparatus, the server may regard the tracking coordinates as the first tracking coordinates.
Optionally, as shown in fig. 5, an embodiment of the present application further provides a processing procedure of a monitoring device for determining an image detection deviation, which may specifically include the following steps:
step 501, for each monitoring device, acquiring a plane coordinate of a first tracked object in a video image of a first overlapping area captured by the monitoring device, and recording a timestamp of the plane coordinate of the first tracked object.
In an implementation, for each monitoring device, the server may determine a first tracking object appearing in a video image of a first overlapping area captured by the monitoring device within a preset time period, and acquire a pixel coordinate of the first tracking object in the video image of the first overlapping area captured by the monitoring device and a timestamp of the pixel coordinate. Then, the server may determine the plane coordinate of the first tracked object according to the pixel coordinate of the first tracked object and the coordinate conversion matrix corresponding to the monitoring device, and record a timestamp of the plane coordinate of the first tracked object as the timestamp of the pixel coordinate of the first tracked object.
As shown in fig. 1, the monitoring area of the monitoring device may include two overlapping areas, the first overlapping area may include two overlapping areas, and the two overlapping areas included in the first overlapping area are respectively represented by a 1A overlapping area and a 1B overlapping area, so that the server may determine a first tracking object that appears in the video image of the 1A overlapping area captured by the monitoring device within a preset time period, and then acquire the pixel coordinates of the first tracking object in the video image of the 1A overlapping area captured by the monitoring device, and a timestamp of the pixel coordinates. And determining a first tracking object which appears in the video image of the 1B overlapping area shot by the monitoring equipment within a preset time period, and then acquiring the pixel coordinate of the first tracking object in the video image of the 1B overlapping area shot by the monitoring equipment and the timestamp of the pixel coordinate.
Step 502, acquiring a plane coordinate of a second tracking object, which is the same as a time stamp of the plane coordinate of the first tracking object, in the video image of the first overlapping area shot by the monitoring device adjacent to the monitoring device.
And the monitoring area shot by the monitoring equipment adjacent to the monitoring equipment covers the first overlapping area. The monitoring devices adjacent to the monitoring device will subsequently be indicated by the third monitoring device.
In an implementation, the server may acquire pixel coordinates of the tracking object in the video image of the first overlapping area captured by the third monitoring device and a timestamp of each pixel coordinate. The server may determine, according to the pixel coordinate of the tracked object in the video image of the first overlapping area captured by the third monitoring device and the coordinate conversion matrix corresponding to the third monitoring device, the plane coordinate of the tracked object in the video image of the first overlapping area captured by the third monitoring device, and record the timestamp of the plane coordinate of the tracked object in the video image of the first overlapping area captured by the third monitoring device as the timestamp of the pixel coordinate of the tracked object in the video image of the first overlapping area captured by the third monitoring device. Then, for each first tracked object, the server may determine, from the plane coordinates of the tracked object in the video image of the first overlapping area captured by the third monitoring device, the plane coordinates of a second tracked object having the same time stamp as the plane coordinates of the first tracked object.
When the first overlap area includes the 1A overlap area and the 1B overlap area, the third monitoring device adjacent to the monitoring device may include a 3A monitoring device that photographs the 1A overlap area and a 3B monitoring device that photographs the 1B overlap area, and the server may acquire pixel coordinates of the tracking object in the video image of the 1A overlap area photographed by the 3A monitoring device and a time stamp of each pixel coordinate. The server may determine, according to the pixel coordinates of the tracking object in the video image of the 1A overlapping region captured by the 3A monitoring device and the coordinate conversion matrix corresponding to the 3A monitoring device, the plane coordinates of the tracking object in the video image of the 1A overlapping region captured by the 3A monitoring device, and record the time stamp of the plane coordinates of the tracking object in the video image of the 1A overlapping region captured by the 3A monitoring device as the time stamp of the pixel coordinates of the tracking object in the video image of the 1A overlapping region captured by the 3A monitoring device. Then, for each first tracked object in the video image of the 1A overlapping area captured by the monitoring device, the server may determine, from the plane coordinates of the tracked object in the video image of the 1A overlapping area captured by the 3A monitoring device, the plane coordinates of a second tracked object having the same time stamp as the plane coordinates of the first tracked object. Similarly, for each first tracked object in the video image of the 1B overlap region captured by the monitoring device, the server may determine, from the plane coordinates of the tracked object in the video image of the 1B overlap region captured by the 3B monitoring device, the plane coordinates of a second tracked object having the same time stamp as the plane coordinates of the first tracked object.
Step 503, determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object.
In an implementation, for each first tracking object, the server may determine a distance between the plane coordinates of the first tracking object and the plane coordinates of the second tracking object determined from the plane coordinates of the first tracking object (i.e., a distance between the first tracking object and the second tracking object), and if there are a plurality of plane coordinates of the second tracking object determined from the plane coordinates of the first tracking object, the server may determine distances between the plane coordinates of the first tracking object and the plane coordinates of the respective second tracking objects. For example, assuming that the plane coordinates of the first tracked object are (5,6), and the plane coordinates of the second tracked object determined from the plane coordinates (5,6) of the first tracked object are (8,10) and (11,14), respectively, the distances between the first tracked object and each of the second tracked objects are 5 and 10, respectively.
When the first overlap region includes the 1A overlap region and the 1B overlap region, for each first tracked object in the video image of the 1A overlap region captured by the monitoring apparatus, the server may determine a distance between the plane coordinates of the first tracked object and the plane coordinates of the second tracked object determined from the plane coordinates of the first tracked object. Similarly, for each first tracked object in the video image of the 1B overlap region captured by the monitoring device, the server may determine a distance between the plane coordinates of the first tracked object and the plane coordinates of the second tracked object determined from the plane coordinates of the first tracked object.
Step 504, if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; and if no second tracking object with the distance from the first tracking object smaller than a preset first threshold exists, judging that the first tracking object fails to be matched.
In implementation, for each first tracked object, if the plane coordinates of a second tracked object whose distance from the plane coordinates of the first tracked object is smaller than a preset first threshold exists in the plane coordinates of each second tracked object determined according to the plane coordinates of the first tracked object, the server determines that the first tracked object is successfully matched (that is, the first tracked object and the second tracked object are the same tracked object); and if the plane coordinate of the second tracking object, the distance from which to the plane coordinate of the first tracking object is smaller than a preset first threshold value, does not exist in the plane coordinates of the second tracking objects determined according to the plane coordinate of the first tracking object, the server judges that the first tracking object fails to be matched.
When the first overlapping area comprises a 1A overlapping area and a 1B overlapping area, for each first tracking object in a video image of the 1A overlapping area shot by the monitoring equipment, if plane coordinates of a second tracking object with the distance from the plane coordinates of the first tracking object smaller than a preset first threshold exist in the plane coordinates of the second tracking object determined according to the plane coordinates of the first tracking object, the server judges that the first tracking object is successfully matched. And if the plane coordinate of the second tracking object, the distance from which to the plane coordinate of the first tracking object is less than a preset first threshold value, does not exist in the plane coordinates of the second tracking object determined according to the plane coordinates of the first tracking object, the server judges that the first tracking object fails to be matched. Similarly, for each first tracking object in the video image of the 1B overlapping area captured by the monitoring device, the server may determine whether the first tracking object is successfully matched.
And 505, determining a matching success rate corresponding to the monitoring equipment according to a ratio between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area.
In implementation, the server may acquire the number N1 of first tracked objects in the video image of the first overlapping area within a preset time period, then determine the number M1 of successfully matched first tracked objects in the acquired first tracked objects, and use M1/N1 as the matching success rate corresponding to the monitoring device. Or acquiring the number M2 of successfully matched first tracked objects in the preset number (N2) of first tracked objects in the video image of the first overlapping area, and then taking M2/N2 as the matching success rate corresponding to the monitoring equipment.
When the first overlapping area comprises a 1A overlapping area and a 1B overlapping area, the server can determine that the first tracking object in the video image of the 1A overlapping area shot by the monitoring equipmentTotal number of (2) NAAnd the number M of successfully matched first tracking objectsAAnd the server can determine the total number N of the first tracking objects in the video image of the 1B overlapping area shot by the monitoring equipmentBAnd the number M of successfully matched first tracking objectsB. The server can exchange (N)A+NB)/(MA+MB) As the matching rate of the monitoring device.
Step 506, determining the first monitoring device with the matching success rate smaller than a preset second threshold.
In implementation, the server may determine, from the respective monitoring devices, a first monitoring device whose matching rate is smaller than a preset second threshold. And if a plurality of monitoring devices with matching rates smaller than a preset second threshold exist, the server takes the monitoring device with the minimum matching rate as the first monitoring device.
In this way, the first monitoring device with the video image deviation can be determined from the monitoring devices.
Based on the same technical concept, as shown in fig. 6, an embodiment of the present invention further provides a tracking apparatus, where the apparatus is applied to a server in a multi-video monitoring system, and the multi-video monitoring system further includes a plurality of monitoring devices, where monitoring areas shot by adjacent monitoring devices have overlapping areas, and the apparatus includes:
a first determining module 601, configured to determine a first monitoring device with an image detection deviation, where a tracked object in a video image captured by the first monitoring device and a tracked object in a video image captured by a monitoring device adjacent to the first monitoring device do not meet a preset matching condition;
a first obtaining module 602, configured to obtain a plurality of first tracking coordinates in a video image of a target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate, where the first tracking coordinate is a pixel coordinate of a tracking object in the video image captured by the first monitoring device when only one tracking object is in the target overlapping area of the first monitoring device and the second monitoring device;
a second obtaining module 603, configured to obtain, for each first tracking coordinate, a second tracking coordinate, which is the same as a timestamp of the first tracking coordinate, in the video image of the target overlapping area captured by the second monitoring device, and determine, according to the second tracking coordinate and the coordinate transformation matrix of the second monitoring device, a plane coordinate corresponding to the second tracking coordinate;
a second determining module 604, configured to use the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
a third determining module 605, configured to determine, according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate, a target coordinate transformation matrix corresponding to the first monitoring device;
a fourth determining module 606, configured to determine, when the tracking coordinate of the target tracking object in the video image captured by the first monitoring device is obtained, a plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix.
Optionally, the first obtaining module 602 includes:
the receiving unit is used for receiving a plurality of tracking coordinate groups sent by the first monitoring equipment, wherein each tracking coordinate group comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate group are the same;
the first determining unit is used for determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment for each tracking coordinate group;
a first acquisition unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a time stamp of the first tracking coordinates.
Optionally, the first obtaining module 602 includes:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
the acquisition unit is used for acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
the second determining unit is used for determining the tracking coordinate in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring device aiming at the tracking coordinate with the same acquired timestamp in each group;
and a third acquiring unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a timestamp of the first tracking coordinates.
Optionally, the first determining module 601 includes:
the fourth acquisition unit is used for acquiring the plane coordinates of the first tracking object in the video image of the first overlapping area shot by each monitoring device and recording the time stamp of the plane coordinates of the first tracking object;
a fifth acquiring unit, configured to acquire a plane coordinate of a second tracking object having the same time stamp as a plane coordinate of the first tracking object in the video image of the first overlap area captured by the monitoring device adjacent to the monitoring device;
a third determining unit configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
the fourth determining unit is used for determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and the fifth determining unit is used for determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
An embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 complete mutual communication through the communication bus 704,
a memory 703 for storing a computer program;
the processor 701 is configured to implement the following steps when executing the program stored in the memory 703:
determining first monitoring equipment with image detection deviation, wherein a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and a timestamp of each first tracking coordinate, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment aiming at each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate;
when the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained, determining the plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate includes:
receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate group, determining a tracking coordinate in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
Optionally, the acquiring a plurality of first tracking coordinates in the video image of the target overlapping area captured by the first monitoring device and a timestamp of each first tracking coordinate includes:
acquiring a first video shot by the first monitoring equipment;
acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
aiming at the tracking coordinates with the same timestamp in each group, determining the tracking coordinates in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
Optionally, the first monitoring device for determining the occurrence of the image detection deviation includes:
for each monitoring device, acquiring a plane coordinate of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinate of the first tracking object;
acquiring a plane coordinate of a second tracking object with the same timestamp as the plane coordinate of the first tracking object in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In a further embodiment provided by the present invention, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of any of the above-described tracking methods.
In a further embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the tracking methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus, the electronic device, the computer-readable storage medium, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A tracking method applied to a server in a multi-video surveillance system, the multi-video surveillance system further including a plurality of surveillance devices, wherein surveillance areas photographed by adjacent surveillance devices have overlapping areas, the method comprising:
determining first monitoring equipment with image detection deviation, wherein a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
acquiring a plurality of first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment and a timestamp of each first tracking coordinate, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment aiming at each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate;
when the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained, determining the plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix.
2. The method of claim 1, wherein the obtaining a plurality of first tracking coordinates in the video image of the overlapping region of the target captured by the first monitoring device and a timestamp of each first tracking coordinate comprises:
receiving a plurality of tracking coordinate sets sent by the first monitoring equipment, wherein each tracking coordinate set comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate set are the same;
for each tracking coordinate group, determining a tracking coordinate in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
3. The method of claim 1, wherein the obtaining a plurality of first tracking coordinates in the video image of the overlapping region of the target captured by the first monitoring device and a timestamp of each first tracking coordinate comprises:
acquiring a first video shot by the first monitoring equipment;
acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
aiming at the tracking coordinates with the same timestamp in each group, determining the tracking coordinates in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring equipment;
and if the number of the determined tracking coordinates is 1, taking the determined tracking coordinates as first tracking coordinates, and acquiring a timestamp of the first tracking coordinates.
4. The method of claim 1, wherein determining the first monitoring device that deviates from the image detection comprises:
for each monitoring device, acquiring a plane coordinate of a first tracking object in a video image of a first overlapping area shot by the monitoring device, and recording a time stamp of the plane coordinate of the first tracking object;
acquiring a plane coordinate of a second tracking object with the same timestamp as the plane coordinate of the first tracking object in the video image of the first overlapping area shot by the monitoring equipment adjacent to the monitoring equipment;
determining the distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists, judging that the first tracking object is successfully matched; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
5. A tracking apparatus, wherein the apparatus is applied to a server in a multi-video surveillance system, the multi-video surveillance system further comprising a plurality of surveillance devices, wherein surveillance areas photographed by adjacent surveillance devices have overlapping areas, the apparatus comprising:
the first determining module is used for determining first monitoring equipment with deviation in image detection, and a tracking object in a video image shot by the first monitoring equipment and a tracking object in a video image shot by monitoring equipment adjacent to the first monitoring equipment do not meet a preset matching condition;
the first acquisition module is used for acquiring a plurality of first tracking coordinates and timestamps of the first tracking coordinates in a video image of a target overlapping area shot by the first monitoring equipment, wherein the first tracking coordinates are pixel coordinates of a tracking object in the video image shot by the first monitoring equipment when only one tracking object exists in the target overlapping area of the first monitoring equipment and the second monitoring equipment;
the second acquisition module is used for acquiring a second tracking coordinate with the same timestamp as the first tracking coordinate in the video image of the target overlapping area shot by the second monitoring equipment according to each first tracking coordinate, and determining a plane coordinate corresponding to the second tracking coordinate according to the second tracking coordinate and a coordinate conversion matrix of the second monitoring equipment;
the second determining module is used for taking the plane coordinate corresponding to the second tracking coordinate as the plane coordinate corresponding to the first tracking coordinate;
the third determining module is used for determining a target coordinate transformation matrix corresponding to the first monitoring equipment according to the plurality of first tracking coordinates and the plane coordinate corresponding to each first tracking coordinate;
and the fourth determining module is used for determining the plane coordinate corresponding to the target tracking object according to the tracking coordinate of the target tracking object and the target coordinate conversion matrix when the tracking coordinate of the target tracking object in the video image shot by the first monitoring device is obtained.
6. The apparatus of claim 5, wherein the first obtaining module comprises:
the receiving unit is used for receiving a plurality of tracking coordinate groups sent by the first monitoring equipment, wherein each tracking coordinate group comprises at least one tracking coordinate, and timestamps of the tracking coordinates in the same tracking coordinate group are the same;
the first determining unit is used for determining tracking coordinates in a coordinate range according to the preset coordinate range of a target overlapping area corresponding to the first monitoring equipment for each tracking coordinate group;
a first acquisition unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a time stamp of the first tracking coordinates.
7. The apparatus of claim 5, wherein the first obtaining module comprises:
the second acquisition unit is used for acquiring a first video shot by the first monitoring equipment;
the acquisition unit is used for acquiring tracking coordinates in a video image of a first video according to a preset time interval, and taking the shooting time of the video image to which the tracking coordinates belong as a timestamp of the tracking coordinates;
the second determining unit is used for determining the tracking coordinate in the coordinate range according to the preset coordinate range of the target overlapping area corresponding to the first monitoring device aiming at the tracking coordinate with the same acquired timestamp in each group;
and a third acquiring unit configured to, if the number of the determined tracking coordinates is 1, take the determined tracking coordinates as first tracking coordinates, and acquire a timestamp of the first tracking coordinates.
8. The apparatus of claim 5, wherein the first determining module comprises:
the fourth acquisition unit is used for acquiring the plane coordinates of the first tracking object in the video image of the first overlapping area shot by each monitoring device and recording the time stamp of the plane coordinates of the first tracking object;
a fifth acquiring unit, configured to acquire a plane coordinate of a second tracking object having the same time stamp as a plane coordinate of the first tracking object in the video image of the first overlap area captured by the monitoring device adjacent to the monitoring device;
a third determining unit configured to determine a distance between the first tracking object and the second tracking object according to the plane coordinates of the first tracking object and the plane coordinates of the second tracking object;
the matching unit is used for judging that the first tracking object is successfully matched if a second tracking object with the distance from the first tracking object smaller than a preset first threshold exists; if a second tracking object with the distance from the first tracking object smaller than the preset first threshold value does not exist, judging that the first tracking object fails to be matched;
the fourth determining unit is used for determining the matching success rate corresponding to the monitoring equipment according to the proportion between the number of the successfully matched first tracking objects and the total number of the first tracking objects in the video image of the first overlapping area;
and the fifth determining unit is used for determining the first monitoring equipment with the matching success rate smaller than a preset second threshold value.
9. An electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor, the processor being caused by the machine-executable instructions to: carrying out the method steps of any one of claims 1 to 4.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 4.
CN201910510134.0A 2019-06-13 2019-06-13 Tracking method and device Active CN111369587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910510134.0A CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910510134.0A CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Publications (2)

Publication Number Publication Date
CN111369587A true CN111369587A (en) 2020-07-03
CN111369587B CN111369587B (en) 2023-05-02

Family

ID=71209989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910510134.0A Active CN111369587B (en) 2019-06-13 2019-06-13 Tracking method and device

Country Status (1)

Country Link
CN (1) CN111369587B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072628A (en) * 2002-08-08 2004-03-04 Univ Waseda Moving body tracking system using a plurality of cameras and its method
CN106373143A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Adaptive method and system
WO2017096761A1 (en) * 2015-12-10 2017-06-15 杭州海康威视数字技术股份有限公司 Method, device and system for looking for target object on basis of surveillance cameras
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072628A (en) * 2002-08-08 2004-03-04 Univ Waseda Moving body tracking system using a plurality of cameras and its method
CN106373143A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Adaptive method and system
WO2017096761A1 (en) * 2015-12-10 2017-06-15 杭州海康威视数字技术股份有限公司 Method, device and system for looking for target object on basis of surveillance cameras
WO2019019943A1 (en) * 2017-07-28 2019-01-31 阿里巴巴集团控股有限公司 Method for tracing track of target in cross regions, and data processing method, apparatus and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SOHAIB KHAN,ET AL: "Consistent Labeling of Tracked Objects in Multiple Cameras with Overlapping Fields of view", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE *
王标: "多摄像机目标跟踪算法研究与实现" *
邓颖娜,等: "构造多相机全景视图实现有重叠区域的目标跟踪", 西安理工大学学报 *

Also Published As

Publication number Publication date
CN111369587B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN113671480A (en) Radar and video fusion traffic target tracking method, system, equipment and terminal
US11599825B2 (en) Method and apparatus for training trajectory classification model, and electronic device
KR20200064873A (en) Method for detecting a speed employing difference of distance between an object and a monitoring camera
US11200406B2 (en) Customer flow statistical method, apparatus and device
CN105809658A (en) Method and apparatus for setting region of interest
CN112053572A (en) Vehicle speed measuring method, device and system based on video and distance grid calibration
CN111784730A (en) Object tracking method and device, electronic equipment and storage medium
CN115063454A (en) Multi-target tracking matching method, device, terminal and storage medium
CN115546705A (en) Target identification method, terminal device and storage medium
CN115908545A (en) Target track generation method and device, electronic equipment and medium
CN113033266A (en) Personnel motion trajectory tracking method, device and system and electronic equipment
CN111369587B (en) Tracking method and device
CN111372040B (en) Method and device for determining coordinate conversion parameters through multi-video monitoring
CN111105465A (en) Camera device calibration method, device, system electronic equipment and storage medium
CN115683046A (en) Distance measuring method, distance measuring device, sensor and computer readable storage medium
CN111462176B (en) Target tracking method, target tracking device and terminal equipment
CN114782555A (en) Map mapping method, apparatus, and storage medium
Van Den Hengel et al. Finding camera overlap in large surveillance networks
TWI526996B (en) Abnormal trade proofing electronic toll collecting method and system
CN110689726B (en) Traffic violation punishment evidence link completion method and equipment
CN113326445A (en) Companion relationship discovery method and device
JP6443144B2 (en) Information output device, information output program, information output method, and information output system
CN112788228A (en) Snapshot triggering system, method and device based on radar
US20230267638A1 (en) Shooting score identifying method and device, and electronic device
CN111862211B (en) Positioning method, device, system, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant