CN110415278B - Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera - Google Patents

Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera Download PDF

Info

Publication number
CN110415278B
CN110415278B CN201910697079.0A CN201910697079A CN110415278B CN 110415278 B CN110415278 B CN 110415278B CN 201910697079 A CN201910697079 A CN 201910697079A CN 110415278 B CN110415278 B CN 110415278B
Authority
CN
China
Prior art keywords
ptz camera
monitoring
image
suspected target
ptz
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910697079.0A
Other languages
Chinese (zh)
Other versions
CN110415278A (en
Inventor
崔智高
苏延召
王涛
徐斌
蔡艳平
李庆辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rocket Force University of Engineering of PLA
Original Assignee
Rocket Force University of Engineering of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rocket Force University of Engineering of PLA filed Critical Rocket Force University of Engineering of PLA
Priority to CN201910697079.0A priority Critical patent/CN110415278B/en
Publication of CN110415278A publication Critical patent/CN110415278A/en
Application granted granted Critical
Publication of CN110415278B publication Critical patent/CN110415278B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses a master-slave tracking method of a linear moving PTZ camera auxiliary binocular PTZ visual system, which comprises the following steps: firstly, constructing a data acquisition platform of a linear moving PTZ camera auxiliary binocular PTZ visual system; secondly, the fixed main monitoring PTZ camera controls the linearly moving PTZ camera to lock the suspected target, and the two cameras continuously track the suspected target under the short focal length; thirdly, estimating foreground areas of suspected targets in the fixed main monitoring PTZ camera and the linear moving PTZ camera respectively; fourthly, obtaining an absolute depth value of the foreground area of the suspected target; fifthly, adjusting the accuracy of depth estimation of the suspected target; and sixthly, estimating control parameters of the fixed slave monitoring PTZ camera and realizing active tracking of the suspected target under the long focal length. According to the invention, on the basis of a binocular PTZ visual system, a PTZ camera which moves linearly is introduced, the depth of a foreground region of a suspected target is estimated from coarse to fine, and the control parameters of a fixed slave monitoring PTZ camera are estimated, so that the master-slave tracking of the binocular PTZ visual system is realized.

Description

Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera
Technical Field
The invention belongs to the technical field of video monitoring, and particularly relates to a master-slave tracking method of a linear motion PTZ camera auxiliary binocular PTZ visual system.
Background
In the system, one PTZ camera is used as a main monitoring camera and is used for monitoring the panorama and realizing target tracking of the suspected target under the condition of low resolution; the other PTZ camera is used as a slave monitoring camera and is controlled by the master monitoring camera, and target tracking of the suspected target under the condition of high resolution is realized by continuously estimating and adjusting parameters of the PTZ camera.
The master-slave tracking method of the existing binocular PTZ visual system mainly comprises a master-slave tracking method based on spherical longitude and latitude coordinates and a master-slave tracking method based on ground plane constraint. A master-slave tracking method based on spherical longitude and latitude coordinates is characterized in that a spherical common coordinate system is established, corresponding points under a two-camera coordinate system are converted into spherical longitude and latitude coordinates, longitude values are kept consistent, the latitude values are used for measuring visual angle differences, and therefore the spherical common coordinate system is used as an intermediate bridge, and master-slave tracking of two PTZ cameras is achieved through a series of coordinate conversion. The main disadvantage of this method is that the maximum depth and the minimum depth of the monitored scene need to be given in advance, and once the parameters are given inaccurately, a large tracking error will result; in addition, the method only roughly estimates three control parameters, namely horizontal rotation (pan), vertical rotation (tilt) and focal length change (zoom), of the monitoring camera through the depth range of the monitoring scene, and does not update the depth information of the suspected target in real time, so that a large tracking error also occurs when the target depth in the scene changes greatly (such as an indoor scene).
A master-slave tracking method based on ground plane constraint establishes coordinate association between two cameras by utilizing a homography matrix determined by a ground plane, thereby realizing master-slave tracking of a binocular PTZ camera. Unlike the foregoing method, this method does not require the maximum depth and the minimum depth of the scene to be given in advance, and can effectively accommodate the change in the target depth. The method has the main defects that the method is only suitable for the condition that most of the monitoring scenes are ground planes, and for the monitoring scenes comprising slopes, steps and the like, the method can fail or cause large tracking errors because the method does not meet plane constraint conditions.
Disclosure of Invention
The invention aims to solve the technical problem that the defects in the prior art are overcome, and provides a master-slave tracking method of a linearly moving PTZ camera assisted binocular PTZ visual system.
In order to solve the technical problems, the invention adopts the technical scheme that: the master-slave tracking method of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera is characterized by comprising the following steps of:
step one, constructing a data acquisition platform of the auxiliary binocular PTZ visual system of the linear moving PTZ camera: installing a first fixed monitoring PTZ camera and a second fixed monitoring PTZ camera at equal-height positions on a vertical plate, installing a sliding rail on the vertical plate and positioned at the lower side of the first fixed monitoring PTZ camera and the second fixed monitoring PTZ camera, wherein the central axis of the sliding rail in the length direction is parallel to the central connecting line of the first fixed monitoring PTZ camera and the second fixed monitoring PTZ camera, installing a linear moving PTZ camera on the sliding rail, the first fixed monitoring PTZ camera, the second fixed monitoring PTZ camera and the linear moving PTZ camera are all connected with a computer, and the X axis, the Y axis and the Z axis of the camera coordinate systems of the first fixed monitoring PTZ camera, the second fixed monitoring PTZ camera and the linear moving PTZ camera are respectively parallel to each other;
the first fixed monitoring PTZ camera, the second fixed monitoring PTZ camera, the linear moving PTZ camera, the sliding rail, the vertical plate and the computer form a data acquisition platform of the linear moving PTZ camera auxiliary binocular PTZ visual system;
one of a first fixed monitoring PTZ camera and a second fixed monitoring PTZ camera in a data acquisition platform of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera is used as a fixed main monitoring PTZ camera, and the other camera is used as a fixed auxiliary monitoring PTZ camera;
step two, the fixed main monitoring PTZ camera controls the linearly moving PTZ camera to lock the suspected target, the two cameras continuously track the suspected target under the short focal length, and the process is as follows:
step 201, fixing image coordinates of a main monitoring PTZ camera and initial control parameters of a linear moving PTZ camera through offline sampling, and storing the initial control parameters in a computer in a data table form;
the control parameters comprise pan rotation parameters, tilt rotation parameters and zoom focal length parameters of the camera;
step 202, selecting a suspected target on a monitoring interface of a fixed main monitoring PTZ camera, inquiring a data table by a computer, and feeding back control parameters to the linearly moving PTZ camera, so that the linearly moving PTZ camera locks the suspected target under a short focal length;
step 203, the fixed main monitoring PTZ camera continuously tracks the suspected target under the short focal length by using a mean shift algorithm, calculates the distance between the centroid of the suspected target and the image boundary frame by frame, and uses a formula if the distance between the centroid of the suspected target and the image boundary is smaller than a pixel threshold value
Figure GDA0002364170920000031
Updating pan rotation parameters and tilt rotation parameters of the fixed main monitoring PTZ camera, so that the suspected target is locked at the image center position of the fixed main monitoring PTZ camera, and the suspected target is tracked under the short focal length by continuously utilizing a mean shift algorithm, wherein (x)1,y1) As coordinates of the center of mass of the suspected target, (u)1,v1) Image center coordinates, f, for a fixed primary monitoring PTZ camera1Equivalent focal length, Δ p, for a fixed main monitoring PTZ camera1Absolute angle, Δ t, to be changed for fixing the main monitoring PTZ Camera pan rotation parameters1Absolute angle (p) to be changed for fixing the tilt rotation parameters of the main monitoring PTZ camera1,t1) Monitoring PTZ Camera pan and Tilt Pre-rotation parameters for stationary Master data, (p'1,t'1) Monitoring data after the pan rotation parameters and tilt rotation parameters of the PTZ camera are changed for a fixed main;
step 204, continuously tracking the suspected target by the linearly moving PTZ camera under the short focal length, wherein the tracking method is consistent with the method for continuously tracking the suspected target by the fixed main monitoring PTZ camera under the short focal length;
step three, estimating suspected target foreground areas in the fixed main monitoring PTZ camera and the linear moving PTZ camera respectively, wherein the process is as follows:
301, acquiring adjacent frame images of the fixed main monitoring PTZ camera
Figure GDA0002364170920000041
And
Figure GDA0002364170920000042
the corresponding points of the background area of (2) establish a similarity transformation relationship, i.e.
Figure GDA0002364170920000043
Figure GDA0002364170920000044
As an image
Figure GDA0002364170920000045
The point coordinates of the background area of (a),
Figure GDA0002364170920000046
as an image
Figure GDA0002364170920000047
Corresponds to the background region of
Figure GDA0002364170920000048
The coordinates of the points of (a) and (b),
Figure GDA0002364170920000049
and
Figure GDA00023641709200000410
are all the coordinates of the same degree,
Figure GDA00023641709200000411
for similarity transformation model, cxAnd dxTransformation parameters in the horizontal direction of adjacent frame images acquired for a fixed main surveillance PTZ camera, cyAnd dyTransformation parameters in the vertical direction of adjacent frame images acquired by a fixed main monitoring PTZ camera;
step 302, image is imaged
Figure GDA00023641709200000412
N are equally divided in the horizontal direction to obtain an image
Figure GDA00023641709200000413
Block image set in horizontal direction
Figure GDA00023641709200000414
Image of a person
Figure GDA00023641709200000425
The size of each block image in the horizontal direction is (W, h), where W is the image
Figure GDA00023641709200000415
H is the height size of each block image and
Figure GDA00023641709200000416
h is an image
Figure GDA00023641709200000417
The height dimension of (a);
constructing images
Figure GDA00023641709200000418
Corresponding horizontal direction sub-image set
Figure GDA00023641709200000419
Image of a person
Figure GDA00023641709200000420
Each sub-image in the corresponding set of horizontal sub-images has a size of (W,2h), i.e.
Figure GDA00023641709200000421
By analogy with that
Figure GDA00023641709200000422
According to the formula
Figure GDA00023641709200000423
Computing images
Figure GDA00023641709200000424
Average gray level vector of each sub-image in corresponding horizontal direction
Figure GDA0002364170920000051
i is the pixel number of each sub-image in the horizontal direction, and j is the pixel number of each sub-image in the vertical direction;
step 303, construct the image
Figure GDA0002364170920000052
Corresponding horizontal direction sub-image set
Figure GDA0002364170920000053
And an image
Figure GDA0002364170920000054
Average gray level vector of each sub-image in corresponding horizontal direction
Figure GDA0002364170920000055
Process and construct images
Figure GDA0002364170920000056
The corresponding processes of the sub-image sets in the horizontal direction are consistent;
step 304, in the adjacent frame image
Figure GDA0002364170920000057
Corresponding position sub-image
Figure GDA0002364170920000058
Pair of average gray level vectors
Figure GDA0002364170920000059
In the method, a local extreme value of the gray scale is searched in a traversal mode, wherein k is a sub-image number, and k is 1.
According to the formula
Figure GDA00023641709200000510
Acquiring an image
Figure GDA00023641709200000511
Subimage
Figure GDA00023641709200000512
Local extremum of upper gray level
Figure GDA00023641709200000513
And image
Figure GDA00023641709200000514
Subimage
Figure GDA00023641709200000515
Local extremum of upper gray level
Figure GDA00023641709200000516
And
Figure GDA00023641709200000517
a pair of abscissa correspondences is formed, wherein,
Figure GDA00023641709200000518
as an image
Figure GDA00023641709200000519
Subimage
Figure GDA00023641709200000520
Local extremum of upper gray level
Figure GDA00023641709200000521
The corresponding abscissa of the coordinate system is set to,
Figure GDA00023641709200000522
as an image
Figure GDA00023641709200000523
Subimage
Figure GDA00023641709200000524
Local extremum of upper gray level
Figure GDA00023641709200000525
The corresponding abscissa, dis (·), is a function of the distance between the two coordinates;
step 305, repeating step 304 for multiple times, traversing the adjacent frame images
Figure GDA00023641709200000526
The corresponding relationship set of the corresponding abscissa of all the pairs of the sub-images;
and then removing adjacent frame images by using Hough transformation
Figure GDA00023641709200000527
The outer points in the corresponding relation set of the corresponding abscissa of all the sub-image pairs are obtained to obtain an inner point set
Figure GDA00023641709200000528
r is an interior point number and r is 1, 2. M is the number of corresponding points in the inner point set;
building equation set according to inner point set
Figure GDA00023641709200000529
Solving an equation set constructed by an inner point set by using a least square algorithm to obtain a transformation parameter c in the horizontal direction of an adjacent frame image acquired by a fixed main monitoring PTZ camera in a similarity transformation modelxAnd dxThe optimal estimated value of (a);
step 306, image is displayed
Figure GDA0002364170920000061
And
Figure GDA0002364170920000062
uniformly dividing in vertical direction, calculating average gray vector of each sub-image, estimating corresponding relation of local extreme value of gray level, and obtaining similar transformation modelTransformation parameter c in vertical direction of adjacent frame image acquired by fixed main monitoring PTZ camera in modelyAnd dyOptimal estimated value of (2), process and image thereof
Figure GDA0002364170920000063
And
Figure GDA0002364170920000064
the process is consistent in the horizontal direction;
further estimating a similarity transformation model;
step 307, using the similarity transformation model to the image
Figure GDA0002364170920000065
Performing similarity transformation to obtain temporary image
Figure GDA0002364170920000066
Then the image is taken
Figure GDA0002364170920000067
And images
Figure GDA0002364170920000068
Performing pixel level difference operation, wherein the pixel area, which is not 0 in the gray difference result and is positioned in the suspected target tracking rectangular frame, is a suspected target foreground area of the fixed main monitoring PTZ camera;
step 308, estimating a suspected target foreground area in the linearly moving PTZ camera, wherein the estimation method is consistent with the estimation method of the suspected target foreground area of the fixed main monitoring PTZ camera;
step four, obtaining the absolute depth value of the foreground area of the suspected target, wherein the process is as follows:
step 401, performing stereo correction on a suspected target foreground area of a fixed main monitoring PTZ camera and a linear moving PTZ camera by using a spherical stereo correction algorithm, and then estimating a depth map of the suspected target foreground area by using a dynamic programming stereo matching algorithm;
step 402, calculating a depth average value for the depth map, thereby obtainingAbsolute depth value of suspected target foreground area
Figure GDA0002364170920000069
Step five, adjusting the accuracy of the depth estimation of the suspected target, wherein the process is as follows:
step 501, according to the formula
Figure GDA00023641709200000610
Calculating accuracy of depth estimation of suspected target
Figure GDA00023641709200000611
Wherein, epsilon is a constant, and,
Figure GDA00023641709200000612
initial values of the zoom focus parameters for the stationary primary monitoring PTZ camera and the linearly moving PTZ camera,
Figure GDA00023641709200000613
a distance from the center of the initial position of the linearly moving PTZ camera to the center of the fixed main monitoring PTZ camera;
502, according to a formula
Figure GDA00023641709200000614
Judging accuracy of depth estimation of suspected target
Figure GDA00023641709200000615
Whether the requirements are met or not, when
Figure GDA00023641709200000616
If so, executing the step six; otherwise, go to step 503; wherein λ isΔIs an accuracy threshold;
step 503, according to the formula
Figure GDA00023641709200000617
Adjusting the position of the linearly moving PTZ camera on the sliding rail to ensure that the distance from the center of the linearly moving PTZ camera to the center of the fixed main monitoring PTZ cameraIs composed of
Figure GDA0002364170920000071
Wherein α and β are constants determined by experiments according to the monitoring scene;
step 504, updating the control parameters of the linearly moving PTZ camera, the process is as follows:
step 5041, utilizing the absolute depth value of the foreground area of the suspected target at the current time
Figure GDA0002364170920000072
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional coordinates of the suspected target in the fixed main monitoring PTZ camera
Figure GDA0002364170920000073
5042 obtaining formula
Figure GDA0002364170920000074
After the linearly moving PTZ camera reaches the designated position, the three-dimensional coordinates of the suspected target in the linearly moving PTZ camera are calculated
Figure GDA0002364170920000075
Wherein the content of the first and second substances,
Figure GDA0002364170920000076
in order to linearly move the PTZ camera to a specified position, the coordinate value of the suspected target on the X axis in the three-dimensional coordinate system of the linearly moving PTZ camera,
Figure GDA0002364170920000077
in order to linearly move the PTZ camera to a specified position, the coordinate value of the suspected target on the Y axis in the three-dimensional coordinate system of the linearly moving PTZ camera,
Figure GDA0002364170920000078
when the linearly moving PTZ camera reaches a specified position, the coordinate value of the suspected target on the Z axis in the three-dimensional coordinate system of the linearly moving PTZ camera is obtained;
step 5043, according to formula
Figure GDA0002364170920000079
Calculating pan rotation parameters after updating of a linearly moving PTZ camera
Figure GDA00023641709200000715
And updated tilt rotation parameters
Figure GDA00023641709200000710
Step 505, the computer updates the pan rotation parameter pt of the linear motion PTZ camera2And updated tilt rotation parameters
Figure GDA00023641709200000716
Feeding back to the linearly moving PTZ camera, so that the suspected target is locked at the image center position of the linearly moving PTZ camera again, continuously tracking the suspected target under the short focal length by using a mean shift algorithm, and circulating the step 203 until the depth estimation accuracy of the suspected target is reached
Figure GDA00023641709200000711
The requirement is met, and at the moment, the final absolute depth value of the foreground area of the suspected target is
Figure GDA00023641709200000712
Estimating control parameters of a fixed slave monitoring PTZ camera and realizing active tracking of a suspected target under a long focal length, wherein the process comprises the following steps:
step 601, utilizing the final absolute depth value of the foreground area of the suspected target at the current moment as
Figure GDA00023641709200000713
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional coordinates of the suspected target in the fixed main monitoring PTZ camera
Figure GDA00023641709200000714
Step 602, according to the formula
Figure GDA0002364170920000081
Calculating three-dimensional coordinates of suspected target in fixed secondary monitoring PTZ camera
Figure GDA0002364170920000082
Wherein, b13To fix the distance between the master monitoring PTZ camera and the slave monitoring PTZ camera,
Figure GDA0002364170920000083
to suspect the coordinate values of the target on the X-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure GDA0002364170920000084
to suspect the coordinate values of the target on the Y-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure GDA0002364170920000085
coordinate values of the suspected target on a Z axis in a three-dimensional coordinate system of the fixed slave monitoring PTZ camera;
step 603, according to the formula
Figure GDA0002364170920000086
Calculating pan rotation parameters for fixed slave surveillance PTZ cameras
Figure GDA0002364170920000087
And tilt rotational parameter
Figure GDA0002364170920000088
Step 604, endowing zoom focal length parameters of the fixed slave monitoring PTZ camera according to specific monitoring scenes
Figure GDA0002364170920000089
Is the actual value at the long focal length;
step 605, the computer fixes pan rotation parameters of the slave monitor PTZ camera
Figure GDA00023641709200000810
tilt rotation parameter
Figure GDA00023641709200000811
And zoom focal length parameter
Figure GDA00023641709200000812
And feeding back to the fixed slave monitoring PTZ camera, thereby realizing the active tracking of the suspected target under the long focal length.
The master-slave tracking method of the auxiliary binocular PTZ visual system of the linear moving PTZ camera is characterized in that: in step 402, before the depth map is subjected to depth average value calculation, the depth map is subjected to normalization and median filtering in sequence.
The master-slave tracking method of the auxiliary binocular PTZ visual system of the linear moving PTZ camera is characterized in that: the short focal length is 1 time of optical zoom distance of the PTZ camera to 3 times of optical zoom distance of the PTZ camera, and the long focal length is 15 times of optical zoom distance of the PTZ camera to 36 times of optical zoom distance of the PTZ camera.
The master-slave tracking method of the auxiliary binocular PTZ visual system of the linear moving PTZ camera is characterized in that: the value range of the pixel threshold is 30-40 pixels.
Compared with the prior art, the invention has the following advantages:
1. according to the method, the data acquisition platform of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera is constructed, the linearly moving PTZ camera is further introduced on the basis of the fixedly installed binocular PTZ visual system, the linearly moving PTZ camera is matched with the fixed main monitoring PTZ camera, the depth of the foreground area of the suspected target is estimated from coarse to fine, the absolute depth value of the foreground area of the suspected target is further acquired, the real-time requirement of master-slave tracking of the binocular PTZ visual system is met to the maximum extent, and the method is convenient to popularize and use.
2. When the absolute depth value of the suspected target foreground area is obtained, the block image sets are respectively obtained in the horizontal direction and the vertical direction of the image, the average gray vector of each sub-image in the horizontal direction and the vertical direction corresponding to the image is calculated, the local extreme value of gray is searched in a traversing mode in the average gray vector pair of the sub-image at the position corresponding to the adjacent frame image, the similar transformation model of the adjacent frame image of the PTZ camera is further estimated, the pixel level difference operation is carried out on the adjacent frame image of the PTZ camera by utilizing the similar transformation model, and therefore the depth estimation result of the suspected target foreground area is obtained, the method is reliable and stable, and the using effect is good.
3. The method has simple steps, the depth of the foreground area of the suspected target is estimated from coarse to fine, the absolute depth value of the foreground area of the suspected target is further obtained, if the estimated depth accuracy does not meet the index requirement, the PTZ camera is moved to a specified position by adjusting the linear movement, and then the depth and the depth accuracy of the foreground area of the suspected target are recalculated until the requirements are met; the maximum depth and the minimum depth parameters of the monitored scene do not need to be input in advance, and the depth information can be updated in real time along with the change of the target depth, so that the estimation of the control parameters of the fixed slave monitoring PTZ camera is more accurate, in addition, the assumption of the monitored scene without ground plane constraint is not needed, the method can still be applied to the monitored scenes comprising slopes, steps and the like, higher tracking accuracy can be obtained, and the method is convenient to popularize and use.
In summary, the invention further introduces a linearly movable PTZ camera on the basis of a fixedly installed binocular PTZ visual system, the linearly movable PTZ camera is matched with a fixed main monitoring PTZ camera to estimate the depth of the foreground area of the suspected target from coarse to fine, so as to obtain the absolute depth value of the foreground area of the suspected target, and further estimates the control parameters of the fixed auxiliary monitoring PTZ camera on the basis of the depth information, thereby realizing the master-slave tracking of the binocular PTZ visual system and being convenient for popularization and use.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
Fig. 1 is a schematic structural diagram of a data acquisition platform of an auxiliary binocular PTZ vision system of a linearly moving PTZ camera according to the present invention.
FIG. 2 is a block diagram of a method flow of the method of the present invention.
Fig. 3 is a diagram of the tracking effect of the fixed main monitoring PTZ camera on a suspected target under a short focal length according to the present invention.
Fig. 4 is a graph of the tracking effect of the suspected target advancing in fig. 3.
Fig. 5 is a graph of the tracking effect of the suspected target advancing in fig. 4.
FIG. 6 is a graph of the tracking effect of the fixed slave surveillance PTZ camera on the suspected target under the long focal length in the invention.
Fig. 7 is a graph of the tracking effect of the suspected target advancing in fig. 6.
Fig. 8 is a graph of the tracking effect of the suspected target advancing in fig. 7.
Description of reference numerals:
1 — a first stationary surveillance PTZ camera; 2 — a second fixed monitoring PTZ camera;
3-linearly moving PTZ camera; 4, a sliding rail;
5-vertical plate.
Detailed Description
As shown in fig. 1 and 2, the master-slave tracking method of the linearly moving PTZ camera assisted binocular PTZ vision system of the present invention comprises the steps of:
step one, constructing a data acquisition platform of the auxiliary binocular PTZ visual system of the linear moving PTZ camera: a first fixed monitoring PTZ camera 1 and a second fixed monitoring PTZ camera 2 are installed at equal-height positions on a vertical plate 5, a sliding rail 4 is installed on the vertical plate 5 and positioned at the lower side of the first fixed monitoring PTZ camera 1 and the lower side of the second fixed monitoring PTZ camera 2, the central axis of the sliding rail 4 in the length direction is parallel to the central connecting line of the first fixed monitoring PTZ camera 1 and the second fixed monitoring PTZ camera 2, a linear moving PTZ camera 3 is installed on the sliding rail 4, the first fixed monitoring PTZ camera 1, the second fixed monitoring PTZ camera 2 and the linear moving PTZ camera 3 are all connected with a computer, and the X axis, the Y axis and the Z axis of the camera coordinate system of the first fixed monitoring PTZ camera 1, the second fixed monitoring PTZ camera 2 and the linear moving PTZ camera 3 are respectively parallel to each other;
the first fixed monitoring PTZ camera 1, the second fixed monitoring PTZ camera 2, the linearly moving PTZ camera 3, the sliding rail 4, the vertical plate 5 and the computer form a data acquisition platform of the linearly moving PTZ camera auxiliary binocular PTZ visual system;
in a data acquisition platform of the linear moving PTZ camera auxiliary binocular PTZ visual system, one of a first fixed monitoring PTZ camera 1 and a second fixed monitoring PTZ camera 2 is used as a fixed main monitoring PTZ camera, and the other camera is used as a fixed auxiliary monitoring PTZ camera;
it should be noted that by constructing a data acquisition platform of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera, on the basis of the fixedly installed binocular PTZ visual system, a linearly moving PTZ camera 3 is further introduced, the linearly moving PTZ camera 3 is matched with a fixed main monitoring PTZ camera, the depth of a foreground area of a suspected target is estimated from coarse to fine, the absolute depth value of the foreground area of the suspected target is further acquired, and the real-time performance of master-slave tracking of the binocular PTZ visual system is met to the maximum extent.
Step two, the fixed main monitoring PTZ camera controls the linearly moving PTZ camera to lock the suspected target, the two cameras continuously track the suspected target under the short focal length, and the process is as follows:
step 201, fixing image coordinates of a main monitoring PTZ camera and initial control parameters of a linear moving PTZ camera 3 through offline sampling, and storing the initial control parameters in a computer in a data table form;
the control parameters comprise pan rotation parameters, tilt rotation parameters and zoom focal length parameters of the camera;
step 202, selecting a suspected target on a monitoring interface of a fixed main monitoring PTZ camera, inquiring a data table by a computer, and feeding back control parameters to the linearly moving PTZ camera 3, so that the linearly moving PTZ camera 3 locks the suspected target under a short focal length;
step 203, the fixed main monitoring PTZ camera continuously tracks the suspected target under the short focal length by using a mean shift algorithm, calculates the distance between the centroid of the suspected target and the image boundary frame by frame, and uses a formula if the distance between the centroid of the suspected target and the image boundary is smaller than a pixel threshold value
Figure GDA0002364170920000121
Pa to fixed main monitoring PTZ cameraUpdating the n rotation parameters and the tilt rotation parameters so as to lock the suspected target at the image center position of the fixed main monitoring PTZ camera, and continuously tracking the suspected target under the short focal length by using a mean shift algorithm, wherein (x)1,y1) As coordinates of the center of mass of the suspected target, (u)1,v1) Image center coordinates, f, for a fixed primary monitoring PTZ camera1Equivalent focal length, Δ p, for a fixed main monitoring PTZ camera1Absolute angle, Δ t, to be changed for fixing the main monitoring PTZ Camera pan rotation parameters1Absolute angle (p) to be changed for fixing the tilt rotation parameters of the main monitoring PTZ camera1,t1) Monitoring PTZ Camera pan and Tilt Pre-rotation parameters for stationary Master data, (p'1,t'1) Monitoring data after the pan rotation parameters and tilt rotation parameters of the PTZ camera are changed for a fixed main;
in this embodiment, the value range of the pixel threshold is 30 pixels to 40 pixels, and in actual use, the preferred pixel threshold is 40 pixels.
Step 204, continuously tracking the suspected target by the linearly moving PTZ camera 3 under the short focal length, wherein the tracking method is consistent with the method for continuously tracking the suspected target by the fixed main monitoring PTZ camera under the short focal length;
step three, estimating suspected target foreground areas in the fixed main monitoring PTZ camera and the linear moving PTZ camera respectively, wherein the process is as follows:
301, acquiring adjacent frame images of the fixed main monitoring PTZ camera
Figure GDA0002364170920000122
And
Figure GDA0002364170920000123
the corresponding points of the background area of (2) establish a similarity transformation relationship, i.e.
Figure GDA0002364170920000124
Figure GDA0002364170920000125
As an image
Figure GDA0002364170920000126
The point coordinates of the background area of (a),
Figure GDA0002364170920000127
as an image
Figure GDA0002364170920000128
Corresponds to the background region of
Figure GDA0002364170920000129
The coordinates of the points of (a) and (b),
Figure GDA00023641709200001210
and
Figure GDA00023641709200001211
are all the coordinates of the same degree,
Figure GDA00023641709200001212
for similarity transformation model, cxAnd dxTransformation parameters in the horizontal direction of adjacent frame images acquired for a fixed main surveillance PTZ camera, cyAnd dyTransformation parameters in the vertical direction of adjacent frame images acquired by a fixed main monitoring PTZ camera;
step 302, image is imaged
Figure GDA0002364170920000131
N are equally divided in the horizontal direction to obtain an image
Figure GDA0002364170920000132
Block image set in horizontal direction
Figure GDA0002364170920000133
Image of a person
Figure GDA0002364170920000134
The size of each block image in the horizontal direction is (W, h), where W is the image
Figure GDA0002364170920000135
H is the height size of each block image and
Figure GDA0002364170920000136
h is an image
Figure GDA0002364170920000137
The height dimension of (a);
constructing images
Figure GDA0002364170920000138
Corresponding horizontal direction sub-image set
Figure GDA0002364170920000139
Image of a person
Figure GDA00023641709200001310
Each sub-image in the corresponding set of horizontal sub-images has a size of (W,2h), i.e.
Figure GDA00023641709200001311
By analogy with that
Figure GDA00023641709200001312
According to the formula
Figure GDA00023641709200001313
Computing images
Figure GDA00023641709200001314
Average gray level vector of each sub-image in corresponding horizontal direction
Figure GDA00023641709200001315
i is the pixel number of each sub-image in the horizontal direction, and j is the pixel number of each sub-image in the vertical direction;
step 303, construct the image
Figure GDA00023641709200001316
Corresponding horizontal direction sub-image set
Figure GDA00023641709200001317
And an image
Figure GDA00023641709200001318
Average gray level vector of each sub-image in corresponding horizontal direction
Figure GDA00023641709200001319
Process and construct images
Figure GDA00023641709200001320
The corresponding processes of the sub-image sets in the horizontal direction are consistent;
step 304, in the adjacent frame image
Figure GDA00023641709200001321
Corresponding position sub-image
Figure GDA00023641709200001322
Pair of average gray level vectors
Figure GDA00023641709200001323
In the method, a local extreme value of the gray scale is searched in a traversal mode, wherein k is a sub-image number, and k is 1.
According to the formula
Figure GDA00023641709200001324
Acquiring an image
Figure GDA00023641709200001325
Subimage
Figure GDA00023641709200001326
Local extremum of upper gray level
Figure GDA0002364170920000141
And image
Figure GDA0002364170920000142
Subimage
Figure GDA0002364170920000143
Local extremum of upper gray level
Figure GDA0002364170920000144
And
Figure GDA0002364170920000145
a pair of abscissa correspondences is formed, wherein,
Figure GDA0002364170920000146
as an image
Figure GDA0002364170920000147
Subimage
Figure GDA0002364170920000148
Local extremum of upper gray level
Figure GDA0002364170920000149
The corresponding abscissa of the coordinate system is set to,
Figure GDA00023641709200001410
as an image
Figure GDA00023641709200001411
Subimage
Figure GDA00023641709200001412
Local extremum of upper gray level
Figure GDA00023641709200001413
The corresponding abscissa, dis (·), is a function of the distance between the two coordinates;
step 305, repeating step 304 for multiple times, traversing the adjacent frame images
Figure GDA00023641709200001414
The corresponding relationship set of the corresponding abscissa of all the pairs of the sub-images;
and then removing adjacent frame images by using Hough transformation
Figure GDA00023641709200001415
The outer points in the corresponding relation set of the corresponding abscissa of all the sub-image pairs are obtained to obtain an inner point set
Figure GDA00023641709200001416
r is an interior point number and r is 1, 2. M is the number of corresponding points in the inner point set;
building equation set according to inner point set
Figure GDA00023641709200001417
Solving an equation set constructed by an inner point set by using a least square algorithm to obtain a transformation parameter c in the horizontal direction of an adjacent frame image acquired by a fixed main monitoring PTZ camera in a similarity transformation modelxAnd dxThe optimal estimated value of (a);
step 306, image is displayed
Figure GDA00023641709200001418
And
Figure GDA00023641709200001419
uniformly dividing in the vertical direction, calculating the average gray vector of each sub-image, and estimating the corresponding relation of local extreme values of gray, thereby obtaining a transformation parameter c in the vertical direction of the adjacent frame image obtained by a fixed main monitoring PTZ camera in a similar transformation modelyAnd dyOptimal estimated value of (2), process and image thereof
Figure GDA00023641709200001420
And
Figure GDA00023641709200001421
the process is consistent in the horizontal direction;
further estimating a similarity transformation model;
step 307, using the similarity transformation model to the image
Figure GDA00023641709200001422
Performing similarity transformation to obtain temporary image
Figure GDA00023641709200001423
Then the image is taken
Figure GDA00023641709200001424
And images
Figure GDA00023641709200001425
Performing pixel level difference operation, wherein the pixel area, which is not 0 in the gray difference result and is positioned in the suspected target tracking rectangular frame, is a suspected target foreground area of the fixed main monitoring PTZ camera;
in practical use, usually, a standard rectangular frame is adopted to lock a suspected target, the standard rectangular frame comprises a foreground area and a background area, if depth estimation is directly performed in the rectangular frame area, not only is the calculation complexity high, but also the background area can cause huge interference to the depth estimation of the suspected target, therefore, the depth estimation is performed only in the foreground area of the suspected target, and the real-time effect is good.
Step 308, estimating a suspected target foreground area in the linearly moving PTZ camera, wherein the estimation method is consistent with the estimation method of the suspected target foreground area of the fixed main monitoring PTZ camera;
it should be noted that, the depth of the foreground region of the suspected target is estimated from coarse to fine, so as to obtain the absolute depth value of the foreground region of the suspected target, if the estimated depth accuracy does not meet the index requirement, the PTZ camera is moved to a specified position by adjusting the linear movement, and then the depth and the depth accuracy of the foreground region of the suspected target are recalculated until the requirements are met; the maximum depth and the minimum depth parameters of the monitored scene do not need to be input in advance, and the depth information can be updated in real time along with the change of the target depth, so that the estimation of the control parameters of the fixed slave monitoring PTZ camera is more accurate, in addition, the assumption of the monitored scene without ground plane constraint is not needed, the method can still be applied to the monitored scenes comprising slopes, steps and the like, and higher tracking accuracy can be obtained.
Step four, obtaining the absolute depth value of the foreground area of the suspected target, wherein the process is as follows:
step 401, performing stereo correction on a suspected target foreground area of a fixed main monitoring PTZ camera and a linear moving PTZ camera 3 by using a spherical stereo correction algorithm, and then estimating a depth map of the suspected target foreground area by using a dynamic programming stereo matching algorithm;
step 402, calculating a depth average value for the depth map, thereby obtaining an absolute depth value of the foreground area of the suspected target
Figure GDA0002364170920000151
In this embodiment, before the depth map is subjected to the depth average calculation in step 402, the depth map is subjected to normalization and median filtering in sequence.
Step five, adjusting the accuracy of the depth estimation of the suspected target, wherein the process is as follows:
step 501, according to the formula
Figure GDA0002364170920000152
Calculating accuracy of depth estimation of suspected target
Figure GDA0002364170920000153
Wherein, epsilon is a constant, and,
Figure GDA0002364170920000154
initial values of the zoom focal length parameters of the stationary main monitoring PTZ camera and the linearly moving PTZ camera 3,
Figure GDA0002364170920000155
a distance from the center of the initial position of the linearly moving PTZ camera 3 to the center of the fixed main monitoring PTZ camera;
502, according to a formula
Figure GDA0002364170920000156
Judging accuracy of depth estimation of suspected target
Figure GDA0002364170920000157
Whether the requirements are met or not, when
Figure GDA0002364170920000158
If so, executing the step six; otherwise, go to step 503; wherein λ isΔIs an accuracy threshold;
step 503, according to the formula
Figure GDA0002364170920000161
Adjusting the position of the linearly moving PTZ camera 3 on the slide rail 4 to ensure that the distance from the center of the linearly moving PTZ camera 3 to the center of the fixed main monitoring PTZ camera is
Figure GDA0002364170920000162
Wherein α and β are constants determined by experiments according to the monitoring scene;
step 504, updating the control parameters of the linearly moving PTZ camera, the process is as follows:
step 5041, utilizing the absolute depth value of the foreground area of the suspected target at the current time
Figure GDA0002364170920000163
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional coordinates of the suspected target in the fixed main monitoring PTZ camera
Figure GDA0002364170920000164
5042 obtaining formula
Figure GDA0002364170920000165
After the linearly moving PTZ camera 3 reaches the designated position, the three-dimensional coordinates of the suspected target in the linearly moving PTZ camera 3 are calculated
Figure GDA0002364170920000166
Wherein the content of the first and second substances,
Figure GDA0002364170920000167
moving PTZ camera 3 to straight lineReaching the specified position, the coordinate value of the suspected target on the X axis in the three-dimensional coordinate system of the linearly moving PTZ camera 3,
Figure GDA0002364170920000168
in order to linearly move the PTZ camera 3 to a specified position, the coordinate values of the suspected object on the Y axis in the three-dimensional coordinate system of the linearly moving PTZ camera 3,
Figure GDA0002364170920000169
when the linearly moving PTZ camera 3 reaches a specified position, the coordinate value of the suspected target on the Z axis in the three-dimensional coordinate system of the linearly moving PTZ camera 3;
step 5043, according to formula
Figure GDA00023641709200001610
Calculating pan rotation parameters after updating of the linearly moving PTZ camera 3
Figure GDA00023641709200001611
And updated tilt rotation parameters
Figure GDA00023641709200001612
Step 505, the computer updates pan rotation parameters of the linear motion PTZ camera 3
Figure GDA00023641709200001613
And updated tilt rotation parameters
Figure GDA00023641709200001614
Feeding back to the linearly moving PTZ camera 3, so that the suspected target is locked at the image center position of the linearly moving PTZ camera 3 again, continuously tracking the suspected target under the short focal length by using a mean shift algorithm, and circulating the step 203 until the depth estimation accuracy of the suspected target is reached
Figure GDA00023641709200001615
The requirement is met, and at the moment, the final absolute depth value of the foreground area of the suspected target is
Figure GDA00023641709200001616
It should be noted that, when the absolute depth value of the foreground region of the suspected target is obtained, a set of block images is obtained in the horizontal direction and the vertical direction of the image respectively, an average gray vector of each sub-image in the horizontal direction and the vertical direction corresponding to the image is calculated, a local extreme value of gray is searched in a traversal manner in the average gray vector pair of the sub-image at the position corresponding to the adjacent frame image, a similarity transformation model of the adjacent frame image of the PTZ camera is estimated, and then the pixel level difference operation is performed on the adjacent frame image of the PTZ camera by using the similarity transformation model, so that the depth estimation result of the foreground region of the suspected target is obtained, which is reliable and stable, and has good use effect.
Estimating control parameters of a fixed slave monitoring PTZ camera and realizing active tracking of a suspected target under a long focal length, wherein the process comprises the following steps:
step 601, utilizing the final absolute depth value of the foreground area of the suspected target at the current moment as
Figure GDA0002364170920000171
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional coordinates of the suspected target in the fixed main monitoring PTZ camera
Figure GDA0002364170920000172
Step 602, according to the formula
Figure GDA0002364170920000173
Calculating three-dimensional coordinates of suspected target in fixed secondary monitoring PTZ camera
Figure GDA0002364170920000174
Wherein, b13To fix the distance between the master monitoring PTZ camera and the slave monitoring PTZ camera,
Figure GDA0002364170920000175
to suspect the coordinate values of the target on the X-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure GDA0002364170920000176
to suspect the coordinate values of the target on the Y-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure GDA0002364170920000177
coordinate values of the suspected target on a Z axis in a three-dimensional coordinate system of the fixed slave monitoring PTZ camera;
step 603, according to the formula
Figure GDA0002364170920000178
Calculating pan rotation parameters for fixed slave surveillance PTZ cameras
Figure GDA0002364170920000179
And tilt rotational parameter
Figure GDA00023641709200001710
Step 604, endowing zoom focal length parameters of the fixed slave monitoring PTZ camera according to specific monitoring scenes
Figure GDA00023641709200001711
Is the actual value at the long focal length;
step 605, the computer fixes pan rotation parameters of the slave monitor PTZ camera
Figure GDA00023641709200001712
tilt rotation parameter
Figure GDA00023641709200001713
And zoom focal length parameter
Figure GDA00023641709200001714
And feeding back to the fixed slave monitoring PTZ camera, thereby realizing the active tracking of the suspected target under the long focal length.
In this embodiment, the short focal length is from 1 times of the optical zoom distance of the PTZ camera to 3 times of the optical zoom distance of the PTZ camera, and the long focal length is from 15 times of the optical zoom distance of the PTZ camera to 36 times of the optical zoom distance of the PTZ camera.
When the method is used, as shown in fig. 3 to 5, the fixed master monitoring PTZ camera and the linearly moving PTZ camera 3 keep the minimum focal length, that is, the suspected target is continuously tracked in the moving process under the short focal length, the linearly moving PTZ camera 3 provides the fixed master monitoring PTZ camera with the depth auxiliary estimation function, the final absolute depth value of the foreground area of the suspected target at the current moment and the imaging model of the fixed master monitoring PTZ camera are utilized to calculate the three-dimensional coordinates of the suspected target in the fixed master monitoring PTZ camera, further calculate the three-dimensional coordinates of the suspected target in the fixed slave monitoring PTZ camera, the zoom parameter given to the fixed slave monitoring PTZ camera according to the specific monitoring scene is adjusted to the actual numerical value under the long focal length, the computer feeds back the control parameter of the fixed slave monitoring PTZ camera to the fixed slave monitoring PTZ camera, thereby realizing the active tracking of the suspected target under the long focal length, as shown in fig. 6 to 8, the depth effect of master-slave tracking is good.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and all simple modifications, changes and equivalent structural changes made to the above embodiment according to the technical spirit of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (4)

1. The master-slave tracking method of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera is characterized by comprising the following steps of:
step one, constructing a data acquisition platform of the auxiliary binocular PTZ visual system of the linear moving PTZ camera: a first fixed monitoring PTZ camera (1) and a second fixed monitoring PTZ camera (2) are installed at equal-height positions on a vertical plate (5), a sliding rail (4) is installed on the vertical plate (5) and located on the lower side of the first fixed monitoring PTZ camera (1) and the lower side of the second fixed monitoring PTZ camera (2), the central axis of the sliding rail (4) in the length direction is parallel to the central connecting line of the first fixed monitoring PTZ camera (1) and the second fixed monitoring PTZ camera (2), a linear moving PTZ camera (3) is installed on the sliding rail (4), the first fixed monitoring PTZ camera (1), the second fixed monitoring PTZ camera (2) and the linear moving PTZ camera (3) are all connected with a computer, and the X axis, the Y axis and the Z axis of a camera coordinate system of the first fixed monitoring PTZ camera (1), the second fixed monitoring PTZ camera (2) and the linear moving PTZ camera (3) are respectively parallel to each other;
the first fixed monitoring PTZ camera (1), the second fixed monitoring PTZ camera (2), the linear moving PTZ camera (3), the sliding rail (4), the vertical plate (5) and the computer form a data acquisition platform of the auxiliary binocular PTZ visual system of the linear moving PTZ camera;
in a data acquisition platform of the auxiliary binocular PTZ visual system of the linearly moving PTZ camera, any one of a first fixed monitoring PTZ camera (1) and a second fixed monitoring PTZ camera (2) is used as a fixed main monitoring PTZ camera, and the other camera is used as a fixed auxiliary monitoring PTZ camera;
step two, the fixed main monitoring PTZ camera controls the linearly moving PTZ camera to lock the suspected target, the two cameras continuously track the suspected target under the short focal length, and the process is as follows:
step 201, fixing image coordinates of a main monitoring PTZ camera and initial control parameters of a linear moving PTZ camera (3) through offline sampling, and storing the initial control parameters in a computer in a data table form;
the control parameters comprise pan rotation parameters, tilt rotation parameters and zoom focal length parameters of the camera;
step 202, selecting a suspected target on a monitoring interface of a fixed main monitoring PTZ camera, inquiring a data table by a computer, and feeding back control parameters to the linearly moving PTZ camera (3), so that the linearly moving PTZ camera (3) locks the suspected target under a short focal length;
step 203, the fixed main monitoring PTZ camera continuously tracks the suspected target under the short focal length by using a mean shift algorithm, calculates the distance between the centroid of the suspected target and the image boundary frame by frame, and uses a formula if the distance between the centroid of the suspected target and the image boundary is smaller than a pixel threshold value
Figure FDA0002364170910000021
Updating pan rotation parameters and tilt rotation parameters of the fixed main monitoring PTZ camera, so that the suspected target is locked at the image center position of the fixed main monitoring PTZ camera, and the suspected target is continuously subjected to short focal length by using a mean shift algorithmPerforming a tracking, wherein (x)1,y1) As coordinates of the center of mass of the suspected target, (u)1,v1) Image center coordinates, f, for a fixed primary monitoring PTZ camera1Equivalent focal length, Δ p, for a fixed main monitoring PTZ camera1Absolute angle, Δ t, to be changed for fixing the main monitoring PTZ Camera pan rotation parameters1Absolute angle (p) to be changed for fixing the tilt rotation parameters of the main monitoring PTZ camera1,t1) Monitoring PTZ Camera pan and Tilt Pre-rotation parameters for stationary Master data, (p'1,t′1) Monitoring data after the pan rotation parameters and tilt rotation parameters of the PTZ camera are changed for a fixed main;
step 204, continuously tracking the suspected target by the linearly moving PTZ camera (3) under the short focal length, wherein the tracking method is consistent with the method for continuously tracking the suspected target by the fixed main monitoring PTZ camera under the short focal length;
step three, estimating suspected target foreground areas in the fixed main monitoring PTZ camera and the linear moving PTZ camera respectively, wherein the process is as follows:
301, acquiring adjacent frame images of the fixed main monitoring PTZ camera
Figure FDA0002364170910000022
And
Figure FDA0002364170910000023
the corresponding points of the background area of (2) establish a similarity transformation relationship, i.e.
Figure FDA0002364170910000024
Figure FDA0002364170910000025
As an image
Figure FDA0002364170910000026
The point coordinates of the background area of (a),
Figure FDA0002364170910000027
as an image
Figure FDA0002364170910000028
Corresponds to the background region of
Figure FDA0002364170910000029
The coordinates of the points of (a) and (b),
Figure FDA00023641709100000210
and
Figure FDA00023641709100000211
are all the coordinates of the same degree,
Figure FDA00023641709100000212
for similarity transformation model, cxAnd dxTransformation parameters in the horizontal direction of adjacent frame images acquired for a fixed main surveillance PTZ camera, cyAnd dyTransformation parameters in the vertical direction of adjacent frame images acquired by a fixed main monitoring PTZ camera;
step 302, image is imaged
Figure FDA0002364170910000031
N are equally divided in the horizontal direction to obtain an image
Figure FDA0002364170910000032
Block image set in horizontal direction
Figure FDA0002364170910000033
Image of a person
Figure FDA0002364170910000034
The size of each block image in the horizontal direction is (W, h), where W is the image
Figure FDA0002364170910000035
H is the height size of each block image and
Figure FDA0002364170910000036
h is an image
Figure FDA0002364170910000037
The height dimension of (a);
constructing images
Figure FDA0002364170910000038
Corresponding horizontal direction sub-image set
Figure FDA0002364170910000039
Image of a person
Figure FDA00023641709100000310
Each sub-image in the corresponding set of horizontal sub-images has a size of (W,2h), i.e.
Figure FDA00023641709100000311
By analogy with that
Figure FDA00023641709100000312
According to the formula
Figure FDA00023641709100000313
Computing images
Figure FDA00023641709100000314
Average gray level vector of each sub-image in corresponding horizontal direction
Figure FDA00023641709100000315
i is the pixel number of each sub-image in the horizontal direction, and j is the pixel number of each sub-image in the vertical direction;
step 303, construct the image
Figure FDA00023641709100000316
Corresponding horizontal direction sub-image set
Figure FDA00023641709100000317
And an image
Figure FDA00023641709100000318
Average gray level vector of each sub-image in corresponding horizontal direction
Figure FDA00023641709100000319
Process and construct images
Figure FDA00023641709100000320
The corresponding processes of the sub-image sets in the horizontal direction are consistent;
step 304, in the adjacent frame image
Figure FDA00023641709100000321
Corresponding position sub-image
Figure FDA00023641709100000322
Pair of average gray level vectors
Figure FDA00023641709100000323
In the method, a local extreme value of the gray scale is searched in a traversal mode, wherein k is a sub-image number, and k is 1.
According to the formula
Figure FDA00023641709100000324
Acquiring an image
Figure FDA00023641709100000325
Subimage
Figure FDA00023641709100000326
Local extremum of upper gray level
Figure FDA0002364170910000041
And image
Figure FDA0002364170910000042
Subimage
Figure FDA0002364170910000043
Local extremum of upper gray level
Figure FDA0002364170910000044
And
Figure FDA0002364170910000045
a pair of abscissa correspondences is formed, wherein,
Figure FDA0002364170910000046
as an image
Figure FDA0002364170910000047
Subimage
Figure FDA0002364170910000048
Local extremum of upper gray level
Figure FDA0002364170910000049
The corresponding abscissa of the coordinate system is set to,
Figure FDA00023641709100000410
as an image
Figure FDA00023641709100000411
Subimage
Figure FDA00023641709100000412
Local extremum of upper gray level
Figure FDA00023641709100000413
The corresponding abscissa, dis (·), is a function of the distance between the two coordinates;
step (ii) of305. Repeating step 304 multiple times to traverse adjacent frame images
Figure FDA00023641709100000414
The corresponding relationship set of the corresponding abscissa of all the pairs of the sub-images;
and then removing adjacent frame images by using Hough transformation
Figure FDA00023641709100000415
The outer points in the corresponding relation set of the corresponding abscissa of all the sub-image pairs are obtained to obtain an inner point set
Figure FDA00023641709100000416
r is an interior point number and r is 1, 2. M is the number of corresponding points in the inner point set;
building equation set according to inner point set
Figure FDA00023641709100000417
Solving an equation set constructed by an inner point set by using a least square algorithm to obtain a transformation parameter c in the horizontal direction of an adjacent frame image acquired by a fixed main monitoring PTZ camera in a similarity transformation modelxAnd dxThe optimal estimated value of (a);
step 306, image is displayed
Figure FDA00023641709100000418
And
Figure FDA00023641709100000419
uniformly dividing in the vertical direction, calculating the average gray vector of each sub-image, and estimating the corresponding relation of local extreme values of gray, thereby obtaining a transformation parameter c in the vertical direction of the adjacent frame image obtained by a fixed main monitoring PTZ camera in a similar transformation modelyAnd dyOptimal estimated value of (2), process and image thereof
Figure FDA00023641709100000420
And
Figure FDA00023641709100000421
the process is consistent in the horizontal direction;
further estimating a similarity transformation model;
step 307, using the similarity transformation model to the image
Figure FDA00023641709100000422
Performing similarity transformation to obtain temporary image
Figure FDA00023641709100000423
Then the image is taken
Figure FDA00023641709100000424
And images
Figure FDA00023641709100000425
Performing pixel level difference operation, wherein the pixel area, which is not 0 in the gray difference result and is positioned in the suspected target tracking rectangular frame, is a suspected target foreground area of the fixed main monitoring PTZ camera;
step 308, estimating a suspected target foreground area in the linearly moving PTZ camera, wherein the estimation method is consistent with the estimation method of the suspected target foreground area of the fixed main monitoring PTZ camera;
step four, obtaining the absolute depth value of the foreground area of the suspected target, wherein the process is as follows:
step 401, performing stereo correction on a suspected target foreground area of a fixed main monitoring PTZ camera and a linear moving PTZ camera (3) by using a spherical stereo correction algorithm, and then estimating a depth map of the suspected target foreground area by using a dynamic programming stereo matching algorithm;
step 402, calculating a depth average value for the depth map, thereby obtaining an absolute depth value of the foreground area of the suspected target
Figure FDA0002364170910000051
Step five, adjusting the accuracy of the depth estimation of the suspected target, wherein the process is as follows:
step 501, according to the formula
Figure FDA0002364170910000052
Calculating accuracy of depth estimation of suspected target
Figure FDA0002364170910000053
Wherein, epsilon is a constant, and,
Figure FDA0002364170910000054
initial values of zoom focal length parameters for the stationary main monitoring PTZ camera and the linearly moving PTZ camera (3),
Figure FDA0002364170910000055
a distance from the center of the initial position of the linearly moving PTZ camera (3) to the center of the fixed main monitoring PTZ camera;
502, according to a formula
Figure FDA0002364170910000056
Judging accuracy of depth estimation of suspected target
Figure FDA0002364170910000057
Whether the requirements are met or not, when
Figure FDA0002364170910000058
If so, executing the step six; otherwise, go to step 503; wherein λ isΔIs an accuracy threshold;
step 503, according to the formula
Figure FDA0002364170910000059
Adjusting the position of the linearly moving PTZ camera (3) on the sliding rail (4) to ensure that the distance from the center of the linearly moving PTZ camera (3) to the center of the fixed main monitoring PTZ camera is
Figure FDA00023641709100000510
Wherein α and β are constants determined by experiments according to the monitoring scene;
step 504, updating the control parameters of the linearly moving PTZ camera, the process is as follows:
step 5041, utilizing the absolute depth value of the foreground area of the suspected target at the current time
Figure FDA00023641709100000511
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional coordinates of the suspected target in the fixed main monitoring PTZ camera
Figure FDA00023641709100000512
5042 obtaining formula
Figure FDA00023641709100000513
After the linearly moving PTZ camera (3) reaches the designated position, the three-dimensional coordinates of the suspected target in the linearly moving PTZ camera (3) are calculated
Figure FDA00023641709100000514
Wherein the content of the first and second substances,
Figure FDA00023641709100000515
in order to linearly move the PTZ camera (3) to a specified position, the coordinate value of the suspected target on the X axis in the three-dimensional coordinate system of the linearly moving PTZ camera (3),
Figure FDA00023641709100000516
in order to linearly move the PTZ camera (3) to a specified position, the coordinate value of the suspected target on the Y axis in the three-dimensional coordinate system of the linearly moving PTZ camera (3),
Figure FDA00023641709100000517
the coordinate value of the suspected target on the Z axis in the three-dimensional coordinate system of the linearly moving PTZ camera (3) is obtained when the linearly moving PTZ camera (3) reaches the designated position;
step 5043, according to formula
Figure FDA0002364170910000061
Calculating pan rotation parameters after updating of the linearly moving PTZ camera (3)
Figure FDA0002364170910000062
And updated tilt rotation parameters
Figure FDA0002364170910000063
505, the computer updates pan rotation parameters of the linear motion PTZ camera (3)
Figure FDA0002364170910000064
And updated tilt rotation parameters
Figure FDA0002364170910000065
Feeding back to the linearly moving PTZ camera (3), so that the suspected target is locked at the image center position of the linearly moving PTZ camera (3) again, continuously tracking the suspected target under the short focal length by using a mean shift algorithm, and circulating the step 203 until the depth estimation accuracy of the suspected target is reached
Figure FDA0002364170910000066
The requirement is met, and at the moment, the final absolute depth value of the foreground area of the suspected target is
Figure FDA0002364170910000067
Estimating control parameters of a fixed slave monitoring PTZ camera and realizing active tracking of a suspected target under a long focal length, wherein the process comprises the following steps:
step 601, utilizing the final absolute depth value of the foreground area of the suspected target at the current moment as
Figure FDA0002364170910000068
And the fixed main monitoring PTZ camera imaging model is used for calculating the three-dimensional of the suspected target in the fixed main monitoring PTZ cameraCoordinates of the object
Figure FDA0002364170910000069
Step 602, according to the formula
Figure FDA00023641709100000610
Calculating three-dimensional coordinates of suspected target in fixed secondary monitoring PTZ camera
Figure FDA00023641709100000611
Wherein, b13To fix the distance between the master monitoring PTZ camera and the slave monitoring PTZ camera,
Figure FDA00023641709100000612
to suspect the coordinate values of the target on the X-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure FDA00023641709100000613
to suspect the coordinate values of the target on the Y-axis in the three-dimensional coordinate system of the fixed slave PTZ camera,
Figure FDA00023641709100000614
coordinate values of the suspected target on a Z axis in a three-dimensional coordinate system of the fixed slave monitoring PTZ camera;
step 603, according to the formula
Figure FDA00023641709100000615
Calculating pan rotation parameters for fixed slave surveillance PTZ cameras
Figure FDA00023641709100000616
And tilt rotational parameter
Figure FDA00023641709100000617
Step 604, endowing zoom focal length parameters of the fixed slave monitoring PTZ camera according to specific monitoring scenes
Figure FDA00023641709100000618
Is the actual value at the long focal length;
step 605, the computer fixes pan rotation parameters of the slave monitor PTZ camera
Figure FDA0002364170910000071
tilt rotation parameter
Figure FDA0002364170910000072
And zoom focal length parameter
Figure FDA0002364170910000073
And feeding back to the fixed slave monitoring PTZ camera, thereby realizing the active tracking of the suspected target under the long focal length.
2. The master-slave tracking method of the linearly moving PTZ camera assisted binocular PTZ vision system of claim 1, wherein: in step 402, before the depth map is subjected to depth average value calculation, the depth map is subjected to normalization and median filtering in sequence.
3. The master-slave tracking method of the linearly moving PTZ camera assisted binocular PTZ vision system of claim 1, wherein: the short focal length is 1 time of optical zoom distance of the PTZ camera to 3 times of optical zoom distance of the PTZ camera, and the long focal length is 15 times of optical zoom distance of the PTZ camera to 36 times of optical zoom distance of the PTZ camera.
4. The master-slave tracking method of the linearly moving PTZ camera assisted binocular PTZ vision system of claim 1, wherein: the value range of the pixel threshold is 30-40 pixels.
CN201910697079.0A 2019-07-30 2019-07-30 Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera Active CN110415278B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910697079.0A CN110415278B (en) 2019-07-30 2019-07-30 Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910697079.0A CN110415278B (en) 2019-07-30 2019-07-30 Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera

Publications (2)

Publication Number Publication Date
CN110415278A CN110415278A (en) 2019-11-05
CN110415278B true CN110415278B (en) 2020-04-17

Family

ID=68364259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910697079.0A Active CN110415278B (en) 2019-07-30 2019-07-30 Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera

Country Status (1)

Country Link
CN (1) CN110415278B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911231B (en) * 2021-01-22 2023-03-07 杭州海康威视数字技术股份有限公司 Linkage method and system of monitoring cameras
CN113379848A (en) * 2021-06-09 2021-09-10 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method based on binocular PTZ camera
CN113538596B (en) * 2021-07-15 2022-10-11 中国人民解放军火箭军工程大学 Moving target tracking system based on trinocular vision
CN113487683B (en) * 2021-07-15 2023-02-10 中国人民解放军火箭军工程大学 Target tracking system based on trinocular vision
CN113489964B (en) * 2021-07-15 2022-11-15 中国人民解放军火箭军工程大学 Scene depth information acquisition system based on trinocular vision
CN113470118B (en) * 2021-07-15 2023-12-05 中国人民解放军火箭军工程大学 Target size estimation system based on trinocular vision
CN115184917B (en) * 2022-09-13 2023-03-10 湖南华诺星空电子技术有限公司 Regional target tracking method integrating millimeter wave radar and camera
CN115713565A (en) * 2022-12-16 2023-02-24 盐城睿算电子科技有限公司 Target positioning method for binocular servo camera

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101719986A (en) * 2009-12-30 2010-06-02 湖北莲花山计算机视觉和信息科学研究院 PTZ tracking method and system based on multi-layered full-view modeling
CN102148965A (en) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 Video monitoring system for multi-target tracking close-up shooting
CN103024350A (en) * 2012-11-13 2013-04-03 清华大学 Master-slave tracking method for binocular PTZ (Pan-Tilt-Zoom) visual system and system applying same
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
KR20180106075A (en) * 2017-03-17 2018-10-01 공현식 The system and method for providing real-time golf game information including point-to-point distance and slope using two PTZ camera installed in a golf cart and automatic tracking of golf ball

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101719986A (en) * 2009-12-30 2010-06-02 湖北莲花山计算机视觉和信息科学研究院 PTZ tracking method and system based on multi-layered full-view modeling
CN102148965A (en) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 Video monitoring system for multi-target tracking close-up shooting
CN103024350A (en) * 2012-11-13 2013-04-03 清华大学 Master-slave tracking method for binocular PTZ (Pan-Tilt-Zoom) visual system and system applying same
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
KR20180106075A (en) * 2017-03-17 2018-10-01 공현식 The system and method for providing real-time golf game information including point-to-point distance and slope using two PTZ camera installed in a golf cart and automatic tracking of golf ball

Also Published As

Publication number Publication date
CN110415278A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN110415278B (en) Master-slave tracking method of auxiliary binocular PTZ (Pan-Tilt-zoom) visual system of linear moving PTZ (pan-Tilt-zoom) camera
CN108648241B (en) PTZ camera on-site calibration and focusing method
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
CN107194991B (en) Three-dimensional global visual monitoring system construction method based on skeleton point local dynamic update
CN107677274B (en) Unmanned plane independent landing navigation information real-time resolving method based on binocular vision
CN109559355B (en) Multi-camera global calibration device and method without public view field based on camera set
JP2008298685A (en) Measuring device and program
CN103402045A (en) Image de-spin and stabilization method based on subarea matching and affine model
CN111028281B (en) Depth information calculation method and device based on light field binocular system
CN112949478A (en) Target detection method based on holder camera
CN112132874A (en) Calibration-board-free different-source image registration method and device, electronic equipment and storage medium
Neves et al. Acquiring high-resolution face images in outdoor environments: A master-slave calibration algorithm
CN107038714A (en) Many types of visual sensing synergistic target tracking method
CN113506340A (en) Method and equipment for predicting cloud deck pose and computer readable storage medium
CN113379801B (en) High-altitude parabolic monitoring and positioning method based on machine vision
CN114812558A (en) Monocular vision unmanned aerial vehicle autonomous positioning method combined with laser ranging
CN111047636A (en) Obstacle avoidance system and method based on active infrared binocular vision
CN116778094B (en) Building deformation monitoring method and device based on optimal viewing angle shooting
CN111402315B (en) Three-dimensional distance measurement method for adaptively adjusting binocular camera baseline
CN112985259A (en) Target positioning method and system based on multi-view vision
CN117611525A (en) Visual detection method and system for abrasion of pantograph slide plate
CN108090930A (en) Barrier vision detection system and method based on binocular solid camera
CN113240749B (en) Remote binocular calibration and ranging method for recovery of unmanned aerial vehicle facing offshore ship platform
CN114935316A (en) Standard depth image generation method based on optical tracking and monocular vision
Xu et al. Research on target tracking algorithm based on parallel binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant