CN111093050A - Target monitoring method and device - Google Patents

Target monitoring method and device Download PDF

Info

Publication number
CN111093050A
CN111093050A CN201811221309.8A CN201811221309A CN111093050A CN 111093050 A CN111093050 A CN 111093050A CN 201811221309 A CN201811221309 A CN 201811221309A CN 111093050 A CN111093050 A CN 111093050A
Authority
CN
China
Prior art keywords
background image
tracking target
tracking
pan
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811221309.8A
Other languages
Chinese (zh)
Other versions
CN111093050B (en
Inventor
潘胜军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201811221309.8A priority Critical patent/CN111093050B/en
Publication of CN111093050A publication Critical patent/CN111093050A/en
Application granted granted Critical
Publication of CN111093050B publication Critical patent/CN111093050B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides a target monitoring method and a device, wherein the target monitoring method comprises the steps of determining a tracking target in a current monitoring picture of a pan-tilt camera and storing the current monitoring picture as a background image; controlling the pan-tilt camera to perform tracking shooting on the tracking target, so that the tracking target is located at the reference position of the shot tracking image; and in the process of tracking shooting, determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera, and identifying the current position in the background image. Therefore, the target can be monitored and shot in a tracking manner through a single pan-tilt camera, so that the hardware and software cost is saved, the installation space is more free, and the installation is not required to be debugged.

Description

Target monitoring method and device
Technical Field
The invention relates to the technical field of security and protection, in particular to a target monitoring method and device.
Background
In the related art, when monitoring a tracked target, a gun and ball linkage scheme is generally adopted in order to enable a user to observe the position of the tracked target in a panoramic area and the local details of the tracked target through a monitoring screen. In this solution, the gunlock is responsible for monitoring the panoramic area, and if the gunlock determines the tracking target in the monitoring picture, the position of the tracking target in the monitoring picture of the gunlock is sent to a pan-tilt camera (usually a ball machine). The pan-tilt camera correspondingly rotates according to the position of the gun and the relative position of the gun and the pan-tilt camera so that the tracking target is positioned at the center of a monitoring picture of the pan-tilt camera, and meanwhile, the magnification of the lens is properly adjusted so that the tracking target is clearly shown at the center of the monitoring picture. In this way, the user can observe the position of the tracking target in the panoramic area in the monitoring picture of the gunlock, and can observe the local details of the tracking target in the monitoring picture of the pan-tilt camera.
However, the above solution needs to be implemented by at least two devices (a gun and a pan-tilt camera), and has the following disadvantages: the hardware and software costs are high; the gunlock and the pan-tilt camera are matched for use, so that higher requirements are imposed on installation space; because different installation conditions can cause different relative positions of the gunlock and the pan-tilt camera, the relative position needs to be determined through debugging after installation; the process of sending the position of the tracked target to the pan-tilt camera by the gun is limited by the transmission condition between the gun and the pan-tilt camera, and the poor transmission condition can influence the real-time tracking of the pan-tilt camera on the tracked target.
Disclosure of Invention
In view of this, an object of the embodiments of the present invention is to provide a target monitoring method, so as to implement panoramic monitoring and tracking shooting on a target through a single pan-tilt camera.
In a first aspect, an embodiment of the present invention provides a target monitoring method, where the target monitoring method includes:
determining a tracking target in a current monitoring picture of a pan-tilt camera, and storing the current monitoring picture as a background image;
controlling the pan-tilt camera to perform tracking shooting on the tracking target, so that the tracking target is located at the reference position of the shot tracking image;
and in the process of tracking shooting, determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera, and identifying the current position in the background image.
Optionally, the determining the current position of the tracking target in the background image according to the current rotation parameter of the pan-tilt camera includes:
and determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera.
Optionally, the reference position is a center of the tracking image, and the target monitoring method further includes:
when the tracking target is determined, saving the current field angle and lens resolution of the pan-tilt camera;
the determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera includes:
obtaining a relative position relation between the current position of the tracking target in the background image and a reference point of the background image according to the relative relation between the rotation angle and the field angle and the lens resolution;
and determining the current position of the tracking target in the background image according to the position of the reference snack in the background image and the relative position relation.
Optionally, the rotation angle includes a horizontal rotation angle and a vertical rotation angle, the field angle includes a horizontal field angle and a vertical field angle, and the lens resolution includes a horizontal resolution and a vertical resolution;
obtaining a relative position relationship between a current position of the tracking target in the background image and a reference point of the background image according to the relative relationship between the rotation angle and the field angle and the lens resolution, including:
calculating a ratio of the tangent value of the horizontal rotation angle to the tangent value of the horizontal field angle, and taking the product of the obtained ratio and the horizontal resolution as a first relative distance between the current position and the picture center of the background image in the horizontal direction;
calculating a ratio of the vertical rotation angle to a tangent value of the vertical field angle, and taking a product of the obtained ratio and the vertical resolution as a second relative distance between the current position of the tracking target in the background image and the center of the picture in the vertical direction;
determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship, including:
obtaining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative distance;
and obtaining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative distance.
Optionally, the reference position is a center of the tracking image, and the target monitoring method further includes:
when the tracking target is determined, saving the current focal length of the pan-tilt camera;
the obtaining of the current position of the tracking target in the background image according to the rotation angle includes:
obtaining the relative position relation between the current position of the tracking target in the background image and a reference point of the background image according to the focal length and the rotation angle;
and determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship.
Optionally, the rotation angle comprises a horizontal rotation angle and a vertical rotation angle;
the obtaining of the relative position relationship between the current position of the tracking target in the background image and the reference point of the background image according to the focal length and the rotation angle includes:
calculating the product of the tangent value of the horizontal rotation angle and the focal length, and taking the obtained product as a first relative distance between the current position of the tracking target in the background image and the picture center of the background image in the horizontal direction;
calculating the product of the tangent value of the vertical rotation angle and the focal length, and taking the obtained product as a second relative distance between the current position of the tracking target in the background image and the center of the picture in the vertical direction;
determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship, including:
determining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative distance;
and determining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative distance.
Optionally, the reference position is a center of the tracking image;
the determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera includes:
calculating the number of preset unit angles included in the rotation angle;
and determining the position of the tracking target in the background image when the tracking target is tracked and shot according to the number and the preset unit length corresponding to the preset unit angle.
Optionally, the identifying the current location in the background image comprises:
and when the current position exceeds the boundary of the background image, identifying the position of the current position in the background image.
Optionally, the target monitoring method further includes:
and in the tracking shooting process, when a preset user instruction is received, controlling the pan-tilt camera to recover to the state when the tracking target is determined.
In a second aspect, an embodiment of the present invention further provides a target monitoring apparatus, where the target monitoring apparatus includes:
the panoramic monitoring module is used for determining a tracking target in a current monitoring picture of the pan-tilt camera and storing the current monitoring picture as a background image;
the tracking module is used for controlling the pan-tilt camera to perform tracking shooting on the tracking target so that the tracking target is positioned at the reference position of the shot tracking image;
and the processing module is used for determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera in the tracking shooting process, and identifying the current position in the background image.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides a target monitoring method and a device, wherein the target monitoring method comprises the steps of determining a tracking target in a current monitoring picture of a pan-tilt camera and storing the current monitoring picture as a background image; controlling the pan-tilt camera to perform tracking shooting on the tracking target, so that the tracking target is located at the reference position of the shot tracking image; and in the process of tracking shooting, determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera, and identifying the current position in the background image. Therefore, the target can be monitored and shot in a tracking manner through a single pan-tilt camera, so that the hardware and software cost is saved, the installation space is more free, and the installation is not required to be debugged.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic block diagram of a monitoring device according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a target monitoring method according to an embodiment of the present invention;
FIG. 3 is an approximate imaging model of a pan-tilt camera;
FIG. 4 is another schematic view of the imaging phantom shown in FIG. 3;
fig. 5 is a functional block diagram of a target monitoring apparatus according to an embodiment of the present invention.
Icon: 100-a monitoring device; 110-a machine-readable storage medium; 120-a processor; 130-a communication unit; 200-a target monitoring device; 210-a panoramic monitoring module; 220-a tracking module; 230-processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Fig. 1 is a block diagram of a monitoring device 100 according to an embodiment of the present invention. The monitoring device 100 may be a pan-tilt camera, or a device with image processing capability, such as a server, communicatively connected to the pan-tilt camera.
Monitoring device 100 includes an object monitoring apparatus 200, a machine-readable storage medium 110, and a processor 120.
The components of the machine-readable storage medium 110 and the processor 120 are electrically connected to each other directly or indirectly to achieve data transmission or interaction. The target monitoring apparatus 200 includes at least one software functional module which can be stored in the form of software or firmware (firmware) in the machine-readable storage medium 110 or solidified in an Operating System (OS) of the monitoring device 100. The processor 120 is used to execute executable modules stored in the machine-readable storage medium 110, such as software functional modules and computer programs included in the object monitoring apparatus 200.
The machine-readable storage medium 110 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 120 may be an integrated circuit chip having signal processing capabilities. The processor 120 may also be a general-purpose processor, such as a Central Processing Unit (CPU), a Network Processor (NP), a microprocessor, etc.; but may also be a Digital Signal Processor (DSP)), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components; the processor 130 may also be any conventional processor that may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention.
It should be understood that the configuration shown in fig. 1 is merely illustrative, and that the monitoring device 100 may also have more or fewer components than shown in fig. 1, or a completely different configuration than shown in fig. 1, for example, the monitoring device 100 may also include the communication unit 130. Further, the components shown in FIG. 1 may be implemented in software, hardware, or a combination thereof.
Fig. 2 is a schematic flow chart of a target monitoring method according to an embodiment of the present invention, which can be applied to the monitoring device 100 shown in fig. 1, and the following steps and specific flow shown in fig. 2 are described in detail.
Step S201: and determining a tracking target in a current monitoring picture of the pan-tilt camera, and storing the current monitoring picture as a background image.
In this embodiment, the tracking target may be a person, a vehicle, or another moving target in the current monitoring picture of the pan-tilt camera, and may also be another target meeting a specific condition, such as an object with a specific color, an object with a specific shape, and the like, which is not limited in this embodiment.
Alternatively, the tracking target may be determined in various ways when implemented. For example, the user may designate an object as the tracking target in the current monitoring picture of the pan-tilt camera, for example, the user selects the tracking target in the monitoring picture by a pointer operation; for another example, a rule for determining the tracking target may be preset in the monitoring device 100, and when a target triggering the rule is detected in the current monitoring picture of the pan-tilt camera, the target is determined as the tracking target, where the rule may include an area intrusion, a mixed line, a license plate, a face, a specific color, a specific shape, and the like.
In detail, the rule may be pre-stored in the monitoring device 100, and when the monitoring device 100 is a pan-tilt camera, the pan-tilt camera may determine the tracking target in the current monitoring picture directly according to the rule; when the monitoring device 100 is a server communicating with a pan-tilt camera, the server may obtain a current monitoring picture of the pan-tilt camera and determine the tracking target in the current monitoring picture according to the rule.
Under the condition that a tracking target is not determined, the pan-tilt camera in the embodiment of the invention is used for carrying out panoramic monitoring, and after the tracking target is determined, the tracking monitoring is carried out on the tracking target through certain transformation so as to obtain the local condition of the tracking target. The area scene in the monitoring picture when the tracking target is determined is the target area scene to be monitored by the user. Therefore, in order to enable a user to observe the position of the tracking target in the target area scene when the tracking target is tracked and shot, the background image with the target area scene displayed can be firstly saved, and then the position of the tracking target in the background image when the tracking target is tracked and shot is determined and identified in the background image.
Step S202: and controlling the pan-tilt camera to perform tracking shooting on the tracking target so that the tracking target is positioned at the reference position of the shot tracking image.
During the process of tracking and shooting the tracking target, the pan-tilt camera is generally controlled to rotate until the tracking target is aligned, so that the tracking target is located at the reference position of the shot tracking image. In a particular embodiment, the reference position may be the center of the tracking image.
It can be seen that, by controlling the pan-tilt camera to rotate, the reference position of the tracking image captured by the pan-tilt camera is changed from the reference position of the background image to the current position of the tracking target in the background image. Therefore, the current position of the tracking target in the background image has a corresponding relation with the rotation parameters of the pan-tilt camera.
Based on the above description, the present embodiment realizes the determination of the current position of the tracking target in the background image through step S203.
Step S203: and in the process of tracking shooting, determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera, and identifying the current position in the background image.
It should be noted that, when the reference position is the center of the tracking image, the magnification of the pan-tilt camera may be adjusted during the tracking shooting process to obtain more local information. However, no matter how the magnification is adjusted, the object displayed at the center of the screen is still displayed at the center of the screen, that is, the adjustment of the magnification does not affect the rotation of the pan/tilt camera when the tracking target is displayed at the center of the screen (for example, the tracking image), and thus the current rotation parameter is not changed by the influence of the adjustment of the magnification in step S203. Therefore, when the reference position is the center of the tracking image, even if the magnification of the pan/tilt camera is changed during tracking shooting, the determination of the current position according to the rotation parameter can be achieved through the above step S203.
Alternatively, the current rotation parameter of the pan-tilt camera may be represented in different forms, for example, by a rotation angle; as another example, this may be represented by a rotational distance; as another example, in some pan/tilt cameras, a motor for controlling the rotation of the pan/tilt camera is provided, in which case, the rotation parameter of the motor can be represented by a rotation parameter of the motor, wherein the rotation parameter of the motor can be a number of rotations, a duration of the rotation of the motor, and the like.
The above steps will be described in detail with reference to fig. 3 and 4.
As shown in fig. 3, the plane ABCD is an imaging plane of the pan/tilt camera, and a picture formed on the imaging plane ABCD is a monitoring picture of the pan/tilt camera. The U-V-Z is a camera coordinate system of the pan-tilt camera, and the camera coordinate system can be a space rectangular coordinate system shown by the coordinate system U-V-Z, and can also be a space spherical coordinate system, a space polar coordinate system and the like. The origin F OF the coordinate system U-V-Z is the optical center OF the pan/tilt camera, the length OF the line segment OF is the focal length OF the pan/tilt camera, and the Z axis is the optical axis OF the pan/tilt camera. It can be determined that the optical axis (Z-axis) is perpendicular to the imaging plane ABCD, and the intersection O of the Z-axis and the imaging plane ABCD is the center of the monitoring screen.
Based on the imaging model: for any point P in the three-dimensional space, the intersection point P' of the connecting line FP of the point P and the optical center F and the imaging plane ABCD is the imaging point of the point P on the imaging plane ABCD.
Now, assuming that a point P' in the monitoring picture of the pan/tilt camera is a tracking target, the pan/tilt camera may rotate to the alignment point P. When the reference position is the center of the tracking image, the pan/tilt camera may rotate until the point P' is located at the center of the captured tracking image. Referring to the model shown in fig. 3, to locate the point P' at the center of the captured tracking image, the current rotation parameters of the pan-tilt camera need to satisfy: the optical axis (Z-axis) is rotated to the alignment point P.
Taking the rotation parameter as an example of a rotation angle, the orientation of the point P' in the background image with respect to the center O of the background image can be obtained when the current rotation angle of the pan-tilt camera is determined. Then, when determining lens parameters (such as lens resolution, field angle, focal length, etc.) when the pan-tilt camera takes the background image, a relative positional relationship between a current position of the point P' in the background image and any reference point (including the point O) in the background image may be calculated according to the known orientation, and then a current position of the tracking target in the background image may be calculated according to a position of the reference point in the background image and the relative positional relationship.
In an alternative manner, a plane coordinate system may be established on the background image with the point O as an origin, and at this time, if the point O is taken as a reference point, a relative positional relationship between a current position of the point P 'in the background image and the point O (i.e., a picture center) in the background image is calculated according to a current rotation angle of the pan/tilt camera, and may be directly used to represent the position of the point P' in the background image.
According to different specific application scenes, the current position of the tracking target in the background image can be determined in different calculation modes according to the current rotation angle of the pan-tilt camera.
In one embodiment, in a panoramic monitoring state before the tracking target is not determined, the magnification adopted by the pan-tilt camera is not necessarily the same, which causes different lens parameters and different sizes of background images. In this case, the object monitoring method provided in this embodiment may further include the following steps to determine the lens parameters when the background image is captured:
and when the tracking target is determined, saving the current field angle and lens resolution of the pan-tilt camera.
In implementation, the field angle data and the lens resolution data under different magnifications may be prestored in the monitoring device 100, and the monitoring device 100 determines the field angle and the lens resolution of the pan-tilt camera when the background image is captured by determining the magnification at the time of capturing the background image.
Based on the above description, when the magnification of the pan/tilt camera determining the background image may be adjusted, the geometric parameters of the imaging model shown in fig. 3 may also be determined by the relevant parameters, so the target monitoring method provided in this embodiment further includes:
correspondingly, step S203 may comprise the following sub-steps:
firstly, according to the relative relationship between the rotation angle and the field angle and the lens resolution, obtaining the relative position relationship between the current position of the tracking target in the background image and the reference point of the background image.
Optionally, the rotation angle includes a horizontal rotation angle and a vertical rotation angle, the field angle includes a horizontal field angle and a vertical field angle, and the lens resolution includes a horizontal resolution and a vertical resolution.
Accordingly, the above steps can be realized by the following sub-steps:
calculating a ratio of the tangent value of the horizontal rotation angle to the tangent value of the horizontal field angle, and taking the product of the obtained ratio and the horizontal resolution as a first relative distance between the current position of the tracking target in the background image and the picture center of the background image in the horizontal direction;
and calculating the ratio of the vertical rotation angle to the tangent value of the vertical field angle, and taking the product of the obtained ratio and the vertical resolution as a second relative distance between the current position of the tracking target in the background image and the center of the picture in the vertical direction.
The above steps are described in detail below with reference to fig. 4:
fig. 4 is another schematic diagram of the imaging model shown in fig. 3, and the same letter meaning is consistent with fig. 3 and is not repeated. Four sides of the rectangle ABCD represent the boundary of the background image, and a two-dimensional image coordinate system is established on the background image with the center O as an origin, wherein the X-axis direction represents the horizontal direction, and the Y-axis direction represents the vertical direction, and respectively coincide with the U-axis direction and the V-axis direction of the camera coordinate system (which are not labeled for simplifying fig. 4). Still taking the above-mentioned tracking target as a point P as an example, the projections of the intersection point P' of the line PF connecting the point P and the optical center F and the background image (imaging plane ABCD) on the X-axis and the Y-axis are respectively the point PxAnd point PyAnd respectively corresponding to the current position of the tracking target in the horizontal direction and the current position of the tracking target in the vertical direction of the background image.
∠ P shown in FIG. 4 based on the above descriptionxFO is the horizontal rotation angle, PxThe ∠ PyFO shown in FIG. 4 is the vertical rotation angle, and the ∠ PxFO and said ∠ PyFO satisfies the following proportional relationship:
Figure BDA0001834836200000121
Figure BDA0001834836200000122
suppose there is a point K located at a boundary in the horizontal direction1Then the pan-tilt camera needs to rotate one half of the horizontal view angle in the horizontal direction to make K1Now, assume that the horizontal field angle of the pan/tilt camera when the background image is captured is 2 α, and the horizontal resolution of the pan/tilt camera when the background image is captured is 2 ×mWherein is point K1The absolute value of the abscissa in the background image is XmI.e. point K1And a first relative distance of the point O in the horizontal direction. Referring to fig. 4, the following relationship exists:
Figure BDA0001834836200000131
thus, in combination with the above calculation formulas (1) and (3), it can be determined that:
Figure BDA0001834836200000132
further, it is possible to obtain:
Figure BDA0001834836200000133
wherein, the calculated P 'Py is the first relative distance between the point P' and the point O in the horizontal direction.
Correspondingly, assume that there is a point K located at the boundary in the vertical direction2Then the pan-tilt camera needs to rotate one half of the vertical field angle in the horizontal direction to make the point K2At the center of the captured tracking image, it is assumed that the vertical field angle of the pan/tilt camera when capturing the background image is 2 β, and the vertical resolution of the pan/tilt camera when capturing the background image is 2YmThen point K2The absolute value of the ordinate in the background image is YmI.e. point K2And a second relative distance of the point O in the vertical direction, in which case there is a proportional relationship:
Figure BDA0001834836200000134
then, in combination with the calculations (2) and (4), it can be determined:
Figure BDA0001834836200000135
further, it is possible to obtain:
Figure BDA0001834836200000136
wherein, the calculated P' PxI.e. a second relative distance in the vertical direction between point P' and point O.
Secondly, according to the position of the reference point in the background image and the relative position relationship, the current position of the tracking target in the background image can be determined.
Accordingly, the above steps can be realized by the following sub-steps:
obtaining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative position relation;
and obtaining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative position relation.
In this embodiment, the center O point of the background image can be directly used as the origin of the image coordinate system, in this case, P' PyThe absolute value of the abscissa of the point P 'in the background image, P' P, can be directly expressedxThe absolute value of the ordinate of the point P' in the background image can be directly represented. Further, the positive and negative of the abscissa and the ordinate may be determined according to the quadrant in which the point P' is located. For example, as shown in FIG. 3, when point P ' is at the second quadrant, the coordinate of point P ' is (-P ' P)y,-P'Px)。
Of course, one vertex of the background image may be used as the origin of the image coordinate system.
In another embodiment, it can be determined according to calculation formulas (1) and (2):
OPx=OF*tan∠PxFO
OPy=OF*tan∠PyFO
thereby, it is found that the point P' and the point O are in the horizontal directionA first relative distance P' P ofyA second relative distance P 'P between the point P' and the point O in the vertical directionxRespectively satisfy:
P'Py=OPx=OF*tan∠PxFO
P'Px=OPy=OF*tan∠PyFO
in this case, the target monitoring method provided in this embodiment may further include:
and when the tracking target is determined, saving the current focal length of the pan-tilt camera.
The monitoring device 100 may store focal lengths corresponding to different magnifications, and in implementation, the current focal length of the pan/tilt camera may be found and stored according to the magnification of the pan/tilt camera when determining the tracking target.
Correspondingly, step S203 may further include the following sub-steps:
firstly, obtaining the relative position relation between the current position of the tracking target in the background image and the reference point of the background image according to the focal length and the rotation angle.
Wherein, optionally, the rotation angle comprises a horizontal rotation angle and a vertical rotation angle.
Thus, the above steps can be implemented by the following sub-steps:
calculating the product of the tangent value of the horizontal rotation angle and the focal length, and taking the obtained product as a first relative distance between the current position of the tracking target in the background image and the picture center of the background image in the horizontal direction;
calculating the product of the tangent value of the vertical rotation angle and the focal length, and taking the obtained product as a second relative distance between the current position of the tracking target in the background image and the picture center of the background image in the vertical direction;
secondly, determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship.
Accordingly, step S203 may be implemented by following sub-steps:
determining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative distance;
and determining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative distance.
In yet another embodiment, in a panoramic monitoring state before tracking the target is not determined, the pan-tilt camera usually adopts a fixed magnification, such as a minimum magnification, to ensure that the monitoring view is maximized. On the basis, when the reference position is the center of the tracking image, the field angle and the lens resolution at the minimum magnification can be acquired in advance, the field angle is equally divided into a plurality of preset unit angles, the lens resolution is equally divided into a plurality of preset unit lengths, and the corresponding relationship between the preset unit angles and the preset unit lengths is established and stored.
In this case, step S203 may include the following sub-steps:
calculating the number of preset unit angles included in the rotation angle;
and determining the current position of the tracking target in the background image according to the number and the preset unit length corresponding to the preset unit angle.
If the magnification adopted by the pan-tilt camera is not necessarily the same as the fixed magnification in the panoramic monitoring state before the tracking target is not determined, for each magnification, the field angle and the lens resolution under the magnification are obtained in advance, the field angle is equally divided into a plurality of preset unit angles, the lens resolution is equally divided into a plurality of preset unit lengths, and the corresponding relation between the preset unit angles and the preset unit lengths is established and stored. At this time, the current position of the tracking target in the background image can be determined in the manner described in the above substeps only by determining the current magnification of the pan/tilt camera when determining the tracking target.
Alternatively, in this embodiment, there is a certain correspondence between the rotation angle of the pan/tilt camera and the parameters of the motor of the pan/tilt camera (the number of rotations of the motor, time, etc.), and therefore, the rotation angle of the pan/tilt camera can also be expressed by the parameters of the motor of the pan/tilt camera. In this case, the position of the tracking target in the background image can be determined directly from the parameters of the motor of the pan-tilt camera. Optionally, there may be multiple ways of identifying the current position in the background image, which is not limited in this embodiment. For example, an identification pattern of the tracking target, which may be a box, a dot, a triangle, or the like, may be superimposed in the background image according to the current position determined in step S203. In a specific implementation manner, the zoomed tracking image may also be directly displayed as the identification graph at the current position in the background image, and the current position of the tracking target is identified in the background image in a picture-in-picture manner. Wherein the logo graphic may be highlighted in a bright color. In the above manner, the current position of the tracking target may also be continuously identified in the background image, so as to display the motion trajectory of the tracking target in the background image.
Optionally, when the determined current position of the tracking target in the background image exceeds the boundary of the background image, the manner of identifying the current position of the tracking target in the background image includes identifying the orientation of the tracking target in the background image relative to the center of the screen of the background image.
In addition, in some embodiments of the present invention, during the tracking shooting process, the user may restore the pan-tilt camera to the panoramic monitoring state by presetting a user instruction. That is, in the process of performing tracking shooting on the tracking target, when the monitoring apparatus 100 receives a user instruction, the pan/tilt camera is controlled to return to the state when shooting the background image. Specifically, the magnification and the shooting angle of the background image may be restored to re-determine the tracking target in the same panoramic area (i.e., the panoramic area displayed in the background image), and when a new tracking target is determined again, the new tracking target is tracked and monitored by the target monitoring method provided in this embodiment.
Through the process, the position of the tracking target in the panoramic area can be displayed through the background image, and the local details of the tracking target can be displayed through the tracking image. Compared with the existing gun-ball linkage scheme, the target monitoring method provided by the embodiment of the invention at least comprises the following beneficial effects: hardware and software costs are saved; the installation space is more free; and debugging is not needed after the installation is finished.
As shown in fig. 5, an object monitoring apparatus 200 according to an embodiment of the present invention is provided, where the apparatus object monitoring apparatus 200 includes a panoramic monitoring module 210, a tracking module 220, and a processing module 230.
The panoramic monitoring module 210 is configured to determine a tracking target in a current monitoring picture of the pan/tilt camera, and store the current monitoring picture as a background image.
In the present embodiment, the description of the panoramic monitoring module 210 may specifically refer to the detailed description of step S201 shown in fig. 2, that is, step S201 may be performed by the panoramic monitoring module 210.
The tracking module 220 is configured to control the pan-tilt camera to perform tracking shooting on the tracking target, so that the tracking target is located at a reference position of the shot tracking image.
In this embodiment, the description of the tracking module 220 may specifically refer to the detailed description of step S202 shown in fig. 2, that is, step S202 may be executed by the tracking module 220.
The processing module 230 is configured to, during a tracking shooting process, determine a current position of the tracking target in the background image according to the current rotation parameter of the pan-tilt camera, and identify the current position in the background image.
In this embodiment, the description of the processing module 230 may specifically refer to the detailed description of step S203 shown in fig. 2, that is, step S203 may be executed by the processing module 230.
In summary, an embodiment of the present invention provides a target monitoring method and a device, where the target monitoring method includes determining a tracking target in a current monitoring picture of a pan-tilt camera, saving the current monitoring picture as a background image, controlling the pan-tilt camera to perform tracking shooting on the tracking target, locating the tracking target at a reference position of a shot tracking image, determining a current position of the tracking target in the background image according to a rotation parameter of the pan-tilt camera during tracking shooting, identifying the current position of the tracking target in the background image, and sending the background image and the tracking image, where the current position of the tracking target is identified, to a display terminal for display. Therefore, the position of the tracking target in the panoramic area and the local characteristics of the tracking target are acquired simultaneously based on the single pan-tilt camera, the hardware and software cost is saved, the installation space is more free, and debugging is not needed after installation.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing description is of selected embodiments of the present invention only, and is not intended to limit the present invention, which may be modified and varied by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An object monitoring method, characterized in that the object monitoring method comprises:
determining a tracking target in a current monitoring picture of a pan-tilt camera, and storing the current monitoring picture as a background image;
controlling the pan-tilt camera to perform tracking shooting on the tracking target, so that the tracking target is located at the reference position of the shot tracking image;
and in the process of tracking shooting, determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera, and identifying the current position in the background image.
2. The method for monitoring the target according to claim 1, wherein the determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera comprises:
and determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera.
3. The object monitoring method according to claim 2, wherein the reference position is a center of the tracking image, the object monitoring method further comprising:
when the tracking target is determined, saving the current field angle and lens resolution of the pan-tilt camera;
the determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera includes:
obtaining a relative position relation between the current position of the tracking target in the background image and a reference point of the background image according to the relative relation between the rotation angle and the field angle and the lens resolution;
and determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship.
4. The object monitoring method according to claim 3, wherein the rotation angle includes a horizontal rotation angle and a vertical rotation angle, the field angles include a horizontal field angle and a vertical field angle, and the lens resolution includes a horizontal resolution and a vertical resolution;
obtaining a relative position relationship between a current position of the tracking target in the background image and a reference point of the background image according to the relative relationship between the rotation angle and the field angle and the lens resolution, including:
calculating a ratio of the tangent value of the horizontal rotation angle to the tangent value of the horizontal field angle, and taking the product of the obtained ratio and the horizontal resolution as a first relative distance between the current position and the picture center of the background image in the horizontal direction;
calculating a ratio of the vertical rotation angle to a tangent value of the vertical field angle, and taking a product of the obtained ratio and the vertical resolution as a second relative distance between the current position of the tracking target in the background image and the center of the picture in the vertical direction;
determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship, including:
obtaining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative distance;
and obtaining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative distance.
5. The object monitoring method according to claim 2, wherein the reference position is a center of the tracking image, the object monitoring method further comprising:
when the tracking target is determined, saving the current focal length of the pan-tilt camera;
the obtaining of the current position of the tracking target in the background image according to the rotation angle includes:
obtaining the relative position relation between the current position of the tracking target in the background image and a reference point of the background image according to the focal length and the rotation angle;
and determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship.
6. The object monitoring method according to claim 5, wherein the rotation angle includes a horizontal rotation angle and a vertical rotation angle;
the obtaining of the relative position relationship between the current position of the tracking target in the background image and the reference point of the background image according to the focal length and the rotation angle includes:
calculating the product of the tangent value of the horizontal rotation angle and the focal length, and taking the obtained product as a first relative distance between the current position of the tracking target in the background image and the picture center of the background image in the horizontal direction;
calculating the product of the tangent value of the vertical rotation angle and the focal length, and taking the product as a second relative distance between the current position of the tracking target in the background image and the center of the picture in the vertical direction;
determining the current position of the tracking target in the background image according to the position of the reference point in the background image and the relative position relationship, including:
determining the current position of the tracking target in the horizontal direction of the background image according to the position of the picture center in the horizontal direction of the background image and the first relative distance;
and determining the current position of the tracking target in the vertical direction of the background image according to the position of the picture center in the vertical direction of the background image and the second relative distance.
7. The object monitoring method according to claim 2, wherein the reference position is a center of the tracking image;
the determining the current position of the tracking target in the background image according to the current rotation angle of the pan-tilt camera includes:
calculating the number of preset unit angles included in the rotation angle;
and determining the position of the tracking target in the background image when the tracking target is tracked and shot according to the number and the preset unit length corresponding to the preset unit angle.
8. The object monitoring method according to any one of claims 1 to 7, wherein the identifying the current location in the background image comprises:
and when the current position exceeds the boundary of the background image, identifying the position of the current position in the background image.
9. The object monitoring method according to any one of claims 1 to 7, further comprising:
and in the tracking shooting process, when a preset user instruction is received, controlling the pan-tilt camera to recover to the state when the tracking target is determined.
10. An object monitoring device, comprising:
the panoramic monitoring module is used for determining a tracking target in a current monitoring picture of the pan-tilt camera and storing the current monitoring picture as a background image;
the tracking module is used for controlling the pan-tilt camera to perform tracking shooting on the tracking target so that the tracking target is positioned at the reference position of the shot tracking image;
and the processing module is used for determining the current position of the tracking target in the background image according to the current rotation parameters of the pan-tilt camera in the tracking shooting process, and identifying the current position in the background image.
CN201811221309.8A 2018-10-19 2018-10-19 Target monitoring method and device Active CN111093050B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811221309.8A CN111093050B (en) 2018-10-19 2018-10-19 Target monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811221309.8A CN111093050B (en) 2018-10-19 2018-10-19 Target monitoring method and device

Publications (2)

Publication Number Publication Date
CN111093050A true CN111093050A (en) 2020-05-01
CN111093050B CN111093050B (en) 2021-03-09

Family

ID=70391359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811221309.8A Active CN111093050B (en) 2018-10-19 2018-10-19 Target monitoring method and device

Country Status (1)

Country Link
CN (1) CN111093050B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739072A (en) * 2020-06-22 2020-10-02 浙江大华技术股份有限公司 Pixel matching method and device, storage medium and electronic device
CN111768433A (en) * 2020-06-30 2020-10-13 杭州海康威视数字技术股份有限公司 Method and device for realizing tracking of moving target and electronic equipment
CN112055158A (en) * 2020-10-16 2020-12-08 苏州科达科技股份有限公司 Target tracking method, monitoring device, storage medium and system
CN112351208A (en) * 2020-11-03 2021-02-09 中冶赛迪重庆信息技术有限公司 Automatic tracking method, system, equipment and medium for loading and unloading videos of unmanned vehicles
CN113470083A (en) * 2021-07-27 2021-10-01 浙江大华技术股份有限公司 Panoramic tracking method, panoramic monitoring and tracking device and electronic equipment
WO2022037215A1 (en) * 2020-08-21 2022-02-24 海信视像科技股份有限公司 Camera, display device and camera control method
CN114119651A (en) * 2021-11-30 2022-03-01 重庆紫光华山智安科技有限公司 Target tracking method, system, device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103034247A (en) * 2012-12-04 2013-04-10 浙江天地人科技有限公司 Controlling method and controlling device for remote monitoring system
US20130250087A1 (en) * 2012-03-23 2013-09-26 Peter A. Smith Pre-processor imaging system and method for remotely capturing iris images
CN103826103A (en) * 2014-02-27 2014-05-28 浙江宇视科技有限公司 Cruise control method for tripod head video camera
CN104184932A (en) * 2013-05-20 2014-12-03 浙江大华技术股份有限公司 Spherical camera control method and device thereof
CN104616322A (en) * 2015-02-10 2015-05-13 山东省科学院海洋仪器仪表研究所 Onboard infrared target image identifying and tracking method and device
US20170230651A1 (en) * 2010-08-26 2017-08-10 Blast Motion Inc. Intelligent motion capture element
CN108574825A (en) * 2017-03-10 2018-09-25 华为技术有限公司 A kind of method of adjustment and device of monopod video camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170230651A1 (en) * 2010-08-26 2017-08-10 Blast Motion Inc. Intelligent motion capture element
US20130250087A1 (en) * 2012-03-23 2013-09-26 Peter A. Smith Pre-processor imaging system and method for remotely capturing iris images
CN103034247A (en) * 2012-12-04 2013-04-10 浙江天地人科技有限公司 Controlling method and controlling device for remote monitoring system
CN104184932A (en) * 2013-05-20 2014-12-03 浙江大华技术股份有限公司 Spherical camera control method and device thereof
CN103826103A (en) * 2014-02-27 2014-05-28 浙江宇视科技有限公司 Cruise control method for tripod head video camera
CN104616322A (en) * 2015-02-10 2015-05-13 山东省科学院海洋仪器仪表研究所 Onboard infrared target image identifying and tracking method and device
CN108574825A (en) * 2017-03-10 2018-09-25 华为技术有限公司 A kind of method of adjustment and device of monopod video camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
侯红娜: "基于商场实时监控视频的云台镜头目标跟踪算法设计与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739072A (en) * 2020-06-22 2020-10-02 浙江大华技术股份有限公司 Pixel matching method and device, storage medium and electronic device
CN111768433A (en) * 2020-06-30 2020-10-13 杭州海康威视数字技术股份有限公司 Method and device for realizing tracking of moving target and electronic equipment
CN111768433B (en) * 2020-06-30 2024-05-24 杭州海康威视数字技术股份有限公司 Method and device for realizing tracking of moving target and electronic equipment
WO2022037215A1 (en) * 2020-08-21 2022-02-24 海信视像科技股份有限公司 Camera, display device and camera control method
CN112055158A (en) * 2020-10-16 2020-12-08 苏州科达科技股份有限公司 Target tracking method, monitoring device, storage medium and system
CN112055158B (en) * 2020-10-16 2022-02-22 苏州科达科技股份有限公司 Target tracking method, monitoring device, storage medium and system
CN112351208A (en) * 2020-11-03 2021-02-09 中冶赛迪重庆信息技术有限公司 Automatic tracking method, system, equipment and medium for loading and unloading videos of unmanned vehicles
CN113470083A (en) * 2021-07-27 2021-10-01 浙江大华技术股份有限公司 Panoramic tracking method, panoramic monitoring and tracking device and electronic equipment
CN114119651A (en) * 2021-11-30 2022-03-01 重庆紫光华山智安科技有限公司 Target tracking method, system, device and storage medium

Also Published As

Publication number Publication date
CN111093050B (en) 2021-03-09

Similar Documents

Publication Publication Date Title
CN111093050B (en) Target monitoring method and device
CN110278382B (en) Focusing method, device, electronic equipment and storage medium
CN109587477B (en) Image acquisition equipment selection method and device, electronic equipment and storage medium
CN111750820B (en) Image positioning method and system
US20180197324A1 (en) Virtual viewpoint setting apparatus, setting method, and storage medium
JP5613041B2 (en) Camera device, image processing system, and image processing method
CN108574825B (en) Method and device for adjusting pan-tilt camera
JP2011239361A (en) System and method for ar navigation and difference extraction for repeated photographing, and program thereof
CN107517360B (en) Image area shielding method and device
CN111737518A (en) Image display method and device based on three-dimensional scene model and electronic equipment
CN106408551A (en) Monitoring device controlling method and device
WO2021168804A1 (en) Image processing method, image processing apparatus and image processing system
US20120069223A1 (en) Camera device with rotary base
CN111652937A (en) Vehicle-mounted camera calibration method and device
CN105100577A (en) Imaging processing method and device
CN112866627A (en) Three-dimensional video monitoring method and related equipment
CN111279393A (en) Camera calibration method, device, equipment and storage medium
WO2022213833A1 (en) Method and apparatus for image synchronization, electronic device, and computer storage medium
CN111586383B (en) Method and device for projection and projection equipment
Lin et al. Large-area, multilayered, and high-resolution visual monitoring using a dual-camera system
CN112203066A (en) Target tracking dynamic projection method and dynamic projection equipment
CN112073633B (en) Data processing method and system
CN110741633A (en) Image processing method, electronic device, and computer-readable storage medium
CN117255247B (en) Method and device for linkage of panoramic camera and detail dome camera
KR102637344B1 (en) Device and method to perform object recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant