CN115278014B - Target tracking method, system, computer equipment and readable medium - Google Patents

Target tracking method, system, computer equipment and readable medium Download PDF

Info

Publication number
CN115278014B
CN115278014B CN202210871722.9A CN202210871722A CN115278014B CN 115278014 B CN115278014 B CN 115278014B CN 202210871722 A CN202210871722 A CN 202210871722A CN 115278014 B CN115278014 B CN 115278014B
Authority
CN
China
Prior art keywords
target
image
tracked
tracking
shooting device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210871722.9A
Other languages
Chinese (zh)
Other versions
CN115278014A (en
Inventor
刘仕川
何凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202210871722.9A priority Critical patent/CN115278014B/en
Publication of CN115278014A publication Critical patent/CN115278014A/en
Application granted granted Critical
Publication of CN115278014B publication Critical patent/CN115278014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Abstract

The application provides a target tracking method, a target tracking system, computer equipment and a readable medium, wherein the target tracking method comprises the following steps: acquiring a first image, detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected; the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device; and finally, tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished. If the first image is a gun camera image and the second image shooting device is a ball camera, compared with the traditional ball camera linkage control method, the method can improve the response speed and tracking accuracy in the gun-ball linkage interaction process, and can also increase the stability of the target tracking process.

Description

Target tracking method, system, computer equipment and readable medium
Technical Field
The present application relates to the field of computer vision, and in particular, to a target tracking method, a target tracking system, a computer device, and a readable medium.
Background
With the rapid development of the monitoring field, a single gun-shaped camera (abbreviated as a gun camera) or a single spherical camera (abbreviated as a spherical camera) cannot meet the increasingly complex monitoring scene. The scope of the angle of view of the rifle bolt is large, the monitoring range is wide; the spherical camera has a small field angle and a small monitoring range, but can acquire the detail information of a monitoring target by controlling PTZ (Pan/Tilt/Zoom) in all directions (left/right/up/down) representing Pan/Tilt and Zoom control.
In order to meet the increasingly complex monitoring scene, the use of the gun camera and ball camera linkage (gun-ball linkage for short) is becoming more and more popular. The gun-ball linkage target tracking scheme is that an intelligent algorithm is operated on a gun camera to identify and track a moving target, the gun camera maps target coordinates detected by the algorithm to a ball camera image coordinate system, the ball camera rotates PTZ to an area where the target is located, and whether the target is the same target is judged by comparing feature similarity of a gun camera target image and a ball camera target image. After the feature matching is successful, the goal is locked by the ball machine, and the PTZ is controlled to continuously track the goal according to a detection tracking algorithm running on the ball machine.
However, the first disadvantage of the existing scheme is that for targets with high moving speed, such as motor vehicles or non-motor vehicles, the series of processes of feature comparison, rotation of the pan-tilt, zoom and focusing are generally long, so that the situation that the targets cannot be tracked or the targets cannot be tracked in the initial stage easily occurs. In addition, the second disadvantage of the existing scheme is that after the dome camera locks the tracking target, the cradle head is controlled to rotate according to the coordinates of the target, cradle head shake easily occurs, and the problem that the tracked target image shakes, even tracks and diverges, and the target is lost is caused.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, an object of the present application is to provide a target tracking method, system, computer device and readable medium, which are used for solving the problems of target following error and following loss, which often occur in the gun ball linkage process in the prior art, and shaking of a cradle head of a ball machine in the tracking process.
To achieve the above and other related objects, the present application provides a target tracking method, including the steps of:
acquiring a first image, wherein the first image comprises an image containing one or more targets, which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device;
Detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image;
the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device;
and tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished.
In an embodiment of the present application, the process of sending the attribute information of the target to be tracked to a second image capturing device and adjusting the second image capturing device based on the coordinates of the target to be tracked in the first image includes:
marking the sitting position of the target to be tracked in the first image as a first coordinate, mapping the first coordinate into an image coordinate system of the second image shooting device, and obtaining the coordinate of the target to be tracked in the second image shooting device, and marking the coordinate as a second coordinate;
And carrying out position adjustment on the second image shooting device by using a holder until the second coordinate is positioned in a preset area of the second image shooting device.
In an embodiment of the present application, after adjusting the second image capturing device, the method further includes:
the second image shooting device is subjected to zooming and focusing based on the size information of the target to be tracked in the first image shooting device, so that the definition of the target to be tracked in the second image shooting device is higher than that of the target to be tracked in the first image shooting device; the attribute information of the target to be tracked comprises size information of the target to be tracked in the first image shooting device.
In an embodiment of the present application, the process of tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image capturing device includes:
acquiring an image shot by the second image shooting device after the adjustment is completed, and recording the image as a second image;
identifying the second image, screening out targets which belong to the same type as the target to be tracked, and marking the targets as candidate targets;
Obtaining the mapping coordinates of the target to be tracked in the second image shooting device and the actual coordinates of each candidate target in the second image shooting device, and calculating the distance between the mapping coordinates of the target to be tracked and the actual coordinates of the candidate targets;
sequencing according to the calculated distance, sequentially selecting candidate targets with small distance values, comparing features of the candidate targets with the target to be tracked, and calculating feature similarity of the candidate targets and the target to be tracked;
comparing the calculated feature similarity with a preset threshold, and selecting a corresponding candidate target when the calculated feature similarity is larger than the preset threshold;
tracking the selected candidate target by using the adjusted second image shooting device;
the attribute information of the target to be tracked comprises the type of the target to be tracked and the characteristics of the target to be tracked.
In an embodiment of the present application, the process of tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image capturing device includes:
acquiring the time for tracking the target to be tracked, and recording the time as target tracking time;
Comparing the target tracking time with a first preset time, and ending tracking of the target to be tracked after the target tracking time exceeds the first preset time; or alternatively, the process may be performed,
and acquiring the selected candidate target in the display area of the second image shooting device, and ending tracking of the target to be tracked if the selected candidate target does not appear in the display area of the second image shooting device.
In an embodiment of the present application, the process of identifying the second image and screening out the target belonging to the same type as the target to be tracked includes:
acquiring an attribute analysis operator, wherein the attribute analysis operator is preconfigured with attribute information;
identifying the second image based on the pre-configured attribute information in the attribute analysis operator, and acquiring targets existing in the second image and the type of each target;
and screening out targets which belong to the same type as the target to be tracked from the second image based on the type of the target to be tracked.
In an embodiment of the present application, the process of detecting and identifying the first image includes:
acquiring an event analysis operator, wherein the event analysis operator is preconfigured with an event judgment rule;
Analyzing one or more targets in the first image based on the event analysis operator, and determining whether at least one of preset alarm events exists in the one or more targets in the first image; wherein, the preset alarm event comprises: area intrusion events, area entry events, area exit events, wire-mixing detection events, rapid movement events, personnel wander events;
if one or more targets in the first image have at least one of preset alarm events, marking that the targets in the first image trigger the preset alarm event;
if one or more targets in the first image do not have any one of the preset alarm events, marking that the targets in the first image do not trigger the preset alarm event.
The application also provides a target tracking system, which comprises:
the image acquisition module is used for acquiring a first image, wherein the first image comprises an image containing one or more targets, which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device;
the target identification module is used for detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image;
The image adjustment module is used for sending the attribute information of the target to be tracked to a second image shooting device, and adjusting the second image shooting device based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device;
and the target tracking module is used for tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished.
The present application also provides a computer device comprising:
a processor; and
a computer readable medium storing instructions that, when executed by the processor, cause the apparatus to perform a method as described in any one of the above.
The application also provides a computer readable medium having instructions stored thereon, the instructions being loaded by a processor and performing a method as described in any of the above.
As described above, the present application provides a target tracking method, system, computer device, and readable medium, which have the following beneficial effects:
the method comprises the steps of firstly obtaining a first image, then detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image; the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device; and finally, tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished. The first image comprises an image which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device and contains one or more targets. The first image capturing device in the present application may be a rifle bolt, and the second image capturing device may be a ball machine. Therefore, compared with the traditional ball machine linkage control method, the scheme provided by the application can improve the response speed and tracking accuracy in the gun ball linkage interaction process, is particularly suitable for detecting and tracking fast moving targets of motor vehicles, non-motor vehicles and the like, and solves the problems of target following error and following loss frequently occurring in the gun ball linkage process. In addition, the gun ball linkage holder control algorithm provided by the application can solve the problem of shaking of the ball machine holder in the tracking process, and the stability of the target tracking process is improved.
Drawings
FIG. 1 is a schematic diagram of an exemplary system architecture to which the teachings of one or more embodiments of the present application may be applied;
FIG. 2 is a flowchart of a target tracking method according to an embodiment of the present application;
FIG. 3 is a flowchart of a target tracking method according to another embodiment of the present application;
FIG. 4 is a timing diagram of a bolt face and a ball machine in accordance with one embodiment of the present application;
FIG. 5 is a schematic hardware architecture of a target tracking system according to an embodiment of the present application;
FIG. 6 is a schematic hardware architecture of a target tracking system according to another embodiment of the present application;
FIG. 7 is a schematic diagram of a hardware architecture of a computer device suitable for implementing one or more embodiments of the application.
Detailed Description
Other advantages and effects of the present application will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present application with reference to specific examples. The application may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present application. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present application by way of illustration, and only the components related to the present application are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which the teachings of one or more embodiments of the present application may be applied. As shown in fig. 1, system architecture 100 may include a terminal device 110, a network 120, and a server 130. Terminal device 110 may include various electronic devices such as smart phones, tablet computers, notebook computers, desktop computers, and the like. The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing services. Network 120 may be a communication medium of various connection types capable of providing a communication link between terminal device 110 and server 130, and may be, for example, a wired communication link or a wireless communication link.
The system architecture in embodiments of the present application may have any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 130 may be a server group composed of a plurality of server devices. In addition, the technical solution provided in the embodiment of the present application may be applied to the terminal device 110, or may be applied to the server 130, or may be implemented by the terminal device 110 and the server 130 together, which is not limited in particular.
In one embodiment of the present application, the terminal device 110 or the server 130 of the present application may acquire a first image, then detect and identify the first image, and when detecting that there is a target triggering a preset alarm event in the first image, take the target triggering the preset alarm event as a target to be tracked; the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device; and finally, tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished. The first image comprises an image which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device and contains one or more targets. The first image capturing device in the present application may be a rifle bolt, and the second image capturing device may be a ball machine. Therefore, the target tracking method is executed by using the terminal equipment 110 or the server 130, so that the response speed and the tracking accuracy in the gun-ball linkage interaction process can be improved, the method is particularly suitable for detecting and tracking fast moving targets such as motor vehicles, non-motor vehicles and the like, and the problems of target following errors and following losses frequently occurring in the gun-ball linkage process are solved. In addition, the gun ball linkage holder control algorithm provided by the application can solve the problem of shaking of the ball machine holder in the tracking process, and the stability of the target tracking process is improved.
The foregoing describes the contents of an exemplary system architecture to which the present solution is applied, and the following describes the target tracking method of the present application.
Fig. 2 is a schematic flow chart of a target tracking method according to an embodiment of the application. Specifically, in an exemplary embodiment, as shown in fig. 2, the present embodiment provides a target tracking method, which includes the steps of:
s210, acquiring a first image, wherein the first image comprises an image containing one or more targets, which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device;
s220, detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image. Specifically, the process of detecting and identifying the first image in this embodiment may include: acquiring an event analysis operator, wherein the event analysis operator is preconfigured with an event judgment rule; analyzing one or more targets in the first image based on the event analysis operator, and determining whether at least one of preset alarm events exists in the one or more targets in the first image; wherein, the preset alarm event comprises: area intrusion events, area entry events, area exit events, wire-mixing detection events, rapid movement events, personnel wander events; if one or more targets in the first image have at least one of preset alarm events, marking that the targets in the first image trigger the preset alarm event; if one or more targets in the first image do not have any one of the preset alarm events, marking that the targets in the first image do not trigger the preset alarm event.
And S230, transmitting the attribute information of the target to be tracked to a second image shooting device, and adjusting the second image shooting device based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device.
S240, tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image shooting device until the tracking is finished.
In this embodiment, the first image capturing device may be a rifle bolt or a ball machine, and the second image capturing device may be a rifle bolt or a ball machine. For example, in one example, the first image capture device may be a bolt face and the second image capture device may be a dome camera. In another example, the first image capturing device may be a bolt face and the second image capturing device may be a bolt face. In yet another example, the first image capturing device may be a ball machine and the second image capturing device may be a bolt machine. In yet another example, the first image capturing device may be a ball machine and the second image capturing device may be a ball machine. If the first image shooting device is a rifle bolt and the second image shooting device is a ball machine, compared with the traditional ball machine linkage control method, the response speed and tracking accuracy in the gun-ball linkage interaction process can be improved by the scheme provided by the embodiment, the method is particularly suitable for detecting and tracking fast moving targets such as motor vehicles and non-motor vehicles, and the problem that the targets are frequently misplaced and lost in the gun-ball linkage process is solved. In addition, the embodiment provides a gun ball linkage tripod head control algorithm, which can solve the problem of shaking of a tripod head of a ball machine in the tracking process and increase the stability of the target tracking process. As an example, the first image capturing device in the present embodiment may be a rifle bolt with a view angle range of 360 degrees, and the second image capturing device may be a ball machine supporting 360 degrees of rotation, and the movement supports 40 times zoom. Among them, the targets in the present embodiment include, but are not limited to, pedestrians, vehicles, non-vehicles, and the like.
In an exemplary embodiment, step S230 sends the attribute information of the target to be tracked to a second image capturing device, and the process of adjusting the second image capturing device based on the coordinates of the target to be tracked in the first image includes: marking the sitting position of the target to be tracked in the first image as a first coordinate, mapping the first coordinate into an image coordinate system of the second image shooting device, and obtaining the coordinate of the target to be tracked in the second image shooting device, and marking the coordinate as a second coordinate; and carrying out position adjustment on the second image shooting device by using a holder until the second coordinate is positioned in a preset area of the second image shooting device. As an example, the preset area of the second image capturing device in this embodiment may be set according to the actual situation, and for example, the center area of the second image capturing device may be set as the preset area.
In an exemplary embodiment, after the second image capturing device is adjusted, the embodiment may further include: the second image shooting device is subjected to zooming and focusing based on the size information of the target to be tracked in the first image shooting device, so that the definition of the target to be tracked in the second image shooting device is higher than that of the target to be tracked in the first image shooting device; the attribute information of the target to be tracked comprises size information of the target to be tracked in the first image shooting device. As an example, if the first image capturing device in the embodiment is a rifle bolt and the second image capturing device is a ball machine, in order to reduce the total time consumed in the coarse tracking process, after the ball machine receives the target size information sent by the rifle bolt, the tripod head rotates the ball machine, and at the same time, the ball machine core can zoom and focus the ball machine according to the size of the target to be tracked in the first image and a preset rule. Therefore, when the targets in the ball machine are matched, a clearer image can be provided, and the success rate of target feature matching is improved.
In an exemplary embodiment, if the attribute information of the target to be tracked includes the type of the target to be tracked and the feature of the target to be tracked, step S240 includes tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image capturing device, where the process of tracking the target to be tracked includes: acquiring an image shot by the second image shooting device after the adjustment is completed, and recording the image as a second image; identifying the second image, screening out targets which belong to the same type as the target to be tracked, and marking the targets as candidate targets; obtaining the mapping coordinates of the target to be tracked in the second image shooting device and the actual coordinates of each candidate target in the second image shooting device, and calculating the distance between the mapping coordinates of the target to be tracked and the actual coordinates of the candidate targets; sequencing according to the calculated distance, sequentially selecting candidate targets with small distance values, comparing features of the candidate targets with the target to be tracked, and calculating feature similarity of the candidate targets and the target to be tracked; comparing the calculated feature similarity with a preset threshold, and selecting a corresponding candidate target when the calculated feature similarity is larger than the preset threshold; and tracking the selected candidate target by using the adjusted second image shooting device. According to the above description, when tracking the selected candidate target, the present embodiment may further include: acquiring the time for tracking the target to be tracked, and recording the time as target tracking time; comparing the target tracking time with a first preset time, and ending tracking of the target to be tracked after the target tracking time exceeds the first preset time; or acquiring the selected candidate target in the display area of the second image shooting device, and ending tracking of the target to be tracked if the selected candidate target does not appear in the display area of the second image shooting device.
According to the above description, in an exemplary embodiment, the process of identifying the second image and screening the object belonging to the same type as the object to be tracked includes: acquiring an attribute analysis operator, wherein the attribute analysis operator is preconfigured with attribute information; identifying the second image based on the pre-configured attribute information in the attribute analysis operator, and acquiring targets existing in the second image and the type of each target; and screening out targets which belong to the same type as the target to be tracked from the second image based on the type of the target to be tracked.
The present application also provides an embodiment, as shown in fig. 3, which provides a target tracking method applicable to a rifle bolt and a ball machine, comprising the following steps:
and initiating a target detection and tracking request at a gun camera end, selecting a target triggering an alarm as a target to be tracked, collecting attribute information such as coordinates, types, sizes, feature vectors and the like of the target to be tracked in the gun camera, and sending the attribute information of the target to be tracked to the ball camera through a gun camera communication control module.
After the spherical machine linkage control module at the spherical machine end receives the attribute information of the target to be tracked, converting the first coordinate of the target to be tracked into the second coordinate according to the transformation matrix; the first coordinate refers to the coordinate of the target to be tracked in the gun camera, and the second coordinate refers to the coordinate of the target to be tracked mapped in the dome camera. And rotating the cradle head to a second coordinate, zooming the spherical camera to a preset multiplying power according to the size of the target to be tracked, and screening candidate targets from the image shot by the spherical camera according to the type and the size of the target to be tracked. Then calculating Euclidean distance between the candidate target and the second coordinate, and sorting in descending order according to the distance; and selecting candidate targets according to the sorting result, calculating the similarity of feature vectors between the candidate targets and the targets to be tracked, if the similarity is larger than a threshold value, considering that the targets are successfully matched, and otherwise, calculating the similarity of the next candidate target.
In this embodiment, the camera equipment algorithm module detects and tracks the targets of the person, the motor vehicle and the non-motor vehicle in the camera image (first image for short) by using the detection and tracking operator, and obtains the coordinates, the size, the movement speed, the target type, the feature vector and other attributes of the targets (first target for short) in the first image. After a trigger of a certain event is detected by a trigger event analysis operator of the gun camera algorithm module, the gun camera algorithm module selects the target as a target to be tracked, and the attribute information of the target to be tracked is sent to the dome camera through a gun camera linkage control module. And the ball machine linkage control module receives the attribute information of the target to be tracked sent by the gun machine, and based on the attribute information, the ball machine is assisted to quickly find the target to be tracked, and locking tracking is performed.
Specifically, the entire tracking process of the present embodiment is divided into three stages, including:
the first stage is mainly rough tracking. The ball machine does not focus on a specific target, and only the ball machine rotates the cradle head to the approximate image area where the target is located. And the ball machine receives the target attribute information sent by the gun machine. And the first target coordinate (first coordinate for short) is calibrated according to a gun ball coordinate system to obtain a coordinate change matrix. And mapping the first coordinate to an image coordinate system of the dome camera, obtaining a coordinate (second coordinate for short) of the target in the image coordinate system of the dome camera, controlling the rotation of the cradle head by the PTZ module of the dome camera, enabling the second coordinate to move to a central area of an image (second image) of the dome camera, and obtaining a second image of an area where the target to be tracked is located. This process requires the bolt to send the target detection tracking attribute to the ball machine multiple times in succession in time sequence. The dome camera needs to zoom and gather according to the size of the target in the first stage tracking process. To reduce the overall time consumed by the coarse tracking process. After the ball machine receives the target size information sent by the gun machine, the cradle head rotates, and meanwhile, the ball machine core changes the magnification of the ball machine according to the size of the target in the first image and the preset rule. And providing a clearer image for accurately matching the target in the next stage, and improving the success rate of target feature matching.
The second stage is mainly to perform accurate tracking. After the first stage of coarse tracking, the dome camera has been turned to the target image area. The ball machine algorithm module can also detect and track the targets of the types of people, motor vehicles, non-motor vehicles and the like in the current second picture in real time. If a plurality of targets are detected in the second image, the target with the highest matching degree with the first image is selected according to the following rule and used as the target for precisely tracking the dome camera. Because the feature comparison is highly complex and time consuming, it is desirable to avoid directly polling the feature similarity of all objects in the second image to the feature of the object to be tracked. The application adopts the following strategies: first, targets are screened according to the types of the targets, and candidate target groups of the same type are screened. And secondly, calculating the distance between the coordinates of each target in the candidate target group and the second coordinates obtained by the coordinate mapping sent by the gun camera, and carrying out incremental sequencing on the distances between all candidate targets and the second coordinates. And preferentially comparing the characteristics of the candidate targets with the closer distances with the target to be tracked. And if the similarity of the feature comparison is greater than a preset threshold, the ball machine is considered to find the target to be tracked sent from the gun machine in the second picture. The time for feature alignment is reduced. The gun camera running detection tracking algorithm detects real-time positions, speeds and feature vectors of tracked targets in the first image in real time, and sends attribute results such as position coordinates, sizes, target movement speeds, types of targets, feature vectors of the targets and the like of the targets to the dome camera. The goal machine can continuously acquire the information of the real-time position, speed, feature vector and the like of the target for a few frames, and the goal machine is instructed until the goal is locked and tracked.
And in the third stage, the ball machine performs autonomous tracking. The gun camera does not send attribute information such as coordinates and features of the first target to the dome camera any more, the dome camera enters an autonomous tracking mode after locking a tracking target, and a detection tracking operator in a dome camera algorithm module can detect and track the target in the second image in real time to obtain attribute information such as a target id, a type t, coordinates and dimensions l (x, y, w, h), a feature vector f and the like. The algorithm module sends the information to the spherical machine linkage control module, the linkage control module calculates the expected rotation angle of the cradle head by adopting a PID control algorithm, the PTZ control module receives the tracking instruction of the linkage control module, and the cradle head is rotated to enable the tracked target image to be displayed at the center position of the spherical machine image picture. The algorithm can solve the jitter problem of the cradle head in the autonomous tracking process. And the zoom and focus can be adjusted according to the size of the target, so that the image of the second target is stable and clear, and the attribute analysis operator is convenient for carrying out detailed attribute analysis on the target.
According to the above description, for convenience of description, 1 in the subscript of each variable represents a first coordinate system where the bolt is located, and 2 represents a second coordinate system where the ball machine is located. For example l 1 (x, y, w, h) represents the coordinates and dimensions of the object to be tracked in the bolt face; l (L) 2 (x, y, w, h) represents the coordinates and dimensions of the target to be tracked mapped in the dome camera. As shown in FIG. 4, the embodiment adopts a rough tracking, precise tracking and autonomous tracking staged tracking strategy, so that the tracking response speed and the tracking accuracy are effectively improved. The implementation is described in detail in the following substeps.
The first stage: the gun camera algorithm module detects a tracking operator to identify all targets in the panoramic image, and acquires the id of each target 1 Coordinates and dimension l 1 (x, y, w, h), type t. The event analysis operator detects that a certain target triggers an alarm event, and indicates that the target is an object needing to be focused on, and the goal is required to be tracked by a dome camera. Packing into R together with detection tracking result 1 And (id, l), the gun camera linkage control module generates a tracking request for the target according to the id of the target as a main identifier, and sends a tracking request instruction to the ball camera linkage control module through a network. The gun camera detects and tracks R of an object to be tracked in n continuous image frames according to time sequence 1 (id, l) attribute information. Ball machine coordinated control module is from R 1 In resolving l 1 (x, y, w, h). According to a variation matrix T obtained by pre-calibration, the coordinate variation obtains a coordinate l under the image coordinate system of the spherical machine 2 (x,y,w,h)。l 2 The coordinates x, y in (x, y, w, h) are divided into x, yThe quantity is used as the input of the pan-tilt control module, and the w and h components are used as the initial multiplying power of the variable magnification. After n rough tracking passes through the process, the ball can quickly acquire a second image of the target area. And the spherical machine algorithm module detects and tracks the second image to obtain a target list in the second image, and the target list is used as a candidate target for accurate tracking in the next stage.
And a second stage: the bolt face algorithm module extracts the type t and the feature vector f of the target to be tracked in the n+1 frames, and only one time is required to perform the operation because the operation of extracting the feature vector is time-consuming. The feature vector is a 128-dimensional vector, can detail the detail features of the target in the image, has invariance of scaling, color, rotation and the like, and is suitable for transmission and storage. Packaging into attribute result R together with detection tracking result 1 (id, l, t, f) is sent to the ball machine. The ball machine receives the attribute result R 1 All attribute information id, l, t and f in the information are analyzed. l (L) 1 (x, y, w, h) obtaining a coordinate l under the coordinate system of the spherical machine image through the action of a transformation matrix T 2 (x,y,w,h)。l 2 And (x, y, w, h) is used as input of a holder control module and variable magnification rate calculation input. The target type t and the feature vector f are used for screening out an accurate target to be tracked. And screening the target with the highest matching degree from the candidate targets according to the attribute information, and taking the target as an input of accurate tracking. After n+1 frames, the bolt stops sending attribute information to the ball machine. The dome camera enters a third phase, namely an autonomous tracking mode.
And a third stage: through the rough tracking and the fine tracking of the first two phases, the dome camera can be locked to the tracked target, and in the current phase, the gun camera stops sending the dome camera attribute information of the target in the first image. The ball machine detects, tracks and analyzes the attribute of the locking target by means of the algorithm module of the ball machine to obtain a result R 2 (id, l, t, f). Algorithm module attribute analysis processes the attribute result R 2 And the data are sent to a ball machine linkage control module. Wherein id, l 2 The function of the device is used for controlling the cradle head and the multiplying power transformation as in the second stage. The type t and the feature vector f are used for quickly retrieving the target after the target is temporarily blocked in the moving process. Because the algorithm module detects the heelAfter the target is lost, the id of the target changes, and the target with the highest similarity of the feature vector is needed to be found in the latest dome camera image by utilizing the feature vector matching of the target to continue tracking. And returning the dome camera to the prefabricated point after the target tracking is thoroughly lost or the tracking time length reaches the preset time length. Ready to continue with the next round of trace requests.
In summary, the present application provides a target tracking method, which includes first acquiring a first image, then detecting and identifying the first image, and when detecting that a target triggers a preset alarm event in the first image, taking the target triggering the preset alarm event as a target to be tracked; the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device; and finally, tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished. The first image comprises an image which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device and contains one or more targets. The first image capturing device in the method may be a rifle bolt and the second image capturing device may be a ball machine. Therefore, compared with the traditional ball machine linkage control method, the method can improve the response speed and tracking accuracy in the gun ball linkage interaction process, is particularly suitable for detecting and tracking fast moving targets of motor vehicles, non-motor vehicles and the like, and solves the problems of target following error and following loss frequently occurring in the gun ball linkage process. In addition, the gun-ball linkage holder control algorithm can solve the problem of shaking of the ball machine holder in the tracking process, and stability of the target tracking process is improved.
As shown in fig. 5, the present application further provides a target tracking system, which includes:
the image acquisition module 510 is configured to acquire a first image, where the first image includes an image including one or more targets obtained by using a first image capturing device to capture a monitoring area determined in advance or in real time;
the target recognition module 520 is configured to detect and recognize the first image, and when detecting that a target triggers a preset alarm event in the first image, take the target triggering the preset alarm event as a target to be tracked. Specifically, the process of detecting and identifying the first image in this embodiment may include: acquiring an event analysis operator, wherein the event analysis operator is preconfigured with an event judgment rule; analyzing one or more targets in the first image based on the event analysis operator, and determining whether at least one of preset alarm events exists in the one or more targets in the first image; wherein, the preset alarm event comprises: area intrusion events, area entry events, area exit events, wire-mixing detection events, rapid movement events, personnel wander events; if one or more targets in the first image have at least one of preset alarm events, marking that the targets in the first image trigger the preset alarm event; if one or more targets in the first image do not have any one of the preset alarm events, marking that the targets in the first image do not trigger the preset alarm event.
The image adjustment module 530 is configured to send the attribute information of the target to be tracked to a second image capturing device, and adjust the second image capturing device based on the coordinates of the target to be tracked in the first image, so that the target to be tracked is located in a preset area of the second image capturing device;
and the target tracking module 540 is configured to track the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image capturing device until the tracking is finished.
In this embodiment, the first image capturing device may be a rifle bolt or a ball machine, and the second image capturing device may be a rifle bolt or a ball machine. For example, in one example, the first image capture device may be a bolt face and the second image capture device may be a dome camera. In another example, the first image capturing device may be a bolt face and the second image capturing device may be a bolt face. In yet another example, the first image capturing device may be a ball machine and the second image capturing device may be a bolt machine. In yet another example, the first image capturing device may be a ball machine and the second image capturing device may be a ball machine. If the first image shooting device is a rifle bolt and the second image shooting device is a ball machine, compared with the traditional ball machine linkage control method, the response speed and tracking accuracy in the gun-ball linkage interaction process can be improved by the scheme provided by the embodiment, the method is particularly suitable for detecting and tracking fast moving targets such as motor vehicles and non-motor vehicles, and the problem that the targets are frequently misplaced and lost in the gun-ball linkage process is solved. In addition, the embodiment provides a gun ball linkage tripod head control algorithm, which can solve the problem of shaking of a tripod head of a ball machine in the tracking process and increase the stability of the target tracking process. As an example, the first image capturing device in the present embodiment may be a rifle bolt with a view angle range of 360 degrees, and the second image capturing device may be a ball machine supporting 360 degrees of rotation, and the movement supports 40 times zoom. Among them, the targets in the present embodiment include, but are not limited to, pedestrians, vehicles, non-vehicles, and the like.
In an exemplary embodiment, the image adjustment module 530 sends the attribute information of the target to be tracked to a second image capturing device, and the process of adjusting the second image capturing device based on the coordinates of the target to be tracked in the first image includes: marking the sitting position of the target to be tracked in the first image as a first coordinate, mapping the first coordinate into an image coordinate system of the second image shooting device, and obtaining the coordinate of the target to be tracked in the second image shooting device, and marking the coordinate as a second coordinate; and carrying out position adjustment on the second image shooting device by using a holder until the second coordinate is positioned in a preset area of the second image shooting device. As an example, the preset area of the second image capturing device in this embodiment may be set according to the actual situation, and for example, the center area of the second image capturing device may be set as the preset area.
In an exemplary embodiment, after the second image capturing device is adjusted, the embodiment may further include: the second image shooting device is subjected to zooming and focusing based on the size information of the target to be tracked in the first image shooting device, so that the definition of the target to be tracked in the second image shooting device is higher than that of the target to be tracked in the first image shooting device; the attribute information of the target to be tracked comprises size information of the target to be tracked in the first image shooting device. As an example, if the first image capturing device in the embodiment is a rifle bolt and the second image capturing device is a ball machine, in order to reduce the total time consumed in the coarse tracking process, after the ball machine receives the target size information sent by the rifle bolt, the tripod head rotates the ball machine, and at the same time, the ball machine core can zoom and focus the ball machine according to the size of the target to be tracked in the first image and a preset rule. Therefore, when the targets in the ball machine are matched, a clearer image can be provided, and the success rate of target feature matching is improved.
In an exemplary embodiment, if the attribute information of the target to be tracked includes the type of the target to be tracked and the feature of the target to be tracked, the process of tracking the target to be tracked by the target tracking module 540 according to the attribute information of the target to be tracked and the adjusted second image capturing device includes: acquiring an image shot by the second image shooting device after the adjustment is completed, and recording the image as a second image; identifying the second image, screening out targets which belong to the same type as the target to be tracked, and marking the targets as candidate targets; obtaining the mapping coordinates of the target to be tracked in the second image shooting device and the actual coordinates of each candidate target in the second image shooting device, and calculating the distance between the mapping coordinates of the target to be tracked and the actual coordinates of the candidate targets; sequencing according to the calculated distance, sequentially selecting candidate targets with small distance values, comparing features of the candidate targets with the target to be tracked, and calculating feature similarity of the candidate targets and the target to be tracked; comparing the calculated feature similarity with a preset threshold, and selecting a corresponding candidate target when the calculated feature similarity is larger than the preset threshold; and tracking the selected candidate target by using the adjusted second image shooting device. According to the above description, when tracking the selected candidate target, the present embodiment may further include: acquiring the time for tracking the target to be tracked, and recording the time as target tracking time; comparing the target tracking time with a first preset time, and ending tracking of the target to be tracked after the target tracking time exceeds the first preset time; or acquiring the selected candidate target in the display area of the second image shooting device, and ending tracking of the target to be tracked if the selected candidate target does not appear in the display area of the second image shooting device.
According to the above description, in an exemplary embodiment, the process of identifying the second image and screening the object belonging to the same type as the object to be tracked includes: acquiring an attribute analysis operator, wherein the attribute analysis operator is preconfigured with attribute information; identifying the second image based on the pre-configured attribute information in the attribute analysis operator, and acquiring targets existing in the second image and the type of each target; and screening out targets which belong to the same type as the target to be tracked from the second image based on the type of the target to be tracked.
The present application also provides an embodiment that provides an object tracking system applicable to a bolt face and a ball machine for performing the steps of:
and initiating a target detection and tracking request at a gun camera end, selecting a target triggering an alarm as a target to be tracked, collecting attribute information such as coordinates, types, sizes, feature vectors and the like of the target to be tracked in the gun camera, and sending the attribute information of the target to be tracked to the ball camera through a gun camera communication control module.
After the spherical machine linkage control module at the spherical machine end receives the attribute information of the target to be tracked, converting the first coordinate of the target to be tracked into the second coordinate according to the transformation matrix; the first coordinate refers to the coordinate of the target to be tracked in the gun camera, and the second coordinate refers to the coordinate of the target to be tracked mapped in the dome camera. And rotating the cradle head to a second coordinate, zooming the spherical camera to a preset multiplying power according to the size of the target to be tracked, and screening candidate targets from the image shot by the spherical camera according to the type and the size of the target to be tracked. Then calculating Euclidean distance between the candidate target and the second coordinate, and sorting in descending order according to the distance; and selecting candidate targets according to the sorting result, calculating the similarity of feature vectors between the candidate targets and the targets to be tracked, if the similarity is larger than a threshold value, considering that the targets are successfully matched, and otherwise, calculating the similarity of the next candidate target.
In this embodiment, the camera equipment algorithm module detects and tracks the targets of the person, the motor vehicle and the non-motor vehicle in the camera image (first image for short) by using the detection and tracking operator, and obtains the coordinates, the size, the movement speed, the target type, the feature vector and other attributes of the targets (first target for short) in the first image. After a trigger of a certain event is detected by a trigger event analysis operator of the gun camera algorithm module, the gun camera algorithm module selects the target as a target to be tracked, and the attribute information of the target to be tracked is sent to the dome camera through a gun camera linkage control module. And the ball machine linkage control module receives the attribute information of the target to be tracked sent by the gun machine, and based on the attribute information, the ball machine is assisted to quickly find the target to be tracked, and locking tracking is performed.
Specifically, the entire tracking process of the present embodiment is divided into three stages, including:
the first stage is mainly rough tracking. The ball machine does not focus on a specific target, and only the ball machine rotates the cradle head to the approximate image area where the target is located. And the ball machine receives the target attribute information sent by the gun machine. And the first target coordinate (first coordinate for short) is calibrated according to a gun ball coordinate system to obtain a coordinate change matrix. And mapping the first coordinate to an image coordinate system of the dome camera, obtaining a coordinate (second coordinate for short) of the target in the image coordinate system of the dome camera, controlling the rotation of the cradle head by the PTZ module of the dome camera, enabling the second coordinate to move to a central area of an image (second image) of the dome camera, and obtaining a second image of an area where the target to be tracked is located. This process requires the bolt to send the target detection tracking attribute to the ball machine multiple times in succession in time sequence. The dome camera needs to zoom and gather according to the size of the target in the first stage tracking process. To reduce the overall time consumed by the coarse tracking process. After the ball machine receives the target size information sent by the gun machine, the cradle head rotates, and meanwhile, the ball machine core changes the magnification of the ball machine according to the size of the target in the first image and the preset rule. And providing a clearer image for accurately matching the target in the next stage, and improving the success rate of target feature matching.
The second stage is mainly to perform accurate tracking. After the first stage of coarse tracking, the dome camera has been turned to the target image area. The ball machine algorithm module can also detect and track the targets of the types of people, motor vehicles, non-motor vehicles and the like in the current second picture in real time. If a plurality of targets are detected in the second image, the target with the highest matching degree with the first image is selected according to the following rule and used as the target for precisely tracking the dome camera. Because the feature comparison is highly complex and time consuming, it is desirable to avoid directly polling the feature similarity of all objects in the second image to the feature of the object to be tracked. The application adopts the following strategies: first, targets are screened according to the types of the targets, and candidate target groups of the same type are screened. And secondly, calculating the distance between the coordinates of each target in the candidate target group and the second coordinates obtained by the coordinate mapping sent by the gun camera, and carrying out incremental sequencing on the distances between all candidate targets and the second coordinates. And preferentially comparing the characteristics of the candidate targets with the closer distances with the target to be tracked. And if the similarity of the feature comparison is greater than a preset threshold, the ball machine is considered to find the target to be tracked sent from the gun machine in the second picture. The time for feature alignment is reduced. The gun camera running detection tracking algorithm detects real-time positions, speeds and feature vectors of tracked targets in the first image in real time, and sends attribute results such as position coordinates, sizes, target movement speeds, types of targets, feature vectors of the targets and the like of the targets to the dome camera. The goal machine can continuously acquire the information of the real-time position, speed, feature vector and the like of the target for a few frames, and the goal machine is instructed until the goal is locked and tracked.
And in the third stage, the ball machine performs autonomous tracking. The gun camera does not send attribute information such as coordinates and features of the first target to the dome camera any more, the dome camera enters an autonomous tracking mode after locking a tracking target, and a detection tracking operator in a dome camera algorithm module can detect and track the target in the second image in real time to obtain attribute information such as a target id, a type t, coordinates and dimensions l (x, y, w, h), a feature vector f and the like. The algorithm module sends the information to the spherical machine linkage control module, the linkage control module calculates the expected rotation angle of the cradle head by adopting a PID control algorithm, the PTZ control module receives the tracking instruction of the linkage control module, and the cradle head is rotated to enable the tracked target image to be displayed at the center position of the spherical machine image picture. The algorithm can solve the jitter problem of the cradle head in the autonomous tracking process. And the zoom and focus can be adjusted according to the size of the target, so that the image of the second target is stable and clear, and the attribute analysis operator is convenient for carrying out detailed attribute analysis on the target.
According to the above description, for convenience of description, 1 in the subscript of each variable represents a first coordinate system where the bolt is located, and 2 represents a second coordinate system where the ball machine is located. For example l 1 (x, y, w, h) represents the coordinates and dimensions of the object to be tracked in the bolt face; l (L) 2 (x, y, w, h) represents the coordinates and dimensions of the target to be tracked mapped in the dome camera. As shown in FIG. 4, the embodiment adopts a rough tracking, precise tracking and autonomous tracking staged tracking strategy, so that the tracking response speed and the tracking accuracy are effectively improved. The implementation is described in detail in the following substeps.
The first stage: the gun camera algorithm module detects a tracking operator to identify all targets in the panoramic image, and acquires the id of each target 1 Coordinates and dimension l 1 (x, y, w, h), type t. The event analysis operator detects that a certain target triggers an alarm event, and indicates that the target is an object needing to be focused on, and the goal is required to be tracked by a dome camera. Packing into R together with detection tracking result 1 The (id, l) gun-lock linkage control module generates a tracking request for the target according to the id of the target as a main identifier and sends the tracking request through a networkAnd a tracking request instruction is sent to the ball machine linkage control module. The gun camera detects and tracks R of an object to be tracked in n continuous image frames according to time sequence 1 (id, l) attribute information. Ball machine coordinated control module is from R 1 In resolving l 1 (x, y, w, h). According to a variation matrix T obtained by pre-calibration, the coordinate variation obtains a coordinate l under the image coordinate system of the spherical machine 2 (x,y,w,h)。l 2 The x, y and h components of the coordinates in (x, y, w and h) are used as the input of the holder control module, and the w and h components are used as the initial multiplying power of the variable magnification. After n rough tracking passes through the process, the ball can quickly acquire a second image of the target area. And the spherical machine algorithm module detects and tracks the second image to obtain a target list in the second image, and the target list is used as a candidate target for accurate tracking in the next stage.
And a second stage: the bolt face algorithm module extracts the type t and the feature vector f of the target to be tracked in the n+1 frames, and only one time is required to perform the operation because the operation of extracting the feature vector is time-consuming. The feature vector is a 128-dimensional vector, can detail the detail features of the target in the image, has invariance of scaling, color, rotation and the like, and is suitable for transmission and storage. Packaging into attribute result R together with detection tracking result 1 (id, l, t, f) is sent to the ball machine. The ball machine receives the attribute result R 1 All attribute information id, l, t and f in the information are analyzed. l (L) 1 (x, y, w, h) obtaining a coordinate l under the coordinate system of the spherical machine image through the action of a transformation matrix T 2 (x,y,w,h)。l 2 And (x, y, w, h) is used as input of a holder control module and variable magnification rate calculation input. The target type t and the feature vector f are used for screening out an accurate target to be tracked. And screening the target with the highest matching degree from the candidate targets according to the attribute information, and taking the target as an input of accurate tracking. After n+1 frames, the bolt stops sending attribute information to the ball machine. The dome camera enters a third phase, namely an autonomous tracking mode.
And a third stage: through the rough tracking and the fine tracking of the first two phases, the dome camera can be locked to the tracked target, and in the current phase, the gun camera stops sending the dome camera attribute information of the target in the first image. Ball machine relies on self algorithm mouldThe block detects, tracks and analyzes the attribute of the locking target to obtain a result R 2 (id, l, t, f). Algorithm module attribute analysis processes the attribute result R 2 And the data are sent to a ball machine linkage control module. Wherein id, l 2 The function of the device is used for controlling the cradle head and the multiplying power transformation as in the second stage. The type t and the feature vector f are used for quickly retrieving the target after the target is temporarily blocked in the moving process. Because the algorithm module detects that the tracking operator changes the id of the target after the target is lost, the target with the highest similarity of the feature vector is needed to be found in the latest dome camera image by utilizing the feature vector matching of the target to continue tracking. And returning the dome camera to the prefabricated point after the target tracking is thoroughly lost or the tracking time length reaches the preset time length. Ready to continue with the next round of trace requests.
In another embodiment, as shown in fig. 6, the present embodiment provides a gun-ball linkage system based on a panoramic gun camera and a ball machine, in which hardware parameters are as follows, a gun camera visual angle range is 360 degrees, multiple sub-lenses are adopted to collect images of different visual angles in a monitoring scene, and a complete panoramic image is obtained through image fusion and splicing. The method is mainly used for collecting the summary information of all targets in the scene and judging the alarm event. The cradle head of the ball machine supports 360-degree rotation, the movement supports 40-time zooming, and the detail information of a target in a scene can be acquired by controlling the cradle head to rotate. As shown in fig. 6, the gun-ball linkage system can be divided into two roles, master and slave.
The gun machine is used as a host role in a gun-ball linkage system and is mainly responsible for selecting a certain target triggering an alarm event, acquiring attribute information of the target, packaging and sending the attribute information to the ball machine, and receiving a detailed result of tracking the target returned after the ball machine finishes tracking. The bolt face function module can be divided into: the gun camera linkage control module is connected with the gun camera linkage control module. The algorithm module can be subdivided into a detection tracking operator and an event analysis operator according to specific functions realized by each operator. Wherein, detect the tracking operator: the detection and tracking operator can detect and track the targets of the pedestrian, the motor vehicle, the non-motor vehicle and the like in the image. Event analysis operator: the event analysis operator may determine, according to a pre-configured rule, whether a target triggers an event. In this embodiment, the event analysis operator supports event detection such as area intrusion, area entry, area exit, wire mixing detection, rapid movement, and person loitering. When a certain target triggers the event, an alarm is generated. The algorithm module can select the target as an object to be tracked and send the attribute R1 information to the bolt linkage control module. The bolt linkage control module can send the attribute information packaged by the algorithm module to the linkage control module of the ball machine through a network.
The ball machine is used as a slave role in the gun-ball linkage system, and the functional modules of the ball machine can be divided into: the device comprises a ball machine algorithm module, a ball machine linkage control module and a ball machine control module. The dome camera algorithm module can be divided into a detection tracking operator and an attribute analysis operator according to specific functions. The function of the detection tracking operator is the same as that of the gun camera detection tracking operator, and can support detection and tracking of different types of targets such as pedestrians, motor vehicles, non-motor vehicles and the like. The attribute analysis operator can support the analysis of the detailed information of the target, such as the license plate number, the vehicle brand, the vehicle type and other attributes of the motor vehicle, the sex, the age, the hairstyle, the clothes, the appearance and other detailed information of the pedestrian, the type of the non-motor vehicle, whether the vehicle is manned, the appearance characteristics of the driver and other detailed information. And the ball machine linkage control module is used for processing the tracking request instruction sent by the gun machine and completing the tracking process control at different stages. The spherical machine control module can be responsible for executing operations such as rotation, zooming, focusing and the like of the cradle head.
In summary, the present application provides a target tracking system, which firstly acquires a first image, then detects and identifies the first image, and when detecting that a target triggers a preset alarm event in the first image, takes the target triggering the preset alarm event as a target to be tracked; the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device; and finally, tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished. The first image comprises an image which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device and contains one or more targets. The first image capturing device in the system may be a rifle bolt and the second image capturing device may be a ball machine. Therefore, the system is equivalent to a traditional ball machine linkage control system, and the scheme provided by the system can improve the response speed and tracking accuracy in the gun ball linkage interaction process, is particularly suitable for detecting and tracking fast moving targets such as motor vehicles and non-motor vehicles, and solves the problems of target following errors and following losses frequently occurring in the gun ball linkage process. In addition, the system provides a gun ball linkage tripod head control algorithm, so that the problem of shaking of a tripod head of a ball machine in the tracking process can be solved, and the stability of the target tracking process is improved.
The embodiment of the application also provides a computer device, which can comprise: one or more processors; and one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the method described in fig. 1. Fig. 7 shows a schematic structural diagram of a computer device 1000. Referring to fig. 7, the computer apparatus 1000 includes: processor 1010, memory 1020, power supply 1030, display unit 1040, and input unit 1060.
The processor 1010 is a control center of the computer device 1000, connects the respective components using various interfaces and lines, and performs various functions of the computer device 1000 by running or executing software programs and/or data stored in the memory 1020, thereby performing overall monitoring of the computer device 1000. In an embodiment of the application, the processor 1010 performs the method described in FIG. 1 when it invokes a computer program stored in the memory 1020. In the alternative, processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. In some embodiments, the processor, memory, may be implemented on a single chip, and in some embodiments, they may be implemented separately on separate chips.
The memory 1020 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, various applications, etc.; the storage data area may store data created according to the use of the computer device 1000, or the like. In addition, memory 1020 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state memory device, among others.
The computer device 1000 also includes a power supply 1030 (e.g., a battery) for powering the various components, which can be logically connected to the processor 1010 via a power management system so as to perform functions such as managing charge, discharge, and power consumption by the power management system.
The display unit 1040 may be used to display information input by a user or information provided to the user, various menus of the computer device 1000, and the like, and in the embodiment of the present application, is mainly used to display a display interface of each application in the computer device 1000, and objects such as text and pictures displayed in the display interface. The display unit 1040 may include a display panel 1050. The display panel 1050 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The input unit 1060 may be used to receive information such as numbers or characters input by a user. The input unit 1060 may include a touch panel 1070 and other input devices 1080. Wherein the touch panel 1070, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1070 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.).
Specifically, the touch panel 1070 may detect a touch operation by a user, detect signals resulting from the touch operation, convert the signals into coordinates of contacts, send the coordinates to the processor 1010, and receive and execute commands sent from the processor 1010. In addition, the touch panel 1070 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. Other input devices 1080 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, power on and off keys, etc.), a trackball, mouse, joystick, etc.
Of course, the touch panel 1070 may overlay the display panel 1050, and when a touch operation is detected on or near the touch panel 1070, the touch operation is transmitted to the processor 1010 to determine the type of touch event, and then the processor 1010 provides a corresponding visual output on the display panel 1050 according to the type of touch event. Although in fig. 7, the touch panel 1070 and the display panel 1050 implement the input and output functions of the computer apparatus 1000 as two separate components, in some embodiments, the touch panel 1070 and the display panel 1050 may be integrated to implement the input and output functions of the computer apparatus 1000.
The computer device 1000 may also include one or more sensors, such as pressure sensors, gravitational acceleration sensors, proximity light sensors, and the like. Of course, the computer device 1000 described above may also include other components such as cameras, as desired in a particular application.
Embodiments of the present application also provide a computer-readable storage medium having instructions stored therein that, when executed by one or more processors, enable the apparatus to perform the method of the present application as described in fig. 1.
It will be appreciated by those skilled in the art that fig. 7 is merely an example of a computer device and is not limiting of the device, and that the device may include more or fewer components than shown, or may combine certain components, or different components. For convenience of description, the above parts are described as being functionally divided into modules (or units) respectively. Of course, in implementing the present application, the functions of each module (or unit) may be implemented in the same piece or pieces of software or hardware.
It will be appreciated by those skilled in the art that the application can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein. The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application, which are desirably implemented by computer program instructions, each flowchart and/or block diagram illustration, and combinations of flowchart illustrations and/or block diagrams. These computer program instructions may be applied to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be understood that although the terms first, second, third, etc. may be used to describe the preset ranges, etc. in the embodiments of the present application, these preset ranges should not be limited to these terms. These terms are only used to distinguish one preset range from another. For example, a first preset range may also be referred to as a second preset range, and similarly, a second preset range may also be referred to as a first preset range without departing from the scope of embodiments of the present application.
The above embodiments are merely illustrative of the principles of the present application and its effectiveness, and are not intended to limit the application. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the application. Accordingly, it is intended that all equivalent modifications and variations of the application be covered by the claims, which are within the ordinary skill of the art, be within the spirit and scope of the present disclosure.

Claims (9)

1. A method of target tracking, the method comprising the steps of:
acquiring a first image, wherein the first image comprises an image containing one or more targets, which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device;
Detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image;
the attribute information of the target to be tracked is sent to a second image shooting device, and the second image shooting device is adjusted based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device;
tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished; the process of tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image shooting device comprises the following steps: acquiring an image shot by the second image shooting device after the adjustment is completed, and recording the image as a second image; identifying the second image, screening out targets which belong to the same type as the target to be tracked, and marking the targets as candidate targets; obtaining the mapping coordinates of the target to be tracked in the second image shooting device and the actual coordinates of each candidate target in the second image shooting device, and calculating the distance between the mapping coordinates of the target to be tracked and the actual coordinates of the candidate targets; sequencing according to the calculated distance, sequentially selecting candidate targets with small distance values, comparing features of the candidate targets with the target to be tracked, and calculating feature similarity of the candidate targets and the target to be tracked; comparing the calculated feature similarity with a preset threshold, and selecting a corresponding candidate target when the calculated feature similarity is larger than the preset threshold; tracking the selected candidate target by using the adjusted second image shooting device; the attribute information of the target to be tracked comprises the type of the target to be tracked and the characteristics of the target to be tracked.
2. The object tracking method according to claim 1, wherein the process of transmitting the attribute information of the object to be tracked to a second image capturing device and adjusting the second image capturing device based on the coordinates of the object to be tracked in the first image includes:
marking the sitting position of the target to be tracked in the first image as a first coordinate, mapping the first coordinate into an image coordinate system of the second image shooting device, and obtaining the coordinate of the target to be tracked in the second image shooting device, and marking the coordinate as a second coordinate;
and carrying out position adjustment on the second image shooting device by using a holder until the second coordinate is positioned in a preset area of the second image shooting device.
3. The object tracking method according to claim 1 or 2, characterized in that after adjusting the second image capturing device, the method further comprises:
the second image shooting device is subjected to zooming and focusing based on the size information of the target to be tracked in the first image shooting device, so that the definition of the target to be tracked in the second image shooting device is higher than that of the target to be tracked in the first image shooting device; the attribute information of the target to be tracked comprises size information of the target to be tracked in the first image shooting device.
4. The target tracking method according to claim 1, wherein the process of tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image capturing device includes:
acquiring the time for tracking the target to be tracked, and recording the time as target tracking time;
comparing the target tracking time with a first preset time, and ending tracking of the target to be tracked after the target tracking time exceeds the first preset time; or alternatively, the process may be performed,
and acquiring the selected candidate target in the display area of the second image shooting device, and ending tracking of the target to be tracked if the selected candidate target does not appear in the display area of the second image shooting device.
5. The object tracking method according to claim 1, wherein the process of identifying the second image and screening out the object belonging to the same type as the object to be tracked includes:
acquiring an attribute analysis operator, wherein the attribute analysis operator is preconfigured with attribute information;
identifying the second image based on the pre-configured attribute information in the attribute analysis operator, and acquiring targets existing in the second image and the type of each target;
And screening out targets which belong to the same type as the target to be tracked from the second image based on the type of the target to be tracked.
6. The object tracking method according to claim 1 or 2, wherein the process of detecting and identifying the first image includes:
acquiring an event analysis operator, wherein the event analysis operator is preconfigured with an event judgment rule;
analyzing one or more targets in the first image based on the event analysis operator, and determining whether at least one of preset alarm events exists in the one or more targets in the first image; wherein, the preset alarm event comprises: area intrusion events, area entry events, area exit events, wire-mixing detection events, rapid movement events, personnel wander events;
if one or more targets in the first image have at least one of preset alarm events, marking that the targets in the first image trigger the preset alarm event;
if one or more targets in the first image do not have any one of the preset alarm events, marking that the targets in the first image do not trigger the preset alarm event.
7. A target tracking system, the system comprising:
the image acquisition module is used for acquiring a first image, wherein the first image comprises an image containing one or more targets, which is obtained by shooting a monitoring area determined in advance or in real time by using a first image shooting device;
the target identification module is used for detecting and identifying the first image, and taking a target triggering a preset alarm event as a target to be tracked when the target triggering the preset alarm event is detected in the first image;
the image adjustment module is used for sending the attribute information of the target to be tracked to a second image shooting device, and adjusting the second image shooting device based on the coordinates of the target to be tracked in the first image so that the target to be tracked is located in a preset area of the second image shooting device;
the target tracking module is used for tracking the target to be tracked according to the attribute information of the target to be tracked and the second image shooting device after adjustment is completed until the tracking is finished; the process of tracking the target to be tracked according to the attribute information of the target to be tracked and the adjusted second image shooting device comprises the following steps: acquiring an image shot by the second image shooting device after the adjustment is completed, and recording the image as a second image; identifying the second image, screening out targets which belong to the same type as the target to be tracked, and marking the targets as candidate targets; obtaining the mapping coordinates of the target to be tracked in the second image shooting device and the actual coordinates of each candidate target in the second image shooting device, and calculating the distance between the mapping coordinates of the target to be tracked and the actual coordinates of the candidate targets; sequencing according to the calculated distance, sequentially selecting candidate targets with small distance values, comparing features of the candidate targets with the target to be tracked, and calculating feature similarity of the candidate targets and the target to be tracked; comparing the calculated feature similarity with a preset threshold, and selecting a corresponding candidate target when the calculated feature similarity is larger than the preset threshold; tracking the selected candidate target by using the adjusted second image shooting device; the attribute information of the target to be tracked comprises the type of the target to be tracked and the characteristics of the target to be tracked.
8. A computer device, comprising:
a processor; and, a step of, in the first embodiment,
a computer readable medium storing instructions which, when executed by the processor, cause the apparatus to perform the method of any one of claims 1 to 6.
9. A computer readable medium having instructions stored thereon, the instructions being loaded by a processor and executing the method of any of claims 1 to 6.
CN202210871722.9A 2022-07-22 2022-07-22 Target tracking method, system, computer equipment and readable medium Active CN115278014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210871722.9A CN115278014B (en) 2022-07-22 2022-07-22 Target tracking method, system, computer equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210871722.9A CN115278014B (en) 2022-07-22 2022-07-22 Target tracking method, system, computer equipment and readable medium

Publications (2)

Publication Number Publication Date
CN115278014A CN115278014A (en) 2022-11-01
CN115278014B true CN115278014B (en) 2023-09-15

Family

ID=83770716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210871722.9A Active CN115278014B (en) 2022-07-22 2022-07-22 Target tracking method, system, computer equipment and readable medium

Country Status (1)

Country Link
CN (1) CN115278014B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115955550B (en) * 2023-03-15 2023-06-27 浙江宇视科技有限公司 Image analysis method and system for GPU cluster

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794731A (en) * 2015-05-12 2015-07-22 成都新舟锐视科技有限公司 Multi-target detection and tracking method for speed dome camera control strategy
CN106027887A (en) * 2016-05-20 2016-10-12 北京格灵深瞳信息技术有限公司 Box and ball linkage control method and device aiming at rotating mirror holder, as well as electronic equipment
CN107592507A (en) * 2017-09-29 2018-01-16 深圳市置辰海信科技有限公司 The method of automatic trace trap high-resolution front face photo
CN207166616U (en) * 2017-09-12 2018-03-30 成都动力视讯科技股份有限公司 Vehicle-mounted monitoring apparatus and system
CN108495085A (en) * 2018-03-14 2018-09-04 成都新舟锐视科技有限公司 A kind of ball machine automatic tracking control method and system based on moving target detection
CN110581979A (en) * 2018-06-08 2019-12-17 杭州海康威视数字技术股份有限公司 Image acquisition system, method and device
CN110969097A (en) * 2019-11-18 2020-04-07 浙江大华技术股份有限公司 Linkage tracking control method, equipment and storage device for monitored target
CN111885301A (en) * 2020-06-29 2020-11-03 浙江大华技术股份有限公司 Gun and ball linkage tracking method and device, computer equipment and storage medium
CN112351200A (en) * 2020-10-13 2021-02-09 深圳英飞拓科技股份有限公司 Method and system for realizing linkage snapshot of gun-ball intelligent camera
WO2021204298A1 (en) * 2020-04-10 2021-10-14 华为技术有限公司 Photographing method, device and system
CN114120165A (en) * 2021-10-13 2022-03-01 浙江大华技术股份有限公司 Gun and ball linked target tracking method and device, electronic device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165606B (en) * 2018-08-29 2019-12-17 腾讯科技(深圳)有限公司 Vehicle information acquisition method and device and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794731A (en) * 2015-05-12 2015-07-22 成都新舟锐视科技有限公司 Multi-target detection and tracking method for speed dome camera control strategy
CN106027887A (en) * 2016-05-20 2016-10-12 北京格灵深瞳信息技术有限公司 Box and ball linkage control method and device aiming at rotating mirror holder, as well as electronic equipment
CN207166616U (en) * 2017-09-12 2018-03-30 成都动力视讯科技股份有限公司 Vehicle-mounted monitoring apparatus and system
CN107592507A (en) * 2017-09-29 2018-01-16 深圳市置辰海信科技有限公司 The method of automatic trace trap high-resolution front face photo
CN108495085A (en) * 2018-03-14 2018-09-04 成都新舟锐视科技有限公司 A kind of ball machine automatic tracking control method and system based on moving target detection
CN110581979A (en) * 2018-06-08 2019-12-17 杭州海康威视数字技术股份有限公司 Image acquisition system, method and device
CN110969097A (en) * 2019-11-18 2020-04-07 浙江大华技术股份有限公司 Linkage tracking control method, equipment and storage device for monitored target
WO2021204298A1 (en) * 2020-04-10 2021-10-14 华为技术有限公司 Photographing method, device and system
CN113518174A (en) * 2020-04-10 2021-10-19 华为技术有限公司 Shooting method, device and system
CN111885301A (en) * 2020-06-29 2020-11-03 浙江大华技术股份有限公司 Gun and ball linkage tracking method and device, computer equipment and storage medium
CN112351200A (en) * 2020-10-13 2021-02-09 深圳英飞拓科技股份有限公司 Method and system for realizing linkage snapshot of gun-ball intelligent camera
CN114120165A (en) * 2021-10-13 2022-03-01 浙江大华技术股份有限公司 Gun and ball linked target tracking method and device, electronic device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
康运锋 ; 谢元涛 ; 张世渝 ; .人像属性识别关键技术研究进展及应用探索.警察技术.2018,(第02期),全文. *

Also Published As

Publication number Publication date
CN115278014A (en) 2022-11-01

Similar Documents

Publication Publication Date Title
US9332189B2 (en) User-guided object identification
US10068135B2 (en) Face detection, identification, and tracking system for robotic devices
US9177224B1 (en) Object recognition and tracking
CN109325456B (en) Target identification method, target identification device, target identification equipment and storage medium
US9424461B1 (en) Object recognition for three-dimensional bodies
US11651589B2 (en) Real time object detection and tracking
CN111935393A (en) Shooting method, shooting device, electronic equipment and storage medium
CN110986969B (en) Map fusion method and device, equipment and storage medium
CN112149636A (en) Method, apparatus, electronic device and storage medium for detecting target object
JP2017033547A (en) Information processing apparatus, control method therefor, and program
CN115278014B (en) Target tracking method, system, computer equipment and readable medium
CN111581423A (en) Target retrieval method and device
US20170344104A1 (en) Object tracking for device input
US20220383523A1 (en) Hand tracking method, device and system
CN112668428A (en) Vehicle lane change detection method, roadside device, cloud control platform and program product
CN111736709A (en) AR glasses control method, device, storage medium and apparatus
CN113038001A (en) Display method and device and electronic equipment
JP2019174920A (en) Article management system and article management program
US20230222792A1 (en) Real Time Object Detection and Tracking
Wu et al. Optimized visual recognition algorithm in service robots
CN115278084A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112184766B (en) Object tracking method and device, computer equipment and storage medium
CN113778233A (en) Method and device for controlling display equipment and readable medium
CN111611941A (en) Special effect processing method and related equipment
CN109658323A (en) Image acquiring method, device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant