CN113781524B - Target tracking system and method based on two-dimensional label - Google Patents

Target tracking system and method based on two-dimensional label Download PDF

Info

Publication number
CN113781524B
CN113781524B CN202111071116.0A CN202111071116A CN113781524B CN 113781524 B CN113781524 B CN 113781524B CN 202111071116 A CN202111071116 A CN 202111071116A CN 113781524 B CN113781524 B CN 113781524B
Authority
CN
China
Prior art keywords
aircraft
tracking
target
processing
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111071116.0A
Other languages
Chinese (zh)
Other versions
CN113781524A (en
Inventor
姜晓栋
张晋桥
赵新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Panchip Microelectronics Co ltd
Original Assignee
Shanghai Panchip Microelectronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Panchip Microelectronics Co ltd filed Critical Shanghai Panchip Microelectronics Co ltd
Priority to CN202111071116.0A priority Critical patent/CN113781524B/en
Publication of CN113781524A publication Critical patent/CN113781524A/en
Application granted granted Critical
Publication of CN113781524B publication Critical patent/CN113781524B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a target tracking system and a target tracking method based on a two-dimensional label, which are used for tracking a moving target object provided with the two-dimensional label and comprise the following steps: the aircraft is provided with a camera, and is used for shooting target images of continuous multi-frame moving target objects in real time in the flight process; the target tracking module is used for identifying each frame of target image in sequence and obtaining relative position information between the aircraft and the moving target object according to target image processing comprising a two-dimensional label; and processing according to the relative position information to obtain a tracking signal so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned at the center of the target image in the shot target image. The method has the beneficial effects that the method can realize the identification and autonomous tracking of the aircraft to the specific moving target object, reduce the investment of manpower in the tracking process, improve the tracking efficiency and stability and reduce the system risk.

Description

Target tracking system and method based on two-dimensional label
Technical Field
The invention relates to the technical field of image processing, in particular to a target tracking system and method based on a two-dimensional label.
Background
The method has important significance for the identification and tracking technology of the specific target in the moving state, and the fields of industrial scene monitoring, intelligent traffic system management, electric power maintenance, even military application and the like. Rapid developments in machine vision technology have driven the development of automated tracking technology. The existing similar system has the technical defects of huge platform, high cost, more manpower input and the like.
April tag is a visual positioning method developed in recent years based on two-dimensional code road signs, which can calculate the precise three-dimensional position, direction and tag ID of a two-dimensional code tag relative to a camera. April tag has now played an important role in multi-agent collaboration, indoor localization. How to combine unmanned aerial vehicle and machine vision technique, realize unmanned aerial vehicle to the recognition of specific target and independently track is the technical problem who needs to solve.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a target tracking system based on a two-dimensional label, which is used for tracking a moving target object, wherein the moving target object is provided with the two-dimensional label;
the target tracking system includes:
the aircraft is provided with a camera, and the camera is used for shooting and outputting target images of the continuous multiframe moving target objects in real time in the flight process of the aircraft;
the target tracking module is connected with the camera and the aircraft respectively, and comprises:
the image processing sub-module is used for identifying the target image of each frame in sequence and processing the target image containing the two-dimensional tag to obtain the relative position information between the aircraft and the moving target object;
and the tracking control sub-module is connected with the image processing sub-module and is used for processing and obtaining a tracking signal according to the relative position information so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional label is positioned at the center position of the target image in the shot target image.
Preferably, the two-dimensional tag is an april tag.
Preferably, the image processing submodule includes:
the first processing unit is used for processing the gradient direction and the amplitude of each pixel in the target image for each frame in sequence, and performing cluster analysis on each gradient direction and each amplitude to obtain a plurality of line segments contained in the target image;
the second processing unit is connected with the first processing unit and is used for traversing each line segment to carry out quadrilateral identification, and outputting an identification result which indicates that the target image of the current frame contains the two-dimensional label when the quadrilateral is identified for the first time;
and the third processing unit is connected with the second processing unit and is used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and the target images of the frames after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object, which is obtained by processing.
Preferably, the second processing unit traverses each line segment to perform quadrilateral identification by adopting a recursive depth-first search algorithm with depth of 4.
Preferably, the third processing unit includes:
the first processing subunit is used for processing the focal length of the camera, the size of the two-dimensional label and the target image which are acquired in advance to obtain a homography matrix representing the position mapping relation of the two-dimensional label between a label coordinate system and an image coordinate system, wherein the label coordinate system takes the center of the two-dimensional label as an origin and the plane where the two-dimensional label is positioned as an XOY plane;
the second processing subunit is connected with the first processing subunit and is used for processing the position information of the two-dimensional tag in the image coordinate system according to the internal reference matrix and the homography matrix of the camera, which are calibrated in advance, and the position information is used as the relative position information between the aircraft and the moving target object.
Preferably, the first processing subunit adopts a direct linear transformation algorithm to process to obtain the homography matrix.
Preferably, the second processing subunit processes the position information by adopting the following formula:
wherein H is used to represent the homographyA matrix; s is used to represent a scale factor; p is used to represent the reference matrix; r is R ij (i=0, 1,2; j=0, 1) for representing a rotational component of the two-dimensional label in the image coordinate system; t (T) x ,T y ,T z A distance component for representing the two-dimensional tag in the image coordinate system;
the location information includes T in the distance component x And T y ,T x Representing a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate system y Representing a second relative distance between the aircraft and the moving target object in a y-axis direction in the tag coordinate system, the relative position information including the first relative distance and the second relative distance.
Preferably, the image processing sub-module further includes a position correction unit connected to the third processing unit, and the position correction unit includes:
the first correction subunit is used for acquiring the Euler angle and the flying height of the aircraft in real time, and respectively processing the Euler angle and the flying height to obtain a first position deviation of the aircraft and the moving target object in the x-axis direction in a tag coordinate system and a second position deviation of the aircraft and the moving target object in the y-axis direction in the tag coordinate system;
the second correction subunit is connected with the first correction subunit and is used for correcting the first relative distance and the second relative distance according to the first position deviation and the second position deviation to obtain corrected relative position information;
and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the flying of the moving target object.
Preferably, the first correction subunit processes the first position deviation and the second position deviation by adopting the following formula:
L=h*tanθ
wherein h represents the flying height; when θ represents the roll angle of the aircraft, L represents the first positional deviation; θ represents the yaw angle of the aircraft, L represents the second positional deviation.
The invention also provides a target tracking method based on the two-dimensional label, which is applied to the target tracking system, and comprises the following steps:
step S1, controlling a camera arranged on an aircraft to shoot a target image of a moving target object with a two-dimensional tag continuously and multiframe in real time by the target tracking system in the flight process of the aircraft;
step S2, the target tracking system receives the target image, sequentially recognizes each frame of the target image, and obtains relative position information between the aircraft and the moving target object according to the target image processing comprising the two-dimensional tag;
and S3, the target tracking system processes and obtains a tracking signal according to the relative position information so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned at the center position of the target image in the shot target image.
The technical scheme has the following advantages or beneficial effects: the method can realize the identification and autonomous tracking of the aircraft to the specific moving target object, reduce the investment of manpower in the tracking process, improve the tracking efficiency and stability and reduce the system risk.
Drawings
FIG. 1 is a schematic diagram of a target tracking system based on two-dimensional code labels according to a preferred embodiment of the present invention;
FIG. 2 is a schematic diagram of a two-dimensional tag according to a preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of the position correction principle in the preferred embodiment of the present invention;
fig. 4 is a flowchart of a target tracking method based on a two-dimensional tag according to a preferred embodiment of the present invention.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples. The present invention is not limited to the embodiment, and other embodiments may fall within the scope of the present invention as long as they conform to the gist of the present invention.
In a preferred embodiment of the present invention, based on the above-mentioned problems existing in the prior art, a target tracking system based on two-dimensional labels is provided for tracking a moving target object 1, wherein the moving target object 1 is provided with a two-dimensional label;
as shown in fig. 1, the target tracking system includes:
the aircraft 2 is provided with a camera 21, and the camera is used for shooting and outputting target images of continuous multi-frame moving target objects 1 in real time in the flight process of the aircraft 2;
the target tracking module 3 is connected with the camera 21 and the aircraft 2 respectively, and the target tracking module 3 comprises:
the image processing sub-module 31 is configured to sequentially identify each frame of target image, and obtain relative position information between the aircraft and the moving target object according to target image processing including two-dimensional labels;
the tracking control sub-module 32 is connected to the image processing sub-module 31, and is configured to process and obtain a tracking signal according to the relative position information, so as to control the aircraft to track the moving target object, so that the two-dimensional tag is located at the center of the target image in the captured target image.
Specifically, in this embodiment, the two-dimensional TAG is an april TAG, as shown in fig. 2, including but not limited to TAG36H11-0. Before the target tracking, the generated april tag label may be printed and then attached to the surface of the moving target object 1, preferably attached to the upper surface of the moving target object 1, so that the camera 21 on the aircraft 2 flying above the moving target object 1 can accurately capture the target image containing the two-dimensional tag during the target tracking process.
In the actual tracking process, as the moving target object 1 and the aircraft 2 are both fast moving during target tracking, in order to keep the stability of the initial tracking state and the success rate of tracking, the system starts the tracking mode to enable the moving target object 1 to be at the center of the visual field of the aircraft 2 as much as possible. During the running process of the system, the moving object 1 is always kept at the center position of the view of the camera by adjusting the gesture of the aircraft, which is also a tracking implementation scheme. More preferably, the camera may be arranged vertically downward such that the positive Z-axis direction of the camera coordinate system is opposite to the positive Z-axis direction of the body coordinate system.
In the target tracking process, the aircraft 2 flies above the moving target object 1, and controls the camera 21 to shoot a target image in real time, and then sends the target image to the target tracking module 3 for processing, preferably, the target tracking module 3 can be a local upper computer, a remote server or an integrated control chip of the aircraft 2. When the target tracking module 3 is integrated in the control chip of the aircraft 2, the processed position information is preferably sent to the controller of the aircraft 2 through a serial port, and the controller is preferably pre-configured with an annular queue space with a size of 64 bytes, and after the serial port receives the position information, the position information is moved to the annular queue space. Meanwhile, in order to ensure the correctness of the position information, check bits are set for each frame of position information data, and the check bits are positioned at the last of each frame of position information data and are the sum of all data bits. After receiving the position information data, summing the position information data of each bit again, and using the acquired position information data as correct data only after verification of the check bit is finished, thereby ensuring the safety.
After receiving the target image, firstly, carrying out image recognition on the target image, and when the two-dimensional label is recognized, representing the corresponding moving target object as a tracking target, then carrying out image processing on the target image to acquire relative position information between the aircraft and the moving target object, and further controlling the aircraft to track the moving target object according to the relative position information, so that the two-dimensional label is positioned at the center position of the target image in the shot target image, in other words, the aircraft 2 flies right above the moving target object, and realizing autonomous tracking of the moving target object.
In a preferred embodiment of the present invention, the image processing sub-module 31 includes:
the first processing unit 311 is configured to process, for each frame of target image in sequence, a gradient direction and an amplitude of each pixel in the target image, and perform cluster analysis on each gradient direction and each amplitude to obtain a plurality of line segments included in the target image;
the second processing unit 312 is connected to the first processing unit 311, and is configured to traverse each line segment to perform quadrilateral identification, and output an identification result indicating that the target image of the current frame contains a two-dimensional label when the quadrilateral is identified for the first time;
the third processing unit 313 is connected to the second processing unit 312, and is configured to start the tracking mode according to the identification result, and then sequentially process the current frame target image and each frame target image after the current frame target image respectively, and continuously output the processed relative position information between the aircraft and the moving target object.
Specifically, in this embodiment, before the tracking mode is started, the two-dimensional tag of the moving target object may not be in the field of view of the camera, a searching process is required, that is, multiple frames of target images are continuously shot to identify in the flight process, one or more target images that begin to be shot may not include the two-dimensional tag, at this time, the position of the aircraft needs to be adjusted until the two-dimensional tag in the target image is identified for the first time, and the tracking mode may be started.
When the image recognition is performed on the target image, firstly, the line segments in the target image are recognized, which comprises the following steps: the gradient direction and amplitude of each pixel in the target image are acquired, and pixels with similar gradient direction and amplitude components are gathered into line segments. The algorithm of the cluster analysis used by the first processing unit 311 is similar to the graph-based method of Felzenszwalb, in particular: each point in the target image captured by the camera 21 represents a pixel, and an edge is added between adjacent pixels, the weight of the edge being equal to the difference in the gradient direction of the adjacent pixels. The pixels are then ranked according to the edge weights to determine whether to classify the pixels into one type (line segment). More specifically, the pixel component is represented by n, which is a vector value; the gradient direction of the pixel is represented by a function D (n), and the gradient direction is a scalar value and represents the direction in which the pixel changes most rapidly; the amplitude of a pixel is represented by a function M (M), representing the difference between the maximum and minimum values of the variation of a certain pixel point. Based on this, when there are two pixel points they meet the following two conditions, they are connected together to form a line segment:
in the above formula: min (D (n), D (M)) and min (M (n), M (M)) are used to represent the gradient direction and the smaller value of the pixel amplitude, K, respectively D And K M For indicating the adjustment parameter, preferably K D =100、K M =1200。
In the actual process of sorting pixels, a linear time counting sorting method is preferably used, and upper and lower limit information of gradient directions and amplitudes can be saved while sorting. This gradient-based clustering approach is relatively sensitive to noise in the image, and even in the presence of less noise, results in local gradient direction changes, which we solve by low-pass filtering the image. The april tag used at the same time has a large scale feature of marginal nature, so that the effective information is not blurred when using low pass filtering, which is different from other problem domains, we choose a filter with σ=0.8 in a specific design. After the clustering operation is finished, the line segments can be connected in a fitting way by using a traditional least square method, and meanwhile, the line segments are classified according to the brightness of images on two sides of the line segments, so that the quadrilateral extraction work in the next processing stage is also facilitated. This part of the work is the slowest stage in the detection scheme, and in practical development, the target image resolution is preferably reduced to half of the original resolution, and the experiment shows that the recognition speed is improved by 4 times.
After the recognition of the line segments in the target image is completed, the quadrangles in the target image are then recognized, and some directed line segments are obtained in the previous work, which facilitates the task of finding a line segment sequence with a quadrangle shape, namely, a rectangle. The system uses a recursive depth-first search algorithm with depth of 4 as a rectangle recognition scheme, and each depth layer of the recursive depth-first search algorithm will acquire one side for a quadrilateral. All selections will be retrieved at the first depth layer and each line segment will be the starting line segment of the rectangle. The second depth layer to the fourth depth layer, the line segment of the first layer is used as a starting point to search the line segment adjacent to the line segment until a closed quadrangle is obtained, and the whole searching process obeys the anticlockwise winding sequence. Meanwhile, the accuracy of identification and the success rate under the shielding condition can be increased by selecting a proper threshold value for judging whether the line segments belong to the same quadrangle. The search of all line segments in the detection process is a huge workload, the resource consumption of MCU is large, and a two-dimensional lookup table is preferably adopted in the design to accelerate the query. Through the optimization mode and the anticlockwise search mode, the detection times of each straight line are limited, and the operation time occupied by quadrilateral detection is greatly reduced.
In a preferred embodiment of the present invention, the third processing unit 313 includes:
the first processing subunit 3131 is configured to obtain a homography matrix that characterizes a position mapping relationship between a tag coordinate system and an image coordinate system of the two-dimensional tag according to a focal length of the camera, a size of the two-dimensional tag, and a target image obtained in advance, where the tag coordinate system uses a center of the two-dimensional tag as an origin, and a plane where the two-dimensional tag is located as an XOY plane;
the second processing subunit 3132 is connected to the first processing subunit 3131, and is configured to process the internal reference matrix and the homography matrix of the camera obtained by calibration in advance to obtain the position information of the two-dimensional tag in the image coordinate system as the relative position information between the aircraft and the moving target object.
Specifically, in this embodiment, according to the imaging principle of the camera, the tag coordinate system may process to obtain the three-dimensional coordinate of the two-dimensional tag under the camera coordinate system after obtaining the focal length of the camera and the size of the two-dimensional tag, and then combine the target image to obtain the homography matrix by adopting a direct linear transformation algorithm.
In a preferred embodiment of the present invention, the second processing subunit 3132 processes the position information according to the following formula:
wherein H is used for representing a homography matrix; s is used to represent a scale factor; p is used to represent the reference matrix; r is R ij (i=0, 1,2; j=0, 1) for representing a rotation component of the two-dimensional label in the image coordinate system; t (T) x ,T y ,T z A distance component for representing the two-dimensional label in the image coordinate system;
the location information includes T in the distance component x And T y ,T x Representing a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate system y The relative position information includes a first relative distance and a second relative distance representing a second relative distance between the aircraft and the moving target object in a y-axis direction in the tag coordinate system.
Specifically, in this embodiment, the matrix with H being 3*3 and the matrix with P being 3×4 is specifically expressed as:wherein f x And f y Is the focal length of the camera. After substituting the above formula, the above formula is converted into a set of equivalent equations as follows:
solving the equation set to obtain position information including T in the distance component x And T y I.e. the position information of the two-dimensional tag in the image coordinate system.
Since the above positional information is referenced to the image coordinate system, the center position of the target image is selected as the origin of the image coordinate system, the positive direction from the origin to the right being the X axis and the positive direction from the origin to the upward being the Y axis are selected in the image plane, and the positional information to be output is the deviation from the image center point. After the design, the position information can be directly used as the relative position information between the aircraft and the moving object, and then the aircraft can fly along with the moving object by adjusting the gesture of the aircraft according to the relative position information, so that the moving object is always positioned at the center position of the target image, and the tracking effect is realized.
Further, since the flight process of the aircraft is dynamic and the camera and the aircraft are relatively stationary, the view angle of the camera may change with the attitude of the body. In the changing process, the visual angle of the camera cannot be kept vertical to the ground at any time, and in addition, the image captured by the camera in the flying process has distortion. Therefore, correction is required to be performed on the position information, and the position information is completely obtained by calculating the image pixels, so that the position information is removed from the attitude disturbance to ensure the correctness of the position data. Based on this, the image processing sub-module 31 further includes a position correction unit 314 connected to the third processing unit 313, and the position correction unit 314 includes:
the first correction subunit 3141 is configured to obtain, in real time, an euler angle and a flying height of the aircraft, and respectively process the euler angle and the flying height to obtain a first position deviation between the aircraft and the moving target object in an x-axis direction in a tag coordinate system, and a second position deviation between the aircraft and the moving target object in a y-axis direction in the tag coordinate system;
the second correcting subunit 3142 is connected to the first correcting subunit 3141, and is configured to correct the first relative distance and the second relative distance according to the first position deviation and the second position deviation, so as to obtain corrected relative position information;
the tracking control sub-module 32 processes the corrected relative position information to obtain a tracking signal so as to control the aircraft to track the moving target object to fly.
Specifically, in the present embodiment, the principle and method of position correction will be described by taking Roll angle Roll of an aircraft as an example, as shown in fig. 3, in which a straight line L1 represents a ground plane on which a moving target object 1 is on; the O point is an aircraft, the straight line L2 is the plane of the aircraft body, the straight line L3 is the line of sight of a camera carried by the aircraft, and the straight line L4 is a straight line parallel to the horizon through the O point; the dotted line is perpendicular to the horizon through the moving object 1, and is perpendicular to the straight line L4 through the point N; the rolling angle of the machine body is theta. At this time, due to the existence of the inclination angle of the body, the aircraft is not directly above the moving target object 1, but the tracking of the aircraft is in a non-deviation state in the center of the moving target object 1 in the image captured by the camera, namely, from the view of the target position data. At this time, the correction of the position data is required, and the broken line in the triangle formed by the moving object 1, the machine body and the point N is the height of the pre-acquired aircraft according to the pythagorean theorem:
tan(π-θ)=h/L
the actual deviation is l=h×tan θ.
Wherein h represents the fly height; when θ represents the roll angle of the aircraft, L represents the first positional deviation; when θ represents the yaw angle of the aircraft, L represents the second positional deviation. The second position deviation is calculated in a similar manner, and will not be described in detail herein.
The invention also provides a target tracking method based on the two-dimensional label, which is applied to the target tracking system, as shown in fig. 4, and comprises the following steps:
step S1, a target tracking system controls a camera arranged on an aircraft to shoot a target image of a moving target object with a two-dimensional tag continuously and multiframe in real time in the flight process of the aircraft;
step S2, the target tracking system receives target images, sequentially recognizes each frame of target image, and obtains relative position information between the aircraft and the moving target object according to target image processing comprising a two-dimensional label;
and S3, the target tracking system processes and obtains a tracking signal according to the relative position information so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned at the center of the target image in the shot target image.
The foregoing description is only illustrative of the preferred embodiments of the present invention and is not to be construed as limiting the scope of the invention, and it will be appreciated by those skilled in the art that equivalent substitutions and obvious variations may be made using the description and drawings, and are intended to be included within the scope of the present invention.

Claims (8)

1. The target tracking system based on the two-dimensional label is characterized by being used for tracking a moving target object, wherein the moving target object is provided with the two-dimensional label;
the target tracking system includes:
the aircraft is provided with a camera, and the camera is used for shooting and outputting target images of the continuous multiframe moving target objects in real time in the flight process of the aircraft;
the target tracking module is connected with the camera and the aircraft respectively, and comprises:
the image processing sub-module is used for identifying the target image of each frame in sequence and processing the target image containing the two-dimensional tag to obtain the relative position information between the aircraft and the moving target object;
the tracking control sub-module is connected with the image processing sub-module and is used for processing and obtaining a tracking signal according to the relative position information so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned at the center position of the target image in the shot target image;
the two-dimensional tag is an april tag;
the image processing submodule includes:
the first processing unit is used for processing the gradient direction and the amplitude of each pixel in the target image for each frame in sequence, and performing cluster analysis on each gradient direction and each amplitude to obtain a plurality of line segments contained in the target image;
the second processing unit is connected with the first processing unit and is used for traversing each line segment to carry out quadrilateral identification, and outputting an identification result which indicates that the target image of the current frame contains the two-dimensional label when the quadrilateral is identified for the first time;
and the third processing unit is connected with the second processing unit and is used for starting a tracking mode according to the identification result, then sequentially processing the target image of the current frame and the target images of the frames after the current frame respectively and continuously outputting the relative position information between the aircraft and the moving target object, which is obtained by processing.
2. The object tracking system of claim 1 wherein said second processing unit traverses each of said line segments for quadrilateral identification using a recursive depth-first search algorithm of depth 4.
3. The object tracking system of claim 1, wherein the third processing unit comprises:
the first processing subunit is used for processing the focal length of the camera, the size of the two-dimensional label and the target image which are acquired in advance to obtain a homography matrix representing the position mapping relation of the two-dimensional label between a label coordinate system and an image coordinate system, wherein the label coordinate system takes the center of the two-dimensional label as an origin and the plane where the two-dimensional label is positioned as an XOY plane;
the second processing subunit is connected with the first processing subunit and is used for processing the position information of the two-dimensional tag in the image coordinate system according to the internal reference matrix and the homography matrix of the camera, which are calibrated in advance, and the position information is used as the relative position information between the aircraft and the moving target object.
4. The object tracking system of claim 3 wherein said first processing subunit processes said homography matrix using a direct linear transformation algorithm.
5. The object tracking system of claim 3 wherein said second processing subunit processes said location information using the formula:
wherein H is used to represent the homography matrix; s is used to represent a scale factor; p is used to represent the reference matrix; r is R ij (i=0, 1,2; j=0, 1) for representing a rotational component of the two-dimensional label in the image coordinate system; t (T) x ,T y ,T z A distance component for representing the two-dimensional tag in the image coordinate system;
the location information includes T in the distance component x And T y ,T x Representing a first relative distance, T, between the aircraft and the moving target object in the x-axis direction in the tag coordinate system y Representing a second relative distance between the aircraft and the moving target object in a y-axis direction in the tag coordinate system, the relative position information including the first relative distance and the second relative distance.
6. The object tracking system of claim 5, wherein said image processing sub-module further comprises a position correction unit coupled to said third processing unit, said position correction unit comprising:
the first correction subunit is used for acquiring the Euler angle and the flying height of the aircraft in real time, and respectively processing the Euler angle and the flying height to obtain a first position deviation of the aircraft and the moving target object in the x-axis direction in a tag coordinate system and a second position deviation of the aircraft and the moving target object in the y-axis direction in the tag coordinate system;
the second correction subunit is connected with the first correction subunit and is used for correcting the first relative distance and the second relative distance according to the first position deviation and the second position deviation to obtain corrected relative position information;
and the tracking control sub-module processes the corrected relative position information to obtain the tracking signal so as to control the aircraft to track the flying of the moving target object.
7. The target tracking system of claim 6, wherein the first corrective subunit processes the first positional deviation and the second positional deviation using the following formulas:
L=h*tanθ
wherein h represents the flying height; when θ represents the roll angle of the aircraft, L represents the first positional deviation; θ represents the yaw angle of the aircraft, L represents the second positional deviation.
8. A two-dimensional tag-based object tracking method, applied to the object tracking system according to any one of claims 1 to 7, comprising:
step S1, controlling a camera arranged on an aircraft to shoot a target image of a moving target object with a two-dimensional tag continuously and multiframe in real time by the target tracking system in the flight process of the aircraft;
step S2, the target tracking system receives the target image, sequentially recognizes each frame of the target image, and obtains relative position information between the aircraft and the moving target object according to the target image processing comprising the two-dimensional tag;
and S3, the target tracking system processes and obtains a tracking signal according to the relative position information so as to control the aircraft to track the flying of the moving target object, so that the two-dimensional tag is positioned at the center position of the target image in the shot target image.
CN202111071116.0A 2021-09-13 2021-09-13 Target tracking system and method based on two-dimensional label Active CN113781524B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111071116.0A CN113781524B (en) 2021-09-13 2021-09-13 Target tracking system and method based on two-dimensional label

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111071116.0A CN113781524B (en) 2021-09-13 2021-09-13 Target tracking system and method based on two-dimensional label

Publications (2)

Publication Number Publication Date
CN113781524A CN113781524A (en) 2021-12-10
CN113781524B true CN113781524B (en) 2023-12-08

Family

ID=78843283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111071116.0A Active CN113781524B (en) 2021-09-13 2021-09-13 Target tracking system and method based on two-dimensional label

Country Status (1)

Country Link
CN (1) CN113781524B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157813B (en) * 2022-02-07 2022-05-03 深圳市慧为智能科技股份有限公司 Electronic scale camera motion control method and device, control terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101287190B1 (en) * 2012-06-04 2013-07-17 주식회사 로드코리아 Photographing position automatic tracking method of video monitoring apparatus
CN107103615A (en) * 2017-04-05 2017-08-29 合肥酷睿网络科技有限公司 A kind of monitor video target lock-on tracing system and track lock method
CN107463181A (en) * 2017-08-30 2017-12-12 南京邮电大学 A kind of quadrotor self-adoptive trace system based on AprilTag
KR20200114924A (en) * 2019-03-26 2020-10-07 주식회사 에프엠웍스 Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101287190B1 (en) * 2012-06-04 2013-07-17 주식회사 로드코리아 Photographing position automatic tracking method of video monitoring apparatus
CN107103615A (en) * 2017-04-05 2017-08-29 合肥酷睿网络科技有限公司 A kind of monitor video target lock-on tracing system and track lock method
CN107463181A (en) * 2017-08-30 2017-12-12 南京邮电大学 A kind of quadrotor self-adoptive trace system based on AprilTag
KR20200114924A (en) * 2019-03-26 2020-10-07 주식회사 에프엠웍스 Method and apparatus of real-time tracking a position using drones, traking a position system including the apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
采用显著性分割与目标检测的形变目标跟踪方法;石祥滨;张健;代钦;张德园;张利国;;计算机辅助设计与图形学学报(第04期) *

Also Published As

Publication number Publication date
CN113781524A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN109949361A (en) A kind of rotor wing unmanned aerial vehicle Attitude estimation method based on monocular vision positioning
CN110966991A (en) Single unmanned aerial vehicle image positioning method without control point
CN110163912B (en) Two-dimensional code pose calibration method, device and system
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN105205785B (en) A kind of orientable oversize vehicle operation management system and its operation method
CN106529538A (en) Method and device for positioning aircraft
CN112215860A (en) Unmanned aerial vehicle positioning method based on image processing
Yu et al. A UAV-based crack inspection system for concrete bridge monitoring
CN111598952B (en) Multi-scale cooperative target design and online detection identification method and system
CN111123962A (en) Rotor unmanned aerial vehicle repositioning photographing method for power tower inspection
CN106444846A (en) Unmanned aerial vehicle and method and device for positioning and controlling mobile terminal
CN110443247A (en) A kind of unmanned aerial vehicle moving small target real-time detecting system and method
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN116866719B (en) Intelligent analysis processing method for high-definition video content based on image recognition
Li et al. Aruco marker detection under occlusion using convolutional neural network
CN109325913A (en) Unmanned plane image split-joint method and device
CN112947526A (en) Unmanned aerial vehicle autonomous landing method and system
CN113781524B (en) Target tracking system and method based on two-dimensional label
CN114972767A (en) Vehicle track and course angle extraction method based on high-altitude unmanned aerial vehicle video
CN112700498A (en) Wind driven generator blade tip positioning method and system based on deep learning
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
CN113744315A (en) Semi-direct vision odometer based on binocular vision
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN116486290B (en) Unmanned aerial vehicle monitoring and tracking method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant