CN111784768B - Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition - Google Patents

Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition Download PDF

Info

Publication number
CN111784768B
CN111784768B CN202010646221.1A CN202010646221A CN111784768B CN 111784768 B CN111784768 B CN 111784768B CN 202010646221 A CN202010646221 A CN 202010646221A CN 111784768 B CN111784768 B CN 111784768B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
color
image
signal lamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010646221.1A
Other languages
Chinese (zh)
Other versions
CN111784768A (en
Inventor
张志勇
丘昌镇
荣易成
王鲁平
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202010646221.1A priority Critical patent/CN111784768B/en
Publication of CN111784768A publication Critical patent/CN111784768A/en
Application granted granted Critical
Publication of CN111784768B publication Critical patent/CN111784768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The embodiment of the invention relates to an unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition, wherein four signal lamps are arranged at the tips of wings and empennages of a target unmanned aerial vehicle and serve as signal marks, so that the probability that marks are shielded in the attitude change process of the target unmanned aerial vehicle is reduced; the four signal lamps use three colors, the robustness of unmanned aerial vehicle flight attitude estimation is improved, the acquired images of the target unmanned aerial vehicle are processed, image characteristics are obtained through detection, two-dimensional image coordinates of the signal lamps are obtained after the corresponding relation between the image characteristics and the signal lamps is identified, the three-dimensional coordinates and the two-dimensional image coordinates of the signal lamps are input into an attitude estimation model to obtain the position and the attitude of the target unmanned aerial vehicle, and automatic real-time attitude estimation of the unmanned aerial vehicle is achieved. The method is easy to select the tone avoiding the background, reduces the difficulty in identifying the image characteristics, and solves the problem of low stability of the existing method for estimating the flight attitude of the unmanned aerial vehicle.

Description

Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition.
Background
The unmanned aerial vehicle has the advantages of flexibility, quick response, unmanned flight, low operation requirement and the like. At present, the unmanned aerial vehicle is applied to the fields of aerial photography, agriculture, plant protection, miniature self-timer, express transportation, disaster relief, wild animal observation, infectious disease monitoring, surveying and mapping, news reporting, electric power inspection, disaster relief, film and television shooting, romantic manufacturing and the like, and the application of the unmanned aerial vehicle is greatly expanded.
In the unmanned aerial vehicle formation flight process, be convenient for know unmanned aerial vehicle's flight gesture, the present adoption sets up the camera on unmanned aerial vehicle and gathers the gesture picture that other unmanned aerial vehicle fly to handle the picture through computer vision technique and carry out attitude estimation, because unmanned aerial vehicle's structure and unmanned aerial vehicle are at the large-scale change of flight in-process gesture, the current condition that the sign is sheltered from appears very easily to unmanned aerial vehicle sign setting method, lead to the sign to can't be detected, thereby can't carry out attitude estimation.
The current sets up a plurality of signal lamps that have the same colour as sign setting scheme on target unmanned aerial vehicle, but this scheme requires the signal lamp relative position in the image unchangeable, consequently can not use when unmanned aerial vehicle gesture changes great.
Disclosure of Invention
The embodiment of the invention provides an unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition, which are used for solving the technical problem of low stability of the existing unmanned aerial vehicle flight attitude estimation method.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
the unmanned aerial vehicle attitude estimation method based on three-color four-lamp mark recognition comprises the following steps of:
s1, acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain image characteristics with high brightness and high saturation;
s2, extracting connected domains corresponding to the four signal lamps from the image characteristics according to the tones of the three colors of the four signal lamps, identifying the one-to-one correspondence relationship between the connected domains and the four signal lamps, and establishing two-dimensional image coordinates of the four signal lamps;
s3, establishing three-dimensional coordinates of the four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
and parameters of acquisition equipment are set in the attitude estimation model.
Preferably, the processing the image in the HSV color space specifically includes:
calculating the saturation and brightness of the image according to the RGB numerical value of the image, and obtaining a product image based on the saturation and the brightness;
performing threshold segmentation on the product image to obtain a binary image;
carrying out connected domain marking on all the binary images to obtain a plurality of connected domains;
and screening all the connected domains according to the area and the shape of the target region to obtain the image characteristics.
Preferably, the extracting the connected domains corresponding to the four signal lamps from the image features by using hues of three colors of the four signal lamps specifically includes:
calculating the tone mean value and the tone standard deviation of a connected domain where the image features are located in two intervals of [ -180 degrees, 180 degrees ] and [0 degrees, 360 degrees ] based on the periodicity of the tones;
selecting a hue mean value and a hue standard deviation corresponding to a small interval of the hue standard deviation of the connected domain as the hue statistic of the connected domain;
screening out the connected domains with the hue standard deviations smaller than a standard deviation threshold value from all the connected domains to obtain a connected domain set;
in the connected domain set, calculating the distance between the tone mean value of each connected domain and the tone of each color of the signal lamp;
and if the distance is smaller than a distance threshold value, screening out four connected domains from the connected domain set.
Preferably, the four connected domains are respectively marked as a first connected domain, a second connected domain, a third connected domain and a fourth connected domain, the four signal lamps are respectively marked as a first signal lamp, a second signal lamp, a third signal lamp and a fourth signal lamp, if the hue and color of the first signal lamp and the second signal lamp are the first color, the hue color of the third signal lamp is a second color, the hue color of the fourth signal lamp is a third color, the connected domains corresponding to the third signal lamp and the fourth signal lamp are directly identified and corresponding according to the distance, if the connected domains corresponding to the first signal lamp and the second signal lamp are the first connected domain and the second connected domain respectively, identifying the correspondence between the first and second connected domains and the first and second signal lamps specifically includes:
the first signal lamp and the second signal lamp are connected into a line, the third signal lamp and the fourth signal lamp are connected into a line, namely the first communicating area and the second communicating area are connected into a first straight line, and the third communicating area and the fourth communicating area are connected into a second straight line;
if the direction vector of the first connected domain pointing to the second connected domain is v1The direction vector of the third connected domain pointing to the fourth connected domain is v2The vector from the midpoint of the first straight line to the midpoint of the second straight line is v3
According to sign ═ vT·v1)·(vT·v2) Calculating to obtain a value of sign, wherein if sign is greater than 0, the first connected domain is positioned on one side of the third connected domain; if sign is not greater than 0, the first connected domain is positioned on one side of the fourth connected domain;
establishing a corresponding relation between the first communication domain and the second communication domain and between the first signal lamp and the second signal lamp according to the positions of the third signal lamp and the fourth signal lamp arranged on the target unmanned aerial vehicle;
wherein v isTIs a vector v3The vertical vector of (a).
Preferably, if four connected domains with the distance smaller than the distance threshold value cannot be screened out from the connected domain set, the step S2 identifies that the process is failed.
Preferably, the distance d between the average value of the color tones of each connected domain and the color tone of each color of the signal lamp is calculated by the formula:
Figure GDA0003153967340000031
where h is the hue value of the signal lamp, hmIs the average value of the hues of the connected components.
Preferably, the inputting the three-dimensional coordinates and the two-dimensional image coordinates into a pose estimation model, and the outputting the position and the pose of the target drone specifically includes: inputting the three-dimensional coordinates and the two-dimensional image coordinates into the attitude estimation model, and calculating and outputting the position of the target unmanned aerial vehicle by adopting a Lambdarist attitude estimation algorithm for the three-dimensional coordinates and the two-dimensional image coordinates;
and mapping the three-dimensional coordinates to obtain coordinates in the image corresponding to the three-dimensional coordinates.
The invention also provides an unmanned aerial vehicle attitude estimation system based on three-color four-lamp mark recognition, wherein four signal lamps for marking are arranged at the tips of wings and empennages of a target unmanned aerial vehicle, the four signal lamps have three colors, and the unmanned aerial vehicle attitude estimation system based on three-color four-lamp mark recognition comprises a feature detection unit, a feature recognition unit and an attitude analysis and estimation unit;
the characteristic detection unit is used for acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain an image characteristic with high brightness and high saturation;
the feature identification unit is used for extracting connected domains corresponding to the four signal lamps from the image features according to the hues of the three colors of the four signal lamps, identifying the one-to-one correspondence relationship between the connected domains and the four signal lamps, and establishing two-dimensional image coordinates of the four signal lamps;
the attitude analysis and estimation unit is used for establishing three-dimensional coordinates of the four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
and parameters of acquisition equipment are set in the attitude estimation model.
The invention also provides a computer-readable storage medium for storing computer instructions which, when run on a computer, cause the computer to perform the above-mentioned unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition.
The invention also provides an apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is used for executing the unmanned aerial vehicle attitude estimation method based on three-color four-light mark recognition according to the instructions in the program codes.
According to the technical scheme, the embodiment of the invention has the following advantages:
1. according to the unmanned aerial vehicle attitude estimation method based on three-color four-lamp mark recognition, four signal lamps are arranged at the tips of wings and empennages of an unmanned aerial vehicle to serve as signal marks, the probability that marks are blocked in the target attitude change process of the target unmanned aerial vehicle is reduced, the stability of the target unmanned aerial vehicle flight attitude estimation is improved, the image of the flight state of the target unmanned aerial vehicle with the signal marks is collected and the collected image is processed to obtain image characteristics, the two-dimensional image coordinates of the signal lamps are obtained after the corresponding relation between the image characteristics and the signal lamps is recognized, the three-dimensional coordinates of the signal lamps are established based on the target unmanned aerial vehicle, the three-dimensional coordinates and the two-dimensional image coordinates are input into an attitude estimation model to obtain the position and the attitude of the target unmanned aerial vehicle, and the automatic real-time attitude estimation of the unmanned aerial vehicle is realized. The method is easy to select the tone avoiding the background, reduces the difficulty in identifying the image characteristics, improves the robustness of the system applying the method, and solves the technical problem of low stability of the existing method for estimating the flight attitude of the unmanned aerial vehicle.
2. The unmanned aerial vehicle attitude estimation system based on three-color four-lamp mark recognition acquires images of flight states of a target unmanned aerial vehicle with signal marks through the feature detection unit and processes the acquired images, two-dimensional image coordinates of signal lamps are obtained after image features are obtained and the corresponding relation between the image features and the signal lamps is recognized through the feature recognition unit, three-dimensional coordinates of the signal lamps are established on the attitude analysis estimation unit based on the target unmanned aerial vehicle, the three-dimensional coordinates and the two-dimensional image coordinates are input into an attitude estimation model to obtain the position and the attitude of the target unmanned aerial vehicle, and automatic real-time attitude estimation of the unmanned aerial vehicle is realized. The system is easy to select the tone avoiding the background, reduces the identification difficulty of image characteristics, improves the robustness of the system, and solves the technical problem of low stability of the existing unmanned aerial vehicle flight attitude estimation method.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating steps of an unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to an embodiment of the present invention.
Fig. 2a is a four signal lamp color tone layout diagram of the unmanned aerial vehicle attitude estimation method based on three-color four-lamp identification according to the embodiment of the present invention.
Fig. 2b is an observation diagram of another angle in fig. 2a of the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to the embodiment of the present invention.
Fig. 2c is an observation diagram of another angle in fig. 2a of the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to the embodiment of the present invention.
Fig. 3 is a block diagram of an unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition according to an embodiment of the present invention.
Fig. 4 is another block diagram of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition according to the embodiment of the present invention.
Fig. 5a is a real object simulation target diagram of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition in the embodiment of the present invention.
Fig. 5b is a front view of a real object simulation target and an acquisition device of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition in the embodiment of the present invention.
Fig. 5c is a top view of a real object simulation target and a collecting device of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition according to the embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The existing attitude estimation of the unmanned aerial vehicle mainly adopts a computer vision technology to realize target attitude estimation, and the computer vision technology is generally divided into two steps: firstly, extracting and identifying target features through an image processing technology; secondly, the attitude of the target is estimated by using the characteristic information of the target. The current computer vision technology has made great progress, a plurality of problems still face to the real-time target attitude estimation by extracting target features, the difficulty of the visual target attitude estimation can be effectively reduced mainly by setting signal marks on a target for the real-time target attitude estimation by extracting the target features, for example, the marks can be easily detected and recognized by elaborately setting the mode of the signal marks, and higher-precision position information is provided, so that the target position and attitude estimation performance is improved.
Therefore, the marker-based technology becomes an important approach for estimating the posture of a computer visual target, and for the characteristic point signal marker, the target posture estimation process can be subdivided into 3 steps: firstly, detecting the image characteristics corresponding to the mark number; establishing a corresponding relation between the image characteristics and the signal marks on the target; and thirdly, estimating the position and the posture of the target based on the 3-dimensional space coordinates and the 2-dimensional image coordinates corresponding to the signal marks.
Hoff systems have proposed target position and attitude estimation schemes. Hoff system uses concentric circles as signal markers for the target surface, i.e. black circles are placed on a white background, or vice versa. Hoff's system can separate white and black regions by simple global image thresholding based on this flag; then, removing the black and white structures with small area in the image by performing morphological filtering operation on the divided binary image; and finally, acquiring the centroid of the connected domain through the connected domain mark, and checking whether the centroids of the black and white connected domains are close to each other based on the concentric criteria of the black and white areas of the concentric circles, thereby extracting the mark characteristics. For identifying the corresponding relation between the determined image characteristics and each signal mark on the target, firstly, placing one signal mark on each of four vertexes of a rectangle on the surface of the target; then, a 5 th semaphore is placed on an edge near one of the vertices. Based on the fact that 3 points in the layout are collinear and the middle point is close to one end, 3 image features corresponding to collinear marks can be extracted, and the corresponding relation between the other two feature points and the marks can be determined according to the position relation of the other two signal marks relative to the 3 points. After the image characteristics corresponding to the marks are determined, the William A.Hoff system estimates the attitude by using a Hung-Yeh-Harwood algorithm, and data input in the Hung-Yeh-Harwood algorithm comprises 3-dimensional space coordinates of signal marks at 4 vertexes of a rectangular area, 2-dimensional image coordinates and parameters of unmanned aerial vehicle image acquisition equipment. The Hung-Yeh-Harwood algorithm obtains the attitude of the target by optimizing the deviation between the measured value and the predicted value of the feature point position. The parameters of the unmanned aerial vehicle image acquisition equipment comprise a pinhole camera model with a slope parameter, and the influence of lens distortion is not considered so as to simplify calculation.
In the prior art, an LED is generally used as a signal sign on a target unmanned aerial vehicle based on high brightness and a specific color of the LED lamp, a plurality of LEDs with the same color are arranged on the target unmanned aerial vehicle at present as the signal sign, and the establishment of the corresponding relationship between the LED lamp and the image characteristics requires that the relative position of the LED lamp in an image is unchanged, so that the application is difficult when the posture of the unmanned aerial vehicle changes greatly.
The embodiment of the application provides an unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition, and is used for solving the technical problem that the existing unmanned aerial vehicle flight attitude estimation method is low in stability. In the present embodiment, the signal lamp is an LED lamp for example, and in other embodiments, the signal lamp may also be a light emitting device emitting color light, for example, a diode.
The first embodiment is as follows:
fig. 1 is a flowchart illustrating steps of an unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to an embodiment of the present invention.
As shown in fig. 1, an embodiment of the present invention provides an unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition, where four signal lamps for marking are arranged at the tips of wings and empennages of a target unmanned aerial vehicle, the four signal lamps have three colors, and the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition includes the following steps:
s1, acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain image characteristics with high brightness and high saturation;
s2, extracting connected domains corresponding to the four signal lamps from image features according to the tones of the three colors of the four signal lamps, identifying the one-to-one correspondence relationship between the connected domains and the four signal lamps, and establishing two-dimensional image coordinates of the four signal lamps;
s3, establishing three-dimensional coordinates of four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
wherein, the attitude estimation model is provided with parameters of the acquisition equipment.
In step S1 of the embodiment of the present invention, an image of a target unmanned aerial vehicle with four signal light markers in a flight state is mainly acquired, and the acquired image is processed to extract an image feature with high brightness and high saturation. The method mainly comprises the steps that an image of a target unmanned aerial vehicle with four signal lamp marks in a flying state is collected through collection equipment, the RGB format of the image is converted into the HSV format in the HSV color space, the target unmanned aerial vehicle in the collected image is provided with the signal lamp marks, the brightness in the image is usually higher than that in a peripheral area, the brightness is higher than that in the peripheral area, the color saturation is higher, and the image characteristics can be improved and recognized in the HSV color space.
It should be noted that the acquisition device may be a video camera, a camera, or the like. The signal lamp can be an LED lamp or a diode. In the present embodiment, LED lamps are used as the signal signs, and the colors of the four LED lamps are red, yellow and blue, respectively. The fuselage of target unmanned aerial vehicle is white usually, can avoid discerning the risk that target unmanned aerial vehicle organism region brought at the image characteristic in-process of discernment high saturation, can avoid estimating the rate of accuracy that target unmanned aerial vehicle gesture result discerned because of organism colour influence. The target unmanned aerial vehicle has a long and narrow surface, and the signal lamp occupies a small area and space, so that the signal lamp can work all day long compared with the existing graphic sign, and the graphic cannot be sensed by a camera at night; the unmanned aerial vehicle in flight is used as a target, the flying attitude of the target unmanned aerial vehicle changes greatly, the signal lamp is strong in shielding resistance as a mark, perception can be carried out on a wide visual angle, and the existing graphic mark is shielded easily. The HSV color space is a color space created by a.r.smith in 1978 based on the intuitive nature of colors, also known as a hexagonal pyramid model, in which pure colors have the highest brightness as pure white, while in other relevant color spaces pure colors have a brightness lower than pure white, so that a white with a slightly lower brightness may have the same brightness as a pure color with a higher brightness, which is not conducive to detecting a pure color signal sign.
In step S2 of the embodiment of the present invention, the correspondence between the four signal lamps and the connected domain in the image feature is mainly identified according to the three color tones of the signal lamps, and after the one-to-one correspondence between the signal lamps and the connected domain is obtained, the two-dimensional image coordinates of the signal lamps are established. And according to the tone of the signal lamp and the position layout of the signal lamp on the target unmanned aerial vehicle, identifying the corresponding relation between each signal mark and the image characteristic, and extracting four connected domains as corresponding characteristics from the connected domains identified by the image characteristic according to the unique tone of the signal lamp.
It should be noted that, based on the color and layout of the signal lamps to identify the corresponding relationship between the signal lamps and the image features, the signal lamps are set at the tips of two wings and two tail wings of the target unmanned aerial vehicle as signal markers, that is, four signal lamps are set on the target unmanned aerial vehicle, in order to reduce the shielding effect caused by the attitude change of the target unmanned aerial vehicle as much as possible, the space on the unmanned aerial vehicle is utilized to the maximum extent, so that the signal markers are far away from each other as much as possible, obviously, the space resolution of the signal markers is facilitated at the distance as far as possible, and further, the attitude estimation is performed at the distance farther away; the number of the signal signs required by the unmanned aerial vehicle is reduced, so that the unmanned aerial vehicle is easier to arrange on the target, and the problem of setting the signal signs on the unmanned aerial vehicle is solved. In this embodiment, the four signal lamps are respectively marked as a first signal lamp, a second signal lamp, a third signal lamp and a fourth signal lamp, the first signal lamp and the second signal lamp are respectively arranged at the tips of two wings of the target unmanned aerial vehicle, the third signal lamp and the fourth signal lamp are respectively arranged at the tips of two empennages of the unmanned aerial vehicle, and the first signal lamp and the second signal lamp are connected to form a line which is marked as a first line segment; the third signal lamp and the fourth signal lamp are connected to form a line which is marked as a second line segment; the first line segment and the second line segment have a parallel relationship. The hue and color of the first signal lamp and the second signal lamp are both a first color, and the hue and color of the third signal lamp and the fourth signal lamp are respectively a second color and a third color; or the hue and the color of the third signal lamp and the fourth signal lamp are both the first color, and the hue and the color of the first signal lamp and the second signal lamp are the second color and the third color respectively.
In step S3 of the embodiment of the present invention, three-dimensional coordinates established based on the target drone as a reference are mainly obtained from the four signal lights, the two-dimensional image coordinates and the three-dimensional coordinates obtained in step S2 are input to the attitude estimation model, parameters of the acquisition device are set in the attitude estimation model, and the attitude and the target position estimated by the drone are obtained by analyzing in the attitude estimation model according to the two-dimensional image coordinates, the three-dimensional coordinates, and the parameters of the acquisition device.
According to the unmanned aerial vehicle attitude estimation method based on three-color four-lamp mark recognition, the four signal lamps are arranged at the tips of the wings and the tail wings of the unmanned aerial vehicle to serve as signal marks, the probability that marks are blocked in the process of changing the target attitude of the unmanned aerial vehicle is reduced, the stability of unmanned aerial vehicle flight attitude estimation is improved, the image of the flight state of the unmanned aerial vehicle with the signal marks is collected and processed to obtain the image characteristics, the corresponding relation between a communication domain and the signal lamps in the image characteristics is recognized to obtain the two-dimensional image coordinates of the signal lamps, the three-dimensional coordinates of the signal lamps are established based on the target unmanned aerial vehicle, and the three-dimensional coordinates and the two-dimensional image coordinates are input into an attitude estimation model to obtain the position and the attitude of the target unmanned aerial vehicle, so that the unmanned aerial vehicle can automatically estimate the attitude in real time. The method is easy to select the tone avoiding the background, reduces the difficulty in identifying the image characteristics, improves the robustness of the system applying the method, and solves the technical problem of low stability of the existing method for estimating the flight attitude of the unmanned aerial vehicle.
In an embodiment of the present invention, processing an image in an HSV color space specifically includes:
acquiring RGB numerical values of the image, calculating the saturation and brightness of the image according to the RGB numerical values of the image, and obtaining a product image based on the saturation and the brightness;
performing threshold segmentation on the product image to obtain a binary image;
carrying out connected domain marking on the binary image to obtain a plurality of connected domains;
and screening all connected domains according to the area and the shape of the target region to obtain the image characteristics.
It should be noted that the image acquired by the acquisition device in the flight state of the unmanned aerial vehicle is a color image in an RGB format, and in the process of processing the image in the HSV color space, the position of the signal lamp is found in the image, that is, the process of processing the image and detecting the characteristics of the image characteristics.
In the present embodiment, a channel product image of a saturation channel and a luminance channel in an image is first calculated. Only the high-brightness high-saturation colored region has a high value in the image, so that the low-brightness and low-saturation regions (white) can be effectively suppressed, and the gray scale of the body region connected with the signal sign of the signal lamp can be effectively suppressed. Second, a plurality of isolated signal beacon regions can be obtained using simple threshold segmentation on the product image. And thirdly, carrying out connected domain marking on the divided binary image to obtain a plurality of connected domains, eliminating connected domains with small and large areas and shapes deviating from a circle and a large circle according to information such as the area and the shape of the signal lamp, and taking the rest connected domains as output results of processing, namely screening the connected domains meeting the requirements of the signal lamp as image characteristics obtained by characteristic detection.
It should be noted that, since the calculated amount of feature detection is proportional to the size of the image, when the size of the target area changes, the target area is scaled to output a window image with a fixed size, and color space conversion and feature detection operations are performed on the basis of the scaled target area, so as to avoid fluctuation of the calculated amount caused by the change of the size of the target area and improve the accuracy of feature detection for extracting the image features.
In this embodiment, processing an image in an HSV color space first converts an RGB image into an HSV format, which specifically includes:
V=max(R,G,B)
C=V-min(R,G,B)
Figure GDA0003153967340000111
Figure GDA0003153967340000112
wherein R, G, B represents the color values of red, green and blue, respectively; v is luminance, S is saturation, H is hue, and C is the difference between the maximum value and the minimum value among RGB values of an image.
In an embodiment of the present invention, extracting connected domains corresponding to four signal lamps from image features by using hues of three colors of the four signal lamps specifically includes:
based on the periodicity of the tone, calculating the tone mean value and the tone standard deviation of a connected domain in which the image features are located in two intervals of [ -180 degrees, 180 degrees ] and [0 degrees, 360 degrees ];
selecting a tone mean value and a tone standard deviation corresponding to a small interval of the tone standard deviation of the connected domain as tone statistic of the connected domain;
screening out connected domains with the hue standard deviation smaller than a standard deviation threshold value from all the connected domains to obtain a connected domain set;
in the connected domain set, calculating the distance between the tone mean value of each connected domain and the tone of each color of the signal lamp;
and if the distance is smaller than the distance threshold value, screening out four connected domains from the connected domain set.
Note that the hue value of a color has an inherent periodicity of 360 °. A plurality of connected domains are obtained in step S1, each connected domain being available in [0 °, 360 ° ] in HSV color space]The hue value of the interval is calculated to be within-180 DEG and 180 DEG for each connected domain]Color tone h of interval[-180,180]The formula of (1) is:
Figure GDA0003153967340000121
in the formula, h[0,360]The color tone value is within the range of [0 DEG, 360 DEG ]],h[-180,180]Is in the interval of [ -180 DEG, 180 DEG °]The hue value of (c).
In the embodiment of the present invention, the distance d between the average value of the hue of each connected component and the hue of each signal lamp is calculated by the formula:
Figure GDA0003153967340000122
where h is the hue value of the signal lamp, hmIs the average value of the hues of the connected components.
The hue value h of the signal lamp and the hue average value h of the connected componentmAre all positioned at the interval of [0 DEG, 360 DEG ]]。
In an embodiment of the present invention, the four connected domains are respectively denoted as a first connected domain, a second connected domain, a third connected domain and a fourth connected domain, if the hue color of the first signal lamp and the second signal lamp is a first color, the hue color of the third signal lamp is a second color, the hue color of the fourth signal lamp is a third color, the connected domains corresponding to the hues of the third signal lamp and the fourth signal lamp are directly identified and correspond to each other according to the distance, and if the connected domains corresponding to the first signal lamp and the second signal lamp are respectively a first connected domain and a second connected domain, identifying the corresponding relationship between the first connected domain and the second connected domain and the first signal lamp and the corresponding relationship between the second connected domain and the second signal lamp specifically includes:
the first signal lamp and the second signal lamp are connected to form a line, the third signal lamp and the fourth signal lamp are connected to form a line, namely the first communicating area and the second communicating area are connected to form a first straight line, and the third communicating area and the fourth communicating area are connected to form a second straight line;
if the direction vector of the first connected domain pointing to the second connected domain is v1The direction vector of the third connected component pointing to the fourth connected component is v2The vector from the midpoint of the first line to the midpoint of the second line is v3
According to sign ═ vT·v1)·(vT·v2) Calculating to obtain the value of sign, and if the sign is larger than 0, positioning the first communication domain at one side of the third communication domain; if sign is not larger than 0, the first connected domain is positioned on one side of the fourth connected domain;
establishing corresponding relations between the first communication domain and the second communication domain and between the first signal lamp and the second signal lamp according to the positions of a third signal lamp and a fourth signal lamp arranged on the target unmanned aerial vehicle;
wherein v isTIs a vector v3The vertical vector of (a).
Fig. 2a is a color tone layout diagram of four signal lights of the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to the embodiment of the present invention, fig. 2b is an observation diagram of another angle in fig. 2a of the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to the embodiment of the present invention, and fig. 2c is an observation diagram of another angle in fig. 2a of the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to the embodiment of the present invention.
It should be noted that, in this embodiment, the first signal lamp and the second signal lamp use a first color, and the third signal lamp and the fourth signal lamp use a second color and a third color, respectively; the communication domain which is closest to the second color and the third color is extracted according to the distance of calculating the tone mean value of the signal lamp tone and the communication domain to serve as the third communication domain and the fourth communication domain, the corresponding relation between the third signal lamp and the fourth signal lamp and the image characteristics is directly established, the first signal lamp and the second signal lamp are arranged on the wing of the target unmanned aerial vehicle, the first signal lamp and the second signal lamp have the same tone, and the one-to-one corresponding relation between the first signal lamp and the first communication domain and between the second signal lamp and the first communication domain and the second communication domain can not be determined through calculating the distance of the tone mean value of the signal lamp tone and the communication domain. As shown in fig. 2a to 2c, the target drone is provided with four signal lamps, the first color is color No. 1, the second color is color No. 2, and the third color is color No. 3, a connecting line between 2 signal lamps of color No. 1 and a connecting line between 2 signal lamps of color No. 2 and 2 signal lamps of color No. 3 in a three-dimensional space are in parallel, the connecting line and an extension line between middle points of two line segments generally divide 4 signal lamps into two parts, one part includes the signal lamp of color No. 2 and one signal lamp of color No. 1, and the other part includes the signal lamp of color No. 3 and the other signal lamp of color No. 1. If the direction vector between 2 signal lights of color No. 1 is v1And the direction vector between 2 signal lights of No. 2 color and No. 3 color is v2(color No. 2 points to color No. 3), and if the vector of the midpoint of 2 signal lights of color No. 1 points to the midpoint of 2 signal lights of color No. 2 and color No. 3 is v ═ b, then the vector v perpendicular to it isT(b, -a). Vector v of No. 1 color signal lamp on No. 2 color side pointing to No. 1 color signal lamp on No. 3 color sideTInner product of and vT·v2Have the same symbols.
In the embodiment of the invention, signal lamps with three colors are adopted as signal marks, and the background color in the image is easier to avoid the less the color is, so that the probability of error identification is reduced, and the robustness of the unmanned aerial vehicle attitude estimation method system based on three-color four-lamp mark identification is increased.
In the embodiment of the present invention, if four connected domains with distances smaller than the distance threshold cannot be screened out from the connected domain set, step S2 identifies that the process has failed, that is, the feature detection process has failed, and the pose estimation in the subsequent step S3 cannot be performed.
In an embodiment of the present invention, inputting the three-dimensional coordinates and the two-dimensional image coordinates into the pose estimation model, and outputting the position and the pose of the target drone specifically includes: inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and calculating the position of an output target unmanned aerial vehicle by adopting a Lambda Twist attitude estimation algorithm for the three-dimensional coordinates and the two-dimensional image coordinates;
and mapping the three-dimensional coordinates to obtain coordinates in the image corresponding to the three-dimensional coordinates.
It should be noted that, for the attitude estimation of the unmanned aerial vehicle, firstly, the mapping process from the three-dimensional space coordinate to the two-dimensional coordinate in the image acquisition process is performed to obtain the coordinate, X, of the signal lamp in the three-dimensional coordinate system of the acquisition devicec=[Xc,Yc,Zc]TRX + t. Wherein, XcIs the coordinate of the three-dimensional space coordinate X of the signal lamp under the coordinate system of the acquisition equipment, XCIs the value of the X axis of the three-dimensional space coordinate X of the signal lamp under the coordinate system of the acquisition equipment, YCThe value of the three-dimensional space coordinate X of the signal lamp on the y axis under the coordinate system of the acquisition equipment is obtained; zCThe numerical value of the three-dimensional space coordinate X of the signal lamp on the z axis under the coordinate system of the acquisition equipment is shown. The attitude of the target drone relative to the acquisition device is represented by a rotation matrix R and a displacement t, after which the acquisition device coordinates are normalized: x is the number ofn=[xn,yn]T=[Xc/Zc,Yc/Zc]TWherein x isnIs a normalized two-dimensional coordinate, xnIs the x-axis value, y, of the normalized two-dimensional coordinatenThe value is the y-axis value of the normalized two-dimensional coordinate. Because the lens of the acquisition equipment has distortion, the two-dimensional coordinate calculation model after distortion is as follows:
Figure GDA0003153967340000141
in the formula, xdIs a distorted two-dimensional coordinate, xdIs the x-axis value, y, of the distorted two-dimensional coordinatesdIs a y-axis value, k, of the distorted two-dimensional coordinate1:5Distortion parameter, r, used for two-dimensional coordinate calculation model2=xn 2+yn 2. And finally, acquiring pixel coordinates of the signal lamp in the image through a projection matrix of acquisition equipment:
Figure GDA0003153967340000142
wherein x is [ x, y ]]TIs the two-dimensional image coordinate of the signal lamp, K is the acquisition equipment matrix, fxAnd fyHorizontal and vertical focal lengths, respectively, alpha is the distortion parameter of the acquisition device, cx、cyRespectively the x-axis and y-axis coordinates of the principal point of the acquisition device.
Based on the mapping model from the three-dimensional space coordinate to the two-dimensional coordinate, the specific attitude estimation process is as follows: the three-dimensional space coordinates and the two-dimensional image coordinates of the obtained 4 signal lights are expressed as: x1:4And x1:4. Based on the two-dimensional coordinate calculation model, a rotation matrix R and a displacement t are estimated. The distortion and acquisition device matrix K is calibrated to known parameters. Pose estimation for unmanned aerial vehicles is first based on two-dimensional image coordinates x1:4Obtaining a normalized coordinate xn,1:4The flow is x1:4→xd,1:4→xn,1:4. The method comprises the following specific steps:
Figure GDA0003153967340000151
obtaining the distorted coordinate [ xd,yd]With respect to xd,1:4→xn,1:4The calculation process uses iterative calculation to obtain a normalized coordinate x which is a dimensional image coordinaten,1:4Then, with the coordinate xn,1:4Wherein 3 points estimate pose using P3P (Perspectral Three Point) algorithm: [ R ]k,tk]=P3P(X1:3,xn,1:3). According to [ R ]k,tk]=P3P(X1:3,xn,1:3) Formula calculation, a specific P3P algorithm adopts LambdaTwist, and a maximum of 4 solutions are obtained based on the calculation of the LambdaTwist algorithm, namely k is 1: 4. For each set of solutions [ Rk,tk]Projecting the 4 th signal lamp space coordinate to obtain the two-dimensional normalized coordinate
Figure GDA0003153967340000152
And calculate
Figure GDA0003153967340000153
And xn4The difference between: e.g. of the typek=||xn,4-xn,k,4||2. In the formula (I), the compound is shown in the specification,
Figure GDA0003153967340000154
calculated from the two-dimensional image coordinates, k1:5As distortion parameter, fR,t(X) represents the 3-dimensional point X to the normalized image plane coordinate X under the target coordinate systemnThe mapping relationship of (2);
Figure GDA0003153967340000155
based on Xc=[Xc,Yc,Zc]TRX + t and xn=[xn,yn]T=[Xc/Zc,Yc/Zc]TThe two formulas are that R is taken as RkT is taken as tkX is taken from X4. Then, x is calculatedn,k,4And xn,4The difference between: e.g. of the typek=||xn,4-xn,k,4||2In the formula xn,4The calculation is performed from the two-dimensional image coordinates. Finally, the solution [ R ] with the minimum error is takenk,tk]And as the output of the attitude estimation model, the position and the attitude of the target unmanned aerial vehicle are output by the attitude estimation model. The Lambda Twist algorithm is disclosed in Persson M, Nordberg K.Lambda Twist: An Accurate Fast Robust efficient Three Point (P3P) solution.ECCV2018.
Example two:
fig. 3 is a block diagram of an unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition according to an embodiment of the present invention.
As shown in fig. 3, an embodiment of the present invention further provides an unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition, where four signal lamps for marking are disposed at the tips of wings and empennages of a target unmanned aerial vehicle, and the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition includes a feature detection unit 10, a feature recognition unit 20, and an attitude analysis estimation unit 30;
the characteristic detection unit 10 is used for acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain an image characteristic with high brightness and high saturation;
the feature identification unit 20 is configured to extract connected domains corresponding to the four signal lamps from the image features according to the hues of the three colors of the four signal lamps, identify one-to-one correspondence between the connected domains and the four signal lamps, and establish two-dimensional image coordinates of the four signal lamps;
the attitude analysis and estimation unit 30 is used for establishing three-dimensional coordinates of four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
wherein, the attitude estimation model is provided with parameters of the acquisition equipment.
Fig. 4 is another block diagram of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition according to the embodiment of the present invention.
As shown in fig. 4, in the embodiment of the present invention, an image of a target drone is acquired by an over-acquisition device mounted on an observation drone, and the image is output to a DSP chip of the acquisition device in an RGB format; the DSP chip is arranged on the observation unmanned aerial vehicle to complete the target attitude estimation task. The information processing is to be real-time, i.e. able to process the data at the frame rate of the acquisition device (30 fps). The DSP chip firstly converts the RGB format image into an HSV format image and completes the feature detection work of the image features based on the saturation and brightness images; secondly, calculating the average tone of the image features by using the tone image based on the connected domain obtained in the feature detection process of the image features, and extracting and identifying the connected domain of the image features corresponding to each signal lamp on the target from the average tone; and finally, inputting the three-dimensional coordinates and the two-dimensional image coordinates of the four signal lamps into an attitude estimation model, and outputting the position and the attitude of the estimated target unmanned aerial vehicle in the attitude estimation model.
Fig. 5a is a real object simulation target diagram of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition in the embodiment of the present invention, fig. 5b is a front view of a real object simulation target and an acquisition device of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition in the embodiment of the present invention, and fig. 5c is a top view of the real object simulation target and the acquisition device of the unmanned aerial vehicle attitude estimation system based on three-color four-light marker recognition in the embodiment of the present invention.
As shown in fig. 5a to 5c, in the present embodiment, the unmanned aerial vehicle attitude estimation system real object simulation target based on three-color four-light mark recognition is used to verify the validity of the proposed system. The target material used a white foam board to simulate the white color of the fuselage. 4 signal lamps of arranging on the board are red, purple and blue respectively, and its overall arrangement simulates target unmanned aerial vehicle wing and fin pointed end position. The distance between the two lamps of the wing is 30cm, the distance between the two lamps of the empennage is 20cm, and the distance between the wing and the empennage is 10 cm. The target is arranged on the swivel chair and is convenient to move. The acquisition equipment adopts a network camera of Haekwever, the spatial resolution of an image frame is 1920x1080, and the image frame is arranged on the other transfer. The tile helps to determine the relative position between the camera and the target, with a side length of 60 cm. When the camera is 240cm from the target. FIG. 5a shows the production of a simulated target and the detection and identification of a mark; FIG. 5b depicts a forward view of a scene where the distance measurement is 235.16cm, with acceptable error; the layout of the marks is positioned at the upper right part, which is consistent with the figure 5a, a pinhole acquisition equipment coordinate system is used, the origin is an optical center, and the z axis is an optical axis; figure 5c shows a top view with the logo layout corresponding to figure 5a with the wing substantially parallel to the x-axis.
It should be noted that the units in the second embodiment system correspond to steps S1 to S3 in the first embodiment method, steps S1 to S3 have been described in detail in the first embodiment method, and the units in the second embodiment are not described one by one.
The unmanned aerial vehicle attitude estimation system based on three-color four-lamp mark recognition provided by the invention processes the acquired image of the flight state of the target unmanned aerial vehicle with the signal mark through the feature detection unit, obtains the two-dimensional image coordinates of the signal lamp after the image features are obtained by the feature recognition unit and the corresponding relation between the image features and the signal lamp is recognized, establishes the three-dimensional coordinates of the signal lamp on the basis of the target unmanned aerial vehicle on the attitude analysis and estimation unit, and inputs the three-dimensional coordinates and the two-dimensional image coordinates into the attitude estimation model to obtain the position and the attitude of the target unmanned aerial vehicle, thereby realizing the automatic real-time attitude estimation of the unmanned aerial vehicle. The system is easy to select the tone avoiding the background, reduces the identification difficulty of image characteristics, improves the robustness of the system, and solves the technical problem of low stability of the existing unmanned aerial vehicle flight attitude estimation method.
Example three:
the embodiment of the invention also provides a computer-readable storage medium, wherein the computer storage medium is used for storing computer instructions, and when the computer instructions run on a computer, the computer is enabled to execute the unmanned aerial vehicle attitude estimation method based on three-color four-light mark identification.
Example four:
an embodiment of the present invention further provides an apparatus, which is characterized in that the apparatus includes a processor and a memory:
a memory for storing the program code and transmitting the program code to the processor;
and the processor is used for executing the unmanned aerial vehicle attitude estimation method based on three-color four-light mark recognition according to the instructions in the program codes.
Illustratively, a computer program may be partitioned into one or more modules/units, which are stored in a memory and executed by a processor to accomplish the present application. One or more modules/units may be a series of computer program instruction segments capable of performing certain functions, the instruction segments describing the execution of a computer program in a device.
The device may be a computing device such as a desktop computer, a notebook, a palm top computer, a cloud server, and the like. The device may include, but is not limited to, a processor, a memory. Those skilled in the art will appreciate that the device is not limited and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the device may also include input output devices, network access devices, buses, etc.
The processor may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The memory may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the computer device. Further, the memory may also include both internal and external storage units of the computer device. The memory is used for storing computer programs and other programs and data required by the computer device. The memory may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, systems or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. The unmanned aerial vehicle attitude estimation method based on three-color four-lamp mark recognition is characterized in that four signal lamps for marking are arranged at the tips of wings and empennages of a target unmanned aerial vehicle, the four signal lamps have three colors, and the unmanned aerial vehicle attitude estimation method based on the three-color four-lamp mark recognition comprises the following steps:
s1, acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain image characteristics with high brightness and high saturation;
s2, extracting connected domains corresponding to the four signal lamps from the image characteristics according to the tones of the three colors of the four signal lamps, identifying the one-to-one correspondence relationship between the connected domains and the four signal lamps, and establishing two-dimensional image coordinates of the four signal lamps;
s3, establishing three-dimensional coordinates of the four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
and parameters of acquisition equipment are set in the attitude estimation model.
2. The unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to claim 1, wherein processing the image in the HSV color space specifically comprises:
calculating the saturation and brightness of the image according to the RGB numerical value of the image, and obtaining a product image based on the saturation and the brightness;
performing threshold segmentation on the product image to obtain a binary image;
carrying out connected domain marking on all the binary images to obtain a plurality of connected domains;
and screening all the connected domains according to the area and the shape of the target region to obtain the image characteristics.
3. The unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to claim 2, wherein extracting connected domains corresponding to the four signal lights from the image features by using hues of three colors of the four signal lights specifically comprises:
calculating the tone mean value and the tone standard deviation of a connected domain where the image features are located in two intervals of [ -180 degrees, 180 degrees ] and [0 degrees, 360 degrees ] based on the periodicity of the tones;
selecting a hue mean value and a hue standard deviation corresponding to a small interval of the hue standard deviation of the connected domain as the hue statistic of the connected domain;
screening out the connected domains with the hue standard deviations smaller than a standard deviation threshold value from all the connected domains to obtain a connected domain set;
in the connected domain set, calculating the distance between the tone mean value of each connected domain and the tone of each color of the signal lamp;
and if the distance is smaller than a distance threshold value, screening out four connected domains from the connected domain set.
4. The method of claim 3, wherein the four connected domains are respectively identified as a first connected domain, a second connected domain, a third connected domain and a fourth connected domain, the four signal lamps are respectively identified as a first signal lamp, a second signal lamp, a third signal lamp and a fourth signal lamp, if the hue color of the first signal lamp and the second signal lamp is a first color, the hue color of the third signal lamp is a second color, the hue color of the fourth signal lamp is a third color, the connected domains corresponding to the hues of the third signal lamp and the fourth signal lamp are directly identified and correspond according to the distance, if the connected domains corresponding to the first signal lamp and the second signal lamp are respectively the first connected domain and the second connected domain, identifying the correspondence between the first and second connected domains and the first and second signal lamps specifically includes:
the first signal lamp and the second signal lamp are connected into a line, the third signal lamp and the fourth signal lamp are connected into a line, namely the first communicating area and the second communicating area are connected into a first straight line, and the third communicating area and the fourth communicating area are connected into a second straight line;
if the direction vector of the first connected domain pointing to the second connected domain is v1The direction vector of the third connected domain pointing to the fourth connected domain is v2The vector from the midpoint of the first straight line to the midpoint of the second straight line is v3
According to sign ═ vT·v1)·(vT·v2) Calculating to obtain a value of sign, wherein if sign is greater than 0, the first connected domain is positioned on one side of the third connected domain; if sign is not greater than 0, the first connected domain is positioned on one side of the fourth connected domain;
establishing a corresponding relation between the first communication domain and the second communication domain and between the first signal lamp and the second signal lamp according to the positions of the third signal lamp and the fourth signal lamp arranged on the target unmanned aerial vehicle;
wherein v isTIs a vector v3The vertical vector of (a).
5. The method of claim 3, wherein if four connected domains with the distance smaller than a distance threshold cannot be screened from the set of connected domains, the step S2 fails the identification process.
6. The unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to claim 3, wherein the distance d formula between the hue mean value of each connected domain and each color hue of the signal lamp is calculated as:
Figure FDA0003153967330000031
where h is the hue value of the signal lamp, hmIs the average value of the hues of the connected components.
7. The unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to claim 1, wherein the three-dimensional coordinates and the two-dimensional image coordinates are input into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle specifically comprises: inputting the three-dimensional coordinates and the two-dimensional image coordinates into the attitude estimation model, and calculating and outputting the position of the target unmanned aerial vehicle by adopting a Lambdarist attitude estimation algorithm for the three-dimensional coordinates and the two-dimensional image coordinates;
and mapping the three-dimensional coordinates to obtain coordinates in the image corresponding to the three-dimensional coordinates.
8. The unmanned aerial vehicle attitude estimation system based on three-color four-lamp mark recognition is characterized in that four signal lamps for marking are arranged at the tips of wings and empennages of a target unmanned aerial vehicle, the four signal lamps have three colors, and the unmanned aerial vehicle attitude estimation system based on the three-color four-lamp mark recognition comprises a feature detection unit, a feature recognition unit and an attitude analysis and estimation unit;
the characteristic detection unit is used for acquiring an image of a target unmanned aerial vehicle flight state with four signal lamp marks acquired by acquisition equipment, and processing the image in an HSV color space to obtain an image characteristic with high brightness and high saturation;
the feature identification unit is used for extracting connected domains corresponding to the four signal lamps from the image features according to the hues of the three colors of the four signal lamps, identifying the one-to-one correspondence relationship between the connected domains and the four signal lamps, and establishing two-dimensional image coordinates of the four signal lamps;
the attitude analysis and estimation unit is used for establishing three-dimensional coordinates of the four signal lamps based on the target unmanned aerial vehicle, inputting the three-dimensional coordinates and the two-dimensional image coordinates into an attitude estimation model, and outputting the position and the attitude of the target unmanned aerial vehicle;
and parameters of acquisition equipment are set in the attitude estimation model.
9. A computer-readable storage medium for storing computer instructions which, when run on a computer, cause the computer to perform the drone pose estimation method based on tri-color quad-light marker recognition of any one of claims 1-7.
10. An apparatus, comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the unmanned aerial vehicle attitude estimation method based on three-color four-light marker recognition according to any one of claims 1 to 7 according to instructions in the program code.
CN202010646221.1A 2020-07-07 2020-07-07 Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition Active CN111784768B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010646221.1A CN111784768B (en) 2020-07-07 2020-07-07 Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010646221.1A CN111784768B (en) 2020-07-07 2020-07-07 Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition

Publications (2)

Publication Number Publication Date
CN111784768A CN111784768A (en) 2020-10-16
CN111784768B true CN111784768B (en) 2021-09-24

Family

ID=72758102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010646221.1A Active CN111784768B (en) 2020-07-07 2020-07-07 Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition

Country Status (1)

Country Link
CN (1) CN111784768B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116205975B (en) * 2023-02-01 2023-09-19 广东国地规划科技股份有限公司 Image control point data acquisition method and unmanned aerial vehicle mapping method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205899386U (en) * 2016-08-01 2017-01-18 中国人民武装警察部队总医院 Flight of many rotor unmanned aerial vehicle is with external safety control and system
CN109214288A (en) * 2018-08-02 2019-01-15 广州市鑫广飞信息科技有限公司 It is taken photo by plane the interframe scene matching method and device of video based on multi-rotor unmanned aerial vehicle
CN110182365A (en) * 2019-06-13 2019-08-30 中国矿业大学 A kind of coal mine underground explosion proof type quadrotor drone
CN111324145A (en) * 2020-02-28 2020-06-23 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205899386U (en) * 2016-08-01 2017-01-18 中国人民武装警察部队总医院 Flight of many rotor unmanned aerial vehicle is with external safety control and system
CN109214288A (en) * 2018-08-02 2019-01-15 广州市鑫广飞信息科技有限公司 It is taken photo by plane the interframe scene matching method and device of video based on multi-rotor unmanned aerial vehicle
CN110182365A (en) * 2019-06-13 2019-08-30 中国矿业大学 A kind of coal mine underground explosion proof type quadrotor drone
CN111324145A (en) * 2020-02-28 2020-06-23 厦门理工学院 Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111784768A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
US20210366124A1 (en) Graphical fiducial marker identification
US10846844B1 (en) Collaborative disparity decomposition
CN109211207B (en) Screw identification and positioning device based on machine vision
CN110163025A (en) Two dimensional code localization method and device
CN101702233B (en) Three-dimension locating method based on three-point collineation marker in video frame
WO2021098163A1 (en) Corner-based aerial target detection method
CN110807807B (en) Monocular vision target positioning pattern, method, device and equipment
CN106355592B (en) Educational toy set, circuit element thereof and wire identification method
JP2023075366A (en) Information processing apparatus, recognition support method, and computer program
JP6779688B2 (en) Image processing equipment, image processing method, computer program
CN103258346A (en) Three-dimension shooting and printing system
CN113222940B (en) Method for automatically grabbing workpiece by robot based on RGB-D image and CAD model
CN108022245B (en) Facial line primitive association model-based photovoltaic panel template automatic generation method
CN111784768B (en) Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition
CN110942092B (en) Graphic image recognition method and recognition system
WO2021170051A1 (en) Digital photogrammetry method, electronic device, and system
CN112686872B (en) Wood counting method based on deep learning
CN111435429A (en) Gesture recognition method and system based on binocular stereo data dynamic cognition
WO2023193763A1 (en) Data processing method and apparatus, and tracking mark, electronic device and storage medium
CN114600160A (en) Method of generating three-dimensional (3D) model
CN108961357B (en) Method and device for strengthening over-explosion image of traffic signal lamp
CN115586796A (en) Vision-based unmanned aerial vehicle landing position processing method, device and equipment
CN113591548B (en) Target ring identification method and system
CN115880643A (en) Social distance monitoring method and device based on target detection algorithm
CN111415372B (en) Moving object merging method based on HSI color space and context information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant