CN111951295B - Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment - Google Patents

Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment Download PDF

Info

Publication number
CN111951295B
CN111951295B CN202010646894.7A CN202010646894A CN111951295B CN 111951295 B CN111951295 B CN 111951295B CN 202010646894 A CN202010646894 A CN 202010646894A CN 111951295 B CN111951295 B CN 111951295B
Authority
CN
China
Prior art keywords
flight
image
determining
frames
video images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010646894.7A
Other languages
Chinese (zh)
Other versions
CN111951295A (en
Inventor
王勇
陈东
干哲
范梅梅
李轶博
陈骁
肖永辉
杨伟斌
王涵
王晶
韩晓广
席有猷
周青巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pla 93114
Original Assignee
Pla 93114
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pla 93114 filed Critical Pla 93114
Priority to CN202010646894.7A priority Critical patent/CN111951295B/en
Publication of CN111951295A publication Critical patent/CN111951295A/en
Application granted granted Critical
Publication of CN111951295B publication Critical patent/CN111951295B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a method, a device and electronic equipment for determining a flight track with high precision based on polynomial fitting. The method comprises the following steps: acquiring N frames of video images shot by a flight device in the flight process, wherein each frame of video image corresponds to a time point; determining the space position information of a shooting center corresponding to each frame of video image in the N frames of video images; and performing curve fitting by using a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding space position information of the photographing center, and determining a flight track curve of the flight device, wherein the polynomial fitting function comprises three polynomials, each polynomial takes the time parameter of the flight device as an independent variable and takes the coordinate value corresponding to one direction corresponding to a space rectangular coordinate system of the flight device as a dependent variable. The method realizes high-precision determination of the flight path of the flight device, and reduces the cost required for determining the flight path of the flight device and the increase of the extra weight of the flight device.

Description

Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment
Technical Field
The invention relates to the technical field of navigation and positioning, in particular to a method, a device and electronic equipment for determining a flight track with high precision based on polynomial fitting.
Background
At present, a flying device such as an unmanned aircraft plays a very important role in many fields, and how to accurately acquire the flying trace of the flying device is very important for better application of the flying device.
In the related art, the position information of a flying device is generally recorded in real time by installing an inertial navigation device or a satellite positioning system on the flying device to obtain the flying trajectory of the flying device. However, this way of mounting the inertial navigation device or the satellite positioning system on the flying device to acquire the flying trace of the flying device is disadvantageous for the flying of the flying device because the weight of the inertial navigation device or the satellite positioning system is large, and also results in high cost for determining the flying trace because the cost of the inertial navigation device or the satellite positioning system is high.
Disclosure of Invention
The present invention aims to solve at least to some extent one of the technical problems in the above-described technology. Therefore, an object of the present invention is to provide a method for determining a flight trajectory with high accuracy based on polynomial fitting, which solves the technical problems of high cost and influence on the flight of the flight device due to the increase of the weight of the flight device in the related art.
The second object of the invention is to provide a device for determining the flight trajectory with high precision based on polynomial fitting.
A third object of the present invention is to propose an electronic device.
A fourth object of the present invention is to propose a computer readable storage medium.
To achieve the above objective, an embodiment of a first aspect of the present invention provides a method for determining a flight trajectory with high accuracy based on polynomial fitting, including the following steps: acquiring N frames of video images shot by a flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1; determining the space position information of a shooting center corresponding to each frame of video image in the N frames of video images, wherein the space position information of the shooting center comprises coordinate values respectively corresponding to the shooting center in three directions of a space rectangular coordinate system; and performing curve fitting by using a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding photographic center space position information, and determining a flight track curve of the flight device, wherein the polynomial fitting function comprises three polynomials, each polynomial takes a time parameter of flight of the flight device as an independent variable and a coordinate value of the flight device corresponding to one direction corresponding to the space rectangular coordinate system as a dependent variable.
To achieve the above object, an embodiment of a second aspect of the present invention provides an apparatus for determining a flight trajectory with high accuracy based on polynomial fitting, including: the first acquisition module is used for acquiring N frames of video images shot by the flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1; the first determining module is used for determining the space position information of the shooting center corresponding to each frame of video image in the N frames of video images, wherein the space position information of the shooting center comprises coordinate values respectively corresponding to the shooting center in three directions of a space rectangular coordinate system; and the second determining module is used for performing curve fitting by utilizing a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding space position information of the photographing center, and determining a flight track curve of the flight device, wherein the polynomial fitting function comprises three polynomials, each polynomial takes the time parameter of the flight device as an independent variable and takes the coordinate value of the flight device corresponding to one direction corresponding to the space rectangular coordinate system as a dependent variable.
To achieve the above object, an embodiment of a third aspect of the present invention provides an electronic device, including a memory, and a processor; the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the method for determining the flight trajectory based on polynomial fitting with high precision according to the embodiment of the first aspect of the invention.
To achieve the above object, an embodiment of a fourth aspect of the present invention provides a computer readable storage medium storing a computer program, which when executed by a processor, implements a method for determining a flight trajectory with high accuracy based on polynomial fitting according to the embodiment of the first aspect of the present invention.
The technical scheme of the embodiment of the invention has the following beneficial effects:
the method has the advantages that the method realizes the high-precision determination of the flight track of the flying device by utilizing polynomial fitting based on the video image shot by the flying device in the flight process, and the cost of the camera is low and the weight is light due to the fact that the camera is only increased, so that the cost required for determining the flight track of the flying device is reduced, and the increase of the extra weight of the flying device is reduced.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow chart of a method for determining a flight trajectory with high accuracy based on polynomial fitting in accordance with one embodiment of the invention;
FIG. 2 is a histogram equalization schematic according to one embodiment of the invention;
FIG. 3 is a schematic diagram of an image convolution operation principle according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of collinear conditions according to one embodiment of the invention;
FIG. 5 is a schematic diagram of a direct linear transformation principle according to one embodiment of the present invention;
FIG. 6 is an exemplary diagram of a template matching classification method according to an embodiment of the invention;
FIG. 7 is a flow chart of a method for determining a flight trajectory with high accuracy based on polynomial fitting in accordance with another embodiment of the invention;
FIG. 8 is a schematic structural diagram of an apparatus for determining a flight trajectory with high accuracy based on polynomial fitting according to one embodiment of the invention; and
fig. 9 is a schematic structural view of an electronic device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
It can be appreciated that in the related art, the position information of the flying device is generally recorded in real time by installing an inertial navigation device or a satellite positioning system on the flying device, so as to obtain the flying trajectory of the flying device. However, this way of mounting the inertial navigation device or the satellite positioning system on the flying device to acquire the flying trace of the flying device is disadvantageous for the flying of the flying device because the weight of the inertial navigation device or the satellite positioning system is large, and also results in high cost for determining the flying trace because the cost of the inertial navigation device or the satellite positioning system is high.
According to the method, after N frames of video images of N time points, which are shot by a flying device in the flying process, are acquired, shooting center space position information of multiple frames of video images, which are acquired by the flying device in the flying process, can be determined, corresponding to the frames of video images, respectively, and further curve fitting is performed according to the time points, which correspond to the multiple frames of video images, and the shooting center space position information, which correspond to the frames of video images, respectively, by using a polynomial fitting function, so as to determine a flying track curve of the flying device, wherein the shooting center space position information comprises coordinate values, corresponding to the shooting center, in three directions of a space rectangular coordinate system, respectively, of the shooting center, the polynomial fitting function comprises three polynomials, each polynomial takes the time parameter of flying of the flying device as an independent variable, and the coordinate value, corresponding to one direction, corresponding to the space rectangular coordinate system, of the flying device as a dependent variable. Therefore, the method and the device realize the high-precision determination of the flight track of the flight device by utilizing polynomial fitting based on the video image shot by the flight device in the flight process, and the cost of the camera is low and the weight is light because the camera is only required to be added, thereby reducing the cost required for determining the flight track of the flight device and reducing the increase of the extra weight of the flight device.
First, several coordinate systems to which the present application relates will be briefly described.
The image plane coordinate system is a plane rectangular coordinate system of the position of an image point in the image plane, and the origin of coordinates is usually the center point of the image.
The image space coordinate system is a space rectangular coordinate system of the image point in the space position of the image space, and the origin of coordinates can be set according to the requirement.
The object space coordinate system is a coordinate system of an object in a specified space of a measurer, such as the ground, other reference objects and the like, and the origin of the coordinates can be set according to requirements.
Methods, apparatuses, electronic devices, and computer-readable storage media for determining a flight trajectory based on polynomial fitting with high accuracy according to embodiments of the present invention are described below with reference to the accompanying drawings.
First, a method for determining a flight trajectory based on polynomial fitting with high accuracy provided in the present application will be described with reference to fig. 1. FIG. 1 is a flow chart of a method for determining a flight trajectory with high accuracy based on polynomial fitting, according to one embodiment of the invention.
As shown in fig. 1, the method for determining a flight trajectory with high accuracy based on polynomial fitting according to the embodiment of the invention may include the following steps:
step 101, acquiring N frames of video images shot by a flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1.
Specifically, the method for determining the flight trajectory based on polynomial fitting with high precision provided by the application can be executed by the device for determining the flight trajectory based on polynomial fitting with high precision, which is hereinafter referred to as a flight trajectory determining device, wherein the flight trajectory determining device can be configured in the electronic equipment to determine the flight trajectory of the flight device with high precision through lower cost and additional weight increase. The electronic device may be any hardware device capable of performing data processing, such as a mobile phone, a computer, and the like. It will be appreciated that the flight trajectory determination device may be configured in the controller of the flight device or in the ground command center of the flight device, as the application is not limited in this respect.
Specifically, a camera can be configured in the flying device so as to shoot video images corresponding to different time points respectively in the flying process of the flying device. In an exemplary embodiment, the camera may be disposed in front of the flying device, and the present application does not limit the location of the camera in the flying device.
In an exemplary embodiment, the camera may capture a video image during the flight of the flying device and transmit the video image to the flight path determining device, and then the flight path determining device may perform a frame de-frame process on the video image captured by the flying device during the flight to obtain N frames of video images.
That is, step 101 may specifically include:
step 101a, obtaining a video image shot by a flight device in the flight process.
In step 101b, a frame de-framing process is performed on the video images to obtain N frames of video images.
It is noted that in practical application, the size of N may be set according to needs, for example, in order to improve accuracy of a flight track of a flight device, curve fitting may be performed by using spatial position information of a shooting center corresponding to more video images, so as to improve accuracy of a determined flight track curve of the flight device, and at this time, the value of N may be set to be larger.
Step 102, preprocessing the N frames of video images by using an image enhancement technology and/or an image denoising technology.
It will be appreciated that the N frames of video images may be pre-processed prior to subsequent processing with the N frames of video images to improve the radiation quality of the N frames of video images. Of course, the captured N frames of video images may be directly used to determine the subsequent flight trajectory without preprocessing, which is not limited in this application.
The process of preprocessing an N-frame video image using image enhancement techniques will be described first.
In an exemplary embodiment, the image enhancement techniques may include an image gray scale transformation technique, a histogram equalization technique, an image sharpening technique, a white balance processing technique, and the like. The present application describes an image gradation conversion technique, a histogram equalization technique, and an image sharpening technique as examples.
Image gray level conversion technology:
the gray level transformation can increase the dynamic range of the image, expand the contrast, make the image clear and the characteristic obvious, and is one of the important means for enhancing the image. The gray scale of the pixel is corrected mainly by utilizing the point operation of the image, the gray scale value of the corresponding output pixel is determined by the gray scale value of the input pixel, and the gray scale value can be regarded as the conversion operation from pixel to pixel without changing the spatial relationship in the image.
The change in the pixel gray level is performed according to a transfer function g1 (x ', y')=t [ f '(x', y ') ] between the gray value of the input image f' (x ', y') and the gray value of the output image g1 (x ', y'). The conversion function has various forms, and in the embodiment of the application, the conversion can be performed by a linear conversion method, as shown in the following formula (1):
g1(x',y')=T[f'(x',y')]=A'*f'(x',y')+B' (1)
wherein, in the formula (1), the parameter A ' is the slope of the linear function, B ' is the intercept of the linear function on the y axis, f ' (x ', y ') represents the gray scale of the input image, and g1 (x ', y ') represents the gray scale of the output image.
In this embodiment of the present application, for each frame of N frames of video images, the gray value of each pixel in the frame of video image may be substituted into equation (1) to obtain the gray value of each pixel after the frame of video image is processed, and the same operation is performed on each frame of video image, so that gray conversion of N frames of video images may be implemented.
The N frames of video images are preprocessed by utilizing an image gray level conversion technology, so that the dynamic range of each frame of video image is increased, the contrast is expanded, each frame of video image is clear and has obvious characteristics, the radiation quality of each frame of video image is improved, and a foundation is laid for accurately determining the space position information of the shooting center corresponding to each frame of video image.
Histogram equalization techniques:
histogram equalization is the process of converting one image into another with equalized histogram by gray level conversion, i.e. with the same number of pixels at each gray level.
The image histogram may represent the distribution of pixel gray values in the image. In general, in order to make an image clear, contrast is increased, image details are highlighted, and it is necessary to make the distribution of image gradation substantially uniform from dark to bright as shown in fig. 2. The histogram equalization technique is a technique of converting an image with uneven histogram distribution (for example, an image with most of pixel gray scales concentrated in a certain segment shown in the upper part of fig. 2) into a new image with even gray scale distribution by a function, and expanding the dynamic range of the gray scale histogram. Wherein the transformation function for histogram equalization is not uniform, it is the integral of the input image histogram, i.e. the cumulative distribution function.
Let the gray scale transformation s ' =f ' (r ') be a discontinuous and micromanipulable function with limited slope, it converts the input image Ii ' (x ', y ') into the output image Io ' (x ', y '), the histogram of the input image is Hi ' (r '), and the histogram of the output image is Ho ' (s '), then the corresponding small area elements after gray scale transformation are equal according to the meaning of the histogram, i.e. there is a relationship as shown in formula (2) between Ho ' (s ') and Hi ' (r ').
Ho'(s')ds'=Hi'(r')dr' (2)
The mapping relation S 'in the final histogram equalization process can be obtained according to the analysis' k In the form as shown in equation (3).
Where n 'is the sum of the pixels in the image, n' j The number of pixels that are the current gray level, L, is the total number of gray levels possible in the image.
In the embodiment of the application, the histogram equalization can be performed on the N frames of video images by using the formula (3), so as to obtain the processed N frames of video images. In the processed N frames of video images, the gray level distribution of each frame of video image is approximately uniform from dark to bright, the processed N frames of video images are clearer, the gray level contrast of the images is increased, the details are enhanced, the radiation quality of each frame of video image is improved, and a foundation is laid for accurately determining the space position information of the shooting center corresponding to each frame of video image.
Image sharpening techniques:
the purpose of image sharpening is to sharpen edges, contours, and details of the image, and the root cause of blurring of the smoothed image is that the image is sharpened by performing an inverse operation (e.g., a differential operation) because the image is subjected to an averaging or integration operation. Therefore, the application makes the processed N frames of video images clearer by respectively performing differential operation on the N frames of video images.
In an exemplary embodiment, high pass filtering and spatial differentiation may be employed for image sharpening.
It can be understood that, for the image sharpening by the high-pass filtering method, since the edge or the detail (edge) part of the line of the image corresponds to the high-frequency component of the image spectrum, the high-frequency component is smoothly passed through by adopting the high-pass filtering, and the middle-low frequency component is properly restrained, so that the detail of the image can be made clear, and the image sharpening is realized.
In an exemplary embodiment, image sharpening may be implemented based on the laplacian operator. Specifically, the differential operator used may be a laplace operator, which is a two-dimensional second order differential operator and is non-directional, as shown in formula (4).
For example, a 3×3 Laplains convolution template may be:
in the embodiment of the present application, the sharpened N-frame video image may be obtained by performing laplace operation on the N-frame video image respectively according to the following formula (6).
Where f '(x', y ') is a video image before the sharpening process, and h' (x ', y') is a video image after the sharpening process.
The image edge of each frame of video image is clearer in the N frames of video images after sharpening, so that the radiation quality of each frame of video image is improved, and a foundation is laid for accurately determining the space position information of the shooting center corresponding to each frame of video image.
The following describes a process of preprocessing N frames of video images using an image denoising technique.
In an exemplary embodiment, denoising may be performed on N frames of video images by median filtering techniques, gaussian filtering techniques, bilateral filtering techniques, and the like.
Median filtering technique:
the median filtering technique is a nonlinear smoothing technique that sets the gray value of each pixel to the median of the gray values of all pixels within a certain neighborhood window of the point. The median filtering is a nonlinear signal processing technology capable of effectively suppressing noise based on a sequencing statistical theory, and the basic principle of the median filtering is to replace the value of one point in an image with the median value of the values of each point in a neighborhood of the point, so that surrounding pixel values are close to the true value, and the isolated noise point is eliminated.
In specific implementation, each pixel in each frame of video image can be scanned by using a two-dimensional sliding template with a certain structure, the pixels covered by the template in the frame of video image are ordered according to the size of pixel values, a monotonically ascending or descending two-dimensional data sequence is generated, and therefore the median value in the two-dimensional data sequence is used as the value of the pixel point corresponding to the central pixel point of the template in the frame of video image.
Wherein the two-dimensional median filtering can be expressed as shown in formula (7):
g2(x',y')=med{f'(x'-k',y'-l'),(k',l'∈W)} (7)
where f ' (x ', y ') is the original video image, and g2 (x ', y ') is the processed video image. W is the two-dimensional sliding template, and k 'and l' are the row number and the column number of the pixels in the two-dimensional sliding template, respectively. The two-dimensional sliding template can be 3*3 or 5*5. In addition, the shape of the two-dimensional sliding template may be linear, circular, cross-shaped, circular ring-shaped, or the like, which is not limited in this application.
The N frames of video images are preprocessed by using a median filtering technology, so that the transition of pixel gray values after the processing of each frame of video image is obviously smoothed, the radiation quality of each frame of video image is improved, and a foundation is laid for the follow-up accurate determination of the spatial position information of the shooting center corresponding to each frame of video image.
Gaussian filtering technique:
gaussian filtering is a linear smoothing filtering, is suitable for eliminating Gaussian noise, and is widely applied to a noise reduction process of image processing. The gaussian filtering is a process of performing weighted average on the whole image, and the value of each pixel point is obtained by performing weighted average on the pixel point and other pixel values in the neighborhood.
Specifically, when the image processing is performed by using gaussian filtering, as shown in fig. 3, a template (or referred to as convolution or mask) (B1 in fig. 3) may be used to scan each pixel in the image to be processed (A1 in fig. 3), and the weighted average gray value of the pixels in the neighborhood determined by the template is used to replace the value of the pixel corresponding to the pixel point in the center of the template (the pixel point where the five-pointed star in B1) in the image to be processed.
In particular, taking preprocessing of one frame of video image of N frames of video images as an example, smoothing filtering may be performed on the video image to be processed first, and a filtering function thereof may be determined as a gaussian function G (x ', y') as shown in formula (8) according to human visual characteristics.
Where G (x ', y') is a circularly symmetric function whose smoothing effect is controllable by sigma.
Then, as shown in fig. 3, the image G (x ', y ') (i.e., B1 in fig. 3) and the video image f ' (x ', y ') (i.e., A1 in fig. 3) to be processed may be convolved in a manner shown in formula (9), so that a processed smoothed video image G3 (x ', y ') may be obtained.
g3(x',y')=f'(x',y')*G(x',y') (9)
Through the mode, the image filtering based on the Gaussian operator can be realized, the pixel gray value transition of the processed video image is smooth, the pixel continuous part is not interrupted, the radiation quality of each frame of video image is improved, and a foundation is laid for the follow-up accurate determination of the spatial position information of the shooting center corresponding to each frame of video image.
Bilateral filtering technology:
the bilateral filtering is a filter capable of protecting edges and removing noise, and the filter is composed of two functions, so that the effect of protecting edges and removing noise can be achieved.
One of the functions of the bilateral filter is to determine the filter coefficients from the geometric spatial distance, and the other function is to determine the filter coefficients from the pixel difference. The bilateral filter has the advantages that edge preservation can be performed, compared with wiener filtering or Gaussian filtering which can be obvious in fuzzy edge and has a poor protective effect on high-frequency details, the bilateral filter reduces noise, one Gaussian variance is added to the Gaussian filtering, and the bilateral filter is based on a Gaussian filtering function of spatial distribution, so that pixels far away from the edge are not affected much by pixels near the edge, and preservation of pixel values near the edge is guaranteed.
Specifically, the edge preserving property of bilateral filtering can be realized by combining a space domain function and a value domain kernel function in the convolution process.
The N frames of video images are preprocessed by bilateral filtering, so that the pixel gray values of the processed video images of each frame are transitionally flattened, the edge characteristics are well reserved, the radiation quality of the video images of each frame is improved, and a foundation is laid for accurately determining the space position information of the shooting center corresponding to the video images of each frame.
In the embodiment of the present application, when each frame of video image is preprocessed, only image enhancement processing may be performed on each frame of video image, or only image denoising processing may be performed on each frame of video image, or image enhancement processing and image denoising processing may also be performed on each frame of video image at the same time, and in addition, any image enhancement processing technique may be selected as required to implement image enhancement, or any image denoising processing technique may be selected as required to implement image denoising.
Step 103, determining the space position information of the shooting center corresponding to each frame of video image in the N frames of video images.
It should be noted that, the N frames of video images in this and subsequent steps are preprocessed N frames of video images.
Wherein, the spatial position information of the photographing center is used for representing the spatial position of the photographing light beam at the photographing moment, and can comprise three-dimensional coordinate values (X s ,Y s ,Z s ) I.e. coordinate values corresponding to the three directions respectively.
Specifically, the spatial position information of the photographing center corresponding to each frame of video image in the N frames of video may be determined in various manners.
For example, the direct linear transformation model may be used to determine the spatial location information of the photographing center corresponding to each of the N frames of video images. Accordingly, before step 103, the method may further include: and establishing a direct linear transformation model according to the characteristics of the central projection of the area array video image.
It can be understood that the area array video image has the characteristic of central projection, and in the embodiment of the application, a direct linear transformation model can be established based on the characteristic of central projection of the area array video image, so that the photographic central space position information corresponding to each frame of video image in the N frames of video images is determined by using the direct linear transformation model. For knowledge of the center projection, reference may be made to descriptions in the related art, which are not repeated herein.
It should be noted that, for a frame of video image, the spatial position information of the shooting center of the video image is the spatial position information of a camera configured in the flying device when shooting the video image, that is, the spatial position information of the flying device in a space rectangular coordinate system at the moment corresponding to the frame of video image recorded by the ephemeris of the flying device.
It is understood that the direct linear transformation model in this application is built based on collinear conditions. The principle of a series of problems such as single image space back intersection, double image space front intersection, optical handwriting area network adjustment and the like is based on the collinear condition, and the expression form and the use method of the collinear condition are different according to the specific situation of the processed problem.
The principle of the collinearity condition and the process of obtaining the collinearity condition equation will be described first. In the course of the present application, (x, y) is the coordinate of the image point (x) 0 ,y 0 ) Coordinates of the principal point (x) which is the central point of the image 0 ,y 0 F) is an intra-azimuth element of the image, (X) S ,Y S ,Z S ) The space coordinates of the object space corresponding to the image point are (X, Y, Z) the space coordinates of the object space corresponding to the image point, and (X) A ,Y A ,Z A ) An object space coordinate which is an object point, (a) i ,b i ,c i ) (i=1, 2, 3) is the 9 directional cosine of the 3 external azimuth elements of the image, (Δx, Δy) is the systematic error correction, which contains ds, dβ.
As shown in fig. 4, S is a photographing center, and the coordinates thereof in a predetermined object space coordinate system are assumed to be (X s ,Y s ,Z s ) A is any object space point, and its object space coordinates are (X A ,Y A ,Z A ). a is the conformation of A on the image, and the corresponding image space coordinates and image space auxiliary coordinates are (X, Y, -f) and (X, Y, Z), respectively. When the three points S, A, a are located on a straight line, the auxiliary coordinates (X, Y, Z) of the image point a and the space coordinates (X) of the object point A A ,Y A ,Z A ) The following relationship is directly present:
as can be seen from the above equation (10), the image space coordinates and the image space auxiliary coordinates have a relationship as shown in the equation (11):
the above formula (11) is developed as:
the above formula (12) is then taken into formula (10) taking into account the coordinates (x) of the principal point 0 ,y 0 ) The following formula can be obtained(13) And (14).
The above formulas (13) and (14) are collinear conditional equations.
It will be appreciated that the direct linear transformation solution is an algorithm that establishes a direct linear relationship between the coordinates of the image point coordinators and the spatial coordinates of the object side of the corresponding object point. The coordinate of the coordinate instrument refers to a direct reading of the coordinate instrument, namely, the coordinate reading of the coordinate instrument with the main point of the image as the origin is not required to be converted.
The direct linear transformation solution is particularly suitable for photogrammetry processing of images shot by a non-measuring camera because initial approximations of the internal azimuth element and the external azimuth element are not needed. Close-range photogrammetry often uses various types of non-metrology cameras, such as normal cameras, high-speed cameras, etc., and thus the algorithm becomes an important component of close-range photogrammetry.
The direct linear transformation solution is in principle derived from collinear conditional equations.
According to the collinearity conditional equations (13) and (14), as shown in fig. 5, when one frame of image taken by a non-measuring camera is placed on a certain spatial coordinate system, the above equations (13) and (14) evolve into the following equations (15) and (16).
The systematic error correction (Δx, Δy) in equations (15) and (16) assumes that the line caused by the coordinate system non-perpendicularity error dβ and the scale non-uniformity error ds is temporarily contained onlyAnd an sexual error correction part. The coordinate system c-xy is a non-rectangular coordinate system, and the non-perpendicularity between two coordinate axes is dβ. Two coordinate systems are respectively rectangular coordinate systems with the principal point o as the originAnd non-rectangular coordinates o-xy. The coordinates of the principal point o of the image are (x 0 ,y 0 ). The coordinates of the certain image point p' in the non-rectangular coordinate system o-xy are (om 2 ,om' 1 ) This coordinate is affected by dβ and ds and contains a linear error. The point p corresponding to the point p' is the ideal position, which is +.>The coordinates (x, y) of (c) are error free. Here->
Let the x-direction have no scale error (the direction scale normalization coefficient is 1) and the y-direction scale normalization coefficient is 1+ds. At this time, if the main distance of the x-direction photo is f x Then the main distance f of the photo in the y direction y The method comprises the following steps:
the scale non-uniform error ds can be considered to be caused by factors such as non-uniform unit lengths of the x axis and the y axis of the used coordinate system, uneven deformation of photographic materials, and the like; while the non-orthogonality error dβ may be considered to be caused by non-perpendicularity of the x-axis and y-axis of the coordinate apparatus used.
Thus, the linear error corrections Δx and Δy are:
Δx=(1+ds)(y-y 0 )sindβ≈(y-y 0 )sindβ (18)
Δy=[(1+ds)cosdβ-1](y-y 0 )≈(y-y 0 )ds (19)
in this case, the collinearly conditional equation including only the linear error correction is in the form shown in equation (20).
l 4 =-(l 1 X s +l 2 Y s +l 3 Z s )
l 8 =-(l 5 X s +l 6 Y s +l 7 Z s )
Wherein r is 1 =-(a 1 X S +b 1 Y S +c 1 Z S ),r 2 =-(a 2 X S +b 2 Y S +c 2 Z S ),r 3 =-(a 3 X S +b 3 Y S +c 3 Z S )。
To sum up, we can derive the basic relation of the direct linear transformation solution:
wherein, the formula (21) is the formula of the direct linear transformation model, l 1 、l 2 ……l 11 Equation coefficients for a direct linear transformation model.
According to l 1 、l 2 ……l 11 Expressions (20) and (21) of (a) can solve for the direction cosine (a) of the image 3 ,b 3 ,c 3 ,a 2 ) As shown in equation (22).
/>
And then the external orientation element of the image can be obtained:
in summary, for one frame image, we solve for l 1 、l 2 ……l 11 After coefficients, 11 independent parameters of the corresponding image can be solved according to the above relation, wherein the 11 parameters comprise 3 internal azimuth elements (x 0 ,y 0 ,f x ) 6 ectopic elementsAnd a non-orthogonal angle dβ and a scale non-uniform coefficient ds. While the y-direction principal distance f of the image y Not an independent parameter, since it is f x And ds, so that independent solution is not needed, and the solution can be obtained by carrying out solution through other parameters.
It is understood that the direct linear transformation solution can also be regarded as a photogrammetric analysis processing method based on a collinear condition equation. The direct linear transformation solution is called because it establishes a direct and linear relationship between the coordinate system coordinates (X, Y) and the object space coordinates (X, Y, Z).
The direct linear transformation can be regarded as a "modified spatial back-to-front-intersection" solution, which "back-intersection" is used to solve for l 1 、l 2 ……l 11 Coefficients whose front intersections are used to solve for the spatial coordinates (X, Y, Z) of the object.
In the embodiment of the application, the direct linear transformation model can be applied to any frame of video image to determine the spatial position information of the shooting center corresponding to the any frame of video image.
In this embodiment of the present application, after each equation of the direct linear transformation model is established, the direct linear transformation model may be used to sequentially determine the spatial position information of the photographing center corresponding to each frame of video image in the N frames of video images until the spatial position information of the photographing center corresponding to each frame of video image is determined.
Specifically, for each frame of video image, feature points in the frame of video image can be extracted first, then image space coordinates of the feature points in an image plane coordinate system are obtained, then equation coefficients of a direct linear transformation model are obtained through solving the direct linear transformation model, and then 6 external orientation elements corresponding to the frame of video image are solved through the equation coefficients, so that (X S ,Y S ,Z S ) As the photographing center spatial position information.
A specific procedure for determining the spatial position information of the photographing center corresponding to each of the N frames of video images using the direct linear transformation model will be described below.
Specifically, step 103 may be implemented by the following steps 103a-103 d.
Step 103a, extracting feature points in each frame of video image for each frame of video image in the N frames of video images.
The extracted characteristic points in each frame of video image are characteristic points with the same characteristic as the video image corresponding to the adjacent time point.
It can be understood that, because the scenes shot by the cameras arranged on the flying device are changed at any time during the flying process of the flying device, and the larger the time interval between the time points is, the larger the scene change degree is, therefore, the video images corresponding to adjacent time points may have more feature points with the same features, the video images with far time points may have fewer feature points with the same features, that is, the larger the time interval between the time points is, and the fewer the number of feature points with the same features of the video images corresponding to the time points are.
In the exemplary embodiment, feature points in each frame of video image may be extracted by a template matching classification method, a geometric classifier, an artificial neural network classifier, a support vector machine classifier, and the like.
The method for classifying the samples of the most similar templates is a template matching classification method.
The template matching classification compares an unknown image, i.e., an image to be identified, with a standard image to see if they are identical or to calculate their degree of similarity. The template matching classifier takes each sample of the training sample set as a standard template, compares the image to be identified with each template, finds out the standard template which is the most similar and closest, and takes the nearest category in the standard template as the category of the identification result. In the classifying process, any image to be identified is compared with the existing templates in similarity, or the characteristic of each image to be identified is compared with the average value of the characteristic values of various templates to find out the most similar template.
As shown in fig. 6, the template is set to be T1 (M1, n 1) and the size thereof is m1×m1; the image to be compared is S1 (M1, N1), the size of which is N1 XN 1, and N1 is more than or equal to M1. The template T1 is overlapped on the image S1 to be compared and translated, and the area covered by the template is called sub-image S1 i',j' The coordinates of the pixel point in the upper left corner of the template, i ', j', in the image S1, called the reference point, can be seen: and 1 is less than or equal to i ', j' is less than or equal to N-M+1.
Now T1 and S1 can be compared i',j' If the two are identical, the difference is zero. In an exemplary embodiment, the degree of similarity (similarity) D (i ', j') thereof may be described using the following formula (24).
Thus, the correlation coefficient R (i ', j') of the following formula (25) can be used as the similarity measure:
the characteristic of each image to be compared can be compared with the average value of the characteristic values of various templates by using the formula (24) or (25) so as to find out the most similar template and realize matching.
In this embodiment of the present application, each frame of video image may be compared with the video image corresponding to the adjacent time point thereof in a similar manner, and then, according to the similarity and the magnitude of the preset similarity threshold, the point with the similarity greater than the preset threshold is extracted as the feature point of each frame of video image.
The size of the similarity threshold can be set according to requirements.
It can be understood that the smaller the similarity threshold is set, the more feature points of each frame of video image are extracted, and the larger the similarity threshold is set, the fewer feature points of each frame of video image are extracted, and therefore, the required number of feature points can be obtained by setting the size of the similarity threshold.
Step 103b, obtaining the image space coordinates of the feature points in the image plane coordinate system.
And step 103c, taking the characteristic points as control points, and determining the object space coordinates of the control points in the object space coordinate system according to the image space coordinates of the characteristic points in the image plane coordinate system.
And step 103d, determining the shooting center space position information corresponding to the video image by using a direct linear transformation model according to the image space coordinates of the feature points in the image plane coordinate system and the object space coordinates of the control points in the object space coordinate system.
Specifically, after extracting the feature points of each frame of video image, the image side coordinates of each feature point in each frame of video image can be determined according to the positions of each feature point in the corresponding video image. In particular, for a frame of video image, after the image space coordinates of each feature point in the image plane coordinate system are obtained, the feature point can be used as a control point, then the object space coordinates of the control point in the object space coordinate system are determined according to the image space coordinates of the feature point in the image plane coordinate system, and the image space coordinates of a plurality of feature points in the image plane coordinate system and the object space coordinates in the object space coordinate system are substituted into formulas (20) and (21), so that l can be calculated 1 、l 2 ……l 11 Then, according to l 1 、l 2 ……l 11 The value of (2), and formulas (22) and (23), can be calculated11 parameters such as element and inner azimuth element, and further (X S ,Y S ,Z S ) As the photographing center spatial position information.
It should be noted that, in the solution of the intersection behind the traditional space, if the external azimuth element and the internal azimuth element are to be solved at the same time, the control points are strictly prohibited from being arranged in the same plane, otherwise, the solution is unstable. Similarly, in the present application, when the spatial position information of the photographing center is resolved by using the direct linear transformation model, since the external orientation element and the internal orientation element are resolved together, it is also required that the control point cannot be arranged on one plane of any orientation.
In the embodiment of the application, when the direct linear transformation model is utilized to calculate the spatial position information of the photographing center, more than six control points are required to be distributed and controlled, and the control points cannot be arranged on one plane (plane with any azimuth) so as to avoid uncertainty of a calculation result. In an exemplary embodiment, the control points may be uniformly arranged so that they surround the object to be measured, and the larger the conformational range of each control point on the image, the better.
And 104, performing curve fitting by using a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding photographic center space position information, and determining a flight track curve of the flight device.
In the specific implementation, the photographic center space position information corresponding to the N frames of video images respectively is determined, namely, after the N photographic center space position information is determined, curve fitting can be performed by utilizing the N photographic center space position information so as to determine the flight track curve of the flight device. Because the N frames of video images respectively correspond to one time point, curve fitting can be carried out according to the time points respectively corresponding to the N frames of video images and the space position information of the shooting centers respectively corresponding to the N frames of video images, and the time parameter t when the flying device flies is determined to be an independent variable, and the space position parameter of the flying device is determined to be a flight track curve function of the dependent variable.
In the specific implementation, curve fitting can be performed by using a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding space position information of the photographing center, so as to determine the flight track curve of the flight device.
It can be understood that N frames of video images taken during the flight of the flying device respectively correspond to a time point, and the spatial position information of the photographing center corresponding to the N frames of video images respectively includes three-dimensional coordinate values (X s ,Y s ,Z s ) I.e. coordinate values corresponding to three directions respectively, wherein X s 、Y s 、Z s And respectively representing coordinate values of the flying device in three directions. In this embodiment of the present application, when curve fitting is performed by using the polynomial fitting method, the polynomial fitting function may include three polynomials, where each polynomial uses a time parameter t of flight of the flight device as an independent variable, and a coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device is a dependent variable.
In an exemplary embodiment, according to the time points corresponding to the N frames of video images and the spatial position information of the shooting centers corresponding to the N frames of video images, each coefficient of a polynomial is solved by a general polynomial fitting method, so that a functional formula of a flight trajectory curve of the flight device is determined.
Taking a cubic polynomial as an example, a fitting function of a general polynomial fitting may be in the form shown in formulas (26) - (28).
x1”=p x1 +p x2 t+p x3 t 2 +p x4 t 3 (26)
y1”=p y1 +p y2 t+p y3 t 2 +p y4 t 3 (27)
z1”=p z1 +p z2 t+p z3 t 2 +p z4 t 3 (28)
Wherein p is x1 、p x2 、p x3 、p x4 、p y1 、p y2 、p y3 、p y4 、p z1 、p z2 、p z3 、p z4 Respectively as a plurality ofThe coefficient of the polynomial, t is the time parameter of the flight device, and x1 ', y1 ', and z1 ' are coordinate values corresponding to the flight device in three directions of a space rectangular coordinate system respectively.
In an exemplary embodiment, according to the spatial position information of the shooting centers corresponding to the N frames of video images, each coefficient of the polynomial is solved by using a chebyshev polynomial fitting method, so that a functional formula of a flight trajectory curve of the flight device is determined.
Taking a sixth order polynomial as an example, the fit function of chebyshev polynomial fitting may be in the form shown in formulas (29) - (31).
x2”=p x1 +p x2 t+p x3 t 2 +p x4 t 3 +p x5 t 4 +p x6 t 5 +p x7 t 6 (29)
y2”=p y1 +p y2 t+p y3 t 2 +p y4 t 3 +p y5 t 4 +p y6 t 5 +p y7 t 6 (30)
z2”=p z1 +p z2 t+p z3 t 2 +p z4 t 3 +p z5 t 4 +p z6 t 5 +p z7 t 6 (31)
Wherein p is x1 、p x2 、p x3 、……p z5 、p z6 、p z7 And the coefficients are respectively the coefficients of Chebyshev polynomials, t is the time parameter of flight of the flight device, and x2 ', y2 ' and z2 ' are respectively coordinate values corresponding to the flight device in three directions of a space rectangular coordinate system.
In addition, the flight path profile of the flying device can also be determined in other ways. For example, according to the time points corresponding to the N frames of video images and the corresponding spatial position information of the photographing center, curve fitting can be performed by using a global optimization method, so as to determine the flight track curve of the flight device.
In an exemplary embodiment, automatic best fit function matching can be performed through a Levenberg-Marquardt method and a general global optimization method to obtain a best fit function form, curve fitting is performed through the best fit function, and coefficients of the fit function are solved to determine a flight trajectory curve of the flight device.
A series of fitting function forms can be obtained by carrying out best fitting function matching through a Marquardt method and a general global optimization method, and the embodiment of the application is illustrated by taking a polynomial form as an example. The fitting function may include three polynomials, where each polynomial uses a time parameter t of flight of the flight device as an independent variable and uses a coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device as a dependent variable. Wherein at least one term of the at least one polynomial may be an exponential function of a natural constant e, such as e t
In an exemplary embodiment, the form of the fitting function obtained by performing best fit function matching by the marquardt method and the general global optimization method may be the form of formulas (32) - (34).
x3”=p x1 +p x2 t 2 +p x3 t 0.5 +p x4 e -t (32)
y3”=p y1 +p y2 t+p y3 t 2 +p y4 t 0.5 +p y5 e t (33)
z3”=p z1 +p z2 t+p z3 t 1.5 +p z4 t 2 +p z5 t 2.5 (34)
Wherein p is x1 、p x2 、p x3 、……p z3 、p z4 、p z5 And the coefficients are respectively polynomial coefficients, t is a time parameter of flight of the flight device, and x3 ', y3 ', and z3 ' are coordinate values corresponding to the flight device in three directions of a space rectangular coordinate system.
The curve fitting process is specifically performed according to the spatial position information of the photographing center corresponding to the N frames of video images, and may refer to the description in the related art, which is not described in detail in this application.
It can be understood that in the embodiment of the present application, after the flight track curve of the flight device is determined, the landing point position information of the flight device may also be determined according to the flight track curve. That is, after step 104, it may further include:
step 105, obtaining the landing time of the flying device.
And 106, determining the landing point position information of the flying device according to the landing point time and the flying track curve.
Specifically, in the flight process of the flight device, the flight speed and the flight distance of the flight device can be obtained in real time, so that the landing time of the flight device is estimated according to the flight speed and the flight distance of the flight device.
After the landing time of the flying device is estimated, the landing time can be substituted into a curve function of the flying track curve to determine landing position information of the flying device.
According to the method for determining the flight track based on polynomial fitting high precision, firstly N frames of video images which are shot by a flight device in the flight process and respectively correspond to N time points are obtained, then the N frames of video images are preprocessed by utilizing an image enhancement technology and/or an image denoising technology, then the shooting center space position information corresponding to each frame of video image in the preprocessed N frames of video images is determined by utilizing a direct linear transformation model, then characteristic points in each frame of video image are extracted according to each frame of video image in the N frames of video images, curve fitting is conducted by utilizing a polynomial fitting function according to the time points respectively corresponding to the N frames of video images and the shooting center space position information respectively corresponding to the N frames of video images, the flight track curve of the flight device is determined, then the falling point time of the flight device is obtained, and then the falling point position information of the flight device is determined according to the falling point time and the flight track curve. Therefore, the method and the device realize that the flight trajectory of the flight device is determined with high precision by utilizing polynomial fitting based on the video image shot by the flight device in the flight process, and further determine the landing point position information of the flight device.
The method for determining the flight trajectory based on polynomial fitting with high accuracy provided in the present application will be described with reference to fig. 7. Fig. 7 is a flow chart of a method for determining a flight trajectory with high accuracy based on polynomial fitting according to another embodiment of the invention.
As shown in fig. 7, the method for determining a flight trajectory based on polynomial fitting according to the embodiment of the present invention may further include the following steps:
step 201, acquiring N frames of video images shot by a flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1.
Specifically, a camera can be configured in the flying device so as to shoot video images corresponding to different time points respectively in the flying process of the flying device. In an exemplary embodiment, the camera may be disposed in front of the flying device, and the present application does not limit the location of the camera in the flying device.
In an exemplary embodiment, the camera may capture a video image during the flight of the flying device and transmit the video image to the flight path determining device, and then the flight path determining device may perform a frame de-frame process on the video image captured by the flying device during the flight to obtain N frames of video images.
Step 202, determining the spatial position information of the shooting center corresponding to each frame of video image in the N frames of video images, wherein the spatial position information of the shooting center comprises coordinate values respectively corresponding to the shooting center in three directions of a spatial rectangular coordinate system.
In the embodiment of the application, a direct linear transformation model can be established based on the characteristics of central projection of the area array video images, so that the spatial position information of the shooting center corresponding to each frame of video image in N frames of video images is determined by using the direct linear transformation model.
Wherein, the spatial position information of the photographing center is used for representing the spatial position of the photographing light beam at the photographing moment, and can comprise three-dimensional coordinate values (X s ,Y s ,Z s )。
Specifically, for each frame of video image in the N frames of video images, feature points of the frame of video image can be extracted, and then, according to positions of the feature points in the corresponding video images, image space coordinates of the feature points in the frame of video image are determined. Specifically, for a frame of video image, after the image space coordinates of each feature point in the image plane coordinate system are obtained, the feature point can be used as a control point, then the object space coordinates of the control point in the object space coordinate system are determined according to the image space coordinates of the feature point in the image plane coordinate system, and the image space coordinates of a plurality of feature points in the image plane coordinate system and the object space coordinates in the object space coordinate system are substituted into the equations of the direct linear transformation model shown in the formulas (20) and (21), so that l can be calculated 1 、l 2 ……l 11 Then, according to l 1 、l 2 ……l 11 And formulas (22) and (23), the external orientation element and the internal orientation element can be calculated, and the (X) in the external orientation element can be calculated S ,Y S ,Z S ) As the photographing center spatial position information.
And 203, performing curve fitting by using a polynomial fitting function according to the corresponding time points of the N frames of video images and the corresponding photographic center space position information, and determining a flight track curve of the flight device.
Specifically, after the spatial position information of the shooting centers corresponding to the N frames of video images is determined, that is, after the spatial position information of the shooting centers is determined, the spatial position information of the shooting centers is utilized to perform curve fitting so as to determine the flight track curve of the flight device.
In the specific implementation, the time points corresponding to the N frames of video images and the corresponding space position information of the photographing center can be utilized, the polynomial function is utilized for curve fitting, the time parameter t is formed to be an independent variable, and the coordinate value corresponding to the flight device in one direction corresponding to the space rectangular coordinate system is a flight track curve function of the dependent variable.
It can be understood that N frames of video images taken during the flight of the flying device respectively correspond to a time point, and the spatial position information of the photographing center corresponding to the N frames of video images respectively includes three-dimensional coordinate values (X s ,Y s ,Z s ) I.e. coordinate values corresponding to three directions respectively, wherein X s 、Y s 、Z s And respectively representing coordinate values of the flying device in three directions. In this embodiment of the present application, when curve fitting is performed by using the polynomial fitting method, the polynomial fitting function may include three polynomials, where each polynomial uses a time parameter t of flight of the flight device as an independent variable, and a coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device is a dependent variable.
In particular, the following multiple ways may be used to determine the flight trajectory curve of the flying device by performing curve fitting using a polynomial function.
For example, according to the corresponding time points of the N frames of video images and the corresponding spatial position information of the photographing center, each coefficient of a polynomial is solved by a general polynomial fitting method, so that a functional formula of a flight track curve of the flight device is determined.
Taking a cubic polynomial as an example, a fitting function of a general polynomial fitting may be in the form shown in formulas (26) - (28).
Alternatively, in an exemplary embodiment, according to the spatial position information of the shooting centers corresponding to the N frames of video images, each coefficient of the polynomial may be solved by using chebyshev polynomial fitting method, so as to determine a functional formula of the flight trajectory curve of the flight device.
Taking a sixth order polynomial as an example, the fit function of chebyshev polynomial fitting may be in the form shown in formulas (29) - (31).
It should be noted that, details not disclosed in the method for determining a flight trajectory based on polynomial fitting high accuracy in the embodiment of the present invention are referred to details disclosed in the method for determining a flight trajectory based on polynomial fitting high accuracy in the above embodiment of the present invention, and are not described in detail here.
According to the method for determining the flight trajectory based on polynomial fitting high precision, firstly N frames of video images which are shot by a flight device in the flight process and correspond to N time points respectively are obtained, then shooting center space position information which is respectively corresponding to a plurality of frames of video images which are obtained by the flight device in the flight process is determined, curve fitting is carried out by using a polynomial fitting function according to the time points respectively corresponding to the plurality of frames of video and the shooting center space position information respectively corresponding to the frames of video, a flight trajectory curve of the flight device is determined, wherein the shooting center space position information comprises coordinate values respectively corresponding to a shooting center in three directions of a space rectangular coordinate system, the polynomial fitting function comprises three polynomials, each polynomial takes the time parameter of flight of the flight device as an independent variable, and the coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device as a dependent variable. Therefore, the method and the device realize the high-precision determination of the flight track of the flight device by utilizing polynomial fitting based on the video image shot by the flight device in the flight process, and the cost of the camera is low and the weight is light because the camera is only required to be added, thereby reducing the cost required for determining the flight track of the flight device and reducing the increase of the extra weight of the flight device.
Fig. 8 is a schematic structural diagram of an apparatus for determining a flight trajectory with high accuracy based on polynomial fitting according to an embodiment of the present invention.
As shown in fig. 8, the device 100 for determining a flight trajectory with high accuracy based on polynomial fitting according to an embodiment of the present invention includes a first acquisition module 11, a first determination module 12, and a second determination module 13.
The first acquiring module 11 is configured to acquire N frames of video images captured by the flying device during a flight process, where each frame of video image corresponds to a time point, and N is a positive integer greater than 1;
a first determining module 12, configured to determine photographic center spatial position information corresponding to each frame of video image in the N frames of video images, where the photographic center spatial position information includes coordinate values corresponding to the photographic center in three directions of a space rectangular coordinate system;
the second determining module 13 is configured to perform curve fitting by using a polynomial fitting function according to the time points corresponding to the N frames of video images and the corresponding spatial position information of the photographing center, and determine a flight trajectory curve of the flight device, where the polynomial fitting function includes three polynomials, each polynomial uses a time parameter of flight of the flight device as an independent variable and uses a coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device as a dependent variable.
Specifically, the device for determining the flight track based on polynomial fitting high precision, which is provided by the application, is referred to as a flight track determining device for short, and the method for determining the flight track based on polynomial fitting high precision, which is provided by the application, can be executed. Wherein the flight trajectory determining device can be configured in the electronic apparatus to determine the flight trajectory of the flight device with high accuracy with low cost and additional weight increase. The electronic device may be any hardware device capable of performing data processing, such as a mobile phone, a computer, and the like. It will be appreciated that the flight trajectory determination device may be configured in the controller of the flight device or in the ground command center of the flight device, as the application is not limited in this respect.
In one embodiment of the present invention, the second determining module 12 is specifically configured to:
and performing curve fitting by using a general polynomial fitting method or a chebyshev polynomial fitting method according to the corresponding time points of the N frames of video images and the corresponding photographic center space position information.
In an embodiment of the present invention, the flight trajectory determining device may further include:
The second acquisition module is used for acquiring the landing time of the flying device;
and the third determining module is used for determining the landing point position information of the flying device according to the landing point time and the flying track curve.
In one embodiment of the present invention, the first obtaining module 11 is specifically configured to:
acquiring a video image shot by a flight device in the flight process;
and carrying out frame de-framing treatment on the video images to obtain N frames of video images.
It should be noted that, details not disclosed in the device for determining a flight trajectory based on polynomial fitting high precision in the embodiment of the present invention are referred to details disclosed in the method for determining a flight trajectory based on polynomial fitting high precision in the above embodiment of the present invention, and are not described here again.
According to the device for determining the flight trajectory based on polynomial fitting high precision, firstly N frames of video images which are shot by a flight device in the flight process and correspond to N time points respectively are obtained, then shooting center space position information which is respectively corresponding to a plurality of frames of video images which are obtained by the flight device in the flight process is determined, curve fitting is carried out by using a polynomial fitting function according to the time points respectively corresponding to the plurality of frames of video and the shooting center space position information respectively corresponding to the frames of video, and a flight trajectory curve of the flight device is determined, wherein the shooting center space position information comprises coordinate values respectively corresponding to a shooting center in three directions of a space rectangular coordinate system, the polynomial fitting function comprises three polynomials, each polynomial takes the time parameter of flight of the flight device as an independent variable, and the coordinate value corresponding to one direction corresponding to the space rectangular coordinate system of the flight device as a dependent variable. Therefore, the method and the device realize the high-precision determination of the flight track of the flight device by utilizing polynomial fitting based on the video image shot by the flight device in the flight process, and the cost of the camera is low and the weight is light because the camera is only required to be added, thereby reducing the cost required for determining the flight track of the flight device and reducing the increase of the extra weight of the flight device.
In order to implement the above embodiment, the present invention further proposes an electronic device 200, as shown in fig. 9, where the electronic device 200 includes a memory 21 and a processor 22. Wherein the processor 22 runs a program corresponding to the executable program code by reading the executable program code stored in the memory 21 for implementing the above-described method of determining a flight trajectory with high accuracy based on polynomial fitting.
According to the electronic equipment provided by the embodiment of the invention, the processor executes the computer program stored on the memory, so that the video image shot in the flight process based on the flight device can be realized, the flight track of the flight device can be determined with high precision by utilizing polynomial fitting, and the cost of the camera is low and the weight of the camera is light as the camera is only required to be increased, thereby reducing the cost required for determining the flight track of the flight device and reducing the increase of the extra weight of the flight device.
In order to achieve the above-mentioned embodiments, the present invention also proposes a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-mentioned method of determining a flight trajectory with high accuracy based on polynomial fitting.
The computer readable storage medium of the embodiment of the invention can realize the high-precision determination of the flight track of the flying device by utilizing polynomial fitting based on the video image shot by the flying device in the flight process by storing the computer program and executing the computer program by the processor, and the cost of the camera is low and the weight is light because the camera is only needed to be added, thereby reducing the cost required for determining the flight track of the flying device and reducing the increase of the extra weight of the flying device.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless expressly stated or limited otherwise, a first feature "up" or "down" a second feature may be the first and second features in direct contact, or the first and second features in indirect contact via an intervening medium. Moreover, a first feature being "above," "over" and "on" a second feature may be a first feature being directly above or obliquely above the second feature, or simply indicating that the first feature is level higher than the second feature. The first feature being "under", "below" and "beneath" the second feature may be the first feature being directly under or obliquely below the second feature, or simply indicating that the first feature is less level than the second feature.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (8)

1. A method for determining a flight trajectory with high accuracy based on polynomial fitting, comprising:
Acquiring N frames of video images shot by a flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1;
determining the space position information of a shooting center corresponding to each frame of video image in the N frames of video images, wherein the space position information of the shooting center comprises coordinate values (XS, YS, ZS) respectively corresponding to the shooting center in three directions of a space rectangular coordinate system;
and performing curve fitting by using a global optimization method according to the corresponding time points of the N frames of video images and the corresponding photographic center space position information, and determining a flight track curve of the flight device, wherein the method comprises the following steps of:
performing automatic best fit function matching through a Marquard method and a general global optimization method to obtain a best fit function form, performing curve fitting by utilizing the best fit function, and solving coefficients of the fit function to determine a flight track curve of the flight device;
the fitting function obtained by carrying out best fitting function matching through a Maiquard method and a general global optimization method has the following form:
x3”=p x1 +p x2 t 2 +p x3 t 0.5 +p x4 e -t
y3”= y1 +p y2 t+p y3 t 2 +p y4 t 0.5 +p y5 e t
z3”=p z1 +p z2 t+p z3 t 1.5 +p z4 t 2 +p z5 t 2.5
wherein p is x1 、p x2 、p x3 ......p z3 、p z4 、p z5 Coefficients of polynomials are respectively adopted, t is a time parameter of flight of the flight device, and x3 ', y3 ' and z3 ' are respectively coordinate values of the flight device corresponding to three directions of a space rectangular coordinate system;
The method comprises the steps of extracting feature points in each frame of video image, obtaining image space coordinates of the feature points in an image plane coordinate system for the feature points in the frame of video image and video images corresponding to adjacent time points, determining object space coordinates of the control points in an object space coordinate system according to the image space coordinates of the feature points in the image plane coordinate system, solving equation coefficients of a direct linear transformation model through a direct linear transformation model, solving 6 external azimuth elements corresponding to the frame of video image through the equation coefficients, and taking (XS, YS, ZS) in the 6 external azimuth elements as photographic center space position information.
2. The method of claim 1, wherein after determining the flight trajectory profile of the flying device, further comprising:
acquiring the landing time of the flying device;
and determining the landing point position information of the flying device according to the landing point time and the flying track curve.
3. The method according to any one of claims 1-2, wherein the acquiring N frames of video images taken by the flying device during the flight comprises:
acquiring a video image shot by the flying device in the flying process;
and carrying out frame de-formation processing on the video images to obtain N frames of video images.
4. A device for determining a flight trajectory with high accuracy based on polynomial fitting, comprising:
the first acquisition module is used for acquiring N frames of video images shot by the flight device in the flight process, wherein each frame of video image corresponds to a time point, and N is a positive integer greater than 1;
a first determining module, configured to determine photographic center spatial position information corresponding to each frame of video image in the N frames of video images, where the photographic center spatial position information includes coordinate values (XS, YS, ZS) corresponding to a photographic center in three directions of a spatial rectangular coordinate system;
the second determining module is configured to perform curve fitting by using a global optimization method according to the time points corresponding to the N frames of video images and the corresponding spatial position information of the photographing center, and determine a flight trajectory curve of the flight device, where the method specifically includes:
Performing automatic best fit function matching through a Marquard method and a general global optimization method to obtain a best fit function form, performing curve fitting by utilizing the best fit function, and solving coefficients of the fit function to determine a flight track curve of the flight device;
the fitting function obtained by carrying out best fitting function matching through a Maiquard method and a general global optimization method has the following form:
x3”=p x1 +p x2 t 2 +p x3 t 0.5 +p x4 e -t
y3”= y1 +p y2 t+p y3 t 2 +p y4 t 0.5 +p y5 e t
z3”=p z1 +p z2 t+p z3 t 1.5 +p z4 t 2 +p z5 t 2.5
wherein p is x1 、p x2 、p x3 ......p z3 、p z4 、p z5 Coefficients of polynomials are respectively adopted, t is a time parameter of flight of the flight device, and x3 ', y3 ' and z3 ' are respectively coordinate values of the flight device corresponding to three directions of a space rectangular coordinate system;
the method comprises the steps of extracting feature points in each frame of video image, obtaining image space coordinates of the feature points in an image plane coordinate system for the feature points in the frame of video image and video images corresponding to adjacent time points, taking the feature points as control points, determining object space coordinates of the control points in an object space coordinate system according to the image space coordinates of the feature points in the image plane coordinate system, solving equation coefficients of a direct linear transformation model through a direct linear transformation model, and solving 6 external azimuth elements corresponding to the frame of video image through the equation coefficients, wherein (XS, YS, ZS) in the 6 external azimuth elements are used as photographic center space position information, more than six control points are distributed when the photographic center space position information is solved, and the control points cannot be arranged on one plane, wherein the feature points are used as the control points.
5. The apparatus as recited in claim 4, further comprising:
the second acquisition module is used for acquiring the landing time of the flying device;
and the third determining module is used for determining the landing point position information of the flying device according to the landing point time and the flying track curve.
6. The apparatus according to any one of claims 4-5, wherein the first acquisition module is specifically configured to:
acquiring a video image shot by the flying device in the flying process;
and carrying out frame de-formation processing on the video images to obtain N frames of video images.
7. An electronic device, comprising a memory and a processor;
wherein the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for implementing the method of determining a flight trajectory based on polynomial fitting with high accuracy as claimed in any one of claims 1-3.
8. A computer readable storage medium storing a computer program, which when executed by a processor implements a method of determining a flight trajectory with high accuracy based on polynomial fitting according to any one of claims 1-3.
CN202010646894.7A 2020-07-07 2020-07-07 Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment Active CN111951295B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010646894.7A CN111951295B (en) 2020-07-07 2020-07-07 Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010646894.7A CN111951295B (en) 2020-07-07 2020-07-07 Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment

Publications (2)

Publication Number Publication Date
CN111951295A CN111951295A (en) 2020-11-17
CN111951295B true CN111951295B (en) 2024-02-27

Family

ID=73340275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010646894.7A Active CN111951295B (en) 2020-07-07 2020-07-07 Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment

Country Status (1)

Country Link
CN (1) CN111951295B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113128342B (en) * 2021-03-19 2023-04-07 中国人民解放军战略支援部队信息工程大学 Flight path data preprocessing method and aerial target identification method
CN113885574B (en) * 2021-10-28 2023-07-25 中国人民解放军96901部队24分队 Multi-unmanned aerial vehicle collaborative formation control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306284A (en) * 2011-08-12 2012-01-04 上海交通大学 Digital reconstruction method of traffic accident scene based on monitoring videos
CN106846926A (en) * 2017-04-13 2017-06-13 电子科技大学 A kind of no-fly zone unmanned plane method for early warning
CN107818685A (en) * 2017-10-25 2018-03-20 司法部司法鉴定科学技术研究所 A kind of method that state of motion of vehicle is obtained based on Vehicular video
CN110632941A (en) * 2019-09-25 2019-12-31 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN111199075A (en) * 2019-12-30 2020-05-26 四川函钛科技有限公司 Flight track self-adaptive smoothing method based on time sequence QAR parameter
CN111241630A (en) * 2020-01-10 2020-06-05 中国人民解放军国防科技大学 Trajectory design method for RCS characteristics of coupled aircraft

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105346706B (en) * 2015-11-13 2018-09-04 深圳市道通智能航空技术有限公司 Flight instruments, flight control system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306284A (en) * 2011-08-12 2012-01-04 上海交通大学 Digital reconstruction method of traffic accident scene based on monitoring videos
CN106846926A (en) * 2017-04-13 2017-06-13 电子科技大学 A kind of no-fly zone unmanned plane method for early warning
CN107818685A (en) * 2017-10-25 2018-03-20 司法部司法鉴定科学技术研究所 A kind of method that state of motion of vehicle is obtained based on Vehicular video
CN110632941A (en) * 2019-09-25 2019-12-31 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN111199075A (en) * 2019-12-30 2020-05-26 四川函钛科技有限公司 Flight track self-adaptive smoothing method based on time sequence QAR parameter
CN111241630A (en) * 2020-01-10 2020-06-05 中国人民解放军国防科技大学 Trajectory design method for RCS characteristics of coupled aircraft

Also Published As

Publication number Publication date
CN111951295A (en) 2020-11-17

Similar Documents

Publication Publication Date Title
US9454796B2 (en) Aligning ground based images and aerial imagery
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN111414798A (en) Head posture detection method and system based on RGB-D image
CN110458877B (en) Navigation method based on bionic vision for fusing infrared and visible light information
CN111951295B (en) Method and device for determining flight trajectory with high precision based on polynomial fitting and electronic equipment
EP2686827A1 (en) 3d streets
CN107274441B (en) Wave band calibration method and system for hyperspectral image
CN111524194B (en) Positioning method and terminal for mutually fusing laser radar and binocular vision
CN111951178A (en) Image processing method and device for remarkably improving image quality and electronic equipment
CN112200848A (en) Depth camera vision enhancement method and system under low-illumination weak-contrast complex environment
Kurmi et al. Pose error reduction for focus enhancement in thermal synthetic aperture visualization
CN114998773A (en) Characteristic mismatching elimination method and system suitable for aerial image of unmanned aerial vehicle system
CN112927251A (en) Morphology-based scene dense depth map acquisition method, system and device
CN111930139B (en) Method and device for determining flight trajectory with high precision based on global optimization method and electronic equipment
CN111951331B (en) Flight device accurate positioning method and device based on video image and electronic equipment
CN117058183A (en) Image processing method and device based on double cameras, electronic equipment and storage medium
CN110602377B (en) Video image stabilizing method and device
EP2879090B1 (en) Aligning ground based images and aerial imagery
CN114565653B (en) Heterologous remote sensing image matching method with rotation change and scale difference
CN113589263B (en) Method and system for jointly calibrating multiple homologous sensors
CN112950723B (en) Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation
WO2018084069A1 (en) Image compositing system, image compositing method, and image compositing program recording medium
CN112766338B (en) Method, system and computer readable storage medium for calculating distance image
CN111951327A (en) Accurate estimation method and device for landing point position of flight device and electronic equipment
CN108596950B (en) Rigid body target tracking method based on active drift correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant