CN117826879A - Method and device for monitoring speed of aircraft, storage medium and aircraft - Google Patents

Method and device for monitoring speed of aircraft, storage medium and aircraft Download PDF

Info

Publication number
CN117826879A
CN117826879A CN202211199668.4A CN202211199668A CN117826879A CN 117826879 A CN117826879 A CN 117826879A CN 202211199668 A CN202211199668 A CN 202211199668A CN 117826879 A CN117826879 A CN 117826879A
Authority
CN
China
Prior art keywords
current
speed
aircraft
video image
coordinate value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211199668.4A
Other languages
Chinese (zh)
Inventor
赖东东
谭明朗
谢亮
付伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN202211199668.4A priority Critical patent/CN117826879A/en
Priority to PCT/CN2023/121150 priority patent/WO2024067498A1/en
Publication of CN117826879A publication Critical patent/CN117826879A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D13/00Control of linear speed; Control of angular speed; Control of acceleration or deceleration, e.g. of a prime mover
    • G05D13/62Control of linear speed; Control of angular speed; Control of acceleration or deceleration, e.g. of a prime mover characterised by the use of electric means, e.g. use of a tachometric dynamo, use of a transducer converting an electric value into a displacement

Abstract

The application discloses a speed monitoring method and device of an aircraft, a storage medium and the aircraft, wherein the method comprises the following steps: acquiring a current video image and an adjacent video image thereof shot when the aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image; acquiring a rotation angle between a current video image and an adjacent video image; and determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video images. The horizontal speed and the vertical speed of the aircraft during flight can be monitored more accurately and in real time.

Description

Method and device for monitoring speed of aircraft, storage medium and aircraft
Technical Field
The application relates to the technical field of aircrafts, in particular to a speed monitoring method and device of an aircraft, a storage medium and the aircraft.
Background
At present, the aircraft is widely applied to the aspects of cruising, monitoring, rescuing, aerial photography and the like, and provides a great deal of convenience for the life of people.
The flying speed of the aircraft is controlled by a flying controller, and the flying controller senses the flying height, flying speed and other information of the aircraft to adjust the flying attitude of the aircraft according to a preset flying plan, so as to implement the flying plan. However, the perceived flight speed of the flight controller is relatively coarse.
Disclosure of Invention
The embodiment of the application provides a speed monitoring method and device for an aircraft, a storage medium and the aircraft, and the flying speed of the aircraft can be monitored more accurately.
In a first aspect, an embodiment of the present application provides a method for monitoring a speed of an aircraft, the method including:
acquiring a current video image and an adjacent video image thereof shot when the aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image;
acquiring a rotation angle between a current video image and an adjacent video image;
and determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video images.
In a second aspect, embodiments of the present application further provide a speed monitoring device of an aircraft, including:
The data acquisition module is used for acquiring a current video image shot by the aircraft during flight and an adjacent video image thereof, current attitude data of the aircraft at a first shooting moment of the current video image and the relative ground height;
the rotation angle measuring and calculating module is used for acquiring the rotation angle between the current video image and the adjacent video image;
the speed monitoring module is used for determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video images.
In a third aspect, embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform a method of speed monitoring of an aircraft as provided in any of the embodiments of the present application.
In a fourth aspect, embodiments of the present application further provide an aircraft, including a body, a processor, an optical flow sensor, and a distance sensor, the optical flow sensor and the distance sensor being disposed at a bottom of the body, the processor being configured to perform a method of speed monitoring of the aircraft as provided in any of the embodiments of the present application.
According to the technical scheme, the current attitude data of the aircraft at the first shooting moment, the ground relative height and the current video image shot at the first shooting moment are obtained when the aircraft flies. And acquiring a rotation angle between the current video image and the adjacent video image, wherein the rotation angle can represent the relative rotation between the current video image and the adjacent video image. Then, the current attitude data, the ground relative height and the rotation angle which influence the flight attitude of the aircraft are taken as eliminating factors to reduce the influence of the factors on speed monitoring, and the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment are determined according to the current video image and the adjacent video image on the basis, so that the more accurate current horizontal speed and the more accurate current vertical speed can be obtained. And further, the flight attitude of the aircraft is conveniently controlled according to the current horizontal speed and the current vertical speed monitored in real time so as to better implement the flight plan.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an application scenario schematic diagram of a speed monitoring method of an aircraft according to an embodiment of the present application.
Fig. 2 is a flow chart of a method for monitoring the speed of an aircraft according to an embodiment of the present application.
FIG. 3 is a schematic diagram of determining a current vertical velocity based on camera-based principles of pinhole imaging provided by an embodiment of the present application.
Fig. 4 is a schematic diagram of comparison of deviation of distance values caused by change of flight attitude according to an embodiment of the present application.
Fig. 5 is a schematic diagram of determining a ground relative projection coordinate according to an embodiment of the present application.
Fig. 6 is a schematic diagram of determining a projection area according to an embodiment of the present application.
Fig. 7 is a schematic diagram of processing a first feature point and a second feature point based on the principle of imaging near-far-small by a camera according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of an aircraft speed monitoring device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by a person skilled in the art without any inventive effort, are intended to be within the scope of the present application based on the embodiments herein.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In order to better understand the speed monitoring method of the aircraft provided by the embodiment of the application, an application scene is provided for explanation. Referring to fig. 1, fig. 1 is a schematic application scenario diagram of a speed monitoring method for an aircraft according to an embodiment of the present application. In fig. 1, an unmanned aerial vehicle is exemplified as an aircraft. Be provided with distance sensor on the unmanned aerial vehicle, distance sensor is used for detecting unmanned aerial vehicle to the distance value on ground, also the fly height of unmanned aerial vehicle. And an optical flow sensor for continuously photographing video images of the bottom of the unmanned aerial vehicle within a photographing range thereof. The unmanned aerial vehicle is further provided with an inertial measurement unit, and the inertial measurement unit is used for detecting attitude data of the unmanned aerial vehicle in flight. In the embodiment of the application, the horizontal speed and the vertical speed of the unmanned aerial vehicle in flight are calculated according to the flying height detected by the distance sensor, the video image detected by the optical flow sensor and the gesture data detected by the inertial measurement unit.
It may be appreciated that the method for monitoring the speed of the aircraft provided in the embodiments of the present application may be the speed monitoring device of the aircraft provided in the embodiments of the present application, or an aircraft integrated with the speed monitoring device of the aircraft. The speed monitoring device of the aircraft can be realized in a hardware or software mode, and the aircraft comprises, but is not limited to, an unmanned plane, a balloon, an airplane, a glider, a helicopter and the like.
Referring to fig. 2, fig. 2 is a flow chart of a speed monitoring method of an aircraft according to an embodiment of the present application. The specific flow of the method for monitoring the speed of the aircraft provided by the embodiment of the application can be as follows:
101. the method comprises the steps of acquiring a current video image and adjacent video images thereof shot when the aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image.
When the aircraft flies, shooting is continuously carried out within the shooting range according to the preset frequency, so that continuous video images are obtained. The preset frequency may be a shooting frequency set by default for the aircraft system, or may be a shooting frequency set by user definition, or may also be a shooting frequency predicted according to user operation habits.
The preset frequency may also be adjusted, for example, according to the flight conditions of the aircraft. Wherein, the flying speed is used as the adjustment basis of the preset frequency. For example, there are a plurality of preset frequencies, and when the aircraft flies at a higher flying speed, a higher preset frequency is set; when the aircraft is flying at a lower flying speed, a lower preset frequency is set. Therefore, the video contents shot by the aircraft can be reasonably linked, and the flying speed of the aircraft can be calculated according to the shot video images. Naturally, the flying height may be used as the adjustment basis of the preset frequency, and the flying height is inversely proportional to the preset frequency. It can be appreciated that there are various ways of adjusting the preset frequency according to the flight condition of the aircraft, and it is only necessary to describe that by this way, more effective video images can be obtained for calculating the flight speed of the aircraft, so that the efficiency of monitoring the flight speed is improved.
The current video image shot at the first shooting moment refers to a frame of video image shot at the current moment in real time. And as a neighboring video image of the current video image, the neighboring video image refers to a certain frame of video image in the history video image photographed before the first photographing time. It will be appreciated that adjacent video images refer to the first n frames of images of the current video image, in which case each time an aircraft takes n frames of video images, i.e. the speed of flight of the aircraft is calculated, where n is a positive integer. The value of n can also be adjusted according to the flight condition of the aircraft, for example, according to the speed of change of the flight speed, the flight attitude and the like. When the speed of flight is adjusted according to the speed of flight, a smaller value n can be selected when the speed of flight is changed faster, and a larger value n can be selected when the speed of flight is changed slower.
The attitude data refers to data related to the flight attitude of the aircraft, such as acceleration, angular velocity and the like. Acceleration may be detected based on devices such as gyroscopes, and angular velocity may be detected based on devices such as accelerometers. The acceleration and angular velocity of the aircraft may also be detected by an inertial measurement unit (also called IMU) that integrates the angular velocity and acceleration detection functions. After the acceleration and the angular velocity are obtained, the real-time attitude of the aircraft can be calculated through algorithms such as extended Kalman filtering, complementary filtering and the like. It will be appreciated that the pose data may include data such as acceleration, angular velocity, etc., as well as the resolved real-time pose.
In this embodiment, there are various ways of detecting the flying height of the aircraft, for example, detecting the flying height of the aircraft by a distance sensor provided on the aircraft, or detecting the flying height of the aircraft by an ultrasonic ranging device provided on the aircraft, or the like. It will be appreciated that any manner of detecting the altitude of an aircraft may be used in embodiments of the present application. Taking the manner in which the flying height is detected based on a distance sensor as an example, the distance sensor detects a distance value between the aircraft and the ground to treat the distance value as the flying height of the aircraft. The flying height detected in real time can be used as the ground relative height of the aircraft at the first shooting moment.
102. And acquiring the rotation angle between the current video image and the adjacent video image.
In the present embodiment, the shooting time at which the current video image is shot is referred to as a first shooting time, and the shooting time at which the adjacent video image is shot is referred to as a second shooting time. The angle of flight of the aircraft at the first shooting moment is referred to as the current angle of flight, and the angle of flight of the aircraft at the second shooting moment is referred to as the adjacent angle of flight, wherein the angle of flight is relative to the horizontal direction (i.e. the direction parallel to the ground). The angle difference between the current flight angle and the adjacent flight angle is calculated to be used as the rotation angle between the current video image and the adjacent video image.
For example, the angle of flight may be determined by the angular velocity in the attitude data. Specifically, a first angular velocity of the aircraft at a first shooting moment and a second angular velocity of the aircraft at a second shooting moment are obtained, and then the first angular velocity and the second angular velocity are integrated to obtain an angle difference.
103. And determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video image.
The optical flow vector between the current video image and the adjacent video image is calculated, and the optical flow vector is subjected to scale restoration (from the view angle of the video image to the view angle of the aircraft) in the horizontal direction, so that the optical flow vector after the scale restoration is used as the horizontal speed of the aircraft at the first shooting moment.
In this embodiment, the horizontal speed of the aircraft at the first shooting moment is compensated by introducing data such as the rotation angle, the current gesture data, and the ground relative height, so as to obtain the current horizontal speed of the aircraft at the first shooting moment. The influence of the rotation of the aircraft on the calculation of the optical flow vector is considered, so that the calculated current horizontal speed is more accurate.
Wherein the flight speed of the aircraft comprises a current horizontal speed in the horizontal direction and also comprises a current vertical speed in the vertical direction (i.e. the direction of gravity). The current vertical speed of the aircraft is related to the degree of change in altitude of the aircraft, and is greater when the change in altitude is faster and is smaller when the change in altitude is smaller. Based on the method, when the current video image and the adjacent video image are processed, the current vertical speed of the aircraft at the first shooting moment is calculated based on the small-hole imaging principle of the camera.
Further, referring to fig. 3, fig. 3 is a schematic diagram of determining a current vertical velocity based on a camera-based pinhole imaging principle according to an embodiment of the present application. Fig. 3 (a) shows an adjacent video image in which two elements are represented by triangles and pentads, respectively, the distance value between the two elements being denoted as d1. When capturing an adjacent video image, the distance value thereof from the ground detected by the distance sensor is denoted as D1. Fig. 3 (b) shows the current video image, the aircraft, after capturing the adjacent video images, has its flying height raised, and its distance value from the ground detected by the distance sensor is denoted as D2, where D2 > D1. The farther the aircraft is from the ground, the more elements it can shoot, and accordingly, for the same element, the flying height is increased, and the same element is correspondingly reduced based on the shooting view of the aircraft. As shown in fig. 3 (b), the current video image still includes triangles and pentanes, which are reduced compared with the same element shown in fig. 3 (a), and the distance between the triangles and the pentanes is also reduced, and the distance value between the triangles and the pentanes is denoted as d2, where d2 < d1.
Based on D1, D2, D1, D2 provided in fig. 3, there is a ratio correlation between these four values, where D1/d2=d1/D2. As described above, the speed of change of the flight altitude can reflect the change of the current vertical speed, and similarly, the degree of change of the distance between the plurality of elements in the current video image and the adjacent video image can also reflect the change of the current vertical speed. Therefore, the embodiment of the application compensates the current video image and the adjacent video image based on the rotation angle and the current gesture data, and further determines the current vertical speed according to the ground relative height, so that the real-time current vertical speed can be accurately calculated.
In this embodiment, the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment are also combined to comprehensively analyze the flying speed condition of the aircraft.
As described above, after the coordinate correction processing is performed on the flying height of the aircraft, the flying height after the coordinate correction processing may be used as the ground relative height of the aircraft at the first shooting time. That is, as an embodiment, acquiring a ground relative altitude of an aircraft at a first capturing moment of a current video image includes:
Acquiring the flight height of an aircraft at the first shooting moment of a current video image;
and performing inclination angle correction processing on the flying height according to the current attitude data to obtain the ground relative height.
When an aircraft flies, a deviation exists in a distance value detected by a distance sensor due to a change in the flying attitude of the aircraft. Referring to fig. 4, fig. 4 is a schematic diagram illustrating a comparison of deviation of distance values caused by a change of flight attitude according to an embodiment of the present application. Wherein the broken line L shows the central axis of the aircraft, the solid line M indicates the ground, and fig. 4 (a) shows a schematic diagram of the distance value detection when the aircraft is in inclined flight with respect to the horizontal direction, since the distance sensor detects the distance value in the vertical direction, when the aircraft is in inclined, the distance sensor detects the distance value in the inclined direction, and the distance value is denoted by a. Fig. 4 (B) shows the actual distance value of the aircraft in the vertical direction with respect to the ground, denoted by B, when the aircraft is tilted. It can be seen that the distance value a is larger than the distance value B in the case of tilting of the aircraft, and in this case, the embodiment of the present application provides a solution for performing the coordinate correction process for the flying height.
Among them, various methods for performing inclination correction processing on the flying height are available. For example, according to the first angular velocity mentioned above, an angular vector corresponding to the first angular velocity is determined, the flying height is projected according to the inclination angle of the angular vector relative to the vertical direction, and the height value obtained by projection is taken as the ground relative height. For another example, according to the above-mentioned real-time attitude, an inclination angle of the aircraft relative to the vertical direction is determined, and then the flying height is projected in the vertical direction according to the inclination angle, and the height value obtained by projection is taken as the ground relative height. It will be appreciated that the manner in which the inclination correction is performed on the flying height is illustrated by way of two examples only and is not limiting of the present application, and that further implementations exist and are not listed here.
The flying height and the ground relative height mentioned in this embodiment are explained with reference to fig. 4. The distance value a in fig. 4 is used as the flying height, and the distance value B is used as the ground relative height. After the inclination angle correction processing is carried out on the flying height to obtain the ground relative height, the ground relative height can accurately represent the real distance between the aircraft and the ground.
The relative ground height is used as one of parameters for calculating the current horizontal speed, so that the influence of the flight attitude of the aircraft when optical flow vector calculation and scale reduction are carried out can be eliminated, and the more accurate current horizontal speed is obtained. Accordingly, the ground relative height is used as one of parameters for calculating the current vertical speed, and the change condition of the flying speed of the aircraft can be accurately reflected, so that the more accurate current vertical speed can be obtained.
In particular, the present application is not limited by the order of execution of the steps described, and certain steps may be performed in other orders or concurrently without conflict.
According to the speed monitoring method for the aircraft, the current video image and the adjacent video image of the aircraft shot at the first shooting moment are obtained, and the current gesture data and the ground relative height of the aircraft at the first shooting moment are obtained. But also by acquiring rotation parameters between the current video image and its neighboring images. The current video image and the adjacent video image are analyzed according to the rotation parameters, the current gesture data and the ground relative height, so that the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment are obtained. When the current horizontal velocity is calculated, the optical flow vector between the current video image and the adjacent video image is compensated based on the rotation parameter and the current gesture data, so that the error of the flight gesture on the optical flow vector is eliminated, and on the basis, the optical flow vector is subjected to scale reduction with the ground relative height capable of representing the real height of the aircraft, so that the accurate current horizontal velocity is obtained. When the current vertical speed is calculated, the change condition of the flying height of the aircraft is analyzed based on the association relation between the current video image and the adjacent video image, and the current vertical speed is calculated by combining the ground relative height capable of representing the real height of the aircraft on the basis, so that the accuracy of the current vertical speed is improved.
In some embodiments, determining the current horizontal velocity and the current vertical velocity of the aircraft at the first capture moment based on the rotation angle, the current pose data, the ground relative altitude, the current video image, and the adjacent video image comprises:
extracting a first characteristic point in a current video image and a second characteristic point matched with the first characteristic point in an adjacent video image;
respectively carrying out coordinate correction processing on the first characteristic point and the second characteristic point according to the rotation angle and the current gesture data to obtain a first coordinate value of the first characteristic point and a second coordinate value of the second characteristic point;
and determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the first coordinate value, the second coordinate value and the ground relative height.
Referring to fig. 5, fig. 5 is a schematic diagram of extracting a first feature point and a second feature point according to an embodiment of the present application. Wherein, for the same element, the first feature point is represented in the current video image as shown in fig. 5 (a). Represented by a second feature point in the adjacent video image as shown in fig. 5 (b).
Wherein, by identifying the same element in the current video image and the adjacent video image, and referring to the feature point of the same element in the current video image as a first feature point, the feature point of the same element in the adjacent video image is referred to as a second feature point. It is understood that when there are more identical elements in the adjacent video image and the current video image, the first feature point and the second feature point have a plurality corresponding to the identical elements. If the first feature point and the second feature point corresponding to the same element form a feature pair, a plurality of feature pairs can be provided. In the coordinate correction processing for the first feature point and the second feature point, one or more of the plurality of feature pairs may be selected for the coordinate correction processing, and the number of the selected feature pairs may be set according to actual needs, which is not limited herein.
The rotation angle of the aircraft at the first shooting time and the second shooting time can be determined according to the rotation angle, and through the rotation angle and the current gesture data, coordinate alignment processing (namely coordinate correction processing) can be performed on a first characteristic point in a current video image and a second characteristic point in an adjacent video image, so that the first characteristic point and the second characteristic point are both in the same coordinate system, and then coordinate differences between the first characteristic point and the second characteristic point are conveniently analyzed, so that the current horizontal speed and the current vertical speed are obtained. The method avoids speed monitoring errors caused by inconsistent coordinate spaces, and is beneficial to improving the accuracy of speed monitoring.
In an exemplary embodiment, when determining the current horizontal speed and the current vertical speed of the aircraft at the first capturing moment, the compensated optical flow vector can be obtained in the horizontal dimension and the feature point after coordinate alignment can be obtained in the vertical dimension by performing coordinate correction processing on the adjacent video image and the current video image. On the basis, the compensated optical flow vector is subjected to scale reduction based on the ground relative height after the inclination angle correction processing so as to obtain the accurate current horizontal speed. And obtaining more accurate current vertical speed based on the relative height of the ground after the inclination angle correction and the distance value between the feature points after the coordinate alignment.
In some embodiments, extracting a first feature point in a current video image and a second feature point in an adjacent video image that matches the first feature point includes:
according to the current attitude data, determining the ground relative projection coordinates of a shooting optical axis of the aircraft in a current video image;
and extracting a first characteristic point from the current video image according to the ground relative projection coordinates, and extracting a second characteristic point matched with the first characteristic point from the adjacent video image.
When the aircraft flies obliquely relative to the ground, the video image shot by the aircraft is an oblique image relative to the ground. The photographing optical axis is the central axis of the optical flow sensor in the embodiment, the photographing optical axis of the aircraft is perpendicular to the plane where the current video image is located, and the intersection point of the photographing optical axis and the ground is called a ground intersection point. And then, calculating a projection pixel point of the ground intersection point in the current video image according to the current gesture data, and calling the pixel position of the projection pixel point as a ground relative projection coordinate. After the ground relative projection coordinates are determined, a first feature point is extracted in the vicinity of the ground relative projection coordinates. Specifically, referring to fig. 6, fig. 6 is a schematic diagram of determining a ground relative projection coordinate according to an embodiment of the present application. Fig. 6 (a) shows a current video image captured by the aircraft and an intersection point of the imaging optical axis of the aircraft and the ground, and fig. 6 (b) shows a ground relative projection coordinate of the imaging optical axis of the aircraft in the current video image.
For example, when the first feature point is extracted in the vicinity of the ground relative projection coordinates, an area may be first circled according to the ground relative projection coordinates, and then the first feature point may be extracted from the area. The size of the region is not limited here, and may be set according to actual requirements. After the first feature point is extracted from the current video image, a second feature point matching the first feature point may be extracted from the neighboring image.
According to the embodiment, the ground relative projection coordinates of the photographing optical axis in the current video image are calculated according to the current gesture data, and the first characteristic points are extracted based on the ground relative projection coordinates, so that the first characteristic points obtained in the mode can accurately represent the horizontal speed of the aircraft.
Based on this, another embodiment is provided herein, namely, on the basis of determining the ground relative projection coordinates of the shooting optical axis of the aircraft in the current video image, determining the ground relative projection coordinates of the shooting optical axis of the aircraft in the adjacent video image according to the attitude data of the aircraft at the second shooting moment. And then determining an area according to the relative projection coordinates of the two floors so as to extract a first characteristic point from the current video image and extract a second characteristic point from the adjacent video image according to the area, thereby obtaining more accurate characteristic point data. The area includes two ground relative projection coordinates, or the area is located between two ground relative projection coordinates, which may be specifically set according to actual requirements, and is not limited herein.
In some embodiments, performing coordinate correction processing on the first feature point and the second feature point according to the rotation angle and the current gesture data to obtain a first coordinate value of the first feature point and a second coordinate value of the second feature point, where the coordinate correction processing includes:
determining a third coordinate value of the first feature point under the image coordinate system and a fourth coordinate value of the second feature point under the image coordinate system;
converting the third coordinate value into a fifth coordinate value under the three-dimensional spherical coordinate system, and converting the fourth coordinate value into a sixth coordinate value under the three-dimensional spherical coordinate system;
converting the fifth coordinate value into a first coordinate value under a world coordinate system according to the current gesture data;
and converting the sixth coordinate value into a second coordinate value under the world coordinate system according to the current gesture data and the rotation angle.
In this embodiment, the first feature point is represented in the current video image by a third coordinate value in the image coordinate system, and the second feature point is represented in the adjacent video image by a fourth coordinate value in the image coordinate system, wherein the image coordinate system is a two-dimensional coordinate system.
To improve the accuracy of the calculation of the current horizontal speed and the current vertical speed. In this embodiment, the third coordinate value is converted into the first coordinate value in the world coordinate system and the fourth coordinate value is converted into the second coordinate value in the world coordinate system by a series of coordinate system conversions.
Illustratively, the series of coordinate transformations includes: image coordinate system-three-dimensional spherical coordinate system-world coordinate system. Wherein the image coordinate system indicates a camera imaging plane, the three-dimensional spherical coordinate system indicates a three-dimensional space, and the world coordinate system indicates a real environment space. In the process of coordinate system conversion, the following stages are experienced:
image coordinate system-three-dimensional spherical coordinate system stage, the expression is as follows:
converting the third coordinate value into a fifth coordinate value under a three-dimensional spherical coordinate system:
converting the fourth coordinate value into a sixth coordinate value under a three-dimensional spherical coordinate system:
wherein m represents a first feature point, m=1, 2,3,..n, n is a positive integer.The third coordinate value is represented by a third coordinate value,representing a fifth coordinate value,/->Representing a fourth coordinate value,/->The sixth coordinate value is represented.
Three-dimensional spherical coordinate system stage→world coordinate system, the expression is as follows:
converting the fifth coordinate value into a first coordinate value in the world coordinate system:
wherein,represents a first coordinate value, R w_i Representing the real-time attitude of an aircraft at time t, R i_c A rotation profile representing a calibration between the optical flow sensor and the inertial measurement unit.
Converting the sixth coordinate value into a second coordinate value in the world coordinate system:
wherein, Represents a second coordinate value, R Δt Representing the rotation angle between the current video image and the neighboring video image.
In some embodiments, there are a plurality of first feature points, each corresponding to a second feature point; determining a current vertical speed of the aircraft at a first shooting moment according to the first coordinate value, the second coordinate value and the ground relative height, wherein the method comprises the following steps:
determining a first distance of a first coordinate value of at least one pair of first feature points on a normalization plane, and determining a second distance of a second coordinate value of a second feature point corresponding to the at least one pair of first feature points on the normalization plane;
and determining the current vertical speed of the aircraft at the first shooting moment according to the first distance, the second distance and the ground relative height.
Wherein the normalized plane indicates a plane located z=1 in front of the imaging optical axis of the aircraft. In this embodiment, the current vertical speed is determined based on the principle that the camera images the near-far-small image, the current video image, and at least one pair of first feature points and at least one pair of second feature points in the adjacent video image, where the at least one pair of first feature points and the at least one pair of second feature points are on the same pair of feature points corresponding to the current video image and the adjacent video image, respectively.
Referring to fig. 7, fig. 7 is a schematic diagram of processing a first feature point and a second feature point based on the principle of near-far imaging of a camera according to an embodiment of the present application. In fig. 7, a pair of feature points on the ground are denoted by X0 and Y0, and when the aircraft is at a position point 1, the coordinates of X0 and Y0 on the normalized plane are denoted by X1 and Y1, respectively, and the distance between X1 and Y1 is denoted by d 1. When the aircraft is at position point 2, the coordinates of X0 and Y0 on the normalized plane are denoted by X2 and Y2, respectively, and the distance between X2 and Y2 is denoted by d 2. According to the triangle similarity principle:
where H1 represents the ground relative altitude of the aircraft at position point 1 and H2 represents the ground relative altitude of the aircraft at position point 2.
For the same pair of feature points X0Y0, there is the following expression:
X0Y0=H1*d1=H2*d2;
in this embodiment, by selecting at least one pair of first feature points, selecting a second feature point corresponding to the at least one pair of first feature points, and further normalizing the at least one pair of first feature points to obtain normalized coordinates, after determining a ground relative height of the aircraft at a first shooting time, a current vertical speed of the aircraft at the first shooting time may be determined according to the ground relative height, a first distance between the at least one pair of first feature points determined from the current video image, and a second distance between the at least one pair of second feature points determined from the adjacent video images. See in particular the examples below.
In the coordinate conversion, the expression of three-dimensional spherical coordinate system→normalized plane is as follows:
converting the first coordinate value into a normalized coordinate value on a normalized plane:
converting the second coordinate value into a normalized coordinate value on the normalized plane:illustratively, the first feature point and the second feature point each include a plurality of feature points to indicate a plurality of elements. Taking a pair of feature points in the image as an example, if there are two first feature points in the current video image, the two feature points are respectively represented by ∈ ->And->Representation, correspondingly, in converting the first coordinate value into normalizationNormalized coordinate values on the plane +.>Correspond to->Correspond to->And second feature points in adjacent video images are respectively marked with +.>And->Indicating that, in converting the first coordinate value into a normalized coordinate value on the normalized plane,/i>Correspond to->Correspond to->Δt represents the photographing frequency, i.e., the time difference between the first photographing time and the second photographing time.
Wherein,and->A first distance d between t Indicating (I)>And->Second distance betweenBy d t-Δt And (3) representing. d, d t And d t-Δt The ratio between them is as follows:
wherein,representation->Coordinate values on the x-axis, < >>Representation->The coordinate values on the x-axis,representation- >Coordinate value on y-axis, < >>Representation->Coordinate values on the y-axis. />Representation->Coordinate values on the x-axis, < >>Representation->Coordinate values on the x-axis, < >>Representation->Coordinate value on y-axis, < >>Representation->Coordinate values on the y-axis.
If d is represented by S t And d t-Δt The ratio of the two is expressed as follows:
if H is t Representing the ground relative altitude of the aircraft at a first shooting moment, denoted by H t-Δt Representing the ground relative altitude of the aircraft at the second shooting moment, H t 、H t-Δt The relationship with S is as follows:
wherein V is z =(H t -H t-Δt )/d t In S and H t Calculating the current vertical velocity V z When expressed by the following formula:
V z =H t *(1-S)/Δt;
as described above, in this embodiment, the ratio of the ground relative height of the aircraft at the first shooting moment to the ground relative height at the second shooting moment is determined by the ratio of the first distance to the second distance, so that the current vertical speed is differentiated according to the ground relative heights at the two shooting moments, and the current vertical speed thus obtained is not affected by the fluctuation of the terrain, and the monitoring accuracy of the current vertical speed is ensured. In addition, the current vertical speed obtained by the method is smoother, and is relatively close to the real flying speed, so that the monitoring accuracy is greatly improved.
It can be understood that the current vertical velocity can be calculated by the above method by a pair of first feature points and a pair of corresponding second feature points, and when there are a plurality of pairs of first feature points and a plurality of pairs of corresponding second feature points, a plurality of current vertical velocities can be calculated by combining the ground relative heights. Further, a final vertical speed can be obtained by means of averaging, median and the like of the plurality of current vertical speeds, and the final vertical speed is further used as the current vertical speed of the aircraft at the first shooting moment.
In some embodiments, determining the current horizontal velocity of the aircraft at the first capture moment based on the first coordinate value, the second coordinate value, and the ground relative altitude comprises:
determining an optical flow vector from the first coordinate value and the second coordinate value;
the current horizontal velocity of the aircraft at the first shooting moment is determined according to the optical flow vector and the relative ground height.
When calculating the current horizontal velocity, determining an optical flow vector according to the first coordinate value and the second coordinate value, wherein the expression is as follows:
where f represents the optical flow vector,representing a second coordinate value,/->The first coordinate value is denoted by Δt, the photographing frequency is denoted by t, and the first photographing time is denoted by t.
When the light flow quantity is subjected to scale reduction, the current horizontal speed is obtained based on the relative height of the ground, and the expression is as follows:
V L =f*H t
alternatively, expressed by the following expression:
wherein V is L Indicating the current horizontal velocity, H t Representing the relative height of the ground.
Illustratively, the current horizontal velocity V L Also includes a component horizontal velocity V in the x-axis L-x And a component horizontal velocity V on the y-axis L-y 。V L-x And V L-y The expression of (2) is as follows:
it will be appreciated that the velocity V may also be based on the component horizontal velocity L-x Sum component horizontal velocity V L-y Calculating to obtain the current horizontal velocity V L . In this embodiment, the current horizontal velocity may be calculated by using one feature point from each of the current video image and the adjacent video image, or a plurality of feature points may be used, and when a plurality of feature points are used, the horizontal velocity may be calculated by each pair of feature points, and finally, an average value is calculated for all the calculated horizontal velocities to be used as the current horizontal velocity.
In some embodiments, acquiring the rotation angle between the current video image and the neighboring video image includes:
acquiring a plurality of angular velocities of the aircraft from a second shooting moment to a first shooting moment, wherein the second shooting moment is the moment when the aircraft shoots adjacent video images;
the rotation angle between the current video image and the adjacent video image is determined from the plurality of angular velocities.
In this embodiment, the rotation angle is obtained by acquiring a plurality of continuous angular velocities of the aircraft between the first photographing time and the second photographing time, and integrating the plurality of continuous angular velocities. Wherein the plurality of angular velocities includes a first angular velocity and a second angular velocity. By integrating a plurality of angular velocities within a continuous time interval, the rotation angle is made to approach the true value infinitely, so as to improve the accuracy of the rotation angle.
In some embodiments, determining the current horizontal velocity and the current vertical velocity of the aircraft at the first capture moment based on the rotation angle, the current attitude data, the ground relative altitude, the current video image, and the adjacent video image further comprises:
determining the current horizontal speed and the current vertical speed as a current speed pair;
acquiring a historical speed pair and historical attitude data corresponding to the aircraft at a second shooting moment, wherein the historical speed pair comprises a historical horizontal speed and a historical vertical speed corresponding to the second shooting moment;
carrying out speed prediction according to the historical gesture data, the current gesture data and the historical speed pair to obtain a predicted speed pair;
and carrying out correction processing on the current speed pair according to the predicted speed pair to obtain a corrected speed pair.
In this embodiment, a way of predicting a velocity pair corresponding to a current video image from neighboring video images and their corresponding historical velocity pairs, and current pose data is provided.
One of the gesture data corresponds to one of the velocity pairs, and the velocity pairs corresponding to the current video image are predicted by integrating the angular velocity and the acceleration in the historical gesture data and the current gesture data based on the historical velocity pairs, wherein the predicted velocity pairs are called predicted velocity pairs. Specifically, based on the historical speed pair of the second shooting time, the angular speed and the acceleration between the second shooting time and the first shooting time are integrated, and the predicted speed pair of the first shooting time is obtained.
Further, after the predicted speed pair is obtained, a correction process is further performed on the current speed pair based on the predicted speed pair. The correction processing mode can obtain the average value of the predicted speed pair and the current speed pair, and the average value is used as the current speed pair after correction processing.
Of course, the predicted speed pair and the current speed pair may be weighted averaged to obtain the corrected current speed pair. Wherein weights may be set for the predicted speed pair and the current speed pair, respectively. When the weights of the two are set, the user can customize the weights, and the weights can be determined according to the time difference between two frames of video images and the noise of the inertia measurement unit, wherein the weights are inversely proportional to the time difference and the noise.
For example, the fusion processing can be further performed on the predicted speed pair and the current speed pair through a kalman filtering algorithm, and the fusion value is further used as the speed pair after the correction processing.
As described above, by correcting the current speed pair, on one hand, the calculation error of the historical speed pair is not easy to accumulate to the current speed pair, and on the other hand, the noise of the current speed pair is reduced, and the difference between the current speed pair and the historical speed pair is reduced, so that the speed monitoring precision is greatly improved.
In some embodiments, the correcting the current speed pair according to the predicted speed pair to obtain a corrected speed pair includes:
performing low-frequency filtering processing on the current speed pair to obtain a current candidate speed pair;
performing high-frequency filtering processing on the predicted speed pair to obtain a predicted candidate speed pair;
and carrying out correction processing on the current candidate speed pair according to the predicted candidate speed pair to obtain a corrected speed pair.
In this embodiment, by performing low-frequency filtering processing on the current speed pair, noise of the current speed pair can be reduced, and optimization of the current speed pair is achieved. The prediction speed pair is also subjected to high-frequency filtering to obtain a smooth prediction candidate speed pair, and the hysteresis of the current speed pair can be reduced.
The correction processing of the current candidate speed pair according to the predicted candidate speed pair may be an average or a weighted average. Reference is specifically made to the above-mentioned matters, and the details are not repeated here.
In some embodiments, determining the current horizontal velocity and the current vertical velocity of the aircraft at the first capture moment based on the rotation angle, the current attitude data, the ground relative altitude, the current video image, and the adjacent video image further comprises:
Acquiring a visual flying speed of the aircraft at a first shooting moment, wherein the visual flying speed is detected by a visual inertial system of the aircraft;
determining speed difference information between the current horizontal speed, the current vertical speed and the visual flying speed;
and determining a target control strategy of the visual inertia system according to the speed difference information, and executing the target control strategy to ensure the relative ground height.
In the present embodiment, in determining the speed difference information between the current horizontal speed and the visual flying speed, that is, determining whether the speed values of both are the same, similarly, determining whether the speed values between the current vertical speed and the visual flying speed are the same.
Further, the visual flight speed may include a flight speed in a horizontal direction and a flight speed in a vertical direction. The flight speed in the vertical direction is compared with the current vertical speed by comparing the flight speed in the horizontal direction with the current horizontal speed to obtain speed difference information.
Illustratively, control strategies for the visual inertial system (also referred to as VIO) include, but are not limited to: initialization policies, restart policies, speed replacement policies, etc. When the method is specifically selected, one control strategy can be selected from a plurality of control strategies according to the speed difference information to serve as a target control strategy.
For example, if the visual flying speed is a specific value, but the speed difference information indicates that the difference between the visual flying speed and the current horizontal speed and the current vertical speed is large, the restarting strategy is used as the target control strategy, so as to control the visual inertial system to restart.
For another example, if the visual flying speed is an invalid value, the initialization strategy is used as a target control strategy, and then the visual inertial system is controlled to be initialized. The current horizontal speed and the current vertical speed are used as the initial visual flying speed of the visual inertial system during initialization.
For another example, if the visual flight speed is a specific value, but the speed difference information indicates that the aircraft is in fixed-point flight, the speed replacement strategy is used as the target control strategy, and then the current horizontal speed and the current vertical speed are used as the speed input of the flight controller. Wherein fixed point flight includes hovering.
From the above, the speed monitoring method of the aircraft provided by the embodiment of the invention can eliminate the influence factors of the change of the flight attitude of the aircraft on the current horizontal speed and the current vertical speed during monitoring, thereby more accurately monitoring the current horizontal speed and the current vertical speed of the aircraft in real time. The method comprises the steps of calculating the current horizontal speed and the current vertical speed in a differential calculation mode, so that the current vertical speed and the current horizontal speed which are relatively smooth can be obtained, and speed monitoring errors are reduced. On the basis, after the current horizontal speed and the current vertical speed are obtained, the current horizontal speed and the current vertical speed are corrected simultaneously, so that the noise of the current horizontal speed and the current vertical speed is reduced on the basis of avoiding error accumulation, and the difference between the current horizontal speed and the current vertical speed is reduced, so that the current horizontal speed and the current vertical speed after correction with higher accuracy are obtained. In addition, the current horizontal speed and the current vertical speed are used as reference conditions for flight control of the aircraft, so that the aircraft can efficiently implement a flight plan. Furthermore, the current horizontal speed and the current vertical speed are used for controlling the visual inertial system, so that the situations that the visual inertial system is invalid in detection or the risk of the frying machine exists can be well avoided.
In one embodiment, an aircraft speed monitoring device 200 is also provided. Referring to fig. 8, fig. 8 is a schematic structural diagram of an aircraft speed monitoring device 200 according to an embodiment of the present disclosure. Wherein the speed monitoring device 200 of the aircraft is applied to the aircraft, the speed monitoring device 200 of the aircraft comprises:
the data acquisition module 201 is configured to acquire a current video image captured during flight of the aircraft and an adjacent video image thereof, and current attitude data and a ground relative height of the aircraft at a first capturing moment of the current video image;
the rotation angle measuring module 202 is configured to obtain a rotation angle between a current video image and an adjacent video image;
the speed monitoring module 203 is configured to determine a current horizontal speed and a current vertical speed ground relative height of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video image.
In some embodiments, the speed monitoring module 203 is further to:
extracting a first characteristic point in a current video image and a second characteristic point matched with the first characteristic point in an adjacent video image;
respectively carrying out coordinate correction processing on the first characteristic point and the second characteristic point according to the rotation angle and the current gesture data to obtain a first coordinate value of the first characteristic point and a second coordinate value of the second characteristic point;
And determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the first coordinate value, the second coordinate value and the ground relative height.
In some embodiments, the speed monitoring module 203 is further to:
determining a first distance of a first coordinate value of at least one pair of first feature points on a normalization plane, and determining a second distance of a second coordinate value of a second feature point corresponding to the at least one pair of first feature points on the normalization plane;
and determining the current vertical speed of the aircraft at the first shooting moment according to the first distance, the second distance and the ground relative height.
In some embodiments, the speed monitoring module 203 is further to:
determining an optical flow vector from the first coordinate value and the second coordinate value;
the current horizontal speed ground relative height of the aircraft at the first shooting moment is determined according to the optical flow vector and the ground relative height.
In some embodiments, the speed monitoring module 203 is further to:
determining a third coordinate value of the first feature point under the image coordinate system and a fourth coordinate value of the second feature point under the image coordinate system;
converting the third coordinate value into a fifth coordinate value under the three-dimensional spherical coordinate system, and converting the fourth coordinate value into a sixth coordinate value under the three-dimensional spherical coordinate system;
Converting the fifth coordinate value into a first coordinate value under a world coordinate system according to the current gesture data;
and converting the sixth coordinate value into a second coordinate value under the world coordinate system according to the current gesture data and the rotation angle.
In some embodiments, the speed monitoring module 203 is further to:
according to the current attitude data, determining the ground relative projection coordinates of a shooting optical axis of the aircraft in a current video image;
and extracting a first characteristic point from the current video image according to the ground relative projection coordinates, and extracting a second characteristic point matched with the first characteristic point from the adjacent video image.
In some embodiments, the rotation angle measurement module 202 is further configured to:
acquiring a plurality of angular velocities of the aircraft from a second shooting moment to a first shooting moment, wherein the second shooting moment is the moment when the aircraft shoots adjacent video images;
the rotation angle between the current video image and the adjacent video image is determined from the plurality of angular velocities.
In some embodiments, the speed monitoring device 200 of the aircraft further comprises an altitude correction module for:
acquiring the flight height of an aircraft at the first shooting moment of a current video image;
and performing inclination angle correction processing on the flying height according to the current attitude data to obtain the ground relative height.
In some embodiments, the speed monitoring module 203 is a speed correction module for:
determining the current horizontal speed and the current vertical speed as a current speed pair;
acquiring a historical speed pair and historical attitude data corresponding to the aircraft at a second shooting moment, wherein the historical speed pair comprises a historical horizontal speed and a historical vertical speed corresponding to the second shooting moment;
carrying out speed prediction according to the historical gesture data, the current gesture data and the historical speed pair to obtain a predicted speed pair;
and carrying out correction processing on the current speed pair according to the predicted speed pair to obtain a corrected speed pair.
In some embodiments, the speed correction module is further to:
performing low-frequency filtering processing on the current speed pair to obtain a current candidate speed pair;
performing high-frequency filtering processing on the predicted speed pair to obtain a predicted candidate speed pair;
and carrying out correction processing on the current candidate speed pair according to the predicted candidate speed pair to obtain a corrected speed pair.
In some embodiments, the speed monitoring device 200 of the aircraft further comprises a visual inertial system control module for:
acquiring a visual flying speed of the aircraft at a first shooting moment, wherein the visual flying speed is detected by a visual inertial system of the aircraft;
Determining speed difference information between the current horizontal speed, the current vertical speed and the visual flying speed;
and determining a target control strategy of the visual inertia system according to the speed difference information, and executing the target control strategy.
It should be noted that, the speed monitoring device 200 of the aircraft provided in the embodiment of the present application belongs to the same concept as the speed monitoring method of the aircraft in the above embodiment, and any method provided in the speed monitoring method embodiment of the aircraft may be implemented by the speed monitoring device 200 of the aircraft, and detailed implementation processes of the method are described in the speed monitoring method embodiment of the aircraft, which is not repeated herein.
As can be seen from the above, the speed monitoring device 200 for an aircraft provided in the embodiments of the present application can eliminate the influence factor when the change of the flight attitude of the aircraft monitors the current horizontal speed and the current vertical speed, thereby more accurately monitoring the current horizontal speed and the current vertical speed of the aircraft in real time. The method comprises the steps of calculating the current horizontal speed and the current vertical speed in a differential calculation mode, so that the current vertical speed and the current horizontal speed which are relatively smooth can be obtained, and speed monitoring errors are reduced. On the basis, after the current horizontal speed and the current vertical speed are obtained, the current horizontal speed and the current vertical speed are corrected simultaneously, so that the noise of the current horizontal speed and the current vertical speed is reduced on the basis of avoiding error accumulation, and the difference between the current horizontal speed and the current vertical speed is reduced, so that the current horizontal speed and the current vertical speed after correction with higher accuracy are obtained. In addition, the current horizontal speed and the current vertical speed are used as reference conditions for flight control of the aircraft, so that the aircraft can efficiently implement a flight plan. Furthermore, the current horizontal speed and the current vertical speed are used for controlling the visual inertial system, so that the situations that the visual inertial system is invalid in detection or the risk of the frying machine exists can be well avoided.
Embodiments of the present application also provide an aircraft including, but not limited to, a drone, a balloon, an airplane, a glider, a helicopter, and the like. The aircraft includes a body, an optical flow sensor, a distance sensor, and a processor of one or more processing cores. The optical flow sensor and the distance sensor are arranged at the bottom of the body and are respectively in communication connection with the processor. It will be appreciated by those skilled in the art that the aircraft structure shown in the figures is not limiting of the aircraft and may include more or fewer components than shown, or certain components may be combined, or a different arrangement of components.
The processor is a control center of the aircraft, and utilizes various interfaces and lines to connect various parts of the whole aircraft, execute various functions of the aircraft and process data, so as to monitor the aircraft as a whole.
In an embodiment of the present application, a processor in an aircraft is configured to implement the following functions:
acquiring a current video image and an adjacent video image thereof shot when the aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image;
acquiring a rotation angle between a current video image and an adjacent video image;
And determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video images.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
From the above, the aircraft provided by the embodiment can eliminate the influence factors when the change of the flight attitude of the aircraft monitors the current horizontal speed and the current vertical speed, so that the current horizontal speed and the current vertical speed of the aircraft in real time can be monitored more accurately. The method comprises the steps of calculating the current horizontal speed and the current vertical speed in a differential calculation mode, so that the current vertical speed and the current horizontal speed which are relatively smooth can be obtained, and speed monitoring errors are reduced. On the basis, after the current horizontal speed and the current vertical speed are obtained, the current horizontal speed and the current vertical speed are corrected simultaneously, so that the noise of the current horizontal speed and the current vertical speed is reduced on the basis of avoiding error accumulation, and the difference between the current horizontal speed and the current vertical speed is reduced, so that the current horizontal speed and the current vertical speed after correction with higher accuracy are obtained. In addition, the current horizontal speed and the current vertical speed are used as reference conditions for flight control of the aircraft, so that the aircraft can efficiently implement a flight plan. Furthermore, the current horizontal speed and the current vertical speed are used for controlling the visual inertial system, so that the situations that the visual inertial system is invalid in detection or the risk of the frying machine exists can be well avoided.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, the embodiments of the present application provide a computer readable storage medium, and those skilled in the art will understand that all or part of the steps in implementing the methods of the embodiments described above may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the program when executed includes the following steps:
acquiring a current video image and an adjacent video image thereof shot when the aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image;
acquiring a rotation angle between a current video image and an adjacent video image;
and determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video images.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
The storage medium may be ROM/RAM, magnetic disk, optical disk, etc. The steps in any of the methods for monitoring the speed of the aircraft provided in the embodiments of the present application may be executed due to the computer program stored in the storage medium, so that the beneficial effects that any of the methods for monitoring the speed of the aircraft provided in the embodiments of the present application may be achieved are detailed in the previous embodiments and are not described herein.
The foregoing has described in detail the methods, apparatuses, media and vehicles for speed monitoring of an aircraft provided by the embodiments of the present application, and specific examples have been applied herein to illustrate the principles and embodiments of the present application, where the foregoing examples are provided to assist in understanding the methods and core ideas of the present application; meanwhile, as those skilled in the art will vary in the specific embodiments and application scope according to the ideas of the present application, the contents of the present specification should not be construed as limiting the present application in summary.

Claims (14)

1. A method of speed monitoring of an aircraft, the method comprising:
Acquiring a current video image and an adjacent video image thereof shot when an aircraft flies, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image;
acquiring a rotation angle between the current video image and the adjacent video image;
and determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video image.
2. The method of claim 1, wherein determining the current horizontal velocity and the current vertical velocity of the aircraft at the first capture time based on the rotation angle, the current attitude data, the ground relative altitude, the current video image, and the neighboring video image comprises:
extracting a first characteristic point in the current video image and a second characteristic point matched with the first characteristic point in the adjacent video image;
performing coordinate correction processing on the first characteristic point and the second characteristic point according to the rotation angle and the current gesture data to obtain a first coordinate value of the first characteristic point and a second coordinate value of the second characteristic point;
And determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the first coordinate value, the second coordinate value and the ground relative height.
3. The method of claim 2, wherein there are a plurality of first feature points, each of the first feature points corresponding to a second feature point; the determining the current vertical speed of the aircraft at the first shooting moment according to the first coordinate value, the second coordinate value and the ground relative height comprises the following steps:
determining a first distance of the first coordinate values of at least one pair of the first feature points on a normalization plane, and determining a second distance of the second coordinate values of the second feature points corresponding to the at least one pair of the first feature points on the normalization plane;
and determining the current vertical speed of the aircraft at the first shooting moment according to the first distance, the second distance and the ground relative height.
4. The method of claim 2, wherein said determining the current horizontal velocity of the aircraft at the first shooting time based on the first coordinate value, the second coordinate value, and the ground relative altitude comprises:
Determining an optical flow vector from the first coordinate value and the second coordinate value;
the current horizontal velocity of the aircraft at the first shooting moment is determined according to the optical flow vector and the ground relative height.
5. The method according to claim 2, wherein the performing coordinate correction processing on the first feature point and the second feature point according to the rotation angle and the current attitude data, respectively, obtains a first coordinate value of the first feature point and a second coordinate value of the second feature point, includes:
determining a third coordinate value of the first feature point under an image coordinate system and a fourth coordinate value of the second feature point under the image coordinate system;
converting the third coordinate value into a fifth coordinate value under a three-dimensional spherical coordinate system, and converting the fourth coordinate value into a sixth coordinate value under the three-dimensional spherical coordinate system;
converting the fifth coordinate value into the first coordinate value under a world coordinate system according to the current gesture data;
and converting the sixth coordinate value into the second coordinate value under the world coordinate system according to the current gesture data and the rotation angle.
6. The method of claim 2, wherein the extracting a first feature point in the current video image and a second feature point in the neighboring video image that matches the first feature point comprises:
according to the current gesture data, determining the ground relative projection coordinates of the shooting optical axis of the aircraft in the current video image;
and extracting the first characteristic point from the current video image and extracting the second characteristic point matched with the first characteristic point from the adjacent video image according to the ground relative projection coordinates.
7. The method of claim 1, wherein the acquiring the rotation angle between the current video image and the adjacent video image comprises:
acquiring a plurality of angular velocities of the aircraft from a second shooting moment to the first shooting moment, wherein the second shooting moment is the moment when the aircraft shoots the adjacent video images;
the rotation angle between the current video image and the adjacent video image is determined from the plurality of angular velocities.
8. The method of speed monitoring of an aircraft according to any one of claims 1 to 7, wherein the acquiring the ground relative altitude of the aircraft at the first instant of capturing the current video image comprises:
acquiring the flying height of the aircraft at the first shooting moment of the current video image;
and performing inclination angle correction processing on the flying height according to the current attitude data to obtain the ground relative height.
9. The method according to any one of claims 1 to 7, wherein the determining of the current horizontal velocity and the current vertical velocity of the aircraft at the first shooting time after the current horizontal velocity and the current vertical velocity of the aircraft according to the rotation angle, the current attitude data, the ground relative altitude, the current video image, and the adjacent video image further comprises:
determining said current horizontal velocity and said current vertical velocity as a current velocity pair;
acquiring a historical speed pair and historical posture data corresponding to the aircraft at a second shooting moment, wherein the historical speed pair comprises a historical horizontal speed and a historical vertical speed of the aircraft when shooting the adjacent video images;
Carrying out speed prediction on the historical gesture data, the current gesture data and the historical speed pair to obtain a predicted speed pair;
and carrying out correction processing on the current speed pair according to the predicted speed pair to obtain a corrected speed pair.
10. The method of claim 9, wherein the correcting the current speed pair according to the predicted speed pair to obtain a corrected speed pair comprises:
performing low-frequency filtering processing on the current speed pair to obtain a current candidate speed pair;
performing high-frequency filtering processing on the predicted speed pair to obtain a predicted candidate speed pair;
and carrying out correction processing on the current candidate speed pair according to the predicted candidate speed pair to obtain a speed pair after correction processing.
11. The method according to any one of claims 1 to 7, wherein the determining of the current horizontal velocity and the current vertical velocity of the aircraft at the first shooting time after the current horizontal velocity and the current vertical velocity of the aircraft according to the rotation angle, the current attitude data, the ground relative altitude, the current video image, and the adjacent video image further comprises:
Acquiring the visual flying speed of the aircraft at the first shooting moment, wherein the visual flying speed is detected by a visual inertial system of the aircraft;
determining speed difference information between the current horizontal speed, the current vertical speed, and the visual flight speed;
and determining a target control strategy of the visual inertia system according to the speed difference information, and executing the target control strategy.
12. A speed monitoring device for an aircraft, comprising:
the data acquisition module is used for acquiring a current video image and an adjacent video image thereof shot by the aircraft during flight, and current attitude data and ground relative height of the aircraft at a first shooting moment of the current video image;
the rotation angle measuring and calculating module is used for acquiring the rotation angle between the current video image and the adjacent video image;
and the speed monitoring module is used for determining the current horizontal speed and the current vertical speed of the aircraft at the first shooting moment according to the rotation angle, the current gesture data, the ground relative height, the current video image and the adjacent video image.
13. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when run on a computer, causes the computer to carry out the method of speed monitoring of an aircraft according to any one of claims 1 to 11.
14. An aircraft, characterized in that it comprises a body, a processor, an optical flow sensor and a distance sensor, the optical flow sensor and the distance sensor being provided at the bottom of the body, the processor being configured to perform the method of speed monitoring of an aircraft according to any one of claims 1 to 11.
CN202211199668.4A 2022-09-29 2022-09-29 Method and device for monitoring speed of aircraft, storage medium and aircraft Pending CN117826879A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211199668.4A CN117826879A (en) 2022-09-29 2022-09-29 Method and device for monitoring speed of aircraft, storage medium and aircraft
PCT/CN2023/121150 WO2024067498A1 (en) 2022-09-29 2023-09-25 Aircraft speed monitoring method and apparatus, storage medium, and aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211199668.4A CN117826879A (en) 2022-09-29 2022-09-29 Method and device for monitoring speed of aircraft, storage medium and aircraft

Publications (1)

Publication Number Publication Date
CN117826879A true CN117826879A (en) 2024-04-05

Family

ID=90476298

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211199668.4A Pending CN117826879A (en) 2022-09-29 2022-09-29 Method and device for monitoring speed of aircraft, storage medium and aircraft

Country Status (2)

Country Link
CN (1) CN117826879A (en)
WO (1) WO2024067498A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007124014A2 (en) * 2006-04-19 2007-11-01 Swope John M System for position and velocity sense and control of an aircraft
CN103913588B (en) * 2014-04-10 2016-06-22 深圳市大疆创新科技有限公司 The measuring method of the flight parameter of unmanned vehicle and device
WO2015154286A1 (en) * 2014-04-10 2015-10-15 深圳市大疆创新科技有限公司 Method and device for measuring flight parameters of unmanned aircraft
JP2017081246A (en) * 2015-10-23 2017-05-18 ソニー株式会社 Flight control device, flight control method, multicopter, and program
CN205539050U (en) * 2016-03-15 2016-08-31 深圳市高巨创新科技开发有限公司 Real -time speed sensor of unmanned vehicles
CN106199039B (en) * 2016-07-06 2019-04-26 深圳市高巨创新科技开发有限公司 A kind of unmanned plane speed monitoring method and system

Also Published As

Publication number Publication date
WO2024067498A1 (en) 2024-04-04

Similar Documents

Publication Publication Date Title
US10303185B2 (en) Multi-camera system and method of use
WO2018023492A1 (en) Mount control method and system
US20210133996A1 (en) Techniques for motion-based automatic image capture
Gandhi et al. Detection of obstacles in the flight path of an aircraft
CN106873619B (en) Processing method of flight path of unmanned aerial vehicle
WO2017045116A1 (en) System and method for supporting smooth target following
WO2020107372A1 (en) Control method and apparatus for photographing device, and device and storage medium
CN110622091A (en) Cloud deck control method, device and system, computer storage medium and unmanned aerial vehicle
US20220086362A1 (en) Focusing method and apparatus, aerial camera and unmanned aerial vehicle
US11272105B2 (en) Image stabilization control method, photographing device and mobile platform
KR101769602B1 (en) Apparatus and method of position revision for hovering using optical flow and imu and ultrasonic sensor
WO2022021027A1 (en) Target tracking method and apparatus, unmanned aerial vehicle, system, and readable storage medium
WO2021081707A1 (en) Data processing method and apparatus, movable platform and computer-readable storage medium
US20210097696A1 (en) Motion estimation methods and mobile devices
CN114035598A (en) Visual swing angle detection and swing reduction method of multi-rotor-wing hanging system
WO2020024182A1 (en) Parameter processing method and apparatus, camera device and aircraft
WO2021081958A1 (en) Terrain detection method, movable platform, control device, system, and storage medium
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
WO2021013143A1 (en) Apparatus, photgraphic apparatus, movable body, method, and program
WO2021217450A1 (en) Target tracking method and device, and storage medium
WO2019205103A1 (en) Pan-tilt orientation correction method, pan-tilt orientation correction apparatus, pan-tilt, pan-tilt system, and unmanned aerial vehicle
JP6405344B2 (en) Mobile object, obstacle detection method for mobile object, and obstacle detection program for mobile object
WO2021056411A1 (en) Air route adjustment method, ground end device, unmanned aerial vehicle, system, and storage medium
CN117826879A (en) Method and device for monitoring speed of aircraft, storage medium and aircraft
Qin et al. Visual-based tracking and control algorithm design for quadcopter UAV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination