WO2024067473A1 - 飞行器的速度监测方法、装置、存储介质及飞行器 - Google Patents

飞行器的速度监测方法、装置、存储介质及飞行器 Download PDF

Info

Publication number
WO2024067473A1
WO2024067473A1 PCT/CN2023/121073 CN2023121073W WO2024067473A1 WO 2024067473 A1 WO2024067473 A1 WO 2024067473A1 CN 2023121073 W CN2023121073 W CN 2023121073W WO 2024067473 A1 WO2024067473 A1 WO 2024067473A1
Authority
WO
WIPO (PCT)
Prior art keywords
speed
aircraft
current
video image
ground
Prior art date
Application number
PCT/CN2023/121073
Other languages
English (en)
French (fr)
Inventor
赖东东
谭明朗
谢亮
付伟
Original Assignee
影石创新科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 影石创新科技股份有限公司 filed Critical 影石创新科技股份有限公司
Publication of WO2024067473A1 publication Critical patent/WO2024067473A1/zh

Links

Classifications

    • G05D1/249
    • G05D1/46

Definitions

  • the present application relates to the field of aircraft technology, and in particular to an aircraft speed monitoring method, device, storage medium and aircraft.
  • aircraft are widely used in cruising, monitoring, rescue, aerial photography and other aspects, providing many conveniences for people's lives.
  • the flight speed of the aircraft is controlled by the flight controller, and the flight controller senses the aircraft's flight altitude, flight speed and other information to adjust the aircraft's flight attitude according to a pre-set flight plan, thereby implementing the flight plan.
  • the flight speed sensed by the flight controller is relatively rough.
  • the embodiments of the present application provide a method, device, storage medium and aircraft for monitoring the speed of an aircraft, which can more accurately monitor the flight speed of the aircraft.
  • an embodiment of the present application provides a method for monitoring the speed of an aircraft, the method comprising:
  • a second relative height of the aircraft relative to the ground at a second shooting moment of an adjacent video image is obtained, and a current vertical speed of the aircraft at the first shooting moment is determined according to the first relative height and the second relative height.
  • an embodiment of the present application further provides a speed monitoring device for an aircraft, comprising:
  • a data acquisition module used to acquire a current video image and its adjacent video images taken by the aircraft during flight, as well as current attitude data and a first ground relative height of the aircraft at a first shooting moment of the current video image;
  • a rotation angle measurement module is used to obtain the rotation angle between the current video image and the adjacent video image
  • a horizontal speed monitoring module used to determine the current horizontal speed of the aircraft at the first shooting moment according to the rotation angle, the current attitude data, the first relative height to the ground, the current video image and the adjacent video images;
  • the vertical speed monitoring module is used to obtain the second relative ground height of the aircraft at the second shooting moment of the adjacent video image, and determine the current vertical speed of the aircraft at the first shooting moment according to the first relative ground height and the second relative ground height.
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • the computer program When the computer program is run on a computer, the computer executes a method for monitoring the speed of an aircraft as provided in any embodiment of the present application.
  • an embodiment of the present application further provides an aircraft, comprising a main body, a processor, an optical flow sensor, and a distance sensor, wherein the optical flow sensor and the distance sensor are arranged at the bottom of the main body, and the processor is configured to execute a speed monitoring method for an aircraft as provided in any embodiment of the present application.
  • the technical solution provided by the embodiment of the present application is to obtain the current attitude data, the first relative height of the ground, and the current video image taken at the first shooting moment when the aircraft is flying. And obtain the rotation angle between the current video image and its adjacent video images, wherein the rotation angle can characterize the relative rotation between the current video image and the adjacent video image.
  • the current attitude data, the first relative height of the ground, and the rotation angle that affect the flight attitude of the aircraft are used as elimination factors to reduce their influence on the horizontal speed monitoring.
  • the current horizontal speed of the aircraft at the first shooting moment is determined according to the current video image and the adjacent video image, which can improve the accuracy of the current horizontal speed monitoring.
  • the second relative height of the ground of the aircraft at the second shooting moment of the adjacent video image is also obtained, wherein the second relative height of the ground and the first relative height of the ground can accurately characterize the height of the aircraft relative to the ground in the vertical direction, and then the current vertical speed of the aircraft at the first shooting moment is determined by the first relative height of the ground and the second relative height of the ground, and the current vertical speed obtained is more accurate. Furthermore, by providing a more accurate speed monitoring method, the horizontal speed and vertical speed of the aircraft are accurately monitored, so as to facilitate the control of the flight attitude of the aircraft and better implement the flight plan.
  • FIG1 is a schematic diagram of an application scenario of a method for monitoring the speed of an aircraft provided in an embodiment of the present application.
  • FIG2 is a schematic flow chart of a method for monitoring the speed of an aircraft according to an embodiment of the present application.
  • FIG3 is a comparative schematic diagram of distance value deviation caused by flight attitude change provided in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of extracting first feature points and second feature points provided in an embodiment of the present application.
  • FIG5 is a schematic diagram of determining ground-relative projection coordinates provided in an embodiment of the present application.
  • FIG6 is a schematic diagram of the structure of a speed monitoring device for an aircraft provided in an embodiment of the present application.
  • FIG. 1 is a schematic diagram of an application scenario of the speed monitoring method of an aircraft provided in an embodiment of the present application.
  • a drone is taken as an example of an aircraft.
  • a distance sensor is provided on the drone, and the distance sensor is used to detect the distance value from the drone to the ground, that is, the flight altitude of the drone.
  • an optical flow sensor the optical flow sensor is used to continuously shoot video images of the bottom of the drone within its shooting range.
  • An inertial measurement unit is also provided on the drone, and the inertial measurement unit is used to detect the attitude data of the drone during flight.
  • the horizontal speed of the drone during flight is calculated according to the flight altitude detected by the distance sensor, the video image detected by the optical flow sensor, and the attitude data detected by the inertial measurement unit.
  • the flight speed of the drone also includes a vertical speed, and the vertical speed of the drone during flight is calculated according to the flight altitude detected by the distance sensor and the attitude data detected by the inertial unit.
  • the subject may be the speed monitoring device for an aircraft provided in an embodiment of the present application, or an aircraft in which the speed monitoring device for an aircraft is integrated.
  • the speed monitoring device for an aircraft may be implemented in hardware or software, and the aircraft includes but is not limited to drones, balloons, airplanes, gliders, helicopters, etc.
  • FIG. 2 is a schematic diagram of the process flow of the aircraft speed monitoring method provided in the embodiment of the present application.
  • the specific process flow of the aircraft speed monitoring method provided in the embodiment of the present application can be as follows:
  • the preset frequency can be a shooting frequency set by the aircraft system by default, a shooting frequency set by the user, or a shooting frequency predicted according to the user's operating habits.
  • the preset frequency can also be adjusted according to the flight conditions of the aircraft.
  • the flight speed is used as the basis for adjusting the preset frequency.
  • a higher preset frequency is set; when the aircraft flies at a lower flight speed, a lower preset frequency is set.
  • the video content shot by the aircraft can be reasonably connected, which is convenient for calculating the flight speed of the aircraft based on the captured video images.
  • the flight altitude can also be used as the basis for adjusting the preset frequency, and the flight altitude is inversely proportional to the preset frequency.
  • the current video image captured at the first shooting moment refers to a frame of video image captured in real time at the current moment.
  • the adjacent video image refers to a frame of video image in the historical video image captured before the first shooting moment. It can be understood that the adjacent video image refers to the first n frames of the current video image.
  • the flight speed of the aircraft is calculated every time the aircraft captures n frames of video images, where n is a positive integer.
  • the value of n can also be adjusted according to the flight situation of the aircraft, for example, according to the speed of change of the flight speed, the flight attitude, etc. When adjusting according to the speed of change of the flight speed, when the flight speed changes faster, a smaller value of n can be selected, and when the flight speed changes slower, a larger value of n can be selected.
  • attitude data refers to data related to the flight attitude of the aircraft, such as acceleration, angular velocity
  • the acceleration can be detected based on devices such as gyroscopes, and the angular velocity can be detected based on devices such as accelerometers.
  • the acceleration and angular velocity of the aircraft can also be detected by an inertial measurement unit (also known as IMU) that integrates angular velocity and acceleration detection functions.
  • IMU inertial measurement unit
  • the real-time attitude of the aircraft can be solved by algorithms such as extended Kalman filtering and complementary filtering. It can be understood that the attitude data can include data such as acceleration and angular velocity, as well as the solved real-time attitude.
  • the flight altitude of the aircraft there are many ways to detect the flight altitude of the aircraft, such as detecting the flight altitude of the aircraft by a distance sensor provided on the aircraft, or detecting the flight altitude of the aircraft by an ultrasonic distance measuring device provided on the aircraft. It can be understood that any method that can detect the flight altitude of the aircraft can be applied to the embodiments of the present application.
  • the method of detecting the flight altitude based on the distance sensor is taken as an example.
  • the distance sensor detects the distance value between the aircraft and the ground, and regards the distance value as the flight altitude of the aircraft.
  • the flight altitude detected in real time can be used as the first relative height of the aircraft to the ground at the first shooting moment.
  • the shooting moment of the current video image is referred to as the first shooting moment
  • the shooting moment of the adjacent video image is referred to as the second shooting moment
  • the flight angle of the aircraft at the first shooting moment is referred to as the current flight angle
  • the flight angle of the aircraft at the second shooting moment is referred to as the adjacent flight angle, wherein the flight angle is relative to the horizontal direction (i.e., the direction parallel to the ground).
  • the angle difference between the current flight angle and the adjacent flight angle is calculated, and the angle difference is used as the rotation angle between the current video image and the adjacent video image.
  • the flight angle can be determined by the angular velocity in the attitude data. Specifically, by obtaining a first angular velocity of the aircraft at a first shooting moment and a second angular velocity of the aircraft at a second shooting moment, the first angular velocity and the second angular velocity are integrated to obtain an angle difference.
  • the optical flow vector between the current video image and the adjacent video image is calculated, and the optical flow vector is scaled down in the horizontal direction (from the video image perspective to the aircraft perspective), so that the scaled optical flow vector is used as the horizontal speed of the aircraft at the first shooting moment.
  • the horizontal speed of the aircraft at the first shooting moment is compensated by introducing the rotation angle, the current posture data, and the first relative height of the ground, so as to obtain the horizontal speed of the aircraft at the first shooting moment.
  • the current horizontal speed at the time of shooting is taken into account, making the calculated current horizontal speed more accurate.
  • the vertical speed of the aircraft in the vertical direction is also calculated so as to determine the comprehensive flight speed of the aircraft in combination with the horizontal speed.
  • the flight altitude detected by the aircraft at the second shooting moment may be referred to as the second relative ground altitude.
  • the current vertical speed For example, by performing a differential operation on the first relative ground altitude and the second relative ground altitude, the current vertical speed of the aircraft at the first shooting moment can be obtained.
  • the target vertical speed at the second shooting moment from the historical vertical speed, and dividing the second relative ground altitude by the target vertical speed, the current vertical speed of the aircraft at the second shooting moment can be obtained. Since there are many ways to calculate the current vertical speed, they are not listed here.
  • the flight height after the coordinate correction processing is used as the first ground relative height or the second ground relative height of the aircraft. That is, as an embodiment, obtaining the first ground relative height of the aircraft at the first shooting moment of the current video image includes:
  • the flight altitude is subjected to inclination correction processing according to the current attitude data to obtain a first ground relative altitude.
  • Figure 3 is a comparative schematic diagram of the distance value deviation caused by the change of flight attitude provided by the embodiment of the present application.
  • the dotted line L shows the central axis of the aircraft
  • the solid line M indicates the ground.
  • Figure 3 (a) shows a schematic diagram of the distance value detection when the aircraft is tilted relative to the horizontal direction. Since the distance sensor detects the distance value in its vertical direction, when the aircraft is tilted, the distance sensor detects the distance value in the tilted direction, and A represents the distance value.
  • Figure 3 (b) shows the actual distance value of the aircraft relative to the ground in the vertical direction when the aircraft is tilted, and B represents the distance value. It can be seen that the distance value A is greater than the distance value B when the aircraft is tilted. In the face of this situation, the embodiment of the present application provides a solution for coordinate correction processing for the flight altitude.
  • the first angular velocity is determined, and the angular vector corresponding to the first angular velocity is determined.
  • the flight altitude is projected according to the inclination of the angular vector relative to the vertical direction, and the height value obtained by the projection is used as the first ground relative altitude.
  • the inclination of the aircraft relative to the vertical direction is determined, and then the flight altitude is projected in the vertical direction according to the inclination, and the height value obtained by the projection is used as the first ground relative altitude.
  • the flight altitude and the first relative height to the ground mentioned in this embodiment are further explained with reference to FIG3 .
  • the distance value A in FIG3 is used as the flight altitude
  • the distance value B is used as the first relative height to the ground.
  • Vz represents the current vertical speed
  • Ht represents the first relative height to the ground
  • Ht - ⁇ t represents the second relative height to the ground
  • ⁇ t represents the shooting frequency
  • t represents the first shooting time
  • the first relative height to the ground provided in this embodiment, and the second relative height to the ground of the aircraft at the second shooting moment obtained based on the inclination processing method provided in the embodiment of the present application.
  • the current vertical speed is calculated based on the first relative height to the ground and the second relative height to the ground, wherein the current vertical speed can reflect the true speed of the aircraft in the vertical direction.
  • the present application is not limited by the execution order of the various steps described. If no conflict occurs, some steps can be performed in other orders or simultaneously.
  • the speed monitoring method of an aircraft determines the rotation parameters between the current video image and its adjacent images, as well as the current posture data of the aircraft at the first shooting moment, the first relative height to the ground, and the second relative height to the ground of the aircraft at the second shooting moment.
  • the first shooting moment is the moment when the current video image is shot
  • the second shooting moment is the moment when the adjacent image is shot.
  • the optical flow vector between the current video image and the adjacent video image is calculated in combination with the rotation parameters and the current posture data.
  • the first ground relative height that can represent the true height of the aircraft is used.
  • the relative height restores the scale of the optical flow vector to obtain the accurate current horizontal speed.
  • the current vertical speed is calculated according to the first relative height of the ground and the second relative height of the ground, and the current vertical speed can accurately represent the real flight speed of the aircraft in the vertical direction.
  • determining the current horizontal speed of the aircraft at the first shooting moment according to the rotation angle, the current attitude data, the first ground relative height, the current video image, and the adjacent video images includes:
  • the current horizontal speed of the aircraft at the first shooting moment is determined according to the optical flow vector and the first relative height to the ground.
  • FIG. 4 is a schematic diagram of extracting the first feature point and the second feature point provided in an embodiment of the present application.
  • the first feature point is used to represent it in the current video image, as shown in FIG. 4(a).
  • the second feature point is used to represent it in the adjacent video image, as shown in FIG. 4(b).
  • the feature points constituting the same element in the current video image are called the first feature points
  • the feature points constituting the same element in the adjacent video image are called the second feature points. It can be understood that when there are many same elements in the adjacent video image and the current video image, the first feature point and the second feature point have multiple corresponding to the same element. If the first feature point and the second feature point corresponding to the same element are formed into a feature pair, there may be multiple feature pairs. When the first feature point and the second feature point are subjected to coordinate correction processing, one or more feature pairs may be selected from multiple feature pairs for coordinate correction processing, and the number of selected feature pairs may be set according to actual needs, which is not limited here.
  • the optical flow vector is first determined according to the first coordinate value and the second coordinate value.
  • the expression is as follows:
  • f represents the optical flow vector
  • ⁇ t represents the shooting frequency
  • t represents the first shooting time
  • V L represents the current horizontal speed
  • H t represents the first relative height to the ground.
  • coordinate correction processing is performed on the first feature point and the second feature point according to the rotation angle and the current posture data to obtain a first coordinate value of the first feature point and a second coordinate value of the second feature point, including:
  • the fifth coordinate value is converted into the first coordinate value in the world coordinate system
  • the sixth coordinate value is converted into a second coordinate value in the world coordinate system.
  • the first feature point in the current video image is represented by a third coordinate value in the image coordinate system
  • the second feature point in the adjacent video image is represented by a fourth coordinate value in the image coordinate system, wherein the image coordinate system is a two-dimensional coordinate system.
  • the third coordinate value is converted into a first coordinate value in the world coordinate system
  • the fourth coordinate value is converted into a second coordinate value in the world coordinate system.
  • a series of coordinate transformations include: image coordinate system ⁇ 3D spherical coordinate system ⁇ world coordinate system.
  • the image coordinate system indicates the camera imaging plane
  • the 3D spherical coordinate system indicates the 3D space
  • the world coordinate system indicates the real environment space.
  • n represents the first feature point
  • m 1, 2, 3, ..., n
  • n is a positive integer.
  • Indicates the third seat Standard value represents the fifth coordinate value
  • R w_i represents the real-time attitude of the aircraft at time t
  • R i_c represents the calibrated rotational extrinsic parameter between the optical flow sensor and the inertial measurement unit.
  • R ⁇ t represents the rotation angle between the current video image and the adjacent video image.
  • the current horizontal velocity V L also includes a component horizontal velocity V Lx on the x-axis and a component horizontal velocity V Ly on the y-axis.
  • V Lx and V Ly are as follows:
  • the current horizontal velocity V L can also be calculated based on the component horizontal velocity V Lx and the component horizontal velocity V Ly .
  • extracting a first feature point in a current video image and a second feature point in an adjacent video image that matches the first feature point includes:
  • a first feature point is extracted from the current video image, and a second feature point matching the first feature point is extracted from an adjacent video image.
  • the camera optical axis is the central axis of the optical flow sensor in this embodiment.
  • the camera optical axis of the aircraft is perpendicular to the plane where the current video image is located.
  • the intersection of the camera optical axis and the ground is called the ground intersection point.
  • the projection pixel point of the ground intersection point in the current video image is calculated based on the current posture data
  • the projection pixel point of the ground intersection point is calculated based on the current posture data.
  • the pixel position of the projected pixel point is called the ground relative projection coordinate.
  • the first feature point is extracted near the ground relative projection coordinate.
  • Figure 5 is a schematic diagram of determining the ground relative projection coordinate provided in an embodiment of the present application.
  • Figure 5 (a) shows the current video image taken by the aircraft, and the intersection of the aircraft's camera optical axis and the ground
  • Figure 5 (b) shows the ground relative projection coordinates of the aircraft's camera optical axis in the current video image.
  • an area can be first selected according to the ground relative projection coordinates, and then the first feature point can be extracted from the area.
  • the size of the area is not limited here and can be set according to actual needs.
  • a second feature point matching the first feature point can be extracted from the adjacent image.
  • the ground-relative projection coordinates of the camera optical axis in the current video image are calculated according to the current posture data, and the first feature point is extracted based on the ground-relative projection coordinates.
  • the first feature point obtained in this way can accurately characterize the horizontal speed of the aircraft.
  • the ground relative projection coordinates of the aircraft's camera optical axis in the current video image are determined according to the aircraft's posture data at the second shooting moment. Then, an area is determined based on these two ground relative projection coordinates, so as to extract the first feature point from the current video image and the second feature point from the adjacent video image according to the area, so as to obtain more accurate feature point data.
  • the area includes two ground relative projection coordinates, or the area is located between the two ground relative projection coordinates, which can be set according to actual needs and is not limited here.
  • obtaining a rotation angle between a current video image and an adjacent video image includes:
  • a rotation angle between a current video image and an adjacent video image is determined according to a plurality of angular velocities.
  • a plurality of continuous angular velocities of the aircraft between the first shooting moment and the second shooting moment are obtained, and then the plurality of continuous angular velocities are integrated to obtain the rotation angle.
  • the plurality of angular velocities include the first angular velocity and the second angular velocity.
  • a second relative height of the aircraft to the ground at a second shooting time of the adjacent video image is obtained, and the relative height of the aircraft to the ground at the first shooting time is determined according to the first relative height to the ground and the second relative height to the ground.
  • the current vertical speed at the time of shooting also includes:
  • the current speed pair is corrected according to the predicted speed pair to obtain a corrected speed pair.
  • a method for predicting the speed pair corresponding to the current video image based on adjacent video images and their corresponding historical speed pairs, as well as current posture data.
  • One posture data corresponds to one speed pair, and based on the historical speed pair, the angular velocity and acceleration in the historical posture data and the current posture data are integrated to predict the speed pair corresponding to the current video image, wherein the predicted speed pair is called a predicted speed pair. Specifically, based on the historical speed pair at the second shooting moment, the angular velocity and acceleration between the second shooting moment and the first shooting moment are integrated to obtain the predicted speed pair at the first shooting moment.
  • the current speed pair is corrected based on the predicted speed pair.
  • the correction processing method can obtain an average value of the predicted speed pair and the current speed pair, and use the average value as the corrected current speed pair.
  • the predicted speed pair and the current speed pair can be weighted averaged to obtain the corrected current speed pair.
  • the weights can be set for the predicted speed pair and the current speed pair, respectively.
  • the user can set the weights by themselves, or the weights can be determined based on the time difference between the two frames of video images and the noise of the inertial measurement unit, wherein the weights are inversely proportional to the time difference and the noise.
  • the predicted speed pair and the current speed pair may be fused by a Kalman filter algorithm, and the fused value may be used as the corrected speed pair.
  • the current speed pair is corrected according to the predicted speed pair to obtain the corrected speed pair, including:
  • the current candidate speed pair is corrected according to the predicted candidate speed pair to obtain a corrected speed pair.
  • the noise of the current speed pair can be reduced, thereby optimizing the current speed pair. Also, by performing high-frequency filtering on the predicted speed pair to obtain a smooth predicted candidate speed pair, the hysteresis of the current speed pair can be reduced.
  • the correction process of the current candidate speed pair according to the predicted candidate speed pair may be performed by calculating an average value or a weighted average value.
  • an average value or a weighted average value may be calculated by calculating an average value or a weighted average value.
  • the method further includes:
  • the target control strategy of the visual-inertial system is determined according to the speed difference information and executed.
  • the visual flight speed may include a flight speed in a horizontal direction and a flight speed in a vertical direction.
  • the flight speed in the horizontal direction is compared with the current horizontal speed
  • the flight speed in the vertical direction is compared with the current vertical speed to obtain speed difference information.
  • control strategies of the visual inertial system include but are not limited to: initialization strategy, restart strategy, speed replacement strategy, etc.
  • initialization strategy e.g., initialization strategy, restart strategy, speed replacement strategy, etc.
  • speed replacement strategy e.g., speed replacement strategy, etc.
  • one of the multiple control strategies can be selected as the target control strategy according to the speed difference information.
  • the restart strategy is used as the target control strategy to control the visual inertial system to restart.
  • the initialization strategy is used as the target control strategy to control the initialization of the visual inertial system.
  • the current horizontal speed and the current vertical speed are used as the initial visual flight speed of the visual inertial system.
  • the speed replacement strategy is used as the target control strategy, and the current horizontal speed and the current vertical speed are used as the speed input of the flight controller.
  • fixed-point flight includes hovering.
  • the speed monitoring method of the aircraft proposed in the embodiment of the present invention can eliminate the influencing factors of the change of the aircraft's flight attitude on the monitoring of the current horizontal speed and the current vertical speed, so as to more accurately monitor the current horizontal speed and the current vertical speed of the aircraft in real time. And, after obtaining the current horizontal speed and the current vertical speed, by correcting both at the same time, on the basis of avoiding error accumulation, the noise of both is also reduced, and the difference with the historical data is narrowed, so as to obtain the current horizontal speed and the current vertical speed after correction with higher accuracy.
  • the current horizontal speed and the current vertical speed are also used as reference conditions for the flight control of the aircraft, so that the aircraft can efficiently implement the flight plan.
  • the visual inertial system is also controlled by the current horizontal speed and the current vertical speed, which can well avoid the situation that the visual inertial system has invalid detection or there is a risk of bombing.
  • a speed monitoring device 200 for an aircraft is also provided. Please refer to FIG. 6 , which is a schematic diagram of the structure of the speed monitoring device 200 for an aircraft provided in an embodiment of the present application.
  • the speed monitoring device 200 for an aircraft is applied to an aircraft, and the speed monitoring device 200 for an aircraft includes:
  • the data acquisition module 201 is used to acquire the current video image and its adjacent video images taken when the aircraft is flying, as well as the current attitude data and the first ground relative height of the aircraft at the first shooting moment of the current video image;
  • a rotation angle calculation module 202 is used to obtain the rotation angle between the current video image and the adjacent video image
  • a horizontal speed monitoring module 203 is used to determine the current horizontal speed of the aircraft at the first shooting moment according to the rotation angle, the current attitude data, the first relative height to the ground, the current video image and the adjacent video images;
  • the vertical speed monitoring module 204 is used to obtain a second relative ground height of the aircraft at a second shooting moment of the adjacent video image, and determine the relative ground height of the aircraft according to the first relative ground height and the second relative ground height. The current vertical speed of the traveler at the first shooting moment.
  • the horizontal speed monitoring module 203 is further configured to:
  • the current horizontal speed of the aircraft at the first shooting moment is determined according to the optical flow vector and the first relative height to the ground.
  • the horizontal speed monitoring module 203 is further configured to:
  • the fifth coordinate value is converted into the first coordinate value in the world coordinate system
  • the sixth coordinate value is converted into a second coordinate value in the world coordinate system.
  • the horizontal speed monitoring module 203 is further configured to:
  • a first feature point is extracted from the current video image according to the ground relative projection coordinates, and a second feature point matching the first feature point is extracted from an adjacent video image.
  • the rotation angle calculation module 202 is further used to:
  • a rotation angle between a current video image and an adjacent video image is determined according to a plurality of angular velocities.
  • the aircraft speed monitoring device 200 further includes an altitude correction module for:
  • the flight altitude is subjected to inclination correction processing according to the current attitude data to obtain a first ground relative altitude.
  • the speed monitoring device 200 further includes a speed correction module for:
  • the current speed pair is corrected according to the predicted speed pair to obtain a corrected speed pair.
  • the speed correction module is further configured to:
  • the current candidate speed pair is corrected according to the predicted candidate speed pair to obtain a corrected speed pair.
  • the aircraft speed monitoring device 200 further includes a visual inertial system control module for:
  • the target control strategy of the visual-inertial system is determined according to the speed difference information and executed.
  • the aircraft speed monitoring device 200 provided in the embodiment of the present application belongs to the same concept as the aircraft speed monitoring method in the above embodiment. Any method provided in the aircraft speed monitoring method embodiment can be implemented through the aircraft speed monitoring device 200. The specific implementation process is detailed in the aircraft speed monitoring method embodiment, which will not be repeated here.
  • the aircraft speed monitoring device 200 proposed in the embodiment of the present application can eliminate the influence of the change of the aircraft's flight attitude on the monitoring of the current horizontal speed and the current vertical speed, so as to more accurately monitor the aircraft's current horizontal speed and current vertical speed in real time. And, after obtaining the current horizontal speed and the current vertical speed, by simultaneously correcting both, on the basis of avoiding error accumulation, the noise of both is also reduced, and the difference with historical data is narrowed, so as to obtain the current horizontal speed and the current vertical speed after the correction processing with higher accuracy.
  • the current horizontal speed and the current vertical speed are also used as reference conditions for the flight control of the aircraft.
  • the visual inertial system is controlled by the current horizontal speed and the current vertical speed, which can effectively avoid the situation where the visual inertial system fails to detect or has the risk of bombing.
  • the embodiment of the present application also provides an aircraft, which includes but is not limited to a drone, a balloon, an airplane, a glider, a helicopter, etc.
  • the aircraft includes a body, an optical flow sensor and a distance sensor, and a processor with one or more processing cores.
  • the optical flow sensor and the distance sensor are arranged at the bottom of the body, and both are respectively connected to the processor for communication.
  • the aircraft structure shown in the figure does not constitute a limitation of the aircraft, and may include more or fewer components than shown in the figure, or combine certain components, or arrange the components differently.
  • the processor is the control center of the aircraft. It uses various interfaces and lines to connect various parts of the entire aircraft, execute various functions of the aircraft and process data, thereby monitoring the aircraft as a whole.
  • the processor in the aircraft is configured to implement the following functions:
  • a second relative height of the aircraft relative to the ground at a second shooting moment of an adjacent video image is obtained, and a current vertical speed of the aircraft at the first shooting moment is determined according to the first relative height and the second relative height.
  • the aircraft provided by this embodiment can eliminate the influence of the change of the aircraft's flight attitude on the monitoring of the current horizontal speed and the current vertical speed, so as to more accurately monitor the current horizontal speed and the current vertical speed of the aircraft in real time. And, after obtaining the current horizontal speed and the current vertical speed, by simultaneously correcting the two, on the basis of avoiding error accumulation, the noise of the two is also reduced, and the difference with the historical data is narrowed, so as to obtain a more accurate The current horizontal speed and the current vertical speed after correction processing.
  • the current horizontal speed and the current vertical speed are used as reference conditions for the flight control of the aircraft, so that the aircraft can efficiently implement the flight plan.
  • the visual inertial system is also controlled by the current horizontal speed and the current vertical speed, which can well avoid the situation that the visual inertial system has invalid detection or the risk of bombing.
  • the embodiment of the present application provides a computer-readable storage medium.
  • a person skilled in the art can understand that all or part of the steps in the above-mentioned embodiment method can be completed by instructing related hardware through a program.
  • the program can be stored in a computer-readable storage medium. When the program is executed, it includes the following steps:
  • a second relative height of the aircraft relative to the ground at a second shooting moment of an adjacent video image is obtained, and a current vertical speed of the aircraft at the first shooting moment is determined according to the first relative height and the second relative height.
  • the above-mentioned storage medium may be ROM/RAM, a magnetic disk, an optical disk, etc. Since the computer program stored in the storage medium can execute the steps in any of the aircraft speed monitoring methods provided in the embodiments of the present application, the beneficial effects that can be achieved by any of the aircraft speed monitoring methods provided in the embodiments of the present application can be achieved, as detailed in the previous embodiments, which will not be repeated here.

Abstract

一种飞行器的速度监测方法、装置、存储介质及飞行器,能够更加准确且实时地监测飞行器在飞行时的水平速度和竖直速度。该方法包括:获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度(步骤101);获取当前视频图像与相邻视频图像之间的旋转角度(步骤102);根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度(步骤103);获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度(步骤104)。

Description

飞行器的速度监测方法、装置、存储介质及飞行器 技术领域
本申请涉及飞行器技术领域,具体涉及一种飞行器的速度监测方法、装置、存储介质及飞行器。
背景技术
目前,飞行器广泛应用于巡航、监测、救援、航拍等方面,为人们生活提供了诸多便利。
飞行器的飞行速度通过飞行控制器控制,而飞行控制器通过感知飞行器的飞行高度、飞行速度等信息,以按照预先设定的飞行计划对飞行器的飞行姿态进行调整,从而实施飞行计划。然而,飞行控制器所感知的飞行速度较为粗略。
发明内容
本申请实施例提供一种飞行器的速度监测方法、装置、存储介质及飞行器,能够更加准确地监测飞行器的飞行速度。
第一方面,本申请实施例提供一种飞行器的速度监测方法,方法包括:
获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
获取当前视频图像与相邻视频图像之间的旋转角度;
根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度;
获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度。
第二方面,本申请实施例还提供一种飞行器的速度监测装置,包括:
数据获取模块,用于获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
旋转角度测算模块,用于获取当前视频图像与相邻视频图像之间的旋转角度;
水平速度监测模块,用于根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度;
竖直速度监测模块,用于获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度。
第三方面,本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,当计算机程序在计算机上运行时,使得计算机执行如本申请任一实施例提供的飞行器的速度监测方法。
第四方面,本申请实施例还提供一种飞行器,包括本体、处理器、光流传感器以及距离传感器,光流传感器和距离传感器设于本体的底部,处理器被配置为执行如本申请任一实施例提供的飞行器的速度监测方法。
本申请实施例提供的技术方案,通过获取飞行器飞行时在第一拍摄时刻的当前姿态数据、第一地面相对高度、在第一拍摄时刻拍摄的当前视频图像。以及获取当前视频图像及其相邻视频图像之间的旋转角度,其中,旋转角度能够表征当前视频图像与相邻视频图像之间的相对旋转。之后,将影响飞行器飞行姿态的当前姿态数据、第一地面相对高度以及旋转角度作为消除因子以降低其对水平速度监测的影响,在此基础上根据当前视频图像和相邻视频图像确定飞行器在第一拍摄时刻的当前水平速度,能够提高对当前水平速度监测的准确度。另外,还获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,其中,第二地面相对高度和第一地面相对高度能够准确地表征飞行器在竖直方向上相对于地面的高度,进而以第一地面相对高度和第二地面相对高度确定飞行器在第一拍摄时刻的当前竖直速度,得到的当前竖直速度更加准确。再者,通过提供更加准确的速度监测方法,以准确地监测飞行器的水平速度和竖直速度,进而便于控制飞行器的飞行姿态,更好地实施飞行计划。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可 以根据这些附图获得其他的附图。
图1为本申请实施例提供的飞行器的速度监测方法的应用场景示意图。
图2为本申请实施例提供的飞行器的速度监测方法的流程示意图。
图3为本申请实施例提供的因飞行姿态变化引起距离值偏差的对比示意图。
图4为本申请实施例提供的提取第一特征点和第二特征点的示意图。
图5为本申请实施例提供的确定地面相对投影坐标的示意图。
图6为本申请实施例提供的飞行器的速度监测装置的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有付出创造性劳动前提下所获得的所有其他实施例,都属于本申请的保护范围。
在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。
为更好地理解本申请实施例提供的飞行器的速度监测方法,以此提供一个应用场景进行说明。请参阅图1,图1为本申请实施例提供的飞行器的速度监测方法的应用场景示意图。图1中,以无人机作为飞行器进行举例。无人机上设置有距离传感器,距离传感器用于检测无人机至地面的距离值,也即无人机的飞行高度。以及光流传感器,光流传感器用于在其拍摄范围内连续拍摄无人机底部的视频图像。无人机上还设置有惯性测量单元,惯性测量单元用于检测无人机在飞行时的姿态数据。在本申请实施例中,根据距离传感器检测到的飞行高度、光流传感器检测到的视频图像、以及惯性测量单元检测到的姿态数据,计算无人机在飞行时的水平速度。无人机的飞行速度还包括竖直速度,根据距离传感器检测到的飞行高度以及惯性单元检测到的姿态数据,计算无人机在飞行时的竖直速度。
可以理解地,本申请实施例提供的飞行器的速度监测方法,该方法的执行 主体可以是本申请实施例提供的飞行器的速度监测装置,或者集成了该飞行器的速度监测装置的飞行器。其中,该飞行器的速度监测装置可以采用硬件或者软件的方式实现,飞行器包括但不限于无人机、气球、飞机、滑翔机、直升机等。
请参阅图2,图2为本申请实施例提供的飞行器的速度监测方法的流程示意图。本申请实施例提供的飞行器的速度监测方法的具体流程可以如下:
101、获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度。
飞行器飞行时按照预设频率连续在其拍摄范围内进行拍摄,以得到连续的视频图像。该预设频率可为飞行器系统默认设定的拍摄频率,也可为用户自定义设定的拍摄频率,或者,还可为根据用户操作习惯预测的拍摄频率。
示例性地,预设频率还可根据飞行器的飞行情况进行调整。其中,以飞行速度作为预设频率的调整依据。比如,预设频率有多个,当飞行器以较高的飞行速度飞行时,设定一个较高的预设频率;当飞行器以较低的飞行速度飞行时,设定一个较低的预设频率。以此,使得飞行器拍摄的视频内容能够合理地衔接,便于根据拍摄的视频图像计算飞行器的飞行速度。当然地,也可以飞行高度作为预设频率的调整依据,飞行高度与预设频率成反比。可以理解地,根据飞行器的飞行情况调整预设频率的方式有多种,此处不作列举,仅需说明的是,通过此方式能够得到更多有效的视频图像用于计算飞行器的飞行速度,提高了对飞行速度进行监测的效率。
其中,第一拍摄时刻拍摄的当前视频图像指的是在当前时刻实时拍摄的一帧视频图像。而作为当前视频图像的相邻视频图像,该相邻视频图像指的是在第一拍摄时刻之前拍摄的历史视频图像中的某一帧视频图像。可以理解地,相邻视频图像指当前视频图像的前n帧图像,在此种情况下,飞行器每拍摄n帧视频图像,即计算飞行器的飞行速度,其中,n为正整数。其中,n的数值也可根据飞行器的飞行情况进行调整,比如,按照飞行速度的变化快慢、飞行姿态等进行调整。在按照飞行速度的变化快慢进行调整时,当飞行速度变化较快时,可选择较小数值的n,当飞行速度变化较慢时,可选择较大的数值n。
其中,姿态数据指的是和飞行器飞行姿态有关的数据,比如加速度、角速 度等数据。加速度可基于陀螺仪等器件检测,角速度可基于加速度计等器件检测。还可通过集成了角速度和加速度检测功能的惯性测量单元(也称IMU)检测飞行器的加速度和角速度。当得到加速度和角速度之后,即可通过扩展卡尔曼滤波、互补滤波等算法解算出飞行器的实时姿态。可以理解地,该姿态数据既可包括加速度、角速度等数据,也可包括解算出的实时姿态。
在本实施例中,检测飞行器的飞行高度的方式有多种,比如,通过飞行器上设置的距离传感器检测飞行器的飞行高度,或者通过飞行器上设置的超声波测距器件检测飞行器的飞行高度等。可以理解地,只要能够检测飞行器的飞行高度的方式均可应用于本申请实施例中。此处以基于距离传感器检测飞行高度的方式为例,距离传感器通过检测飞行器与地面之间的距离值,以将该距离值视为飞行器的飞行高度。其中,实时检测的飞行高度可作为飞行器在第一拍摄时刻的第一地面相对高度。
102、获取当前视频图像与相邻视频图像之间的旋转角度。
在本实施例中,将拍摄当前视频图像的拍摄时刻称作第一拍摄时刻,将拍摄相邻视频图像的拍摄时刻称作第二拍摄时刻。将飞行器在第一拍摄时刻的飞行角度称作当前飞行角度,将飞行器在第二拍摄时刻的飞行角度称作相邻飞行角度,其中,飞行角度是相对水平方向(即与地面平行的方向)而言的。通过计算当前飞行角度和相邻飞行角度之间的角度差,以将该角度差作为当前视频图像与相邻视频图像之间的旋转角度。
示例性地,飞行角度可通过姿态数据中的角速度确定。具体地,通过获取飞行器在第一拍摄时刻的第一角速度,以及飞行器在第二拍摄时刻的第二角速度,进而对第一角速度和第二角速度进行积分得到角度差。
103、根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度。
其中,通过计算当前视频图像与相邻视频图像之间的光流向量,并在水平方向上对光流向量进行尺度还原(由视频图像视角还原成飞行器视角),以将尺度还原后的光流向量作为飞行器在第一拍摄时刻的水平速度。
在本实施例中,通过引入旋转角度、当前姿态数据以及第一地面相对高度等数据,以对飞行器在第一拍摄时刻的水平速度进行补偿,得到飞行器在第一 拍摄时刻的当前水平速度。考虑到了飞行器旋转对光流向量测算的影响,从而使得计算得到的当前水平速度更加准确。
104、获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度。
在本实施例中,还对飞行器在竖直方向上(即重力方向)的竖直速度进行计算,以便于结合水平速度确定飞行器综合的飞行速度情况。
示例性地,可将飞行器在第二拍摄时刻检测的飞行高度称作第二地面相对高度。其中,确定当前竖直速度的方式有多种,比如,通过对第一地面相对高度和第二地面相对高度进行差分运算,即可得到飞行器在第一拍摄时刻的当前竖直速度。再比如,通过在历史竖直速度中获取第二拍摄时刻的目标竖直速度,对第二地面相对高度与目标竖直速度求除,即可得到飞行器在第二拍摄时刻的当前竖直速度。由于计算当前竖直速度的方式有多种,此处不再列举。
如上,还可对飞行器的飞行高度进行坐标校正处理之后,将坐标校正处理之后的飞行高度作为飞行器上述的第一地面相对高度或第二地面相对高度。即作为一种实施例,获取飞行器在当前视频图像的第一拍摄时刻的第一地面相对高度,包括:
获取飞行器在当前视频图像的第一拍摄时刻的飞行高度;
根据当前姿态数据对飞行高度进行倾角校正处理,得到第一地面相对高度。
在飞行器飞行时,因飞行器飞行姿态的变化,通过距离传感器所检测的距离值会存在偏差。请参阅图3,图3为本申请实施例提供的因飞行姿态变化引起距离值偏差的对比示意图。其中,虚线L显示飞行器的中轴线,实线M指示地面,图3(a)显示飞行器以相对于水平方向倾斜飞行时进行距离值检测的示意图,由于距离传感器是以其垂直方向进行距离值检测的,在飞行器倾斜时,距离传感器则以倾斜方向检测得到距离值,以A表示该距离值。图3(b)显示飞行器倾斜时,飞行器在竖直方向上相对于地面实际的距离值,以B表示该距离值。可见,距离值A在飞行器倾斜的情况下是大于距离值B的,面对此种情况,本申请实施例提供了针对飞行高度进行坐标校正处理的方案。
其中,对飞行高度进行倾角校正处理的方式有多种。比如,根据上述提及 的第一角速度,确定第一角速度对应的角向量,根据角向量相对竖直方向的倾角对飞行高度进行投影,将投影得到的高度值作为第一地面相对高度。再比如,根据上述提及的实时姿态,确定飞行器相对竖直方向的倾角,进而根据倾角对飞行高度在竖直方向上进行投影,将投影得到的高度值作为第一地面相对高度。可以理解地,此处仅以两种举例解释说明对飞行高度进行倾角校正处理的方式,并非作为本申请的限定,本申请还存在更多的实现方式,此处不再进行列举。
继续以图3解释该实施例提及的飞行高度和第一地面相对高度。其中,以图3中的距离值A作为飞行高度,以距离值B作为第一地面相对高度。在对飞行高度进行倾角校正处理得到第一地面相对高度之后,第一地面相对高度能够准确地表征飞行器相对地面真实的距离。
作为一种实施例,可通过对第一地面相对高度和第二地面相对高度进行差分,以得到当前竖直速度,其表达式如下:
Vz=(Ht-Ht-Δt)/Δt;
其中,Vz表示当前竖直速度,Ht表示第一地面相对高度,Ht-Δt表示第二地面相对高度,Δt表示拍摄频率,t表示第一拍摄时刻。
在进一步方案中,基于本实施例提供的第一地面相对高度,以及基于本申请实施例提供的倾角处理方式得到的飞行器在第二拍摄时刻的第二地面相对高度。将第一地面相对高度作为计算当前水平速度的参数之一,能够消除光流向量计算以及对其进行尺度还原时受飞行器飞行姿态的影响,得到更加精准的当前水平速度。根据第一地面相对高度和第二地面相对高度计算当前竖直速度,其中,当前竖直速度能够反映飞行器在竖直方向上的真实速度。
具体实施时,本申请不受所描述的各个步骤的执行顺序的限制,在不产生冲突的情况下,某些步骤还可以采用其它顺序进行或者同时进行。
本申请实施例提供的飞行器的速度监测方法,通过确定当前视频图像及其相邻图像之间的旋转参数,以及飞行器在第一拍摄时刻的当前姿态数据、第一地面相对高度,飞行器在第二拍摄时刻的第二地面相对高度。其中,第一拍摄时刻为拍摄当前视频图像的时刻,第二拍摄时刻为拍摄相邻图像的时刻。之后,为提高光流向量计算的准确性,结合旋转参数和当前姿态数据计算当前视频图像与相邻视频图像之间的光流向量。并以能够表征飞行器真实高度的第一地面 相对高度对光流向量进行尺度还原,得到精准的当前水平速度。而根据第一地面相对高度和第二地面相对高度计算当前竖直速度,该当前竖直速度能够准确地表征飞行器在竖直方向上真实的飞行速度。
在一些实施例中,根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度,包括:
提取当前视频图像中的第一特征点,以及相邻视频图像中与第一特征点匹配的第二特征点;
根据旋转角度和当前姿态数据分别对第一特征点和第二特征点进行坐标校正处理,得到第一特征点的第一坐标值,以及第二特征点的第二坐标值;
根据第一坐标值和第二坐标值确定光流向量;
根据光流向量和第一地面相对高度,确定飞行器在第一拍摄时刻的当前水平速度。
请参阅图4,图4为本申请实施例提供的提取第一特征点和第二特征点的示意图。其中,对于同一元素,在当前视频图像中的用第一特征点表示,如图4(a)所示。在相邻视频图像中用第二特征点表示,如图4(b)所示。
其中,通过识别当前视频图像和相邻视频图像中的同一元素,并将当前视频图像中构成该同一元素的特征点称作第一特征点,将相邻视频图像中构成该同一元素的特征点称作第二特征点。可以理解地,当相邻视频图像和当前视频图像中具有的同一元素较多时,第一特征点和第二特征点具有与同一元素对应的多个。若将同一元素对应的第一特征点和第二特征点组成一个特征对,则可具有多个特征对。在对第一特征点和第二特征点进行坐标校正处理时,可从多个特征对中选择一个或多个进行坐标校正处理,而选择的特征对的数量可根据实际需求设定,此处并不进行限定。
在计算当前水平速度时,先根据第一坐标值和第二坐标值确定光流向量,其表达式如下:
其中,f表示光流向量,表示第二坐标值,表示第一坐标值,Δt表示拍摄频率,t表示第一拍摄时刻。
在对光流向量进行尺度还原时,基于第一地面相对高度得到当前水平速度,表达式如下:
VL=f*Ht
或者,以下述表达式表示:
其中,VL表示当前水平速度,Ht表示第一地面相对高度。
在一些实施例中,根据旋转角度和当前姿态数据分别对第一特征点和第二特征点进行坐标校正处理,得到第一特征点的第一坐标值,以及第二特征点的第二坐标值,包括:
确定第一特征点在图像坐标系下的第三坐标值,以及第二特征点在图像坐标系下的第四坐标值;
将第三坐标值转换成在三维球坐标系下的第五坐标值,以及将第四坐标值转换成在三维球坐标系下的第六坐标值;
根据当前姿态数据,将第五坐标值转换成世界坐标系下的第一坐标值;
根据当前姿态数据和旋转角度,将第六坐标值转换成世界坐标系下的第二坐标值。
在该实施例中,第一特征点在当前视频图像中是以图像坐标系下的第三坐标值表示的,第二特征点在相邻视频图像中是以图像坐标系下的第四坐标值表示的,其中,图像坐标系为二维坐标系。
为提高对当前水平速度的计算准确度。本实施例中,通过一系列的坐标系转换,将第三坐标值转换成世界坐标系下的第一坐标值,将第四坐标值转换成该世界坐标系下的第二坐标值。
示例性地,一系列的坐标转换包括:图像坐标系→三维球坐标系→世界坐标系。其中,图像坐标系指示相机成像平面,三维球坐标系指示三维空间,世界坐标系指示现实环境空间。在进行坐标系转换时,经历如下阶段:
图像坐标系→三维球坐标系阶段,表达式如下:
将第三坐标值转换成三维球坐标系下的第五坐标值:
将第四坐标值转换成三维球坐标系下的第六坐标值:
其中,m表示第一特征点,m=1,2,3,...,n,n为正整数。表示第三坐 标值,表示第五坐标值,表示第四坐标值,表示第六坐标值。
三维球坐标系阶段→世界坐标系,表达式如下:
将第五坐标值转换成世界坐标系下的第一坐标值:

其中,表示第一坐标值,Rw_i表示飞行器在t时刻的实时姿态,Ri_c表示光流传感器与惯性测量单元之间的标定的旋转外参。
将第六坐标值转换成世界坐标系下的第二坐标值:

其中,表示第二坐标值,RΔt表示当前视频图像与相邻视频图像之间的旋转角度。
示例性地,当前水平速度VL还包括在x轴上的分量水平速度VL-x,以及在y轴上的分量水平速度VL-y。VL-x和VL-y的表达式如下:

可以理解地,还可根据分量水平速度VL-x和分量水平速度VL-y计算得到当前水平速度VL
在一些实施例中,提取当前视频图像中的第一特征点,以及相邻视频图像中与第一特征点匹配的第二特征点,包括:
根据当前姿态数据,确定飞行器的摄像光轴在当前视频图像中的地面相对投影坐标;
根据地面相对投影坐标,在当前视频图像中提取第一特征点,以及在相邻视频图像中提取与第一特征点匹配的第二特征点。
在飞行器相对地面倾斜飞行时,其所拍摄的视频图像相对于地面为倾斜图像。其中,摄像光轴为本实施例中光流传感器的中轴线,飞行器的摄像光轴垂直于当前视频图像所在平面,摄像光轴与地面的交点称为地面相交点。之后,根据当前姿态数据解算出地面相交点在当前视频图像中的投影像素点,并将该 投影像素点的像素位置称为地面相对投影坐标。在确定地面相对投影坐标之后,则在地面相对投影坐标的附近提取第一特征点。具体地,请参阅图5,图5为本申请实施例提供的确定地面相对投影坐标的示意图。其中,图5(a)显示飞行器所拍摄的当前视频图像,以及飞行器的摄像光轴与地面的交点,图5(b)显示飞行器的摄像光轴在当前视频图像中的地面相对投影坐标。
示例性地,在地面相对投影坐标的附近提取第一特征点时,可以根据地面相对投影坐标先圈选一个区域,进而从该区域中提取第一特征点。其中,该区域的大小的此处并不进行限定,可根据实际需求设定。在从当前视频图像中提取出第一特征点之后,即可从相邻图像中提取与第一特征点匹配的第二特征点。
本实施例中通过根据当前姿态数据解算摄像光轴在当前视频图像中的地面相对投影坐标,并基于地面相对投影坐标提取第一特征点,此种方式得到的第一特征点能够准确地表征飞行器的水平速度。
基于此,此处还提供了另一实施方式,即在确定飞行器的摄像光轴在当前视频图像中的地面相对投影坐标的基础上,根据第二拍摄时刻的飞行器的姿态数据,确定飞行器的摄像光轴在相邻视频图像中的地面相对投影坐标。进而根据此两个地面相对投影坐标确定一个区域,以根据该区域从当前视频图像中提取第一特征点,以及从相邻视频图像中提取第二特征点,得到更加精准的特征点数据。其中,该区域包含两个地面相对投影坐标,或者,该区域位于两个地面相对投影坐标之间,具体可根据实际需求设定,此处并不进行限定。
在一些实施例中,获取当前视频图像与相邻视频图像之间的旋转角度,包括:
获取飞行器在第二拍摄时刻至第一拍摄时刻的多个角速度;
根据多个角速度确定当前视频图像与相邻视频图像之间的旋转角度。
在本实施例中,通过获取飞行器在第一拍摄时刻至第二拍摄时刻之间连续的多个角速度,进而对连续的多个角速度进行积分得到旋转角度。其中,该多个角速度包括第一角速度和第二角速度。通过对连续时间区间内的多个角速度进行积分,使得旋转角度无限逼近真实值,以提高旋转角度的精确度。
在一些实施例中,获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一 拍摄时刻的当前竖直速度之后,还包括:
将当前水平速度和当前竖直速度确定为一个当前速度对;
获取飞行器在第二拍摄时刻对应的历史速度对以及历史姿态数据,历史速度对包括第二拍摄时刻对应的历史水平速度和历史竖直速度;
根据历史姿态数据、当前姿态数据以及历史速度对进行速度预测,得到预测速度对;
根据预测速度对对当前速度对进行修正处理,得到修正处理后的速度对。
在该实施例中,提供了根据相邻视频图像及其对应的历史速度对、以及当前姿态数据对当前视频图像对应的速度对进行预测的方式。
其中,一个姿态数据对应一个速度对,基于历史速度对,对历史姿态数据和当前姿态数据中的角速度和加速度进行积分,预测出当前视频图像对应的速度对,其中,将预测出的速度对称为预测速度对。具体地,基于第二拍摄时刻的历史速度对,将第二拍摄时刻至第一拍摄时刻之间的角速度和加速度进行积分,得到第一拍摄时刻的预测速度对。
进一步地,得到预测速度对之后,还基于预测速度对对当前速度对进行修正处理。该修正处理方式可求取预测速度对和当前速度对的平均值,以将该平均值作为修正处理后的当前速度对。
当然地,可以对预测速度对和当前速度对进行加权平均,以得到修正处理后的当前速度对。其中,可分别为预测速度对和当前速度对设定权重。在设定两者的权重时,可由用户自定义设定权重,也可根据两帧视频图像之间的时间差以及惯性测量单元的噪声来确定权重,其中,权重与时间差和噪声均成反比。
示例性地,还可通过卡尔曼滤波算法对预测速度对和当前速度对进行融合处理,进而将融合值作为修正处理后的速度对。
如上,通过对当前速度对进行修正处理,一方面不易将历史速度对的计算误差累计至当前速度对,另一方面还降低了当前速度对的噪声,减小了当前速度对和历史速度对之间的差异,从而极大地提高了速度监测的精度。
在一些实施例中,根据预测速度对对当前速度对进行修正处理,得到修正处理后的速度对,包括:
对当前速度对进行低频滤波处理,得到当前候选速度对;
对预测速度对进行高频滤波处理,得到预测候选速度对;
根据预测候选速度对对当前候选速度对进行修正处理,得到修正处理后的速度对。
在本实施例中,通过对当前速度对进行低频滤波处理,能够降低当前速度对的噪声,实现了对当前速度对的优化。还通过对预测速度对进行高频滤波以得到平滑的预测候选速度对,且还能够降低当前速度对的滞后度。
其中,根据预测候选速度对对当前候选速度对进行修正处理可为求取平均值的方式,也可为求取加权平均值的方式。具体可参照上述提及的内容,此处不再赘述。
在一些实施例中,获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的竖直速度之后,还包括:
获取飞行器在第一拍摄时刻的视觉飞行速度,视觉飞行速度通过飞行器的视觉惯性系统检测得到;
确定当前水平速度、当前竖直速度与视觉飞行速度之间的速度差异信息;
根据速度差异信息确定视觉惯性系统的目标控制策略,并执行目标控制策略。
在本实施例中,在确定当前水平速度与视觉飞行速度之间的速度差异信息时,即确定两者的速度值是否相同,同样地,确定当前竖直速度与视觉飞行速度之间的速度值是否相同。
进一步地,视觉飞行速度可包括水平方向上的飞行速度和竖直方向上的飞行速度。通过将水平方向上的飞行速度与当前水平速度进行比较,将竖直方向上的飞行速度与当前竖直速度进行比较,以得到速度差异信息。
示例性地,视觉惯性系统(也称VIO)的控制策略包括但不限于:初始化策略、重启策略以及速度替换策略等。在具体选用时,可根据速度差异信息从多种控制策略中选择一个作为目标控制策略。
比如,若视觉飞行速度为具体数值,但速度差异信息指示视觉飞行速度与当前水平速度、当前竖直速度之间的差值较大时,将重启策略作为目标控制策略,进而控制视觉惯性系统重新启动。
再比如,若视觉飞行速度为无效值,则将初始化策略作为目标控制策略,进而控制视觉惯性系统初始化。在初始化时将当前水平速度、当前竖直速度作为视觉惯性系统的初始视觉飞行速度。
又比如,若视觉飞行速度为具体数值,但速度差异信息指示飞行器处于定点飞行,则将速度替换策略作为目标控制策略,进而将当前水平速度和当前竖直速度作为飞行控制器的速度输入。其中,定点飞行包括悬停。
由上可知,本发明实施例提出的飞行器的速度监测方法,能够消除飞行器飞行姿态的改变对当时水平速度以及当前竖直速度进行监测时的影响因素,从而更加准确地监测飞行器实时的当前水平速度以及当前竖直速度。以及,在得到当前水平速度和当前竖直速度之后,通过对两者同时进行修正处理,在避免了误差累计的基础上,还降低了两者的噪声,缩小了与历史数据之间的差异,从而得到精准度更高的修正处理后的当前水平速度以及当前竖直速度。另外,还根据该当前水平速度以及当前竖直速度作为对飞行器飞行控制的参考条件,进而使得飞行器高效地实施飞行计划。再者,还通过该当前水平速度和当前竖直速度对视觉惯性系统进行控制,能够很好地避免视觉惯性系统发生检测无效或者存在炸机风险等情况。
在一实施例中还提供一种飞行器的速度监测装置200。请参阅图6,图6为本申请实施例提供的飞行器的速度监测装置200的结构示意图。其中该飞行器的速度监测装置200应用于飞行器,该飞行器的速度监测装置200包括:
数据获取模块201,用于获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
旋转角度测算模块202,用于获取当前视频图像与相邻视频图像之间的旋转角度;
水平速度监测模块203,用于根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度;
竖直速度监测模块204,用于获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞 行器在第一拍摄时刻的当前竖直速度。
在一些实施例中,水平速度监测模块203还用于:
提取当前视频图像中的第一特征点,以及相邻视频图像中与第一特征点匹配的第二特征点;
根据旋转角度和当前姿态数据分别对第一特征点和第二特征点进行坐标校正处理,得到第一特征点的第一坐标值,以及第二特征点的第二坐标值;
根据第一坐标值和第二坐标值确定光流向量;
根据光流向量和第一地面相对高度,确定飞行器在第一拍摄时刻的当前水平速度。
在一些实施例中,水平速度监测模块203还用于:
确定第一特征点在图像坐标系下的第三坐标值,以及第二特征点在图像坐标系下的第四坐标值;
将第三坐标值转换成在三维球坐标系下的第五坐标值,以及将第四坐标值转换成在三维球坐标系下的第六坐标值;
根据当前姿态数据,将第五坐标值转换成世界坐标系下的第一坐标值;
根据当前姿态数据和旋转角度,将第六坐标值转换成世界坐标系下的第二坐标值。
在一些实施例中,水平速度监测模块203还用于:
根据当前姿态数据,确定飞行器的摄像光轴在当前视频图像中的地面相对投影坐标;
根据所述地面相对投影坐标,在当前视频图像中提取第一特征点,以及在相邻视频图像中提取与第一特征点匹配的第二特征点。
在一些实施例中,旋转角度测算模块202还用于:
获取飞行器在第二拍摄时刻至第一拍摄时刻的多个角速度;
根据多个角速度确定当前视频图像与相邻视频图像之间的旋转角度。
在一些实施例中,飞行器的速度监测装置200还包括高度校正模块,用于:
获取飞行器在当前视频图像的第一拍摄时刻的飞行高度;
根据当前姿态数据对飞行高度进行倾角校正处理,得到第一地面相对高度。
在一些实施例中,速度监测装置200还包括速度修正模块,用于:
将当前水平速度和当前竖直速度确定为一个当前速度对;
获取飞行器在第二拍摄时刻对应的历史速度对以及历史姿态数据,历史速度对包括第二拍摄时刻对应的历史水平速度和历史竖直速度;
根据历史姿态数据、当前姿态数据以及历史速度对进行速度预测,得到预测速度对;
根据预测速度对对当前速度对进行修正处理,得到修正处理后的速度对。
在一些实施例中,速度修正模块还用于:
对当前速度对进行低频滤波处理,得到当前候选速度对;
对预测速度对进行高频滤波处理,得到预测候选速度对;
根据预测候选速度对对当前候选速度对进行修正处理,得到修正处理后的速度对。
在一些实施例中,飞行器的速度监测装置200还包括视觉惯性系统控制模块,用于:
获取飞行器在第一拍摄时刻的视觉飞行速度,视觉飞行速度通过飞行器的视觉惯性系统检测得到;
确定当前水平速度、当前竖直速度与视觉飞行速度之间的速度差异信息;
根据速度差异信息确定视觉惯性系统的目标控制策略,并执行目标控制策略。
应当说明的是,本申请实施例提供的飞行器的速度监测装置200与上文实施例中的飞行器的速度监测方法属于同一构思,通过该飞行器的速度监测装置200可以实现飞行器的速度监测方法实施例中提供的任一方法,其具体实现过程详见飞行器的速度监测方法实施例,此处不再赘述。
由上可知,本申请实施例提出的飞行器的速度监测装置200,能够消除飞行器飞行姿态的改变对当时水平速度以及当前竖直速度进行监测时的影响因素,从而更加准确地监测飞行器实时的当前水平速度以及当前竖直速度。以及,在得到当前水平速度和当前竖直速度之后,通过对两者同时进行修正处理,在避免了误差累计的基础上,还降低了两者的噪声,缩小了与历史数据之间的差异,从而得到精准度更高的修正处理后的当前水平速度以及当前竖直速度。另外,还根据该当前水平速度以及当前竖直速度作为对飞行器飞行控制的参考条 件,进而使得飞行器高效地实施飞行计划。再者,还通过该当前水平速度和当前竖直速度对视觉惯性系统进行控制,能够很好地避免视觉惯性系统发生检测无效或者存在炸机风险等情况。
本申请实施例还提供一种飞行器,飞行器包括但不限于无人机、气球、飞机、滑翔机、直升机等。该飞行器包括本体、光流传感器以及距离传感器、一个或者一个以上处理核心的处理器。其中,光流传感器和距离传感器设于本体的底部,两者分别与处理器通信连接。本领域技术人员可以理解,图中示出的飞行器结构并不构成对飞行器的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
处理器是飞行器的控制中心,利用各种接口和线路连接整个飞行器的各个部分,执行飞行器的各种功能和处理数据,从而对飞行器进行整体监控。
在本申请实施例中,飞行器中的处理器被配置为实现如下功能:
获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
获取当前视频图像与相邻视频图像之间的旋转角度;
根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度;
获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
由上可知,本实施例提供的飞行器,能够消除飞行器飞行姿态的改变对当时水平速度以及当前竖直速度进行监测时的影响因素,从而更加准确地监测飞行器实时的当前水平速度以及当前竖直速度。以及,在得到当前水平速度和当前竖直速度之后,通过对两者同时进行修正处理,在避免了误差累计的基础上,还降低了两者的噪声,缩小了与历史数据之间的差异,从而得到精准度更高的 修正处理后的当前水平速度以及当前竖直速度。另外,还根据该当前水平速度以及当前竖直速度作为对飞行器飞行控制的参考条件,进而使得飞行器高效地实施飞行计划。再者,还通过该当前水平速度和当前竖直速度对视觉惯性系统进行控制,能够很好地避免视觉惯性系统发生检测无效或者存在炸机风险等情况。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过指令来完成,或通过指令控制相关的硬件来完成,该指令可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本申请实施例提供一种计算机可读存储介质,本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来指令相关的硬件完成,的程序可以存储于一计算机可读取存储介质中,该程序在执行时,包括如下步骤:
获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及飞行器在当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
获取当前视频图像与相邻视频图像之间的旋转角度;
根据旋转角度、当前姿态数据、第一地面相对高度、当前视频图像以及相邻视频图像,确定飞行器在第一拍摄时刻的当前水平速度;
获取飞行器在相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据第一地面相对高度和第二地面相对高度,确定飞行器在第一拍摄时刻的当前竖直速度。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
上述的存储介质可以为ROM/RAM、磁碟、光盘等。由于该存储介质中所存储的计算机程序,可以执行本申请实施例所提供的任一种飞行器的速度监测方法中的步骤,因此,可以实现本申请实施例所提供的任一种飞行器的速度监测方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
以上对本申请实施例所提供的一种飞行器的速度监测方法、装置、介质及飞行器进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想; 同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上,本说明书内容不应理解为对本申请的限制。

Claims (12)

  1. 一种飞行器的速度监测方法,其特征在于,所述方法包括:
    获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及所述飞行器在所述当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
    获取所述当前视频图像与所述相邻视频图像之间的旋转角度;
    根据所述旋转角度、所述当前姿态数据、所述第一地面相对高度、所述当前视频图像以及所述相邻视频图像,确定所述飞行器在所述第一拍摄时刻的当前水平速度;
    获取所述飞行器在所述相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据所述第一地面相对高度和所述第二地面相对高度,确定所述飞行器在所述第一拍摄时刻的当前竖直速度。
  2. 根据权利要求1所述的飞行器的速度监测方法,其特征在于,所述根据所述旋转角度、所述当前姿态数据、所述第一地面相对高度、所述当前视频图像以及所述相邻视频图像,确定所述飞行器在所述第一拍摄时刻的当前水平速度,包括:
    提取所述当前视频图像中的第一特征点,以及所述相邻视频图像中与所述第一特征点匹配的第二特征点;
    根据所述旋转角度和所述当前姿态数据分别对所述第一特征点和所述第二特征点进行坐标校正处理,得到所述第一特征点的第一坐标值,以及所述第二特征点的第二坐标值;
    根据所述第一坐标值和所述第二坐标值确定光流向量;
    根据所述光流向量和所述第一地面相对高度,确定所述飞行器在所述第一拍摄时刻的所述当前水平速度。
  3. 根据权利要求2所述的飞行器的速度监测方法,其特征在于,所述根据所述旋转角度和所述当前姿态数据分别对所述第一特征点和所述第二特征点进行坐标校正处理,得到所述第一特征点的第一坐标值,以及所述第二特征点的第二坐标值,包括:
    确定所述第一特征点在图像坐标系下的第三坐标值,以及所述第二特征点在所述图像坐标系下的第四坐标值;
    将所述第三坐标值转换成在三维球坐标系下的第五坐标值,以及将所述第 四坐标值转换成在所述三维球坐标系下的第六坐标值;
    根据所述当前姿态数据,将所述第五坐标值转换成世界坐标系下的所述第一坐标值;
    根据所述当前姿态数据和所述旋转角度,将所述第六坐标值转换成所述世界坐标系下的所述第二坐标值。
  4. 根据权利要求2所述的飞行器的速度监测方法,其特征在于,所述提取所述当前视频图像中的第一特征点,以及所述相邻视频图像中与所述第一特征点匹配的第二特征点,包括:
    根据所述当前姿态数据,确定所述飞行器的摄像光轴在所述当前视频图像中的地面相对投影坐标;
    根据所述地面相对投影坐标,在所述当前视频图像中提取所述第一特征点,以及在所述相邻视频图像中提取与所述第一特征点匹配的所述第二特征点。
  5. 根据权利要求1所述的飞行器的速度监测方法,其特征在于,所述获取所述当前视频图像与所述相邻视频图像之间的旋转角度,包括:
    获取所述飞行器在所述第二拍摄时刻至所述第一拍摄时刻的多个角速度;
    根据所述多个角速度确定所述当前视频图像与所述相邻视频图像之间的所述旋转角度。
  6. 根据权利要求1至5任一项所述的飞行器的速度监测方法,其特征在于,所述获取所述飞行器在所述当前视频图像的第一拍摄时刻的第一地面相对高度,包括:
    获取所述飞行器在所述当前视频图像的第一拍摄时刻的飞行高度;
    根据所述当前姿态数据对所述飞行高度进行倾角校正处理,得到所述第一地面相对高度。
  7. 根据权利要求1至5任一项所述的飞行器的速度监测方法,其特征在于,所述获取所述飞行器在所述相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据所述第一地面相对高度和所述第二地面相对高度,确定所述飞行器在所述第一拍摄时刻的当前竖直速度之后,还包括:
    将所述当前水平速度和所述当前竖直速度确定为一个当前速度对;
    获取所述飞行器在所述第二拍摄时刻对应的历史速度对以及历史姿态数 据,所述历史速度对包括所述第二拍摄时刻对应的历史水平速度和历史竖直速度;
    根据所述历史姿态数据、所述当前姿态数据以及所述历史速度对进行速度预测,得到预测速度对;
    根据所述预测速度对对所述当前速度对进行修正处理,得到修正处理后的速度对。
  8. 根据权利要求7所述的飞行器的速度监测方法,其特征在于,所述根据所述预测速度对对所述当前速度对进行修正处理,得到修正处理后的速度对,包括:
    对所述当前速度对进行低频滤波处理,得到当前候选速度对;
    对所述预测速度对进行高频滤波处理,得到预测候选速度对;
    根据所述预测候选速度对对所述当前候选速度对进行修正处理,得到所述修正处理后的速度对。
  9. 根据权利要求1至5任一项所述的飞行器的速度监测方法,其特征在于,所述获取所述飞行器在所述相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据所述第一地面相对高度和所述第二地面相对高度,确定所述飞行器在所述第一拍摄时刻的竖直速度之后,还包括:
    获取所述飞行器在所述第一拍摄时刻的视觉飞行速度,所述视觉飞行速度通过所述飞行器的视觉惯性系统检测得到;
    确定所述当前水平速度、所述当前竖直速度与所述视觉飞行速度之间的速度差异信息;
    根据所述速度差异信息确定所述视觉惯性系统的目标控制策略,并执行所述目标控制策略。
  10. 一种飞行器的速度监测装置,其特征在于,包括:
    数据获取模块,用于获取飞行器飞行时拍摄的当前视频图像及其相邻视频图像,以及所述飞行器在所述当前视频图像的第一拍摄时刻的当前姿态数据、第一地面相对高度;
    旋转角度测算模块,用于获取所述当前视频图像与所述相邻视频图像之间的旋转角度;
    水平速度监测模块,用于根据所述旋转角度、所述当前姿态数据、所述第一地面相对高度、所述当前视频图像以及所述相邻视频图像,确定所述飞行器在所述第一拍摄时刻的当前水平速度;
    竖直速度监测模块,用于获取所述飞行器在所述相邻视频图像的第二拍摄时刻的第二地面相对高度,并根据所述第一地面相对高度和所述第二地面相对高度,确定所述飞行器在所述第一拍摄时刻的当前竖直速度。
  11. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,当所述计算机程序在计算机上运行时,使得所述计算机执行如权利要求1至9任一项所述的飞行器的速度监测方法。
  12. 一种飞行器,其特征在于,所述飞行器包括本体、处理器、光流传感器以及距离传感器,所述光流传感器和所述距离传感器设于所述本体的底部,所述处理器被配置为执行如权利要求1至9任一项所述的飞行器的速度监测方法。
PCT/CN2023/121073 2022-09-29 2023-09-25 飞行器的速度监测方法、装置、存储介质及飞行器 WO2024067473A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211200475.6 2022-09-29
CN202211200475.6A CN117850438A (zh) 2022-09-29 2022-09-29 飞行器的速度监测方法、装置、存储介质及飞行器

Publications (1)

Publication Number Publication Date
WO2024067473A1 true WO2024067473A1 (zh) 2024-04-04

Family

ID=90476229

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/121073 WO2024067473A1 (zh) 2022-09-29 2023-09-25 飞行器的速度监测方法、装置、存储介质及飞行器

Country Status (2)

Country Link
CN (1) CN117850438A (zh)
WO (1) WO2024067473A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104880187A (zh) * 2015-06-09 2015-09-02 北京航空航天大学 一种基于双摄像机的飞行器光流检测装置的运动估计方法
CN105807083A (zh) * 2016-03-15 2016-07-27 深圳市高巨创新科技开发有限公司 一种无人飞行器实时测速方法及系统
CN106199039A (zh) * 2016-07-06 2016-12-07 深圳市高巨创新科技开发有限公司 一种无人机速度监测方法及系统
CN108204812A (zh) * 2016-12-16 2018-06-26 中国航天科工飞航技术研究院 一种无人机速度估计方法
CN109782014A (zh) * 2019-03-11 2019-05-21 南京理工大学泰州科技学院 一种无人机速度确定方法及装置
JP2019114008A (ja) * 2017-12-22 2019-07-11 カシオ計算機株式会社 飛行装置、飛行装置の制御方法及びプログラム
CN112254721A (zh) * 2020-11-06 2021-01-22 南京大学 一种基于光流相机的姿态定位方法
CN113607968A (zh) * 2021-08-05 2021-11-05 深圳慧源创新科技有限公司 一种飞行器速度监测方法、飞行器

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104880187A (zh) * 2015-06-09 2015-09-02 北京航空航天大学 一种基于双摄像机的飞行器光流检测装置的运动估计方法
CN105807083A (zh) * 2016-03-15 2016-07-27 深圳市高巨创新科技开发有限公司 一种无人飞行器实时测速方法及系统
CN106199039A (zh) * 2016-07-06 2016-12-07 深圳市高巨创新科技开发有限公司 一种无人机速度监测方法及系统
CN108204812A (zh) * 2016-12-16 2018-06-26 中国航天科工飞航技术研究院 一种无人机速度估计方法
JP2019114008A (ja) * 2017-12-22 2019-07-11 カシオ計算機株式会社 飛行装置、飛行装置の制御方法及びプログラム
CN109782014A (zh) * 2019-03-11 2019-05-21 南京理工大学泰州科技学院 一种无人机速度确定方法及装置
CN112254721A (zh) * 2020-11-06 2021-01-22 南京大学 一种基于光流相机的姿态定位方法
CN113607968A (zh) * 2021-08-05 2021-11-05 深圳慧源创新科技有限公司 一种飞行器速度监测方法、飞行器

Also Published As

Publication number Publication date
CN117850438A (zh) 2024-04-09

Similar Documents

Publication Publication Date Title
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
WO2019126958A1 (zh) 偏航姿态控制方法、无人机、计算机可读存储介质
WO2019242553A1 (zh) 控制拍摄装置的拍摄角度的方法、控制装置及可穿戴设备
WO2020024185A1 (en) Techniques for motion-based automatic image capture
CN109562844A (zh) 自动着陆表面地形评估以及相关的系统和方法
WO2022021027A1 (zh) 目标跟踪方法、装置、无人机、系统及可读存储介质
CN106973221B (zh) 基于美学评价的无人机摄像方法和系统
WO2017181513A1 (zh) 无人机的飞行控制方法和装置
WO2021081774A1 (zh) 一种参数优化方法、装置及控制设备、飞行器
WO2018098792A1 (en) Methods and associated systems for managing 3d flight paths
WO2020073245A1 (zh) 手势识别方法、vr视角控制方法以及vr系统
CN110720113A (zh) 一种参数处理方法、装置及摄像设备、飞行器
WO2024067473A1 (zh) 飞行器的速度监测方法、装置、存储介质及飞行器
WO2019061466A1 (zh) 一种飞行控制方法、遥控装置、遥控系统
US10577101B2 (en) Water surface detection method and apparatus, unmanned aerial vehicle landing method and apparatus and unmanned aerial vehicle
WO2024067498A1 (zh) 飞行器的速度监测方法、装置、存储介质及飞行器
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机
WO2021217450A1 (zh) 目标跟踪方法、设备及存储介质
WO2021013143A1 (zh) 装置、摄像装置、移动体、方法以及程序
WO2021056411A1 (zh) 航线调整方法、地面端设备、无人机、系统和存储介质
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
CN112947546B (zh) 一种无人飞行器仿地飞行方法
CN113206951B (zh) 一种基于扑翼飞行系统的实时电子稳像方法
WO2021035746A1 (zh) 图像处理方法、装置和可移动平台
CN109754412B (zh) 目标跟踪方法、目标跟踪装置及计算机可读存储介质