CN112840286A - Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium - Google Patents

Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium Download PDF

Info

Publication number
CN112840286A
CN112840286A CN202080005564.8A CN202080005564A CN112840286A CN 112840286 A CN112840286 A CN 112840286A CN 202080005564 A CN202080005564 A CN 202080005564A CN 112840286 A CN112840286 A CN 112840286A
Authority
CN
China
Prior art keywords
image
paddle
display
attitude angle
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080005564.8A
Other languages
Chinese (zh)
Inventor
翁松伟
梁季光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112840286A publication Critical patent/CN112840286A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

A flight assistance method, a flight assistance apparatus (200), an unmanned aerial vehicle (100), a remote controller (300), a display (400), and an unmanned aerial vehicle system (1000). The flight assistance method comprises (011) acquiring flight parameters of the functional component (20); (012) generating a virtual image according to the flight parameters; (013) a virtual image is added to the display image to indicate the flight status corresponding to the functional component (20).

Description

Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium
Technical Field
The present application relates to the field of unmanned aerial vehicle technology, and in particular, to a flight assistance method for an unmanned aerial vehicle, a flight assistance device for an unmanned aerial vehicle, a remote controller for an unmanned aerial vehicle, a display for an unmanned aerial vehicle, an unmanned aerial vehicle system, and a computer-readable storage medium.
Background
With the technical development of the unmanned aerial vehicle, people have more application scene requirements on the flight control of the unmanned aerial vehicle. In some cases, an unmanned aerial vehicle (e.g., a crossing machine) is generally photographed together with a structural component (e.g., a blade) of the unmanned aerial vehicle to be displayed on a First Person View (FPV) screen to assist an operator in determining a flight state (e.g., a flight attitude) of the unmanned aerial vehicle, which may cause the structural component (e.g., the blade) to be present in the photographed video, thereby affecting the quality of the photographed video. In other cases, structural parts (e.g., blades) of the unmanned aerial vehicle are intentionally not recorded for good visual perception, but this increases the difficulty for the operator to determine the flight state (e.g., flight attitude) of the unmanned aerial vehicle.
Disclosure of Invention
Embodiments of the present application provide a flight assist method for an unmanned aerial vehicle, a flight assist apparatus for an unmanned aerial vehicle, a remote controller for an unmanned aerial vehicle, a display for an unmanned aerial vehicle, an unmanned aerial vehicle system, and a computer-readable storage medium.
The embodiment of the application provides a flight assisting method for an unmanned aerial vehicle, and the flight assisting method comprises the following steps: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to a display image to indicate the flight state corresponding to the functional component.
Embodiments of the present application also provide a flight assist device for an unmanned aerial vehicle, the unmanned aerial vehicle comprising a functional component, the flight assist device comprising a processor configured to: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to a display image to indicate the flight state corresponding to the functional component.
The embodiment of this application still provides an unmanned vehicles, unmanned vehicles includes: fuselage, functional component and flight auxiliary device, the functional component sets up on the fuselage, flight auxiliary device includes the treater, the treater is used for: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to a display image to indicate the flight state corresponding to the functional component.
The embodiment of the present application further provides a remote controller for an unmanned aerial vehicle, the unmanned aerial vehicle includes the functional unit, the remote controller includes the flight assisting device, the flight assisting device includes the processor, the processor is used for: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to a display image to indicate the flight state corresponding to the functional component.
The embodiment of the application further provides a display for an unmanned aerial vehicle, the display is used for receiving a display image added with a virtual image to display a corresponding display picture, the unmanned aerial vehicle comprises a functional component, the display image added with the virtual image is obtained by acquiring flight parameters of the functional component, generating the virtual image according to the flight parameters and adding the virtual image to the display image, and the display image added with the virtual image can indicate a flight state corresponding to the functional component.
The embodiment of the application also provides an unmanned aerial vehicle system, which comprises an unmanned aerial vehicle, a remote controller, a flight auxiliary device and a display, wherein the unmanned aerial vehicle comprises a functional component, and the flight auxiliary device is arranged on the unmanned aerial vehicle and/or the remote controller; the flight assistance apparatus includes a processor configured to: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; adding the virtual image to a display image to indicate the flight state corresponding to the functional component; the display is used for receiving the display image added with the virtual image so as to display a corresponding display picture.
Embodiments of the present application also provide a computer-readable storage medium containing computer-executable instructions. The computer-executable instructions, when executed by one or more processors, cause the processors to perform the flight assistance methods of the above embodiments.
According to the flight assistance method for the unmanned aerial vehicle, the flight assistance device for the unmanned aerial vehicle, the remote controller for the unmanned aerial vehicle, the display for the unmanned aerial vehicle, the unmanned aerial vehicle system and the computer-readable storage medium, the flight state is indicated through the virtual image added to the display image, and since the virtual image is generated according to the flight parameters of the functional component of the unmanned aerial vehicle, the virtual image can well indicate the flight state corresponding to the functional component, for example, indicate the attitude angle corresponding to the functional component (such as a gyroscope), so that the flight assistance method for the unmanned aerial vehicle, the flight assistance device for the unmanned aerial vehicle and the computer-readable storage medium can assist an operator in judging the flight state of the unmanned aerial vehicle without shooting together with the blade. And because the virtual image is subsequently added to the display image, the virtual image and the paddle do not exist in the actually shot video, and the quality of the shot video is better.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow diagram of a flight assistance method according to some embodiments of the present application.
FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle system according to certain embodiments of the present application.
FIG. 3 is a flow chart illustrating a flight assistance method according to some embodiments of the present application.
Fig. 4-7 are schematic views of a flight assistance method according to some embodiments of the present application.
Fig. 8 and 9 are schematic flow diagrams of flight assistance methods according to certain embodiments of the present application.
FIG. 10 is a schematic view of a flight assistance method according to some embodiments of the present application.
Fig. 11-13 are flow diagrams illustrating flight assistance methods according to certain embodiments of the present application.
Fig. 14-17 are schematic structural views of unmanned aerial vehicle systems according to certain embodiments of the present application.
FIG. 18 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
At present, the unmanned aerial vehicle 100 is widely applied to aerial photography, agriculture, plant protection, miniature self-timer, express transportation, disaster relief, wild animal observation, infectious disease monitoring, surveying and mapping, news reporting, power inspection, disaster relief, movie and television shooting, romantic manufacturing and other scenes, and mainly controls the unmanned aerial vehicle 100 to execute various flight tasks through the remote controller 300.
The unmanned aerial vehicle 100 comprises a fuselage 10 and a shooting device 30 installed on the fuselage 10, wherein the shooting device 30 is used for acquiring an environment image, and the shooting device 30 is used as an extension of human eyes, so that a user operating the unmanned aerial vehicle 100 can know the current environment of the unmanned aerial vehicle 100 according to a shot picture, and the remote controller 300 is controlled to operate the unmanned aerial vehicle 100 to execute various flight tasks.
Among the unmanned vehicles 100, there is a type of unmanned vehicle 100 (i.e., a crossing machine) mainly used for racing entertainment, and the crossing machine takes a video and displays the video on an FPV screen during flight, so that a flyer (i.e., an unmanned vehicle player who operates the crossing machine) realizes an operation experience of a first person through the FPV screen and operates the crossing machine for flight. In the following, the unmanned aerial vehicle 100 is taken as a traversing machine for example, and the principle is basically the same when the unmanned aerial vehicle 100 is of other types, and the details are not described herein.
In crossing the machine, shooting equipment 30 generally can take the paddle into together to supplementary flight hand judgement crosses the current flight gesture of machine, however, this can make FPV picture both sides have the picture of paddle rotation, and the paddle not only can shelter from the partial scene of shooing in the video, makes the control of flight hand experience receive certain influence, but also can make to have the paddle in the shooting video, has influenced viewer's watching experience.
To this end, referring to fig. 1 and 2, the present application provides a flight assistance method for an unmanned aerial vehicle 100, where the unmanned aerial vehicle 100 includes a functional component 20, and the flight assistance method includes:
011: acquiring flight parameters of the functional component 20;
012: generating a virtual image according to the flight parameters; and
013: a virtual image is added to the display image to indicate the flight status for the functional component 20.
The embodiment of the present application further provides a flight assistance device 200 for the unmanned aerial vehicle 100, where the flight assistance device 200 includes a processor 210, and the processor 210 is configured to: acquiring flight parameters of the functional component 20; generating a virtual image according to the flight parameters; and adding a virtual image to the display image to indicate the flight status corresponding to the functional component 20. That is, step 011 through step 013 can be implemented by processor 210.
The present embodiment also provides an unmanned aerial vehicle system 1000, where the unmanned aerial vehicle system 1000 includes an unmanned aerial vehicle 100, a flight assistance device 200, a remote controller 300, and a display 400, and the flight assistance device 200 may be disposed on the unmanned aerial vehicle 100 and/or on the remote controller 300.
Specifically, the flight assist device 200 may be provided on the fuselage 10 of the unmanned aerial vehicle 100 (i.e., the unmanned aerial vehicle 100 includes the fuselage 10, the functional component 20, the photographing apparatus 30, and the flight assist device 200, and the functional component 20 may be provided on the fuselage 10), and the processor 210 may directly acquire the flight parameters of the functional component 20 and video data (the video data includes one or more frames of display images) photographed by the photographing apparatus 30 provided on the fuselage 10 of the unmanned aerial vehicle 100, and generate a virtual image according to the flight parameters to be added to the display images for indicating the flight state of the unmanned aerial vehicle 100; alternatively, the flight assist device 200 may be provided on the remote controller 300 (that is, the unmanned aerial vehicle 100 includes the fuselage 10, the functional component 20, and the shooting device 30, and the remote controller 300 includes the flight assist device 200), and through wireless communication between the remote controller 300 and the unmanned aerial vehicle 100, the processor 210 may acquire flight parameters of the functional component 20 and video data shot by the shooting device 30, and generate a virtual image according to the flight parameters to be added to the display image, so as to indicate the flight state of the unmanned aerial vehicle 100; or, the flight assistance device 200 is simultaneously disposed on the main body 10 and the remote controller 300 of the unmanned aerial vehicle 100, that is, the flight assistance device 200 includes at least two processors 210, which are respectively disposed on the main body 10 and the remote controller 300 of the unmanned aerial vehicle 100, the processor 210 disposed on the main body 10 of the unmanned aerial vehicle 100 can directly acquire the flight parameters of the functional component 20 and the video data captured by the capturing device 30 and generate a virtual image according to the flight parameters, then the processor 210 disposed on the remote controller 300 can receive the virtual image and the video data transmitted by the unmanned aerial vehicle 100 through wireless communication and add the virtual image to the display image, the display image to which the virtual image has been added can indicate the flight status corresponding to the functional component 20, that is, step 011 and step 012 can be implemented by the processor 210 on the main body 10 of the unmanned aerial vehicle 100, step 013 can be implemented by processor 210 on remote control 300, it being understood that step 011 can also be implemented by processor 210 on fuselage 10 of UAV 100, and steps 012 and 013 can be implemented by processor 210 on remote control 300. In the embodiment shown in fig. 2, the flight assistance device 200 is disposed on the fuselage 10 of the unmanned aerial vehicle 100, and is not limited herein.
The functional component 20 may be any device disposed on the unmanned aerial vehicle 100, for example, the functional component 20 may include sensors (such as a gyroscope, an accelerometer, a barometer, a magnetic compass, etc.), a motor, a power supply, etc. of the unmanned aerial vehicle 100, and the processor 210 acquiring the flight parameters of the functional component 20 may include: and acquiring the attitude angle of the gyroscope, the acceleration of the accelerometer, the air pressure of the barometer, the magnetic field data of the magnetic compass, the rotating speed of the motor, the discharge current of the power supply, the residual electric quantity of the power supply and the like.
The processor 210 generates a corresponding virtual image according to the flight parameters, for example, the processor 210 may generate a paddle image according to the attitude angle of the gyroscope and the acceleration of the accelerometer to indicate the attitude angle of the unmanned aerial vehicle 100 (for example, one paddle image or one arm lamp is respectively added to both sides of the FPV screen, and the roll angle is indicated by the inclination angle of the two paddle images or the arm lamps); as another example, processor 210 may generate a graduated indicator tube image from the barometric pressure of the barometer to indicate the barometric pressure of the environment in which UAV 100 is currently located; for another example, processor 210 may generate a heading instrument compass image that displays an azimuth in the form of a disk from the magnetic field data of the magnetic compass to indicate the flight heading of unmanned aerial vehicle 100; for another example, the processor 210 may generate a rotatable paddle image according to the rotation speed of the motor, and the rotation speed of the paddle image is determined according to the rotation speed of the motor to indicate the rotation speed of the blade corresponding to each motor; as another example, the processor 210 may generate a battery image to indicate the remaining amount of power according to the remaining amount of power of the power supply, the processor 210 may also generate an indication digital image to indicate the remaining usage time of the unmanned aerial vehicle 100 according to the remaining amount of power of the power supply and the discharge current of the power supply, and the like. In the present application, the functional component 20 is exemplified by a gyroscope and a motor, and the principles of the functional component 20, such as a power supply and a magnetic compass, are basically the same, and are not described herein again.
The flight parameter may be a flight parameter at the current time, or a flight parameter within a predetermined time period before the current time, and when the virtual image is generated according to the flight parameter at the current time, the virtual image may indicate a flight state of the functional component 20 at the current time; when the virtual image is generated according to the flight parameters in the predetermined period before the current time, the flight state of the functional component 20 in the predetermined period can be indicated, for example, the rotatable paddle shadow generated according to the average rotating speed of the motor in the predetermined period can be indicated, the power supply discharge current can be generated according to the residual electric quantity in the predetermined period and the predetermined period, and then an indication digital image can be generated according to the discharge current and the residual electric quantity to indicate the remaining use time of the unmanned aerial vehicle 100 and the like. Wherein the predetermined period of time may be 1 second, 2 seconds, 30 seconds, 1 minute, 5 minutes, etc.
After generating the corresponding virtual image according to the flight parameters, the processor 210 adds the virtual image to the display image acquired by the shooting device 30, so as to display the display image added with the virtual image as an FPV screen, where the display image added with the virtual image can indicate the flight state corresponding to the functional component 20 in real time, for example, indicate the attitude angle of the unmanned aerial vehicle 100 by simulating the paddle shadow of the actual blade, so that the flying hand determines the current flight attitude of the traversing machine, and assists the flying hand in manipulating the traversing machine.
In the flight assistance method and the flight assistance device 200 of the present application, the flight state is indicated by the virtual image added to the display image, and since the virtual image is generated according to the flight parameters of the functional component 20 of the unmanned aerial vehicle, the virtual image can well indicate the flight state corresponding to the functional component 20, for example, the attitude angle of the unmanned aerial vehicle 100 is indicated by a paddle shadow, so that shooting together with a paddle is not required, and the flight state of the unmanned aerial vehicle 100 can be determined by an assistant. And because the virtual image is subsequently added to the display image to be displayed as an FPV picture, the virtual image and the blade do not exist in the actually shot display image, and the quality of the shot video is better.
Referring to fig. 2 and 3, in some embodiments, the functional component 20 includes a gyroscope 21, the virtual image includes a two-dimensional first paddle shadow and a two-dimensional second paddle shadow, the first paddle shadow and the second paddle shadow are located on two opposite sides of the display image, an attitude angle of the gyroscope 21 includes an attitude angle of a roll axis, and the step 012 includes:
0121: and generating a first paddle shadow and a second paddle shadow according to the attitude angle of the transverse roller, and determining the inclination angle of the connecting line of the first paddle shadow and the second paddle shadow, wherein the inclination angle is the angle of the connecting line of the first paddle shadow and the second paddle shadow relative to the horizontal line.
In some embodiments, the processor 210 is further configured to generate a first paddle shadow and a second paddle shadow according to the attitude angle of the roll axis, and determine an inclination angle of a connecting line of the first paddle shadow and the second paddle shadow, where the inclination angle is an angle of the connecting line of the first paddle shadow and the second paddle shadow relative to the horizontal line. That is, step 0121 may be implemented by processor 210.
Specifically, in the FPV picture corresponding to the crossing machine, the blades are respectively located at two sides of the FPV picture, and as shown in the FPV picture displayed by the display 400 shown in fig. 4, the horizontal direction of the FPV picture is taken as the X direction, and the vertical direction of the FPV picture is taken as the Y direction, along with the change of the attitude angle of the unmanned aerial vehicle 100, the positions of the blades in the Y direction at the two sides of the FPV picture are changed accordingly.
The gyroscope 21 may be provided on the body 10 of the unmanned aerial vehicle 100 to detect the attitude angle of the body 10; alternatively, the gyroscope 21 may be provided on the pan/tilt head 500, and the pan/tilt head 500 is provided on the body 10 of the unmanned aerial vehicle 100 to detect the attitude angle of the pan/tilt head 500; alternatively, the gyroscopes 21 are simultaneously disposed on the main body 10 and the pan/tilt head 500 of the unmanned aerial vehicle 100, that is, at least two gyroscopes 21 are disposed on the main body 10 and the pan/tilt head 500 of the unmanned aerial vehicle 100, respectively, so as to detect the attitude angle of the main body 10 and the attitude angle of the pan/tilt head 500, respectively. In the embodiment shown in fig. 2, the gyroscope 21 is disposed on the fuselage 10 of the unmanned aerial vehicle 100, which is not limited herein.
The operation modes of the pan/tilt head 500 include a stability augmentation mode and a following mode. The pan/tilt head 500 can maintain the stability augmentation mode, maintain the following mode, and switch between the stability augmentation mode and the following mode. The stability augmentation mode refers to that the pan/tilt head 500 always maintains the stability of the preset reference direction (e.g., horizontal direction), and the pan/tilt head 500 performs negative feedback adjustment on the roll, yaw and pitch operations of the unmanned aerial vehicle 100 to counteract the shake that may be brought about, so as to maintain the stability of the load (e.g., the shooting device 30) carried on the pan/tilt head 500. Taking the pitching as an example, in the augmented stability mode, when the unmanned aerial vehicle 100 is pitched, the camera is not pitched along with the pitching, but the original shooting angle (generally horizontal) is still maintained, because the pan/tilt head 500 performs negative feedback adjustment to keep the shooting device 30 carried by the pan/tilt head 500 in the horizontal direction all the time when the unmanned aerial vehicle 100 is pitched. The negative feedback regulation here means that when the unmanned aerial vehicle 100 tilts up for 15 degrees, the pan-tilt 500 controls the shooting device 30 to tilt up for 15 degrees so as to keep the shooting device horizontal, and therefore the stability of the shooting device is increased; the negative feedback adjustment principle of yaw and roll is basically the same as that of pitch. The following mode means that the pan/tilt head 500 moves following the unmanned aerial vehicle 100 so as to keep the relative angle of the photographing apparatus 30 and the unmanned aerial vehicle 100 constant. Taking pitch as an example, in the following mode, the user controls the unmanned aerial vehicle 100 to pitch 20 degrees, and the pan/tilt head 500 controls the photographing apparatus 30 to pitch 20 degrees so that the relative angle between the photographing apparatus 30 and the unmanned aerial vehicle 100 is substantially constant.
Referring to fig. 4, in the present embodiment, an inclination angle of a connection line between a first paddle shadow S1 and a second paddle shadow S2 is determined by an attitude angle of a roll shaft, the first paddle shadow S1 is located on the left side of an FPV picture, and the second paddle shadow S2 is located on the right side of the FPV picture, referring to fig. 5, when the unmanned aerial vehicle 100 rolls counterclockwise, the first paddle shadow S1 sticks to the left edge of the FPV picture and moves in the opposite direction of the Y direction, and the second paddle shadow S2 sticks to the right edge of the FPV picture and moves in the Y direction; on the contrary, referring to fig. 6, when the unmanned aerial vehicle 100 rolls clockwise, the first paddle shadow S1 moves along the Y direction along the left edge of the FPV screen, and the second paddle shadow S2 moves along the opposite direction of the Y direction along the right edge of the FPV screen. In an embodiment, a connection line between the first paddle shadow S1 and the second paddle shadow S2 may pass through the center of the FPV screen, and an inclination angle of the connection line between the first paddle shadow S1 and the second paddle shadow S2 is an angle of the connection line between the first paddle shadow S1 and the second paddle shadow S2 relative to a horizontal line passing through the center of the FPV screen. The roll attitude angle of the unmanned aerial vehicle 100 can be indicated according to the inclination angle of the connecting line of the first paddle shadow S1 and the second paddle shadow S2.
Referring to fig. 7, when the third paddle image S3 and the fourth paddle image S4 represent the roll attitude angle of the pan/tilt head 500, if the pan/tilt head 500 is currently in the stability enhancement mode, the tilt angle of the connection line between the third paddle image S3 and the fourth paddle image S4 corresponding to the pan/tilt head 500 is always unchanged (for example, 0 degrees), and in the following mode, the tilt angle of the connection line between the third paddle image S3 and the fourth paddle image S4 corresponding to the pan/tilt head 500 moves along with the movement of the first paddle image S1 and the second paddle image S2 corresponding to the unmanned aerial vehicle 100, and the tilt angles of the third paddle image S3 and the fourth paddle image S4 are substantially the same.
Referring to fig. 7, in an example, a connection line between the first paddle shadow S1 and the second paddle shadow S2 corresponding to the unmanned aerial vehicle 100 may pass through the center of the upper half of the FPV screen, and an inclination angle of the connection line between the first paddle shadow S1 and the second paddle shadow S2 is an angle of the connection line between the first paddle shadow S1 and the second paddle shadow S2 relative to a horizontal line passing through the center of the upper half of the FPV screen; a connecting line of the third paddle shadow S3 and the fourth paddle shadow S4 corresponding to the pan/tilt head 500 can pass through the center of the lower half of the FPV picture, and an inclination angle of the connecting line of the third paddle shadow S3 and the fourth paddle shadow S4 is an angle of the connecting line of the third paddle shadow S3 and the fourth paddle shadow S4 relative to a horizontal line passing through the center of the lower half of the FPV picture; thereby simultaneously displaying the roll attitude angle of the unmanned aerial vehicle 100 and the roll attitude angle of the pan/tilt head 500.
Referring to fig. 2, 5 and 8, in some embodiments, the functional component 20 includes a motor 22, the virtual image includes a two-dimensional first paddle image S1 and a two-dimensional second paddle image S2, the first paddle image S1 and the second paddle image S2 are located at two opposite sides of the displayed image, and the step 012 further includes:
0122: the first and second rotatable paddle images S1 and S2 are generated according to the rotation speed of the motor 22, and the rotation speeds of the first and second paddle images S1 and S2 are determined.
In certain embodiments, the processor 210 is further configured to generate a first rotatable paddle image S1 and a second rotatable paddle image S2 based on the rotational speed of the motor 22, and determine the rotational speed of the first paddle image S1 and the second paddle image S2. That is, step 0122 can be implemented by processor 210.
Specifically, in the embodiment of the present application, since the actual blade is not photographed by the photographing device 30, the flyer itself cannot know whether the blade normally rotates and the rotating speed, and when the motor 22 is damaged, the rotation of the blade may slow or even no longer rotate, and therefore, the processor 210 generates the first and second rotatable paddle images S1 and S2 according to the rotating speed of the motor 22 to indicate the rotating speed of the motor 22, when the first and second paddle images S1 and S2 do not rotate or the rotating speed suddenly becomes extremely slow, it indicates that the motor 22 may have a problem, and at this time, the crossing machine is no longer suitable for flying and needs to be dropped as soon as possible for maintenance. In one embodiment, the rotational speeds of the first paddle shadow S1 and the second paddle shadow S2 may be determined based on the lowest rotational speed of the corresponding one or more motors 22, e.g., the traversing machine is a four-wing traversing machine, in two groups, each group including two blades, each blade corresponding to one motor 22, the first paddle shadow S1 is determined based on the lowest rotational speed of the motors 22 of the first group, and the second paddle shadow S2 is determined based on the lowest rotational speed of the motors 22 of the second group, such that the flyer may timely find a motor 22 that may be damaged (e.g., a rotational speed less than or equal to a predetermined rotational speed). Of course, the number of the paddle shadows may be more, for example, two first paddle shadows S1 and two second paddle shadows S2 are provided, and each paddle shadow corresponds to one motor 22, so that the motor 22 which may be damaged is accurately determined according to the rotation speed of the paddle shadows.
In the method shown in fig. 8, step 012 includes steps 0121 and 0122, that is, through the display image to which the virtual image has been added, the flyer can know both the roll attitude angle and the rotation speed or damage condition of the motor of the traversing machine. It should be understood that step 012 of the flight assistance method may also include single step 0121 or single step 0122, or single other possible steps, or a combination of these steps, which is not limited herein.
Referring to fig. 2, 9 and 10, in some embodiments, the virtual image includes a three-dimensional aircraft image S5, the attitude angle of the gyroscope 21 includes an attitude angle of a roll axis, an attitude angle of a pitch axis and an attitude angle of a yaw axis, and the step 012 further includes:
0123: the aircraft image S5 is generated from the attitude angle of the gyroscope 21, and the roll attitude angle, pitch attitude angle, and yaw attitude angle of the aircraft image S5 are determined from the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis, respectively.
In some embodiments, the processor 210 is further configured to generate the vehicle image S5 according to the attitude angle of the gyroscope 21, and determine a roll attitude angle, a pitch attitude angle, and a yaw attitude angle of the vehicle image S5 according to the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis, respectively. That is, step 0123 can be implemented by processor 210.
Specifically, in order to accurately indicate the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis of the unmanned aerial vehicle 100, a three-dimensional vehicle image S5 may be generated from the attitude angle of the gyroscope 21, the roll attitude angle of the three-dimensional vehicle image S5 and the roll attitude angle of the gyroscope 21 being kept coincident; the pitch attitude angle of the three-dimensional aircraft image S5 and the pitch axis attitude angle of the gyroscope 21 are kept coincident; the yaw attitude angle of the three-dimensional aircraft image S5 and the yaw axis attitude angle of the gyroscope 21 are kept coincident. In this way, the aircraft image S5 changes in synchronization with the attitude of the unmanned aerial vehicle 100, and the change in the attitude of the three axes (i.e., the roll axis, the pitch axis, and the yaw axis) of the virtual three-dimensional aircraft image S5 can be used by the aircraft to accurately determine the change in the attitude of the three axes of the unmanned aerial vehicle 100, thereby assisting the aircraft to better control the unmanned aerial vehicle 100.
Referring to fig. 2 and 11, in some embodiments, a flight assistance method includes:
014: parameters of the virtual image are adjusted so that the content of the display image is not obscured by the virtual image.
In some embodiments, the processor 210 is further configured to adjust parameters of the virtual image such that the content of the display image is not obscured by the virtual image. That is, step 014 may be implemented by processor 210.
Specifically, since the virtual image is added to the display screen, in order to minimize the influence of the virtual image on the display screen (e.g., minimize the influence of occlusion on the display screen) while indicating the flight state of the unmanned aerial vehicle 100, the processor 210 may adjust a parameter of the virtual image, for example, the parameter may be the color, size, and/or transparency of the virtual image, and the processor 210 may adjust the color, size, or transparency of the virtual image; alternatively, the processor 210 may adjust the color and size of the virtual image; alternatively, the processor 210 may adjust the color and transparency of the virtual image; alternatively, the processor 210 may adjust the size and transparency of the virtual image; alternatively, the processor 210 may adjust the color, size, and transparency of the virtual image. In this embodiment, the color, size, and transparency of the virtual image are adjustable by the processor 210.
The processor 210 may adjust the color of the virtual image according to the color of the display image itself acquired by the photographing apparatus 30, adjust the size of the virtual image according to the size of the display image, and adjust the transparency of the virtual image according to the color of the display image. For example, when most of the display image (more than 70% of the pixels) is blue, the virtual image is displayed in a color (e.g., red, green, etc.) that is greatly different from blue, so that the user can quickly observe the virtual image, while the processor 210 adjusts the size of the virtual image according to the size of the display image (e.g., 1/20 adjusting the size of the virtual image to the size of the display image) so that the virtual image does not obscure an excessive area of the display image; and the transparency of the virtual image is adjusted according to the color of the display image, so that the area of the display image, which is shielded by the virtual image, can be observed by the flyer, and it can be understood that when the color of the virtual image is similar to that of the display image (for example, the pixel with the same color is greater than or equal to 70%), the transparency needs to be adjusted to be higher to distinguish the virtual image from the display image well, and when the color of the virtual image is far from that of the display image (for example, the pixel with the same color is less than or equal to 30%), the transparency needs to be adjusted to be lower to distinguish the virtual image from the display image well, so that when the virtual image and the display image are displayed with different colors, the transparency of the virtual image can be adjusted to be lower (for example, 20%), so that the virtual image can be observed by the flyer quickly. In this way, the processor 210 adjusts the parameters of the virtual image, so that the adjusted virtual image can be quickly observed by the flyer without obstructing the display content of the display image.
In other embodiments, the processor 210 is further configured to receive an adjustment instruction to adjust a parameter of the virtual image according to the adjustment instruction. The adjustment instruction may be obtained by receiving an input of the flyer through the remote controller 300, or the adjustment instruction may also be obtained by receiving an input of the flyer through a terminal (e.g., a mobile phone, a tablet computer, etc.) connected to the remote controller 300, where the remote controller 300 may be provided with a mounting bracket, and the terminal may be directly disposed on the mounting bracket to implement wired connection with the remote controller 300, or the terminal implements wired connection with the remote controller 300 through a data line, or the terminal is wirelessly connected with the remote controller 300 (e.g., the terminal is connected with the remote controller 300 through bluetooth, or the terminal and the remote controller 300 are connected with a cellular mobile network or a wireless lan to implement wireless connection, etc.). In this way, a virtual image that best fits the flier can be set according to the personal preferences of the flier.
Referring to fig. 2 and 12, in some embodiments, step 013 includes:
0131: acquiring video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
0132: a virtual image generated from the flight parameters of the functional component 20 acquired at the same time as the time stamp is added to the display image corresponding to the time stamp to indicate the flight state of the functional component 20 at the time of the time stamp.
In some embodiments, the processor 210 is further configured to obtain video data, where the video data includes one or more frames of display images and a timestamp corresponding to each frame of display image; and adding a virtual image generated from the flight parameters of the functional component 20 acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component 20 at the time of the time stamp. That is, step 0131 and step 0132 may be implemented by processor 210.
Specifically, when the virtual image is added to the display image, in order to match the flight state indicated by the virtual image with the current display image, the processor 210 needs to acquire the video data captured by the capturing device 30, where the video data includes one or more frames of display images, each frame of display image corresponds to a timestamp (hereinafter referred to as a first timestamp), and the timestamp represents the time when the display image is acquired, and at present, from the acquisition of the video data captured by the capturing device 30 to the display of the display image, the delay in the middle is controlled to be extremely small, and it can be basically considered that the capturing time of the video data is the display time, and similarly, there is a timestamp (hereinafter referred to as a second timestamp) for the flight parameter, and the display image corresponding to the first timestamp and the flight parameter corresponding to the second timestamp at the same time as the first timestamp are in one-to-one correspondence, and the virtual image generated according to the flight parameter acquired at the second timestamp corresponding to the first timestamp, the flight status of the unmanned aerial vehicle 100 at the first time stamp may be indicated. Therefore, when the virtual image is added to the display image corresponding to the first time stamp, the aircraft can determine the flight state of the unmanned aerial vehicle 100 at the first time stamp by observing the display screen to which the virtual image is added.
It should be understood that the flow diagrams of the flight assistance methods illustrated in FIGS. 11-12 are presented by way of example only, and not by way of limitation. The flight assistance method of the present application may further include any combination of steps 011, 0121, 0122, 0123, 0131, 0132, 014, or other possible steps, and is not limited thereto.
Referring to fig. 2 and 13, in some embodiments, the flight assistance method further includes:
015: the display image to which the virtual image has been added is transmitted to display a corresponding display screen on the display 400.
In some embodiments, the processor 210 is further configured to send the display image with the virtual image added thereto to display a corresponding display screen on the display 400. That is, step 015 may be implemented by processor 210.
Specifically, the display image is generally displayed by the display 400, the display 400 may be the display 400 built in the remote controller 300 itself, or the display 400 may also be the display 400 on a terminal connected to the remote controller 300 or the unmanned aerial vehicle 100, for example, the terminal may be a mobile phone, a tablet computer, an intelligent wearable device (such as FPV glasses, etc.), the terminal may be installed on the remote controller 300 or may be connected to the remote controller 300 through a data line, or the terminal may be connected to the remote controller 300 or the unmanned aerial vehicle 100 through a cellular mobile network or a wireless local area network.
After the virtual image is added to the display image, the processor 210 sends the display image to which the virtual image has been added, so as to display a corresponding display frame (e.g., FPV frame) on the display 400. The unmanned aerial vehicle 100 may include an image transmission module 40, and the image transmission module 40 is a device for realizing wireless transmission of images in the unmanned aerial vehicle 100. For example, referring to fig. 14, when the display 400 is the display 400 built in the remote controller 300 and the flight assisting apparatus 200 is disposed on the remote controller 300, the image transmission module 40 of the unmanned aerial vehicle 100 directly acquires video data (display image) and then transmits the video data to the remote controller 300, the processor 210 acquires flight parameters of the unmanned aerial vehicle 100 through wireless communication, so as to generate a virtual image according to the flight parameters and add the virtual image to the display image, and the display 400 of the remote controller 300 displays a corresponding display screen according to the display image to which the virtual image has been added (that is, the processor 210 acquires the display image from the image transmission module 40 and transmits the display image to which the virtual image has been added to the display 400, so as to display the corresponding display screen on the display 400); referring to fig. 15, when the display 400 is the display 400 built in the remote controller 300 and the flight assisting apparatus 200 is disposed on the unmanned aerial vehicle 100, after the processor 210 acquires the video data (display image) and the flight parameters, a virtual image is generated according to the flight parameters and added to the display image, the display image with the virtual image added thereto is sent to the image transmission module 40, the display image with the virtual image added thereto is transmitted to the remote controller 300 through the image transmission module 40, and the display 400 of the remote controller 300 displays a corresponding display screen according to the display image.
For another example, referring to fig. 16, when the display 400 is the display 400 of the terminal connected to the remote controller 300 and the flight assisting apparatus 200 is disposed on the remote controller 300, the image transmission module 40 of the unmanned aerial vehicle 100 directly acquires the video data (display image) and then transmits the video data to the remote controller 300 (i.e., the processor 210 acquires the display image from the image transmission module 40), the processor 210 acquires the flight parameters of the functional component 20 of the unmanned aerial vehicle 100 through wireless communication, generates a virtual image according to the flight parameters and adds the virtual image to the display image, the remote controller 300 transmits the display image to which the virtual image is added to the terminal (e.g., the FPV glasses in fig. 16), that is, the display image to which the virtual image is added is obtained by acquiring the flight parameters of the functional component 20, generating the virtual image according to the flight parameters, and adding the virtual image to the display image, the display image to which the virtual image has been added can indicate the flight state corresponding to the functional component. The display 400 of the FPV glasses displays a corresponding display screen according to the display image (i.e., the display 400 is used for receiving the display image to which the virtual image has been added to display the corresponding display screen); referring again to fig. 2, when the display 400 is the display 400 of the terminal connected to the remote controller 300, and the flight assistance apparatus 200 is set on the unmanned aerial vehicle 100, after the processor 210 acquires the video data (display image) and the flight parameters, generates a virtual image according to the flight parameters and adds the virtual image to the display image, and then transmits the display image to which the virtual image has been added to the image transmission module 40, the image transmission module 40 then transmits the display image to which the virtual image has been added to the remote controller 300, and after the remote controller 300 receives the display image to which the virtual image has been added, the display image with the added virtual image is sent to the terminal, and the display 400 of the terminal displays a corresponding display screen according to the display image (i.e., the display 400 is used for receiving the display image with the added virtual image to display the corresponding display screen). It is understood that in another embodiment, the image transmission module 40 may also directly transmit the display image to which the virtual image has been added to the terminal to display a corresponding display screen on the display 400 of the terminal.
For another example, referring to fig. 17, when the display 400 is the display 400 of the terminal connected to the unmanned aerial vehicle 100 and the flight assisting apparatus 200 is disposed on the remote controller 300, after the processor 210 acquires the video data (display image) and the flight parameters, a virtual image is generated according to the flight parameters and added to the display image, the display image to which the virtual image has been added is sent to the image transmission module 40 through the remote controller 300, the display image is transmitted to the terminal through the image transmission module 40, and the display 400 of the terminal displays a corresponding display screen according to the display image.
The display 400 for the unmanned aerial vehicle 100 according to the embodiment of the present application is configured to receive a display image to which a virtual image is added to display a corresponding display screen, the unmanned aerial vehicle 100 includes a functional unit 20, the display image to which the virtual image is added is obtained by acquiring a flight parameter of the functional unit 20, generating a virtual image according to the flight parameter, and adding the virtual image to the display image, and the display image to which the virtual image is added can indicate a flight state corresponding to the functional unit 20.
Referring to fig. 18, a computer-readable storage medium 500 containing computer-executable instructions 502 according to embodiments of the present application, when the computer-executable instructions 502 are executed by one or more processors 210, cause the processors 210 to perform the flight assistance method according to any one of the embodiments described above.
For example, referring to fig. 1 and 2 in conjunction, the computer readable instructions 502, when executed by the processor 210, cause the processor 210 to perform the steps of:
011: acquiring flight parameters of the functional component 20;
012: generating a virtual image according to the flight parameters; and
013: a virtual image is added to the display image to indicate the flight status for the functional component 20.
For another example, referring to fig. 2 and 3, when executed by processor 210, computer readable instructions 502 cause processor 210 to perform the steps of:
0121: and generating a first paddle shadow and a second paddle shadow according to the attitude angle of the transverse roller, and determining the inclination angle of the connecting line of the first paddle shadow and the second paddle shadow.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (67)

1. A flight assistance method for an unmanned aerial vehicle, the unmanned aerial vehicle comprising a functional component, the flight assistance method comprising:
acquiring flight parameters of the functional component;
generating a virtual image according to the flight parameters; and
and adding the virtual image to a display image to indicate the corresponding flight state of the functional component.
2. The flight assistance method according to claim 1, wherein the functional component includes a gyroscope, the flight parameter includes an attitude angle of the gyroscope, and the acquiring the flight parameter of the functional component includes:
and acquiring the attitude angle of the gyroscope.
3. A flight assistance method according to claim 2, wherein the gyroscope is provided on a fuselage of the unmanned aerial vehicle and/or on a pan-tilt on the unmanned aerial vehicle.
4. The method of claim 2, wherein the virtual image comprises a two-dimensional first and second paddle shadow, the first and second paddle shadow being located on opposite sides of the display image, the attitude angle of the gyroscope comprising an attitude angle of a roll axis, the generating the virtual image according to the flight parameter comprising:
and generating the first paddle shadow and the second paddle shadow according to the attitude angle of the transverse roller, and determining the inclination angle of the connecting line of the first paddle shadow and the second paddle shadow, wherein the inclination angle is the angle of the connecting line of the first paddle shadow and the second paddle shadow relative to the horizontal line.
5. The flight assistance method of claim 1, wherein the functional component comprises a motor, the flight parameter comprises a rotational speed of the motor, and the obtaining the flight parameter of the functional component comprises:
and acquiring the rotating speed of the motor.
6. The flight assistance method of claim 5, wherein the virtual image comprises a two-dimensional first and second paddle image, the first and second paddle image being located on opposite sides of the display image, and wherein generating the virtual image according to the flight parameters comprises:
and generating the rotatable first paddle shadow and the rotatable second paddle shadow according to the rotating speed of the motor, and determining the rotating speed of the first paddle shadow and the second paddle shadow.
7. The flight assistance method of claim 2, wherein the virtual image comprises a three-dimensional image of an aircraft, the attitude angle of the gyroscope comprises an attitude angle of a roll axis, an attitude angle of a pitch axis, and an attitude angle of a yaw axis, and the generating a virtual image according to the flight parameters comprises:
and generating the aircraft image according to the attitude angle of the gyroscope, and respectively determining the roll attitude angle, the pitch attitude angle and the yaw attitude angle of the aircraft image according to the attitude angle of the roll shaft, the attitude angle of the pitch shaft and the attitude angle of the yaw shaft.
8. The flight assistance method of claim 1, further comprising:
adjusting parameters of the virtual image such that content of the display image is not obscured by the virtual image.
9. A flight assistance method according to claim 8, wherein the parameters include colour, size and/or transparency.
10. The flight assistance method of claim 1, wherein the adding the virtual image to a display image to indicate the flight status corresponding to the functional component comprises:
acquiring video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
11. The flight assistance method of claim 1, further comprising:
and sending the display image added with the virtual image to display a corresponding display picture on a display.
12. A flight assistance apparatus for an unmanned aerial vehicle, the unmanned aerial vehicle comprising functional components, the flight assistance apparatus comprising a processor configured to:
acquiring flight parameters of the functional component;
generating a virtual image according to the flight parameters; and
and adding the virtual image to a display image to indicate the corresponding flight state of the functional component.
13. The flight assist device of claim 12, wherein the functional component comprises a gyroscope, the flight parameter comprises an attitude angle of the gyroscope, and the processor is further configured to obtain the attitude angle of the gyroscope.
14. A flight assistance device according to claim 13, wherein the gyroscope is provided on a fuselage of the unmanned aerial vehicle and/or on a pan-tilt on the unmanned aerial vehicle.
15. The flight assist device of claim 13, wherein the virtual image includes a first two-dimensional paddle image and a second two-dimensional paddle image, the first paddle image and the second paddle image are located on opposite sides of the display image, the attitude angle of the gyroscope includes an attitude angle of a roll axis, and the processor is further configured to generate the first paddle image and the second paddle image according to the attitude angle of the roll axis and determine an inclination angle of a line connecting the first paddle image and the second paddle image, the inclination angle being an angle of the line connecting the first paddle image and the second paddle image with respect to a horizontal line.
16. The flight assist device of claim 12, wherein the functional component comprises a motor, the flight parameter comprises a rotational speed of the motor, and the processor is further configured to obtain the rotational speed of the motor.
17. A flight assistance apparatus according to claim 16, wherein the virtual image comprises a first and second two-dimensional paddle image, the first and second paddle image being located on opposite sides of the displayed image, the processor being further configured to generate the first and second rotatable paddle images from the rotational speed of the motor and to determine the rotational speed of the first and second paddle images.
18. The flight assist device of claim 13, wherein the virtual image comprises a three-dimensional image of an aircraft, the attitude angle of the gyroscope comprises an attitude angle of a roll axis, an attitude angle of a pitch axis, and an attitude angle of a yaw axis, and the processor is further configured to generate the image of the aircraft based on the attitude angle of the gyroscope, and determine the roll attitude angle, the pitch attitude angle, and the yaw attitude angle of the image of the aircraft based on the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis, respectively.
19. The flight assist device of claim 12, wherein the processor is further configured to receive adjustment instructions according to which parameters of the virtual image are adjusted such that content of the display image is not obscured by the virtual image.
20. A flight aid according to claim 19, wherein the parameters include colour, size and/or transparency.
21. A flight assistance apparatus as claimed in claim 12, wherein the UAV further comprises a camera device for capturing video data, the processor being further configured to:
acquiring the video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
22. A flight assistance apparatus according to claim 12,
the processor is used for sending the display image added with the virtual image so as to display a corresponding display picture on a display.
23. An unmanned aerial vehicle, comprising:
a body;
a functional part provided on the body; and
a flight assistance device disposed on the fuselage, the flight assistance device including a processor for:
acquiring flight parameters of the functional component;
generating a virtual image according to the flight parameters; and
and adding the virtual image to a display image to indicate the corresponding flight state of the functional component.
24. The UAV of claim 23 wherein the functional component comprises a gyroscope, the flight parameter comprises an attitude angle of the gyroscope, and the processor is further configured to obtain the attitude angle of the gyroscope.
25. The UAV according to claim 24 wherein the gyroscope is provided on a fuselage of the UAV and/or on a pan/tilt head of the UAV.
26. The UAV of claim 24, wherein the virtual image comprises a first two-dimensional paddle image and a second two-dimensional paddle image, the first paddle image and the second paddle image are located on opposite sides of the display image, the attitude angle of the gyroscope comprises an attitude angle of a roll axis, and the processor is further configured to generate the first paddle image and the second paddle image according to the attitude angle of the roll axis and determine an inclination angle of a line connecting the first paddle image and the second paddle image, the inclination angle being an angle of the line connecting the first paddle image and the second paddle image relative to a horizontal line.
27. The UAV of claim 23 wherein the functional component comprises a motor, the flight parameter comprises a speed of the motor, and the processor is further configured to obtain the speed of the motor.
28. The UAV of claim 27 wherein the virtual image comprises a first and second two-dimensional paddle image on opposite sides of the display image, the processor further configured to generate the first and second rotatable paddle images based on a speed of the motor and determine the speeds of the first and second paddle images.
29. The UAV of claim 24, wherein the virtual image comprises a three-dimensional image of an aircraft, wherein the gyroscope comprises a roll axis attitude angle, a pitch axis attitude angle, and a yaw axis attitude angle, and wherein the processor is further configured to generate the image of the aircraft based on the gyroscope attitude angles and determine the roll attitude angle, the pitch attitude angle, and the yaw attitude angle of the image of the aircraft based on the roll axis attitude angle, the pitch axis attitude angle, and the yaw axis attitude angle, respectively.
30. The UAV of claim 23 wherein the processor is further configured to receive adjustment instructions, and to adjust parameters of the virtual image in accordance with the adjustment instructions such that the content of the displayed image is not obscured by the virtual image.
31. The UAV of claim 30 wherein the parameters include color, size, and/or transparency.
32. The UAV of claim 23, further comprising a camera configured to capture video data, the processor further configured to:
acquiring the video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
33. The UAV of claim 23, further comprising an image transmission module, wherein the processor is configured to send the display image with the added virtual image to the image transmission module, and wherein the image transmission module is configured to transmit the display image with the added virtual image to a display to display a corresponding display on the display.
34. The UAV of claim 23 wherein the UAV comprises a traversing machine.
35. A remote control for an UAV, the UAV comprising a feature, the remote control comprising a flight assistance device, the flight assistance device comprising a processor configured to:
acquiring flight parameters of the functional component;
generating a virtual image according to the flight parameters; and
and adding the virtual image to a display image to indicate the corresponding flight state of the functional component.
36. The remote control of claim 35, wherein the functional component comprises a gyroscope, wherein the flight parameter comprises an attitude angle of the gyroscope, and wherein the processor is further configured to obtain the attitude angle of the gyroscope.
37. The remote control of claim 36, wherein the gyroscope is disposed on a fuselage of the UAV and/or a pan/tilt head on the UAV.
38. The remote controller according to claim 36, wherein the virtual image includes a two-dimensional first paddle shadow and a two-dimensional second paddle shadow, the first paddle shadow and the second paddle shadow are located on opposite sides of the display image, the attitude angle of the gyroscope includes an attitude angle of a roll axis, and the processor is further configured to generate the first paddle shadow and the second paddle shadow according to the attitude angle of the roll axis and determine an inclination angle of a line connecting the first paddle shadow and the second paddle shadow, the inclination angle being an angle of the line connecting the first paddle shadow and the second paddle shadow with respect to a horizontal line.
39. The remote control of claim 35, wherein the functional component comprises a motor, the flight parameter comprises a rotational speed of the motor, and the processor is further configured to obtain the rotational speed of the motor.
40. The remote control of claim 39, wherein the virtual image comprises a first two-dimensional paddle image and a second two-dimensional paddle image, the first and second paddle images being located on opposite sides of the display image, and wherein the processor is further configured to generate the first and second rotatable paddle images based on a rotational speed of the motor and determine the rotational speeds of the first and second paddle images.
41. The remote control of claim 36, wherein the virtual image comprises a three-dimensional image of an aircraft, wherein the attitude angle of the gyroscope comprises an attitude angle of a roll axis, an attitude angle of a pitch axis, and an attitude angle of a yaw axis, and wherein the processor is further configured to generate the image of the aircraft based on the attitude angle of the gyroscope and determine the roll attitude angle, the pitch attitude angle, and the yaw attitude angle of the image of the aircraft based on the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis, respectively.
42. The remote controller according to claim 35, wherein the processor is further configured to receive an adjustment instruction, and adjust the parameter of the virtual image according to the adjustment instruction, so that the content of the display image is not obstructed by the virtual image.
43. A remote control as claimed in claim 42, wherein the parameters include colour, size and/or transparency.
44. The remote control of claim 35, wherein the UAV further comprises a camera for capturing video data, the processor further configured to:
acquiring the video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
45. The remote control of claim 35, wherein the UAV further comprises an image transmission module, and wherein the processor is configured to obtain the display image from the image transmission module and send the display image with the virtual image added thereto to a display to display a corresponding display on the display.
46. A display for an unmanned aerial vehicle, the display being configured to receive a display image to which a virtual image has been added to display a corresponding display screen, the unmanned aerial vehicle including a functional component, the display image to which the virtual image has been added being obtained by acquiring a flight parameter of the functional component, generating a virtual image according to the flight parameter, and adding the virtual image to the display image, the display image to which the virtual image has been added being capable of indicating a flight status corresponding to the functional component.
47. The display of claim 46, wherein the functional component comprises a gyroscope, wherein the flight parameter comprises an attitude angle of the gyroscope, and wherein the obtaining the flight parameter of the functional component comprises: and acquiring the attitude angle of the gyroscope.
48. The display of claim 47, wherein the gyroscope is disposed on a fuselage of the UAV and/or a pan/tilt head on the UAV.
49. The display of claim 47, wherein the virtual image comprises a first paddle shadow and a second paddle shadow in two dimensions, the first paddle shadow and the second paddle shadow being located on opposite sides of the display image, the attitude angle of the gyroscope comprising an attitude angle of a roll axis, the generating the virtual image according to the flight parameter comprising:
and generating the first paddle shadow and the second paddle shadow according to the attitude angle of the transverse roller, and determining the inclination angle of the connecting line of the first paddle shadow and the second paddle shadow, wherein the inclination angle is the angle of the connecting line of the first paddle shadow and the second paddle shadow relative to the horizontal line.
50. The display of claim 46, wherein the functional component comprises a motor, the flight parameter comprises a rotational speed of the motor, and the obtaining the flight parameter of the functional component comprises: and acquiring the rotating speed of the motor.
51. The display of claim 50, wherein the virtual image comprises a two-dimensional first paddle image and a two-dimensional second paddle image, the first paddle image and the second paddle image being located on opposite sides of the display image, and wherein generating the virtual image according to the flight parameter comprises:
and generating the rotatable first paddle shadow and the rotatable second paddle shadow according to the rotating speed of the motor, and determining the rotating speed of the first paddle shadow and the second paddle shadow.
52. The display of claim 47, wherein the virtual image comprises a three-dimensional image of an aircraft, wherein the attitude angle of the gyroscope comprises an attitude angle of a roll axis, an attitude angle of a pitch axis, and an attitude angle of a yaw axis, and wherein generating the virtual image based on the flight parameters comprises:
and generating the aircraft image according to the attitude angle of the gyroscope, and respectively determining the roll attitude angle, the pitch attitude angle and the yaw attitude angle of the aircraft image according to the attitude angle of the roll shaft, the attitude angle of the pitch shaft and the attitude angle of the yaw shaft.
53. A display as claimed in claim 46, in which the parameters of the virtual image can be adjusted so that the content of the display image is not obscured by the virtual image.
54. A display as claimed in claim 53, characterised in that the parameters comprise colour, size and/or transparency.
55. The method of claim 46, wherein adding the virtual image to the display image comprises:
acquiring video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
56. An unmanned aerial vehicle system, characterized in that the unmanned aerial vehicle system comprises an unmanned aerial vehicle, a remote controller, a flight assistance device and a display, the unmanned aerial vehicle comprises a functional component, and the flight assistance device is arranged on the unmanned aerial vehicle and/or the remote controller; the flight assistance apparatus includes a processor configured to: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; adding the virtual image to a display image to indicate the flight state corresponding to the functional component; the display is used for receiving the display image added with the virtual image so as to display a corresponding display picture.
57. The UAV system of claim 56 wherein the functional component comprises a gyroscope, the flight parameter comprises an attitude angle of the gyroscope, and the processor is further configured to obtain the attitude angle of the gyroscope.
58. An unmanned aerial vehicle system according to claim 57, wherein the gyroscope is provided on a fuselage of the unmanned aerial vehicle and/or on a pan/tilt head on the unmanned aerial vehicle.
59. The UAV system of claim 57, wherein the virtual image comprises a first two-dimensional paddle image and a second two-dimensional paddle image, the first paddle image and the second paddle image are located on opposite sides of the display image, the attitude angle of the gyroscope comprises an attitude angle of a roll axis, and the processor is further configured to generate the first paddle image and the second paddle image according to the attitude angle of the roll axis and determine an inclination angle of a line connecting the first paddle image and the second paddle image, the inclination angle being an angle of the line connecting the first paddle image and the second paddle image relative to a horizontal line.
60. The UAV system of claim 56 wherein the functional component comprises a motor, the flight parameter comprises a speed of the motor, and the processor is further configured to obtain the speed of the motor.
61. The UAV system of claim 60, wherein the virtual image comprises a first and second two-dimensional paddle image on opposite sides of the display image, and wherein the processor is further configured to generate the first and second rotatable paddle images according to a rotational speed of the motor and determine the rotational speeds of the first and second paddle images.
62. The UAV system of claim 57, wherein the virtual image comprises a three-dimensional image of an aircraft, wherein the gyroscope comprises a roll axis attitude angle, a pitch axis attitude angle, and a yaw axis attitude angle, and wherein the processor is further configured to generate the image of the aircraft based on the gyroscope attitude angles and to determine the roll attitude angle, the pitch attitude angle, and the yaw attitude angle of the image of the aircraft based on the roll axis attitude angle, the pitch axis attitude angle, and the yaw axis attitude angle, respectively.
63. The UAV system of claim 56, wherein the processor is further configured to receive adjustment instructions, and adjust parameters of the virtual image according to the adjustment instructions such that content of the display image is not obscured by the virtual image.
64. The UAV system of claim 63 wherein the parameters include color, size, and/or transparency.
65. The UAV system of claim 56, wherein the UAV further comprises a camera configured to capture video data, the processor further configured to:
acquiring the video data, wherein the video data comprises one or more frames of display images and timestamps corresponding to the display images of each frame; and
adding the virtual image generated according to the flight parameters of the functional component acquired at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight state of the functional component at the time stamp.
66. The UAV system of claim 56, wherein the UAV further comprises an image transmission module, wherein when the flight assistance device is disposed on the UAV, the processor is configured to send the display image with the virtual image added thereto to the image transmission module, and the image transmission module is configured to transmit the display image with the virtual image added thereto to a display to display a corresponding display on the display;
when the flight assisting device is arranged on the remote controller, the processor is used for acquiring the display image from the image transmission module and sending the display image added with the virtual image to a display so as to display a corresponding display picture on the display.
67. A computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform a flight assistance method as claimed in any one of claims 1 to 11.
CN202080005564.8A 2020-05-22 2020-05-22 Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium Pending CN112840286A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/091885 WO2021232424A1 (en) 2020-05-22 2020-05-22 Flight assisting method and apparatus, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium

Publications (1)

Publication Number Publication Date
CN112840286A true CN112840286A (en) 2021-05-25

Family

ID=75926589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080005564.8A Pending CN112840286A (en) 2020-05-22 2020-05-22 Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium

Country Status (2)

Country Link
CN (1) CN112840286A (en)
WO (1) WO2021232424A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115938190A (en) * 2022-12-01 2023-04-07 南京芯传汇电子科技有限公司 Portable control terminal of unmanned aerial vehicle control training simulator

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (en) * 2009-03-04 2009-07-29 北京航空航天大学 Aerial unmanned plane ultra-viewing distance remote control method
CN103809600A (en) * 2014-03-04 2014-05-21 北京航空航天大学 Human-machine interaction control system of unmanned airship
CN104898697A (en) * 2015-05-18 2015-09-09 国家电网公司 Three-dimensional dynamic model of unmanned plane and control method
CN105045277A (en) * 2015-07-08 2015-11-11 西安电子科技大学 Multiple-UAV operation information display system
CN107077113A (en) * 2014-10-27 2017-08-18 深圳市大疆创新科技有限公司 Unmanned vehicle flight display
US20180048828A1 (en) * 2016-08-11 2018-02-15 Parrot Drones Method for capturing image(s), related computer program and electronic system for capturing a video
US20180253981A1 (en) * 2012-05-09 2018-09-06 Singularity University Transportation using network of unmanned aerial vehicles
US20200159252A1 (en) * 2018-11-21 2020-05-21 Eagle View Technologies, Inc. Navigating unmanned aircraft using pitch

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108253966B (en) * 2016-12-28 2021-08-06 昊翔电能运动科技(昆山)有限公司 Three-dimensional simulation display method for flight of unmanned aerial vehicle
CN110187700B (en) * 2019-06-10 2021-01-08 北京科技大学 Bionic flapping wing flying robot remote control system and method based on virtual reality

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (en) * 2009-03-04 2009-07-29 北京航空航天大学 Aerial unmanned plane ultra-viewing distance remote control method
US20180253981A1 (en) * 2012-05-09 2018-09-06 Singularity University Transportation using network of unmanned aerial vehicles
CN103809600A (en) * 2014-03-04 2014-05-21 北京航空航天大学 Human-machine interaction control system of unmanned airship
CN107077113A (en) * 2014-10-27 2017-08-18 深圳市大疆创新科技有限公司 Unmanned vehicle flight display
CN104898697A (en) * 2015-05-18 2015-09-09 国家电网公司 Three-dimensional dynamic model of unmanned plane and control method
CN105045277A (en) * 2015-07-08 2015-11-11 西安电子科技大学 Multiple-UAV operation information display system
US20180048828A1 (en) * 2016-08-11 2018-02-15 Parrot Drones Method for capturing image(s), related computer program and electronic system for capturing a video
CN107734289A (en) * 2016-08-11 2018-02-23 鹦鹉无人机股份有限公司 Method, related computer program and the electronic system for capturing video of capture images
US20200159252A1 (en) * 2018-11-21 2020-05-21 Eagle View Technologies, Inc. Navigating unmanned aircraft using pitch

Also Published As

Publication number Publication date
WO2021232424A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
CN108235702B (en) Cloud deck, unmanned aerial vehicle and control method thereof
WO2021078270A1 (en) Detachable/replaceable gimbal camera, aerial vehicle, system, and gimbal detachment/replacement method
CN108521864B (en) Imaging control method, imaging device and unmanned aerial vehicle
JP2017085550A (en) Drone with forward looking camera using segment of empty image for automatic exposure control
WO2020143677A1 (en) Flight control method and flight control system
CN110291780B (en) Image stability augmentation control method, shooting equipment and movable platform
WO2020057609A1 (en) Image transmission method and apparatus, image sending terminal, and aircraft image transmission system
JP4914171B2 (en) Imaging device control method and camera system
CN113273172A (en) Panorama shooting method, device and system and computer readable storage medium
JP2020024417A (en) Information processing apparatus
CN110720209B (en) Image processing method and device
CN109698913B (en) Image display method and device and electronic equipment
CN112514368A (en) Image acquisition method, control device and movable platform
US20210243357A1 (en) Photographing control method, mobile platform, control device, and storage medium
CN110291777B (en) Image acquisition method, device and machine-readable storage medium
WO2018049642A1 (en) Method and device for providing image in wearable device and movable object
CN112840286A (en) Flight assistance method and device, unmanned aerial vehicle, remote controller, display, unmanned aerial vehicle system, and storage medium
CN112334853A (en) Course adjustment method, ground end equipment, unmanned aerial vehicle, system and storage medium
WO2021135824A1 (en) Image exposure method and apparatus, and unmanned aerial vehicle
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
CN107577246A (en) A kind of image capturing method, system and electronic platform and aircraft
WO2021093577A1 (en) High dynamic range image automatic exposure method and unmanned aerial vehicle
WO2022061934A1 (en) Image processing method and device, system, platform, and computer readable storage medium
WO2021195861A1 (en) Anti-flickering method and apparatus, system and computer-readable storage medium
CN107867407A (en) A kind of pan-shot head of taking photo by plane of self-stabilization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination