WO2021232424A1 - 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质 - Google Patents

飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质 Download PDF

Info

Publication number
WO2021232424A1
WO2021232424A1 PCT/CN2020/091885 CN2020091885W WO2021232424A1 WO 2021232424 A1 WO2021232424 A1 WO 2021232424A1 CN 2020091885 W CN2020091885 W CN 2020091885W WO 2021232424 A1 WO2021232424 A1 WO 2021232424A1
Authority
WO
WIPO (PCT)
Prior art keywords
shadow
oar
image
attitude angle
display
Prior art date
Application number
PCT/CN2020/091885
Other languages
English (en)
French (fr)
Inventor
翁松伟
梁季光
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/091885 priority Critical patent/WO2021232424A1/zh
Priority to CN202080005564.8A priority patent/CN112840286A/zh
Publication of WO2021232424A1 publication Critical patent/WO2021232424A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • This application relates to the technical field of unmanned aerial vehicles, in particular to a flight assist method for unmanned aerial vehicles, flight assist devices for unmanned aerial vehicles, unmanned aerial vehicles, remote controllers for unmanned aerial vehicles, and Aircraft displays, unmanned aerial vehicle systems, and computer-readable storage media.
  • unmanned aerial vehicles such as traversing aircraft
  • structural components such as blades
  • FOV first-person view
  • the flight state of the unmanned aerial vehicle for example, the flight attitude
  • the structural components for example, blades
  • the structural components (such as blades) of the UAV are deliberately not recorded for a good picture, but this increases the difficulty for the operator to judge the flight status (such as the flight attitude) of the UAV.
  • the embodiments of the present application provide a flight assistance method for an unmanned aerial vehicle, a flight assist device for an unmanned aerial vehicle, an unmanned aerial vehicle, a remote control for an unmanned aerial vehicle, a display for an unmanned aerial vehicle, and an unmanned aerial vehicle.
  • Human aircraft system and computer readable storage medium Human aircraft system and computer readable storage medium.
  • the embodiment of the present application provides a flight assistance method for an unmanned aerial vehicle.
  • the flight assistance method includes: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to The image is displayed to indicate the flight status corresponding to the functional component.
  • the embodiment of the present application also provides a flight assistance device for an unmanned aerial vehicle, the unmanned aerial vehicle includes a functional component, the flight assistance device includes a processor, and the processor is configured to: obtain flight parameters of the functional component Generating a virtual image according to the flight parameters; and adding the virtual image to the display image to indicate the flight status corresponding to the functional component.
  • the embodiment of the present application also provides an unmanned aerial vehicle, the unmanned aerial vehicle comprising: a fuselage, a functional component and a flight auxiliary device, the functional component is arranged on the fuselage, and the flight auxiliary device is arranged on the On the fuselage, the flight assistance device includes a processor for: acquiring flight parameters of the functional component; generating a virtual image according to the flight parameters; and adding the virtual image to the display image to Indicates the flight status corresponding to the functional component.
  • the embodiment of the present application also provides a remote control for an unmanned aerial vehicle, the unmanned aerial vehicle includes functional components, the remote control includes a flight assistance device, the flight assistance device includes a processor, and the processor is used for: Obtain the flight parameters of the functional component; generate a virtual image according to the flight parameters; and add the virtual image to the display image to indicate the flight status corresponding to the functional component.
  • the embodiment of the present application also provides a display for an unmanned aerial vehicle, the display is used to receive a display image with a virtual image added to display a corresponding display screen, the unmanned aerial vehicle includes functional components, and the The display image of the virtual image is obtained by acquiring the flight parameters of the functional component, generating a virtual image according to the flight parameters, and adding the virtual image to the display image, and the display image to which the virtual image has been added It can indicate the flight status corresponding to the functional component.
  • the embodiment of the present application also provides an unmanned aerial vehicle system.
  • the unmanned aerial vehicle system includes an unmanned aerial vehicle, a remote control, a flight assistance device, and a display.
  • the flight assistance device includes a processor, the processor is used to: obtain the flight parameters of the functional component; generate a virtual image according to the flight parameters; and transfer the A virtual image is added to the display image to indicate the flight status corresponding to the functional component; the display is used to receive the display image to which the virtual image has been added, so as to display the corresponding display screen.
  • the embodiment of the present application also provides a computer-readable storage medium containing computer-executable instructions.
  • the processor is caused to execute the flight assistance method of the foregoing embodiment.
  • the flight assistance method for the unmanned aerial vehicle, the flight assist device for the unmanned aerial vehicle, the unmanned aerial vehicle, the remote control for the unmanned aerial vehicle, the display for the unmanned aerial vehicle, the unmanned aerial vehicle system according to the embodiments of the present application
  • the flight status is indicated by the virtual image added to the display image. Since the virtual image is generated according to the flight parameters of the functional components of the UAV, the virtual image can well indicate that the functional components correspond For example, indicating the attitude angle corresponding to the functional component (such as the gyroscope), so that it does not need to be photographed together with the blades, and can also assist the operator in judging the flight status of the unmanned aerial vehicle. And since the virtual image is added to the display image subsequently, there are no virtual images and blades in the actually shot video, and the quality of the shot video is better.
  • FIG. 1 is a schematic flowchart of a flight assistance method according to some embodiments of the present application.
  • Fig. 2 is a schematic structural diagram of an unmanned aerial vehicle system according to some embodiments of the present application.
  • Fig. 3 is a schematic flowchart of a flight assistance method according to some embodiments of the present application.
  • 4 to 7 are schematic diagrams of scenes of flight assistance methods according to some embodiments of the present application.
  • FIG 8 and 9 are schematic flowcharts of flight assistance methods in some embodiments of the present application.
  • FIG. 10 is a schematic diagram of a scene of a flight assistance method according to some embodiments of the present application.
  • 11 to 13 are schematic flowcharts of flight assistance methods in some embodiments of the present application.
  • FIGS. 14 to 17 are schematic diagrams of the structure of the unmanned aerial vehicle system according to some embodiments of the present application.
  • FIG. 18 is a schematic diagram of the connection between a processor and a computer-readable storage medium in some embodiments of the present application.
  • first and second are only used for description purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the present application, “multiple” means two or more than two, unless otherwise specifically defined.
  • connection should be understood in a broad sense, unless otherwise clearly specified and limited.
  • it can be a fixed connection or a detachable connection.
  • Connected or integrally connected it can be mechanically connected, or electrically connected or can communicate with each other; it can be directly connected, or indirectly connected through an intermediate medium, it can be the internal communication of two components or the interaction of two components relation.
  • connection should be understood according to specific circumstances.
  • UAV 100 is widely used in aerial photography, agriculture, plant protection, micro selfies, express transportation, disaster rescue, wildlife observation, monitoring of infectious diseases, surveying and mapping, news reports, power inspections, disaster relief, film and television shooting, romantic making, etc.
  • the application of the scene is mainly to control the unmanned aerial vehicle 100 through the remote control 300 to perform various flight tasks.
  • the unmanned aerial vehicle 100 includes a fuselage 10 and a photographing device 30 mounted on the fuselage 10.
  • the photographing device 30 is used to obtain environmental images. It is possible to understand the current environment of the unmanned aerial vehicle 100, so as to control the remote controller 300 to control the unmanned aerial vehicle 100 to perform various flight tasks.
  • unmanned aerial vehicle 100 there is a type of unmanned aerial vehicle 100 (i.e., a rider) that is mainly used for racing and entertainment.
  • Unmanned aerial vehicle players realize the first-person operation experience through the FPV screen, and control the traversing aircraft to fly.
  • the following description takes the UAV 100 as a traversing machine as an example.
  • the UAV 100 is of other types, the principle is basically the same, and will not be repeated here.
  • the shooting device 30 will generally take pictures of the blades together to assist the pilot in judging the current flight attitude of the traversing machine. However, this will cause the rotation of the blades on both sides of the FPV screen, and the blades will not only block the shooting video Part of the scenes in the pilot's control experience will be affected to a certain extent, and it will also cause the presence of paddles in the shooting video, which affects the viewer's viewing experience.
  • An embodiment of the present application provides a flight assistance method for an unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 includes a functional component 20, and the flight assist method includes:
  • a virtual image is added to the display image to indicate the flight status corresponding to the functional component 20.
  • the embodiment of the present application also provides a flight assistance device 200 for the unmanned aerial vehicle 100.
  • the flight assistance device 200 includes a processor 210.
  • the processor 210 is configured to: obtain flight parameters of the functional component 20; and generate a virtual image according to the flight parameters. ; And adding a virtual image to the display image to indicate the flight status corresponding to the functional component 20.
  • step 011 to step 013 can be implemented by the processor 210.
  • the embodiment of the present application also provides an unmanned aerial vehicle system 1000.
  • the unmanned aerial vehicle system 1000 includes an unmanned aerial vehicle 100, a flight assistance device 200, a remote controller 300, and a display 400.
  • the flight assistance device 200 can be installed on the unmanned aerial vehicle 100. And/or remote control 300.
  • the flight assistance device 200 may be arranged on the fuselage 10 of the unmanned aerial vehicle 100 (that is, the unmanned aerial vehicle 100 includes the fuselage 10, the functional components 20, the photographing equipment 30, and the flight assistance device 200, and the functional components 20 may be arranged on On the fuselage 10), the processor 210 can directly obtain the flight parameters of the functional components 20 and the video data taken by the shooting device 30 set on the fuselage 10 of the UAV 100 (the video data includes one or more frames of display images). ), and generate a virtual image according to the flight parameters to be added to the display image to indicate the flight status of the UAV 100; or, the flight assistance device 200 may be set on the remote control 300 (that is, the UAV 100 includes the fuselage 10.
  • the functional component 20 and the photographing equipment 30, the remote control 300 includes the flight assistance device 200), through the wireless communication between the remote control 300 and the UAV 100, the processor 210 can obtain the flight parameters of the functional component 20 and the photographing equipment 30 According to the flight parameters, a virtual image is generated to add to the displayed image to indicate the flight status of the UAV 100; or, the flight assistance device 200 is installed on the fuselage 10 of the UAV 100 and the remote control 300 at the same time
  • the flight assistance device 200 includes at least two processors 210, which are respectively arranged on the fuselage 10 of the unmanned aerial vehicle 100 and the remote control 300, and the processor 210 arranged on the fuselage 10 of the unmanned aerial vehicle 100 can directly obtain To the flight parameters of the functional component 20 and the video data captured by the shooting device 30, and generate virtual images according to the flight parameters, and then the processor 210 set on the remote control 300 can receive the virtual images and videos sent by the unmanned aerial vehicle 100 through wireless communication Data and add a virtual image to the display image.
  • the display image to which the virtual image has been added can indicate the flight status corresponding to the functional component 20, that is, step 011 and step 012 can be performed on the fuselage 10 of the UAV 100
  • Step 013 can be implemented by the processor 210 on the remote control 300.
  • step 011 can also be implemented by the processor 210 on the fuselage 10 of the UAV 100, step 012 and step 013. It can be implemented by the processor 210 on the remote control 300.
  • the flight assistance device 200 is set on the fuselage 10 of the unmanned aerial vehicle 100 as an example, which is not limited here.
  • the functional component 20 may be any device provided on the UAV 100.
  • the functional component 20 may include sensors (such as gyroscopes, accelerometers, barometers, magnetic compasses, etc.), motors, power supplies, etc. of the UAV 100.
  • Processing Obtaining the flight parameters of the functional component 20 by the device 210 may include: obtaining the attitude angle of the gyroscope, the acceleration of the accelerometer, the air pressure of the barometer, the magnetic field data of the magnetic compass, the rotation speed of the motor, the discharge current of the power supply, and the remaining power of the power supply.
  • the processor 210 generates a corresponding virtual image according to the flight parameters. For example, the processor 210 may generate a propeller shadow image according to the attitude angle of the gyroscope and the acceleration of the accelerometer to indicate the attitude angle of the UAV 100 (such as on both sides of the FPV screen).
  • the processor 210 can generate a scaled indicator tube image according to the air pressure of the barometer to indicate The atmospheric pressure of the environment where the UAV 100 is currently located; for another example, the processor 210 may generate a heading compass image showing the azimuth in the form of a disc according to the magnetic field data of the magnetic compass, so as to indicate the flight heading of the UAV 100; for another example, The processor 210 can generate a rotatable propeller shadow image according to the rotation speed of the motor, and the rotation speed of the propeller shadow is determined according to the rotation speed of the motor to indicate the rotation speed of the blade corresponding to each motor; The battery power generates a battery image to indicate the remaining power, and the processor 210 may also generate an indicating digital image to indicate the remaining use time of the UAV 100 according to the remaining power of the power source and the discharge current of the power source.
  • the processor 210 can generate a scaled indicator tube image according to the air pressure of the barometer to indicate The atmospheric pressure of the environment where the UAV 100 is currently located;
  • the flight parameters can be the flight parameters at the current moment, or the flight parameters within a predetermined time period before the current time.
  • the virtual image When the virtual image is generated based on the flight parameters at the current time, it can indicate the flight status of the functional component 20 at the current time; the virtual image is based on
  • the flight parameters in a predetermined time period before the current time are generated, it can indicate the flight status of the functional component 20 in the predetermined time period.
  • a rotatable propeller shadow generated according to the average rotation speed of the motor in the predetermined time period can indicate the predetermined time period.
  • the rotation speed of the motor is generated according to the remaining power within a predetermined period and the predetermined period of time to generate a power discharge current, and then based on the discharge current and remaining power, an indicator digital image can be generated to indicate the remaining use time of the UAV 100, etc.
  • the predetermined time period can be 1 second, 2 seconds, 30 seconds, 1 minute, 5 minutes, and so on.
  • the processor 210 After generating the corresponding virtual image according to the flight parameters, the processor 210 adds the virtual image to the display image acquired by the photographing device 30, thereby displaying the display image with the virtual image as the FPV screen, and adding the display of the virtual image
  • the image can indicate the flight status corresponding to the functional component 20 in real time.
  • the attitude angle of the UAV 100 can be indicated by simulating the shadow of an actual blade, so that the pilot can determine the current flight attitude of the crossing plane and assist the pilot in controlling the crossing plane.
  • the flight status is indicated by the virtual image added to the display image. Since the virtual image is generated according to the flight parameters of the functional component 20 of the unmanned aerial vehicle, the virtual image can be very good.
  • the flight status corresponding to the functional component 20 is indicated locally, for example, the attitude angle of the UAV 100 is indicated by the shadow of the propeller, so that it does not need to be photographed with the blades, and can also assist the pilot to judge the flight status of the UAV 100.
  • the virtual image is subsequently added to the display image to be displayed as an FPV screen, the virtual image and the blade are not present in the actual captured display image, and the quality of the captured video is better.
  • the functional component 20 includes a gyroscope 21, the virtual image includes a two-dimensional first oar shadow and a second oar shadow, the first oar shadow and the second oar shadow are located in the display image
  • the attitude angle of the gyroscope 21 includes the attitude angle of the roll axis
  • step 012 includes:
  • 0121 Generate the first oar shadow and the second oar shadow according to the attitude angle of the roll axis, and determine the inclination angle of the line connecting the first oar shadow and the second oar shadow, the inclination angle is the first oar shadow and the second oar shadow The angle of the line relative to the horizontal.
  • the processor 210 is further configured to generate the first oar shadow and the second oar shadow according to the attitude angle of the roll axis, and determine the inclination angle of the line connecting the first oar shadow and the second oar shadow.
  • the angle is the angle of the line connecting the first oar shadow and the second oar shadow with respect to the horizontal line.
  • step 0121 can be implemented by the processor 210.
  • the blades are located on both sides of the FPV screen.
  • the horizontal direction of the FPV screen is the X direction
  • the vertical direction of the FPV screen is The direction is the Y direction.
  • the attitude angle of the UAV 100 changes, the positions of the blades in the Y direction on both sides of the FPV screen change accordingly.
  • the gyroscope 21 may be arranged on the fuselage 10 of the unmanned aerial vehicle 100 to detect the attitude angle of the fuselage 10; or, the gyroscope 21 may be arranged on the pan/tilt 500, which is arranged on the fuselage of the unmanned aerial vehicle 100 10, to detect the attitude angle of the PTZ 500; or, the gyroscope 21 is installed on the fuselage 10 of the UAV 100 and on the PTZ 500 at the same time, that is, there are at least two gyroscopes 21, which are respectively set on the UAV 100 On the fuselage 10 of 100 and on the pan/tilt 500, the attitude angle of the fuselage 10 and the attitude angle of the pan/tilt 500 are respectively detected. In the embodiment shown in FIG. 2, the gyroscope 21 is set on the fuselage 10 of the UAV 100 as an example, and it is not limited here.
  • the working modes of the pan-tilt 500 include a stabilization mode and a follow mode.
  • the pan/tilt head 500 can maintain the stabilization mode, maintain the follow mode, and switch between the stabilization mode and the follow mode.
  • the stabilization mode refers to that the gimbal 500 always maintains the stability of the preset reference direction (for example, the horizontal direction).
  • the gimbal 500 will perform negative feedback adjustments on the roll, yaw, and pitch operations of the UAV 100 to offset it.
  • the possible shaking can keep the load (such as the photographing device 30) carried on the pan/tilt 500 stable. Take pitch as an example.
  • the stabilization mode when UAV 100 is pitching, the camera will not pitch along with it, but still maintain the original shooting angle (generally horizontal).
  • the reason is that UAV 100 is pitching.
  • the time pan/tilt 500 performs negative feedback adjustment to keep the camera device 30 mounted on the pan/tilt 500 always in a horizontal direction.
  • the negative feedback adjustment here means that when the UAV is tilted at 100 degrees by 15 degrees, the gimbal 500 controls the camera to tilt by 15 degrees to maintain the level of the camera and realize the stabilization of the camera; the negative effects of yaw and roll
  • the principle of feedback adjustment is basically the same as the principle of negative feedback adjustment of pitch.
  • the following mode refers to that the pan/tilt 500 follows the unmanned aerial vehicle 100 to move, so as to keep the relative angle of the photographing device 30 and the unmanned aerial vehicle 100 unchanged. Take pitch as an example. In the follow mode, if the user controls the UAV 100 to raise 20 degrees, the gimbal 500 controls the camera 30 to raise 20 degrees so that the relative angle of the camera 30 and the UAV 100 remains basically unchanged. .
  • the tilt angle of the line connecting the first paddle shadow S1 and the second paddle shadow S2 generated is determined by the attitude angle of the roll axis, and the first paddle shadow S1 is located on the left side of the FPV screen.
  • the second paddle shadow S2 is located on the right side of the FPV screen.
  • Figure 5 When the UAV 100 rolls counterclockwise, the first paddle shadow S1 is attached to the left edge of the FPV screen and moves in the opposite direction of the Y direction. , And the second paddle shadow S2 is attached to the right edge of the FPV screen and moves in the Y direction; on the contrary, please refer to Figure 6, when the UAV 100 rolls clockwise, the first paddle shadow S1 is attached to the FPV screen.
  • the line connecting the first oar shadow S1 and the second oar shadow S2 may pass through the center of the FPV screen, and the inclination angle of the line connecting the first oar shadow S1 and the second oar shadow S2 is the first oar shadow The angle of the connecting line of S1 and the second paddle shadow S2 with respect to the horizontal line passing through the center of the FPV screen. According to the inclination angle of the line connecting the first propeller shadow S1 and the second propeller shadow S2, the roll attitude angle of the UAV 100 can be indicated.
  • the inclination angle of the connecting line of the quadruple shadow S4 always remains the same (for example, 0 degrees), and in the follow mode, the inclination angle of the connecting line of the third oar shadow S3 and the fourth oar shadow S4 corresponding to the gimbal 500 follows the unmanned
  • the aircraft 100 moves in response to the movement of the first oar shadow S1 and the second oar shadow S2, and the inclination angles of the two are basically the same.
  • connection of the first oar shadow S1 and the second oar shadow S2 corresponding to the UAV 100 can pass through the center of the upper half of the FPV screen, and the first oar shadow S1 and the second oar shadow S1
  • the inclination angle of the connecting line of the paddle shadow S2 is the angle of the connecting line of the first paddle shadow S1 and the second paddle shadow S2 with respect to the horizontal line passing through the center of the upper half of the FPV screen; the third paddle shadow S3 corresponding to the PTZ 500
  • the connection with the fourth oar shadow S4 can pass through the center of the lower half of the FPV screen, and the inclination angle of the connection between the third oar shadow S3 and the fourth oar shadow S4 is the third oar shadow S3 and the fourth oar shadow S4
  • the angle of the connecting line of is relative to the horizontal line passing through the center of the lower half of the FPV screen; thus, the roll attitude angle of the UAV 100 and
  • the functional component 20 includes a motor 22, the virtual image includes a two-dimensional first oar shadow S1 and a second oar shadow S2, and the first oar shadow S1 and the second oar shadow S1 and S2.
  • the two paddle shadows S2 are located on opposite sides of the displayed image, and step 012 also includes:
  • 0122 Generate the rotatable first propeller shadow S1 and the second propeller shadow S2 according to the rotation speed of the motor 22, and determine the rotation speeds of the first propeller shadow S1 and the second propeller shadow S2.
  • the processor 210 is further configured to generate a rotatable first blade shadow S1 and a second blade shadow S2 according to the rotation speed of the motor 22, and determine the rotation speeds of the first blade shadow S1 and the second blade shadow S2.
  • step 0122 may be implemented by the processor 210.
  • the processor 210 since the photographing device 30 does not photograph the actual blades, the pilot himself cannot know whether the blades are rotating normally and the speed of rotation. When the motor 22 is damaged, the blades The rotation will slow down or even no longer rotate. Therefore, the processor 210 generates a rotatable first propeller shadow S1 and a second propeller shadow S2 according to the rotation speed of the motor 22 to indicate the rotation speed of the motor 22. When the two blade shadow S2 does not rotate or the rotation speed suddenly becomes extremely slow, it indicates that there may be a problem with the motor 22. At this time, the traversing aircraft is no longer suitable for flying and needs to be landed as soon as possible for maintenance.
  • the rotation speeds of the first propeller shadow S1 and the second propeller shadow S2 can be determined according to the lowest speed of the corresponding one or more motors 22.
  • the traversing aircraft is a four-wing traversing aircraft, divided into two groups, each The group includes two blades, and each blade corresponds to a motor 22.
  • the first blade shadow S1 is determined according to the lowest speed of the motors 22 of the first group
  • the second blade shadow S2 is determined according to the lowest speed of the motors 22 of the second group.
  • the rotation speed is determined, so that the pilot can find the motor 22 that may be damaged (for example, the rotation speed is less than or equal to the predetermined rotation speed) in time.
  • the number of propeller shadows can be more.
  • the first propeller shadow S1 and the second propeller shadow S2 are two respectively, and each propeller shadow corresponds to a motor 22, so that the motor that may be damaged can be accurately determined according to the rotation speed of the propeller shadow. twenty two.
  • step 012 includes step 0121 and step 0122, that is to say, through the display image of the added virtual image, the pilot can not only understand the roll attitude angle of the traversing machine, but also can understand the traversing machine's roll attitude angle. Motor speed or damage.
  • step 012 of the flight assistance method may also include a separate step 0121 or a separate step 0122, or other feasible steps alone, or a combination of these steps, which is not limited herein.
  • the virtual image includes a three-dimensional aircraft image S5, and the attitude angle of the gyroscope 21 includes the attitude angle of the roll axis, the attitude angle of the pitch axis, and the yaw axis.
  • Attitude angle, step 012 also includes:
  • 0123 Generate the aircraft image S5 according to the attitude angle of the gyroscope 21, and determine the roll attitude angle, pitch attitude angle and the attitude angle of the aircraft image S5 according to the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis. Yaw attitude angle.
  • the processor 210 is further configured to generate the aircraft image S5 according to the attitude angle of the gyroscope 21, and determine the aircraft image S5 according to the attitude angle of the roll axis, the attitude angle of the pitch axis, and the attitude angle of the yaw axis. S5 roll attitude angle, pitch attitude angle and yaw attitude angle.
  • step 0123 can be implemented by the processor 210.
  • a three-dimensional aircraft image S5 may be generated according to the attitude angle of the gyroscope 21.
  • the roll attitude angle of the aircraft image S5 is consistent with the roll axis attitude angle of the gyroscope 21;
  • the pitch attitude angle of the three-dimensional aircraft image S5 is consistent with the pitch axis attitude angle of the gyroscope 21;
  • the three-dimensional aircraft image S5 The yaw attitude angle is consistent with the yaw axis attitude angle of the gyroscope 21.
  • the aircraft image S5 changes in synchronization with the attitude of the unmanned aerial vehicle 100, and the pilot can accurately determine the attitude changes based on the three-axis (ie, roll, pitch, and yaw axis) of the virtual three-dimensional aircraft image S5.
  • the three-axis attitude of the human aircraft 100 changes, thereby assisting the pilot to better control the unmanned aircraft 100.
  • the flight assistance method includes:
  • 014 Adjust the parameters of the virtual image so that the content of the displayed image is not blocked by the virtual image.
  • the processor 210 is further configured to adjust the parameters of the virtual image, so that the content of the displayed image is not blocked by the virtual image.
  • step 014 can be implemented by the processor 210.
  • the processor 210 may Adjust the parameters of the virtual image, for example, the parameters may be the color, size, and/or transparency of the virtual image, and the processor 210 may adjust the color, size, or transparency of the virtual image; or, the processor 210 may adjust the color and size of the virtual image; or The processor 210 can adjust the color and transparency of the virtual image; alternatively, the processor 210 can adjust the size and transparency of the virtual image; or the processor 210 can adjust the color, size, and transparency of the virtual image. In this embodiment, the processor 210 can adjust the color, size, and transparency of the virtual image as an example for description.
  • the processor 210 can adjust the color of the virtual image according to the color of the display image itself acquired by the photographing device 30, adjust the size of the virtual image according to the size of the display image, and adjust the transparency of the virtual image according to the color of the display image. For example, if most of the displayed image (more than 70% of the pixels) is blue, the virtual image is displayed in a color (such as red, green, etc.) that is quite different from blue, so that the user can quickly observe the virtual image.
  • a color such as red, green, etc.
  • the processor 210 adjusts the size of the virtual image according to the size of the displayed image (for example, adjusting the size of the virtual image to 1/20 of the size of the displayed image), so that the virtual image does not block the excessively displayed image area;
  • the color of the displayed image adjusts the transparency of the virtual image, so that the area occluded by the virtual image of the displayed image can also be observed by the pilot.
  • the transparency needs to be adjusted lower.
  • the virtual image and the displayed image can be distinguished well. Therefore, when the virtual image and the displayed image are displayed in different colors, the transparency of the virtual image can be adjusted to a lower level (such as 20%), so that the virtual image can not only be fast Observed by the pilot, and basically will not obstruct the displayed image.
  • the processor 210 adjusts the parameters of the virtual image so that the adjusted virtual image can be quickly observed by the pilot without obstructing the display content of the displayed image.
  • the processor 210 is further configured to receive an adjustment instruction to adjust the parameters of the virtual image according to the adjustment instruction.
  • the adjustment instruction may be obtained by receiving the pilot's input through the remote control 300, or the adjustment instruction may also be obtained by receiving the pilot's input through a terminal (such as a mobile phone, a tablet computer, etc.) connected to the remote control 300, where the remote control 300 may A mounting frame is provided, and the terminal can be directly set on the mounting frame to realize a wired connection with the remote control 300, or the terminal can realize a wired connection with the remote control 300 through a data cable, or the terminal can be wirelessly connected with the remote control 300 (such as The remote control 300 is connected via Bluetooth, or the terminal and the remote control 300 are connected to a cellular mobile network or a wireless local area network to achieve a wireless connection, etc.). In this way, the most suitable virtual image for the pilot can be set according to the pilot's personal preference.
  • step 013 includes:
  • the video data includes one or more frames of display images and a timestamp corresponding to each frame of the displayed image;
  • 0132 Add a virtual image generated according to the flight parameters of the functional component 20 obtained at the same time as the time stamp to the display image corresponding to the time stamp to indicate the flight status of the functional component 20 at the time stamp.
  • the processor 210 is also used to obtain video data.
  • the video data includes one or more frames of display images and a timestamp corresponding to each frame of the display image; and functions that will be acquired according to the same time as the timestamp
  • the virtual image generated by the flight parameters of the component 20 is added to the display image corresponding to the time stamp to indicate the flight status of the functional component 20 at the time stamp.
  • step 0131 and step 0132 can be implemented by the processor 210.
  • the video data taken by the shooting device 30 needs to be acquired first.
  • the video data includes one or more frames. Display images, each frame of the displayed image corresponds to a time stamp (hereinafter referred to as the first time stamp).
  • the time stamp indicates the time when the displayed image is acquired.
  • the flight parameters also have a time stamp (hereinafter referred to as the second time stamp), the display image corresponding to the first time stamp, and
  • the flight parameters corresponding to the second time stamp at the same time as the first time stamp are in one-to-one correspondence, and the virtual image generated based on the flight parameters acquired at the second time stamp corresponding to the first time stamp can indicate the first time stamp
  • the flight status of UAV 100 at time Therefore, after adding the virtual image to the display image corresponding to the first time stamp, the pilot can determine the flight status of the UAV 100 at the first time stamp by observing the display screen added with the virtual image.
  • FIGS. 11-12 the flowchart of the flight assistance method shown in FIGS. 11-12 is only shown as an example, and has no limiting effect.
  • the flight assistance method of the present application may further include steps 011, 0121, 0122, 0123, 0131, 0132, 014 or any combination of other feasible steps, which are not limited here.
  • the flight assistance method further includes:
  • 015 Send the display image to which the virtual image has been added to display the corresponding display screen on the display 400.
  • the processor 210 is further configured to send a display image to which a virtual image has been added, so as to display a corresponding display screen on the display 400.
  • step 015 may be implemented by the processor 210.
  • the display image is generally displayed on the display 400, which can be the display 400 built into the remote control 300 itself, or the display 400 can also be the display 400 on a terminal connected to the remote control 300 or the UAV 100, such as a terminal. It can be a mobile phone, tablet computer, smart wearable device (such as FPV glasses, etc.), the terminal can be installed on the remote control 300 or connected to the remote control 300 through a data cable, or the terminal can be connected to the remote control 300 through a cellular mobile network or a wireless local area network. Wireless connection of remote control 300 or UAV 100.
  • the unmanned aerial vehicle 100 may include an image transmission module 40, and the image transmission module 40 is a device for realizing wireless transmission of images in the unmanned aerial vehicle 100. For example, referring to FIG.
  • the image transmission module 40 of the unmanned aerial vehicle 100 directly obtains the video data (display image ) Is sent to the remote control 300, and the processor 210 obtains the flight parameters of the UAV 100 through wireless communication, thereby generating a virtual image according to the flight parameters and adding the virtual image to the display image.
  • the display 400 of the remote control 300 is based on the added The display image of the virtual image displays the corresponding display image (that is, the processor 210 obtains the display image from the image transmission module 40, and sends the display image to which the virtual image has been added to the display 400 to display the corresponding display image on the display 400) ;
  • the processor 210 obtains the video data (display image) and flight parameters, according to the flight
  • the parameter generates a virtual image and adds the virtual image to the display image, and then sends the display image with the added virtual image to the image transmission module 40, and transmits the display image with the added virtual image to the remote control 300 through the image transmission module 40.
  • the display 400 of the device 300 displays a corresponding display screen based on the display image.
  • the image transmission module 40 of the unmanned aerial vehicle 100 directly obtains the video
  • the data (display image) is then sent to the remote controller 300 (that is, the processor 210 obtains the display image from the image transmission module 40), and the processor 210 obtains the flight parameters of the functional components 20 of the UAV 100 through wireless communication, so as to obtain the flight parameters according to the flight
  • the parameter generates a virtual image and adds the virtual image to the display image
  • the remote control 300 sends the display image of the added virtual image to the terminal (such as FPV glasses in Figure 16), that is, the display of the added virtual image
  • the image is obtained by acquiring the flight parameters of the functional component 20, generating a virtual image based on the flight parameters, and adding the virtual image to the display image.
  • the display image to which the virtual image has been added can indicate the flight status corresponding to the functional component.
  • the display 400 of the FPV glasses displays the corresponding display screen according to the display image (that is, the display 400 is used to receive the display image with the added virtual image to display the corresponding display screen); please refer to FIG.
  • the processor 210 obtains the video data (display image) and flight parameters, generates a virtual image according to the flight parameters and adds the virtual image On the display image, then the display image with the added virtual image is sent to the image transmission module 40, and then the image transmission module 40 transmits the display image with the added virtual image to the remote control 300, and the remote control 300 receives the added virtual image
  • the display image with the added virtual image is sent to the terminal, and the display 400 of the terminal displays the corresponding display screen according to the display image (that is, the display 400 is used to receive the display image with the added virtual image to display the corresponding display image. Display screen).
  • the image transmission module 40 may also directly transmit the display image to which the virtual image has been added to the terminal, so as to display the corresponding display screen on the display 400 of the terminal.
  • the display 400 is the display 400 of the terminal connected to the unmanned aerial vehicle 100, and the flight assistance device 200 is set on the remote control 300
  • the processor 210 obtains the video data (display image) and the flight
  • a virtual image is generated according to the flight parameters and the virtual image is added to the display image, and then the display image with the added virtual image is sent to the image transmission module 40 through the remote control 300, and then the display image is transmitted by the image transmission module 40
  • the display 400 of the terminal displays a corresponding display screen according to the display image.
  • the display 400 for the unmanned aerial vehicle 100 of the embodiment of the present application is used to receive the display image with the added virtual image to display the corresponding display screen.
  • the unmanned aerial vehicle 100 includes the functional component 20, and the display image with the added virtual image is obtained by
  • the flight parameters of the functional component 20 are obtained by generating a virtual image according to the flight parameters, and adding the virtual image to the display image.
  • the display image to which the virtual image has been added can indicate the flight status corresponding to the functional component 20.
  • FIG. 18 a computer-readable storage medium 500 containing computer-executable instructions 502 according to an embodiment of the present application.
  • the processor 210 can execute any of the foregoing.
  • the processor 210 when the computer-readable instruction 502 is executed by the processor 210, the processor 210 is caused to perform the following steps:
  • a virtual image is added to the display image to indicate the flight status corresponding to the functional component 20.
  • the processor 210 when the computer readable instruction 502 is executed by the processor 210, the processor 210 is caused to perform the following steps:
  • 0121 Generate the first and second oar shadows according to the attitude angle of the roll axis, and determine the inclination angle of the line connecting the first oar shadow and the second oar shadow.
  • a "computer-readable medium” can be any device that can contain, store, communicate, propagate, or transmit a program for use by an instruction execution system, device, or device or in combination with these instruction execution systems, devices, or devices.
  • computer readable media include the following: electrical connections (electronic devices) with one or more wiring, portable computer disk cases (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because it can be performed, for example, by optically scanning the paper or other medium, and then editing, interpreting, or other suitable methods when necessary. Process to obtain the program electronically and then store it in the computer memory.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

Abstract

一种飞行辅助方法、飞行辅助装置(200)、无人飞行器(100)、遥控器(300)、显示器(400)和无人飞行器系统(1000)。飞行辅助方法包括(011)获取功能部件(20)的飞行参数;(012)根据飞行参数生成虚拟图像;(013)将虚拟图像添加到显示图像上,以指示功能部件(20)对应的飞行状态。

Description

飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质 技术领域
本申请涉及无人飞行器技术领域,特别涉及一种用于无人飞行器的飞行辅助方法、用于无人飞行器的飞行辅助装置、无人飞行器、用于无人飞行器的遥控器、用于无人飞行器的显示器、无人飞行器系统以及计算机可读存储介质。
背景技术
随着无人飞行器的技术发展,人们对无人飞行器的飞行控制有了更多的应用场景要求。在一些情况下,无人飞行器(例如穿越机)一般会连同无人飞行器的结构部件(例如桨叶)一起拍摄以显示在第一人称视角(First Person View,FPV)画面上,以辅助操作者判断无人飞行器的飞行状态(例如飞行姿态),然而这会导致拍摄的视频中存在该结构部件(例如桨叶),从而影响拍摄视频的质量。在另一些情况下,为了良好的画面感而特意不录制到无人飞行器的结构部件(例如桨叶),但这又增加了操作者判断无人飞行器的飞行状态(例如飞行姿态)的难度。
发明内容
本申请的实施方式提供一种用于无人飞行器的飞行辅助方法、用于无人飞行器的飞行辅助装置、无人飞行器、用于无人飞行器的遥控器、用于无人飞行器的显示器、无人飞行器系统和和计算机可读存储介质。
本申请实施方式提供一种用于无人飞行器的飞行辅助方法,所述飞行辅助方法包括:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
本申请实施方式还提供一种用于无人飞行器的飞行辅助装置,所述无人飞行器包括功能部件,所述飞行辅助装置包括处理器,所述处理器用于:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
本申请实施方式还提供一种无人飞行器,所述无人飞行器包括:机身、功能部件和飞行辅助装置,所述功能部件设置在所述机身上,所述飞行辅助装置设置在所述机身上,所述飞行辅助装置包括处理器,所述处理器用于:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
本申请实施方式还提供一种用于无人飞行器的遥控器,所述无人飞行器包括功能部件,所述遥控器包括飞行辅助装置,所述飞行辅助装置包括处理器,所述处理器用于:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
本申请实施方式还提供一种用于无人飞行器的显示器,所述显示器用于接收已添加虚拟图像的显示图像,以显示对应的显示画面,所述无人飞行器包括功能部件,已添加所述虚拟图像的所述显示图像通 过获取所述功能部件的飞行参数、根据所述飞行参数生成虚拟图像、及将所述虚拟图像添加到显示图像上得到,已添加所述虚拟图像的所述显示图像能够指示所述功能部件对应的飞行状态。
本申请实施方式还提供一种无人飞行器系统,所述无人飞行器系统包括无人飞行器、遥控器、飞行辅助装置和显示器,所述无人飞行器包括功能部件,所述飞行辅助装置设置在所述无人飞行器和/或所述遥控器上;所述飞行辅助装置包括处理器,所述处理器用于:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态;所述显示器用于接收已添加所述虚拟图像的所述显示图像,以显示对应的显示画面。
本申请实施方式还提供一种包含计算机可执行指令的计算机可读存储介质。当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行上述实施方式的飞行辅助方法。
本申请实施方式的用于无人飞行器的飞行辅助方法、用于无人飞行器的飞行辅助装置、无人飞行器、用于无人飞行器的遥控器、用于无人飞行器的显示器、无人飞行器系统和和计算机可读存储介质中,通过添加到显示图像上的虚拟图像来指示飞行状态,由于虚拟图像是根据无人飞行器的功能部件的飞行参数生成的,虚拟图像能够很好地指示功能部件对应的飞行状态,例如指示功能部件(如陀螺仪)对应的姿态角,从而无需连同桨叶一起拍摄,也能够辅助操作者判断无人飞行器的飞行状态。且由于虚拟图像是后续添加到显示图像上的,实际拍摄的视频中不存在虚拟图像和桨叶,拍摄视频的质量较好。
本申请的实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实施方式的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请某些实施方式的飞行辅助方法的流程示意图。
图2是本申请某些实施方式的无人飞行器系统的结构示意图。
图3是本申请某些实施方式的飞行辅助方法的流程示意图。
图4至图7是本申请某些实施方式的飞行辅助方法的场景示意图。
图8和图9是本申请某些实施方式的飞行辅助方法的流程示意图。
图10是本申请某些实施方式的飞行辅助方法的场景示意图。
图11至图13是本申请某些实施方式的飞行辅助方法的流程示意图。
图14至图17是本申请某些实施方式的无人飞行器系统的结构示意图。
图18是本申请某些实施方式的处理器和计算机可读存储介质的连接示意图。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。
在本申请的描述中,需要理解的是,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本申请的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本申请的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接或可以相互通信;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。
目前,无人飞行器100广泛应用于航拍、农业、植保、微型自拍、快递运输、灾难救援、观察野生动物、监控传染病、测绘、新闻报道、电力巡检、救灾、影视拍摄、制造浪漫等等场景的应用,主要是通过遥控器300控制无人飞行器100执行各种飞行任务。
无人飞行器100包括机身10和安装在机身10上的拍摄设备30,拍摄设备30用于获取环境图像,拍摄设备30作为人眼的延伸,使得操控无人飞行器100的用户根据拍摄画面即可了解无人飞行器100当前的所处环境,从而控制遥控器300操控无人飞行器100执行各种飞行任务。
无人飞行器100中,有一类主要用于竞速娱乐的无人飞行器100(即,穿越机),穿越机在飞行时拍摄视频并以FPV画面显示,从而使得飞手(即,操控穿越机的无人飞行器选手)通过FPV画面实现第一人称的操作体验,操控穿越机进行飞行。下面以无人飞行器100为穿越机为例进行说明,无人飞行器100为其他类型时原理基本相同,在此不再赘述。
穿越机中,拍摄设备30一般会将桨叶一起拍进去,以辅助飞手判断穿越机当前的飞行姿态,然而,这会使得FPV画面两边存在桨叶旋转的画面,桨叶不仅会遮挡拍摄视频中的部分场景,使得飞手的操控体验受到一定影响,而且还会使得拍摄视频中存在桨叶,影响了观看者的观看体验。
为此,请参阅图1和图2,本申请实施方式提供一种用于无人飞行器100的飞行辅助方法,无人飞行器100包括功能部件20,该飞行辅助方法包括:
011:获取功能部件20的飞行参数;
012:根据飞行参数生成虚拟图像;及
013:将虚拟图像添加到显示图像上,以指示功能部件20对应的飞行状态。
本申请实施方式还提供一种用于无人飞行器100的飞行辅助装置200,该飞行辅助装置200包括处理器210,处理器210用于:获取功能部件20的飞行参数;根据飞行参数生成虚拟图像;及将虚拟图像添加到显示图像上,以指示功能部件20对应的飞行状态。也即是说,步骤011至步骤013可以由处理器210实现。
本申请实施方式还提供一种无人飞行器系统1000,无人飞行器系统1000包括无人飞行器100、飞 行辅助装置200、遥控器300、和显示器400,飞行辅助装置200可设置在无人飞行器100上和/或遥控器300上。
具体地,飞行辅助装置200可设置在无人飞行器100的机身10上(即,无人飞行器100包括机身10、功能部件20、拍摄设备30和飞行辅助装置200,功能部件20可设置在机身10上),处理器210可直接获取到功能部件20的飞行参数和设置在无人飞行器100的机身10上的拍摄设备30拍摄的视频数据(视频数据包括一帧或多帧显示图像),并根据飞行参数生成虚拟图像以添加到显示图像上,用来指示无人飞行器100的飞行状态;或者,飞行辅助装置200可设置在遥控器300上(即,无人飞行器100包括机身10、功能部件20和拍摄设备30,遥控器300包括飞行辅助装置200),通过遥控器300和无人飞行器100的无线通信,处理器210可获取到功能部件20的飞行参数和拍摄设备30拍摄的视频数据,并根据飞行参数生成虚拟图像以添加到显示图像上,用来指示无人飞行器100的飞行状态;或者,飞行辅助装置200同时设置在无人飞行器100的机身10和遥控器300上,即飞行辅助装置200包括至少两个处理器210,分别设置在无人飞行器100的机身10和遥控器300上,设置在无人飞行器100的机身10上的处理器210可直接获取到功能部件20的飞行参数和拍摄设备30拍摄的视频数据,并根据飞行参数生成虚拟图像,然后设置在遥控器300上的处理器210能够通过无线通信接收无人飞行器100发送的虚拟图像和视频数据,并将虚拟图像添加到显示图像上,已添加虚拟图像的显示图像能够指示功能部件20对应的飞行状态,也即是说,步骤011和步骤012可以由无人飞行器100的机身10上的处理器210实现,步骤013可以由遥控器300上的处理器210实现,应当理解,还可以是步骤011可以由无人飞行器100的机身10上的处理器210实现,步骤012和步骤013可以由遥控器300上的处理器210实现。在图2所示的实施方式中,以飞行辅助装置200设置在无人飞行器100的机身10上为例,此处不作限定作用。
功能部件20可以是设置在无人飞行器100上任一装置,例如,功能部件20可以包括无人飞行器100的传感器(如陀螺仪、加速度计、气压计、磁罗盘等)、电机、电源等,处理器210获取功能部件20的飞行参数可以包括:获取陀螺仪的姿态角、加速度计的加速度,气压计的气压、磁罗盘的磁场数据、电机的转速、电源的放电电流及电源剩余电量等。
处理器210根据飞行参数生成对应的虚拟图像,例如处理器210可根据陀螺仪的姿态角和加速度计的加速度生成桨影图像,以指示无人飞行器100的姿态角(如在FPV画面的两侧分别添加一个桨影或机臂灯,通过两个桨影或机臂灯的倾斜角度来指示横滚角);再例如,处理器210可根据气压计的气压生成带刻度指示管图像,以指示无人飞行器100当前所处环境的气压;再例如,处理器210可根据磁罗盘的磁场数据生成以圆盘形式显示方位的航向仪罗盘图像,以指示无人飞行器100的飞行航向;再例如,处理器210可根据电机的转速生成可旋转的桨影图像,桨影的旋转速度根据电机的转速确定,以指示每个电机对应的桨叶的转速;再例如,处理器210可根据电源的剩余电量生成电池图像,以指示剩余电量,处理器210还可根据电源的剩余电量和电源的放电电流生成指示数字图像以指示无人飞行器100剩余的使用时间等。本申请以功能部件20包括陀螺仪和电机为例进行说明,功能部件20为电源、磁罗盘等情况原理基本相同,在此不再赘述。
其中,飞行参数可以是当前时刻的飞行参数,也可以是当前时刻之前预定时段内的飞行参数,虚拟 图像根据当前时刻的飞行参数生成时,可指示功能部件20当前时刻的飞行状态;虚拟图像根据当前时刻之前预定时段内的飞行参数生成时,能够指示功能部件20在该预定时段内的飞行状态,例如根据预定时段内的电机的平均转速生成的可旋转的桨影,能够指示该预定时段内的电机的转速,根据预定时段内的剩余电量和预定时段生成电源放电电流,然后根据该放电电流和剩余电量,可生成指示数字图像以指示无人飞行器100剩余的使用时间等。其中,预定时段可以是1秒、2秒、30秒、1分钟、5分钟等。
在根据飞行参数生成对应的虚拟图像后,处理器210将虚拟图像添加到拍摄设备30获取到的显示图像上,从而以添加有虚拟图像的显示图像作为FPV画面进行显示,添加有虚拟图像的显示图像能够实时地指示功能部件20对应的飞行状态,例如通过模拟实际桨叶的桨影指示无人飞行器100的姿态角,以使得飞手确定穿越机当前的飞行姿态,辅助飞手操控穿越机。
本申请的飞行辅助方法和飞行辅助装置200中,通过添加到显示图像上的虚拟图像来指示飞行状态,由于虚拟图像是根据无人飞行器的功能部件20的飞行参数生成的,虚拟图像能够很好地指示功能部件20对应的飞行状态,例如通过桨影指示无人飞行器100的姿态角,从而无需连同桨叶一起拍摄,也能够辅助飞手判断无人飞行器100的飞行状态。且由于虚拟图像是后续添加到显示图像以作为FPV画面进行显示的,实际拍摄的显示图像中不存在虚拟图像和桨叶,拍摄视频的质量较好。
请参阅图2和3,在某些实施方式中,功能部件20包括陀螺仪21,虚拟图像包括二维的第一桨影和第二桨影,第一桨影和第二桨影位于显示图像的相对的两侧,陀螺仪21的姿态角包括横滚轴的姿态角,步骤012包括:
0121:根据横滚轴的姿态角生成第一桨影和第二桨影,并确定第一桨影和第二桨影的连线的倾斜角度,倾斜角度为第一桨影和第二桨影的连线相对于水平线的角度。
在某些实施方式中,处理器210还用于根据横滚轴的姿态角生成第一桨影和第二桨影,并确定第一桨影和第二桨影的连线的倾斜角度,倾斜角度为第一桨影和第二桨影的连线相对于水平线的角度。也即是说,步骤0121可以由处理器210实现。
具体地,在穿越机对应的FPV画面中,桨叶分别位于FPV画面的两侧,如图4所示的显示器400显示的FPV画面中,以FPV画面的水平方向为X方向,FPV画面的垂直方向为Y方向,随着无人飞行器100的姿态角的变化,桨叶在FPV画面两侧的Y方向的位置随之变化。
陀螺仪21可以设置在无人飞行器100的机身10上,以检测机身10的姿态角;或者,陀螺仪21可以设置在云台500上,云台500设置在无人飞行器100的机身10上,以检测云台500的姿态角;或者,陀螺仪21同时设置在无人飞行器100的机身10上及云台500上,即陀螺仪21至少为两个,分别设置在无人飞行器100的机身10上及云台500上,以分别检测机身10的姿态角和云台500的姿态角。在图2所示的实施方式中,以陀螺仪21设置在无人飞行器100的机身10上为例,此处不作限定作用。
云台500的工作模式包括增稳模式和跟随模式。云台500可以保持增稳模式、保持跟随模式、及在增稳模式和跟随模式之间进行切换。其中,增稳模式指的是云台500始终保持预设参照方向(例如水平方向)的稳定性,云台500会对无人飞行器100的横滚、偏航和俯仰操作进行负反馈调节以抵消可能带来的晃动从而保持云台500上搭载的负载(如拍摄设备30)的稳定。以俯仰为例进行说明,在增稳模式下,无人飞行器100进行俯仰时,相机并不会随之俯仰,而是依旧保持原来的拍摄角度(一般为水平), 原因在于无人飞行器100俯仰时云台500进行负反馈调节以保持云台500搭载的拍摄设备30始终在水平方向。这里的负反馈调节指的是当无人飞行器100仰15度的时候,云台500则控制拍摄设备30俯15度从而保持拍摄设备水平,实现拍摄设备的增稳;偏航和横滚的负反馈调节原理和俯仰的负反馈调节原理基本相同。跟随模式指的是云台500跟随无人飞行器100移动,从而保持拍摄设备30与无人飞行器100的相对角度不变。以俯仰为例进行说明,在跟随模式下,用户控制无人飞行器100仰20度,则云台500控制拍摄设备30仰20度以使得拍摄设备30和无人飞行器100的相对角度基本保持不变。
请参阅图4,本实施方式中,通过横滚轴的姿态角确定了生成的第一桨影S1和第二桨影S2的连线的倾斜角度,第一桨影S1位于FPV画面的左侧,第二桨影S2位于FPV画面的右侧,请参阅图5,在无人飞行器100逆时针横滚时,第一桨影S1贴着FPV画面的左侧边沿,沿Y方向的反方向移动,而第二桨影S2贴着FPV画面的右侧边沿,沿Y方向移动;相反的,请参阅图6,在无人飞行器100顺时针横滚时,第一桨影S1贴着FPV画面的左侧边沿,沿Y方向移动,而第二桨影S2贴着FPV画面的右侧边沿,沿Y方向的反方向移动。在一个实施例中,第一桨影S1和第二桨影S2的连线可经过FPV画面的中心,第一桨影S1和第二桨影S2的连线的倾斜角度即为第一桨影S1和第二桨影S2的连线相对经过FPV画面的中心的水平线的角度。根据第一桨影S1和第二桨影S2的连线的倾斜角度即可指示无人飞行器100的横滚姿态角。
请参阅图7,在第三桨影S3和第四桨影S4表示云台500的横滚姿态角时,若云台500当前处于增稳模式,云台500对应的第三桨影S3和第四桨影S4的连线倾斜角度始终保持不变(如为0度),而在跟随模式下,云台500对应的第三桨影S3和第四桨影S4的连线倾斜角度跟随无人飞行器100对应的第一桨影S1和第二桨影S2的移动而移动,两者的倾斜角度基本相同。
请继续参阅图7,在一个例子中,无人飞行器100对应的第一桨影S1和第二桨影S2的连线可经过FPV画面的上半部分的中心,第一桨影S1和第二桨影S2的连线的倾斜角度即为第一桨影S1和第二桨影S2的连线相对经过FPV画面的上半部分的中心的水平线的角度;云台500对应的第三桨影S3和第四桨影S4的连线可经过FPV画面的下半部分的中心,第三桨影S3和第四桨影S4的连线的倾斜角度即为第三桨影S3和第四桨影S4的连线相对经过FPV画面的下半部分的中心的水平线的角度;从而同时显示无人飞行器100的横滚姿态角和云台500的横滚姿态角。
请参阅图2、图5和图8,在某些实施方式中,功能部件20包括电机22,虚拟图像包括二维的第一桨影S1和第二桨影S2,第一桨影S1和第二桨影S2位于显示图像的相对的两侧,步骤012还包括:
0122:根据电机22的转速生成可旋转的第一桨影S1和第二桨影S2,并确定第一桨影S1和第二桨影S2的转速。
在某些实施方式中,处理器210还用于根据电机22的转速生成可旋转的第一桨影S1和第二桨影S2,并确定第一桨影S1和第二桨影S2的转速。也即是说,步骤0122可以由处理器210实现。
具体地,本申请实施方式中,由于拍摄设备30并未拍摄实际的桨叶,因此飞手本身是无法得知桨叶是否正常旋转以及旋转的速度的,在电机22出现损坏时,桨叶的旋转会变慢甚至不再旋转,因此,处理器210根据电机22的转速生成可旋转的第一桨影S1和第二桨影S2,以指示电机22的转速,在第一桨影S1和第二桨影S2不旋转或者旋转速度忽然变得极慢时,说明电机22可能出现了问题,此时穿 越机不再适合飞行需要尽快降落以进行检修。在一个实施例中,第一桨影S1和第二桨影S2的旋转速度可根据对应的一个或多个电机22的最低转速确定,例如,穿越机为四翼穿越机,分两组,每组包括两个桨叶,每个桨叶均对应一个电机22,第一桨影S1根据第一组的电机22中的最低转速确定,第二桨影S2根据第二组的电机22中的最低转速确定,从而使得飞手及时发现可能损坏(如旋转速度小于或等于预定旋转速度)的电机22。当然,桨影的数目可以更多,比如第一桨影S1和第二桨影S2分别为两个,每个桨影对应一个电机22,从而根据桨影的旋转速度准确地确定可能损坏的电机22。
在图8所示的方法中,步骤012包括步骤0121和步骤0122,也就是说,通过已添加虚拟图像的显示图像,飞手既可以了解穿越机的横滚姿态角,又可以了解穿越机的电机的转速或损坏情况。应当理解,该飞行辅助方法的步骤012还可以包括单独的步骤0121或单独的步骤0122,或单独的其他可行步骤,或这些步骤的组合,在此不作限定。
请参阅图2、图9和图10,在某些实施方式中,虚拟图像包括三维的飞行器图像S5,陀螺仪21的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,步骤012还包括:
0123:根据陀螺仪21的姿态角生成飞行器图像S5,并根据横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角分别确定飞行器图像S5的横滚姿态角、俯仰姿态角和偏航姿态角。
在某些实施方式中,处理器210还用于根据陀螺仪21的姿态角生成飞行器图像S5,并根据横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角分别确定飞行器图像S5的横滚姿态角、俯仰姿态角和偏航姿态角。也即是说,步骤0123可以由处理器210实现。
具体地,为了准确地指示无人飞行器100的横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,可根据陀螺仪21的姿态角生成三维的飞行器图像S5,该三维的飞行器图像S5的横滚姿态角和陀螺仪21的横滚轴姿态角保持一致;该三维的飞行器图像S5的俯仰姿态角和陀螺仪21的俯仰轴姿态角保持一致;该三维的飞行器图像S5的偏航姿态角和陀螺仪21的偏航轴姿态角保持一致。如此,飞行器图像S5与无人飞行器100的姿态同步变化,飞手根据虚拟的三维飞行器图像S5的三轴(即,横滚轴、俯仰轴和偏航轴)的姿态变化即可准确地确定无人飞行器100的三轴的姿态变化,从而辅助飞手更好的操控无人飞行器100。
请参阅图2和图11,在某些实施方式中,飞行辅助方法包括:
014:调节虚拟图像的参数,以使得显示图像的内容不被虚拟图像遮挡。
在某些实施方式中,处理器210还用于调节虚拟图像的参数,以使得显示图像的内容不被虚拟图像遮挡。也即是说,步骤014可以由处理器210实现。
具体地,由于显示画面上添加了虚拟图像,为了使得虚拟图像在指示无人飞行器100的飞行状态的同时,对显示画面的影响最小(如,对显示画面的遮挡影响最小),处理器210可调节虚拟图像的参数,例如参数可以是虚拟图像的颜色、大小和/或透明度,处理器210可调节虚拟图像的颜色、大小或透明度;或者,处理器210可调节虚拟图像的颜色和大小;或者,处理器210可调节虚拟图像的颜色和透明度;或者,处理器210可调节虚拟图像的大小和透明度;或者,处理器210可调节虚拟图像的颜色、大小和透明度。本实施方式中,以处理器210可调节虚拟图像的颜色、大小和透明度为例进行说明。
处理器210可根据拍摄设备30获取的显示图像本身的颜色调节虚拟图像的颜色、根据显示图像的 大小调节虚拟图像的大小、及根据显示图像的颜色调节虚拟图像的透明度。例如,在显示图像的大部分(70%以上像素)均为蓝色,则虚拟图像以与蓝色的差别较大的颜色(如红色、绿色等)显示,以使得用户能够快速地观察到虚拟图像,同时处理器210根据显示图像的大小调节虚拟图像的大小(如调节虚拟图像的大小为显示图像的大小的1/20),以使得虚拟图像不会遮挡显示图像过多的区域;并根据显示图像的颜色调节虚拟图像的透明度,使得显示图像被虚拟图像遮挡的区域也能够被飞手观察到,可以理解,当虚拟图像和显示图像颜色相近(如颜色相同的像素大于或等于70%)时,透明度需要调节的较高才能很好的区分虚拟图像和显示图像,而在虚拟图像和显示图像颜色相远(如颜色相同的像素小于或等于30%)时,透明度需要调节的较低也能很好的区分虚拟图像和显示图像,因此,将虚拟图像和显示图像使用不同的颜色显示时,可将虚拟图像的透明度调节的较低(如20%),从而使得虚拟图像不仅能够快速的被飞手观察到,而且基本不会遮挡显示图像。如此,处理器210通过调节虚拟图像的参数,使得调节后的虚拟图像能够被飞手快速地观察到的同时,不遮挡显示图像的显示内容。
在其他实施方式中,处理器210还用于接收调节指令,以根据调节指令调节虚拟图像的参数。调节指令可以是通过遥控器300接收飞手的输入得到的,或者调节指令还可以通过与遥控器300连接的终端(如手机、平板电脑等)接收飞手的输入得到,其中,遥控器300可设置有安装架,终端可直接设置在安装架上以实现与遥控器300的有线连接,或者,终端通过数据线实现与遥控器300的有线连接,或者终端与遥控器300无线连接(如终端和遥控器300通过蓝牙连接,或者终端和遥控器300连接蜂窝移动网或无线局域网后实现无线连接等)。如此,可根据飞手的个人喜好设置最适合飞手的虚拟图像。
请参阅图2和图12,在某些实施方式中,步骤013包括:
0131:获取视频数据,视频数据包括一帧或多帧显示图像、及与每帧显示图像对应的时间戳;及
0132:将根据与时间戳相同时刻获取的功能部件20的飞行参数生成的虚拟图像添加到时间戳对应的显示图像上,以指示功能部件20在时间戳时的飞行状态。
在某些实施方式中,处理器210还用于获取视频数据,视频数据包括一帧或多帧显示图像、及与每帧显示图像对应的时间戳;及将根据与时间戳相同时刻获取的功能部件20的飞行参数生成的虚拟图像添加到时间戳对应的显示图像上,以指示功能部件20在时间戳时的飞行状态。也即是说,步骤0131和步骤0132可以由处理器210实现。
具体地,处理器210在将虚拟图像添加到显示图像上时,为了使得虚拟图像指示的飞行状态和当前显示图像匹配,需要先获取拍摄设备30拍摄的视频数据,视频数据包括一帧或多帧显示图像,每帧显示图像均对应一个时间戳(下称第一时间戳),时间戳表示获取该显示图像的时刻,目前,从获取拍摄设备30拍摄的视频数据到显示显示图像,中间的延迟已经被控制得极小,基本可以认为视频数据的拍摄时间即为显示时间,同样地,飞行参数也存在一个时间戳(下称第二时间戳),第一时间戳对应的显示图像、和与第一时间戳相同时刻的第二时间戳对应的飞行参数是一一对应的,根据在与第一时间戳对应的第二时间戳获取的飞行参数生成的虚拟图像,可指示该第一时间戳时无人飞行器100的飞行状态。因此,将该虚拟图像添加到第一时间戳对应的显示图像后,飞手通过观察添加有虚拟图像的显示画面即可确定无人飞行器100在第一时间戳时的飞行状态。
应当理解,图11-12所示的飞行辅助方法的流程示意图仅作为例子展示,并没有限定作用。本申请 的飞行辅助方法还可以包括步骤011、0121、0122、0123、0131、0132、014或其他可行步骤的任意组合,在此不作限定。
请参阅图2和图13,在某些实施方式中,飞行辅助方法还包括:
015:发送已添加虚拟图像的显示图像,以在显示器400上显示对应的显示画面。
在某些实施方式中,处理器210还用于发送已添加虚拟图像的显示图像,以在显示器400上显示对应的显示画面。也即是说,步骤015可以由处理器210实现。
具体地,显示图像一般通过显示器400显示,显示器400可以是遥控器300本身内置的显示器400,或者,显示器400还可以是与遥控器300或无人飞行器100连接的终端上的显示器400,如终端可以是手机、平板电脑、智能穿戴设备(如FPV眼镜等),终端可以安装在遥控器300上或通过数据线实现与遥控器300的有线连接,或者终端可以通过蜂窝移动网或无线局域网实现与遥控器300或无人飞行器100的无线连接。
处理器210在将虚拟图像添加到显示图像上后,会将已添加虚拟图像的显示图像发送,以在显示器400上显示对应的显示画面(如FPV画面)。无人飞行器100可包括图像传输模块40,图像传输模块40为无人飞行器100中用于实现图像的无线传输的装置。例如,请参阅图14,当显示器400为遥控器300内置的显示器400,且飞行辅助装置200设置在遥控器300上时,无人飞行器100的图像传输模块40直接在获取到视频数据(显示图像)后发送给遥控器300,处理器210再通过无线通信获取无人飞行器100的飞行参数,从而根据飞行参数生成虚拟图像并将虚拟图像添加到显示图像上,遥控器300的显示器400根据已添加虚拟图像的显示图像显示对应的显示画面(即,处理器210从图像传输模块40获取显示图像,并将已添加虚拟图像的显示图像发送到显示器400,以在显示器400上显示对应的显示画面);请参阅图15,当显示器400为遥控器300内置的显示器400,且飞行辅助装置200设置在无人飞行器100上时,处理器210获取到视频数据(显示图像)和飞行参数后,根据飞行参数生成虚拟图像并将虚拟图像添加到显示图像上,然后将已添加虚拟图像的显示图像发送到图像传输模块40,通过图像传输模块40将已添加虚拟图像的显示图像传输给遥控器300,遥控器300的显示器400根据该显示图像显示对应的显示画面。
再例如,请参阅图16,当显示器400为与遥控器300连接的终端的显示器400,且飞行辅助装置200设置在遥控器300上时,无人飞行器100的图像传输模块40直接在获取到视频数据(显示图像)后发送给遥控器300(即,处理器210从图像传输模块40获取显示图像),处理器210再通过无线通信获取无人飞行器100的功能部件20的飞行参数,从而根据飞行参数生成虚拟图像并将虚拟图像添加到显示图像上,遥控器300再将已添加虚拟图像的显示图像发送给终端(如图16中的FPV眼镜),也即是说,已添加虚拟图像的显示图像是通过获取功能部件20的飞行参数、根据该飞行参数生成虚拟图像、及将该虚拟图像添加到显示图像上得到的,已添加虚拟图像的显示图像能够指示功能部件对应的飞行状态。FPV眼镜的显示器400根据该显示图像显示对应的显示画面(即,显示器400用于接收已添加虚拟图像的显示图像,以显示对应的显示画面);请再次参阅图2,当显示器400为与遥控器300连接的终端的显示器400,且飞行辅助装置200设置在无人飞行器100上时,处理器210获取到视频数据(显示图像)和飞行参数后,根据飞行参数生成虚拟图像并将虚拟图像添加到显示图像上,接着将已添加虚拟图像的 显示图像发送给图像传输模块40,然后图像传输模块40将已添加虚拟图像的显示图像传输给遥控器300,遥控器300接收到已添加虚拟图像的显示图像后,再将该已添加虚拟图像的显示图像发送给终端,终端的显示器400根据该显示图像显示对应的显示画面(即,显示器400用于接收已添加虚拟图像的显示图像,以显示对应的显示画面)。可以理解,在另一个实施例中,图像传输模块40也可以将已添加虚拟图像的显示图像直接传输给终端,以在终端的显示器400上显示对应的显示画面。
再例如,请参阅图17,当显示器400为与无人飞行器100连接的终端的显示器400,且飞行辅助装置200设置在遥控器300上时,处理器210获取到视频数据(显示图像)和飞行参数后,根据飞行参数生成虚拟图像并将虚拟图像添加到显示图像上,然后通过遥控器300将已添加虚拟图像的显示图像发送给图像传输模块40,然后由图像传输模块40将该显示图像传输到终端,终端的显示器400根据该显示图像显示对应的显示画面。
本申请实施方式的用于无人飞行器100的显示器400用于接收已添加虚拟图像的显示图像,以显示对应的显示画面,无人飞行器100包括功能部件20,已添加虚拟图像的显示图像通过获取功能部件20的飞行参数、根据飞行参数生成虚拟图像、及将虚拟图像添加到显示图像上得到,已添加虚拟图像的显示图像能够指示功能部件20对应的飞行状态。
请参阅图18,本申请实施方式的一种包含计算机可执行指令502的计算机可读存储介质500,当计算机可执行指令502被一个或多个处理器210执行时,使得处理器210执行上述任一实施方式的飞行辅助方法。
例如,请结合图1和图2,计算机可读指令502被处理器210执行时,使得处理器210执行以下步骤:
011:获取功能部件20的飞行参数;
012:根据飞行参数生成虚拟图像;及
013:将虚拟图像添加到显示图像上,以指示功能部件20对应的飞行状态。
再例如,请结合图2和图3,计算机可读指令502被处理器210执行时,使得处理器210执行以下步骤:
0121:根据横滚轴的姿态角生成第一桨影和第二桨影,并确定第一桨影和第二桨影的连线的倾斜角度。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”等的描述意指结合实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于执行特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的执行,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于执行逻辑功能的可执行指令的定序列表,可以具体执行在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得程序,然后将其存储在计算机存储器中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (67)

  1. 一种用于无人飞行器的飞行辅助方法,其特征在于,所述无人飞行器包括功能部件,所述飞行辅助方法包括:
    获取所述功能部件的飞行参数;
    根据所述飞行参数生成虚拟图像;及
    将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
  2. 根据权利要求1所述的飞行辅助方法,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述获取所述功能部件的飞行参数,包括:
    获取所述陀螺仪的姿态角。
  3. 根据权利要求2所述的飞行辅助方法,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  4. 根据权利要求2所述的飞行辅助方法,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  5. 根据权利要求1所述的飞行辅助方法,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述获取所述功能部件的飞行参数,包括:
    获取所述电机的转速。
  6. 根据权利要求5所述的飞行辅助方法,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影,并确定所述第一桨影和第二桨影的转速。
  7. 根据权利要求2所述的飞行辅助方法,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  8. 根据权利要求1所述的飞行辅助方法,其特征在于,所述飞行辅助方法还包括:
    调节所述虚拟图像的参数,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  9. 根据权利要求8所述的飞行辅助方法,其特征在于,所述参数包括颜色、大小和/或透明度。
  10. 根据权利要求1所述的飞行辅助方法,其特征在于,所述将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态,包括:
    获取视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  11. 根据权利要求1所述的飞行辅助方法,其特征在于,所述飞行辅助方法还包括:
    发送已添加所述虚拟图像的所述显示图像,以在显示器上显示对应的显示画面。
  12. 一种用于无人飞行器的飞行辅助装置,其特征在于,所述无人飞行器包括功能部件,所述飞行辅助装置包括处理器,所述处理器用于:
    获取所述功能部件的飞行参数;
    根据所述飞行参数生成虚拟图像;及
    将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
  13. 根据权利要求12所述的飞行辅助装置,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述处理器还用于获取所述陀螺仪的姿态角。
  14. 根据权利要求13所述的飞行辅助装置,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  15. 根据权利要求13所述的飞行辅助装置,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述处理器还用于根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  16. 根据权利要求12所述的飞行辅助装置,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述处理器还用于获取所述电机的转速。
  17. 根据权利要求16所述的飞行辅助装置,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述处理器还用于根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影并确定所述第一桨影和第二桨影的转速。
  18. 根据权利要求13所述的飞行辅助装置,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述处理器还用于根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  19. 根据权利要求12所述的飞行辅助装置,其特征在于,所述处理器还用于接收调节指令,根据所述调节指令调节所述虚拟图像的参数,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  20. 根据权利要求19所述的飞行辅助装置,其特征在于,所述参数包括颜色、大小和/或透明度。
  21. 根据权利要求12所述的飞行辅助装置,其特征在于,所述无人飞行器还包括拍摄设备,所述拍摄设备用于拍摄视频数据,所述处理器还用于:
    获取所述视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  22. 根据权利要求12所述的飞行辅助装置,其特征在于,
    所述处理器用于发送已添加所述虚拟图像的所述显示图像,以在显示器上显示对应的显示画面。
  23. 一种无人飞行器,其特征在于,所述无人飞行器包括:
    机身;
    功能部件,所述功能部件设置在所述机身上;和
    飞行辅助装置,所述飞行辅助装置设置在所述机身上,所述飞行辅助装置包括处理器,所述处理器用于:
    获取所述功能部件的飞行参数;
    根据所述飞行参数生成虚拟图像;及
    将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
  24. 根据权利要求23所述的无人飞行器,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述处理器还用于获取所述陀螺仪的姿态角。
  25. 根据权利要求24所述的无人飞行器,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  26. 根据权利要求24所述的无人飞行器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述处理器还用于根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  27. 根据权利要求23所述的无人飞行器,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述处理器还用于获取所述电机的转速。
  28. 根据权利要求27所述的无人飞行器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述处理器还用于根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影并确定所述第一桨影和第二桨影的转速。
  29. 根据权利要求24所述的无人飞行器,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述处理器还用于根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  30. 根据权利要求23所述的无人飞行器,其特征在于,所述处理器还用于接收调节指令,根据所述调节指令调节所述虚拟图像的参数,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  31. 根据权利要求30所述的无人飞行器,其特征在于,所述参数包括颜色、大小和/或透明度。
  32. 根据权利要求23所述的无人飞行器,其特征在于,所述无人飞行器还包括拍摄设备,所述拍摄设备用于拍摄视频数据,所述处理器还用于:
    获取所述视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  33. 根据权利要求23所述的无人飞行器,其特征在于,所述无人飞行器还包括图像传输模块,所述处理器用于发送已添加所述虚拟图像的所述显示图像到所述图像传输模块,所述图像传输模块用于传输已添加所述虚拟图像的所述显示图像到显示器,以在所述显示器上显示对应的显示画面。
  34. 根据权利要求23所述的无人飞行器,其特征在于,所述无人飞行器包括穿越机。
  35. 一种用于无人飞行器的遥控器,其特征在于,所述无人飞行器包括功能部件,所述遥控器包括飞行辅助装置,所述飞行辅助装置包括处理器,所述处理器用于:
    获取所述功能部件的飞行参数;
    根据所述飞行参数生成虚拟图像;及
    将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态。
  36. 根据权利要求35所述的遥控器,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述处理器还用于获取所述陀螺仪的姿态角。
  37. 根据权利要求36所述的遥控器,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  38. 根据权利要求36所述的遥控器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述处理器还用于根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  39. 根据权利要求35所述的遥控器,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述处理器还用于获取所述电机的转速。
  40. 根据权利要求39所述的遥控器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述处理器还用于根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影并确定所述第一桨影和第二桨影的转速。
  41. 根据权利要求36所述的遥控器,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述处理器还用于根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  42. 根据权利要求35所述的遥控器,其特征在于,所述处理器还用于接收调节指令,根据所述调节指令调节所述虚拟图像的参数,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  43. 根据权利要求42所述的遥控器,其特征在于,所述参数包括颜色、大小和/或透明度。
  44. 根据权利要求35所述的遥控器,其特征在于,所述无人飞行器还包括拍摄设备,所述拍摄设备用于拍摄视频数据,所述处理器还用于:
    获取所述视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  45. 根据权利要求35所述的遥控器,其特征在于,所述无人飞行器还包括图像传输模块,所述处理器用于从所述图像传输模块获取所述显示图像,并将已添加所述虚拟图像的所述显示图像发送到显示器,以在所述显示器上显示对应的显示画面。
  46. 一种用于无人飞行器的显示器,其特征在于,所述显示器用于接收已添加虚拟图像的显示图像,以显示对应的显示画面,所述无人飞行器包括功能部件,已添加所述虚拟图像的所述显示图像通过获取所述功能部件的飞行参数、根据所述飞行参数生成虚拟图像、及将所述虚拟图像添加到显示图像上得到,已添加所述虚拟图像的所述显示图像能够指示所述功能部件对应的飞行状态。
  47. 根据权利要求46所述的显示器,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述获取所述功能部件的飞行参数包括:获取所述陀螺仪的姿态角。
  48. 根据权利要求47所述的显示器,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  49. 根据权利要求47所述的显示器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  50. 根据权利要求46所述的显示器,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述获取所述功能部件的飞行参数,包括:获取所述电机的转速。
  51. 根据权利要求50所述的显示器,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影,并确定所述第一桨影和第二桨影的转速。
  52. 根据权利要求47所述的显示器,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述根据所述飞行参数生成虚拟图像,包括:
    根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  53. 根据权利要求46所述的显示器,其特征在于,所述虚拟图像的参数能够被调节,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  54. 根据权利要求53所述的显示器,其特征在于,所述参数包括颜色、大小和/或透明度。
  55. 根据权利要求46所述的显示器,其特征在于,所述将所述虚拟图像添加到显示图像上,包括:
    获取视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  56. 一种无人飞行器系统,其特征在于,所述无人飞行器系统包括无人飞行器、遥控器、飞行辅助装置和显示器,所述无人飞行器包括功能部件,所述飞行辅助装置设置在所述无人飞行器和/或所述遥控器上;所述飞行辅助装置包括处理器,所述处理器用于:获取所述功能部件的飞行参数;根据所述飞行参数生成虚拟图像;及将所述虚拟图像添加到显示图像上,以指示所述功能部件对应的飞行状态;所述显示器用于接收已添加所述虚拟图像的所述显示图像,以显示对应的显示画面。
  57. 根据权利要求56所述的无人飞行器系统,其特征在于,所述功能部件包括陀螺仪,所述飞行参数包括所述陀螺仪的姿态角,所述处理器还用于获取所述陀螺仪的姿态角。
  58. 根据权利要求57所述的无人飞行器系统,其特征在于,所述陀螺仪设置在所述无人飞行器的机身上,和/或设置在所述无人飞行器上的云台上。
  59. 根据权利要求57所述的无人飞行器系统,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述陀螺仪的姿态角包括横滚轴的姿态角,所述处理器还用于根据所述横滚轴的姿态角生成所述第一桨影和所述第二桨影,并确定所述第一桨影和所述第二桨影的连线的倾斜角度,所述倾斜角度为所述第一桨影和所述第二桨影的连线相对于水平线的角度。
  60. 根据权利要求56所述的无人飞行器系统,其特征在于,所述功能部件包括电机,所述飞行参数包括所述电机的转速,所述处理器还用于获取所述电机的转速。
  61. 根据权利要求60所述的无人飞行器系统,其特征在于,所述虚拟图像包括二维的第一桨影和第二桨影,所述第一桨影和所述第二桨影位于所述显示图像的相对的两侧,所述处理器还用于根据所述电机的转速生成可旋转的所述第一桨影和所述第二桨影并确定所述第一桨影和第二桨影的转速。
  62. 根据权利要求57所述的无人飞行器系统,其特征在于,所述虚拟图像包括三维的飞行器图像,所述陀螺仪的姿态角包括横滚轴的姿态角、俯仰轴的姿态角和偏航轴的姿态角,所述处理器还用于根据所述陀螺仪的姿态角生成所述飞行器图像,并根据所述横滚轴的姿态角、所述俯仰轴的姿态角和所述偏航轴的姿态角分别确定所述飞行器图像的横滚姿态角、俯仰姿态角和偏航姿态角。
  63. 根据权利要求56所述的无人飞行器系统,其特征在于,所述处理器还用于接收调节指令,根据所述调节指令调节所述虚拟图像的参数,以使得所述显示图像的内容不被所述虚拟图像遮挡。
  64. 根据权利要求63所述的无人飞行器系统,其特征在于,所述参数包括颜色、大小和/或透明度。
  65. 根据权利要求56所述的无人飞行器系统,其特征在于,所述无人飞行器还包括拍摄设备,所述拍摄设备用于拍摄视频数据,所述处理器还用于:
    获取所述视频数据,所述视频数据包括一帧或多帧显示图像、及与每帧所述显示图像对应的时间戳;及
    将根据与所述时间戳相同时刻获取的所述功能部件的所述飞行参数生成的所述虚拟图像添加到所述时间戳对应的所述显示图像上,以指示所述功能部件在所述时间戳时的所述飞行状态。
  66. 根据权利要求56所述的无人飞行器系统,其特征在于,所述无人飞行器还包括图像传输模块,当所述飞行辅助装置设置在所述无人飞行器上时,所述处理器用于发送已添加所述虚拟图像的所述显示图像到所述图像传输模块,所述图像传输模块用于传输已添加所述虚拟图像的所述显示图像到显示器,以在所述显示器上显示对应的显示画面;
    当所述飞行辅助装置设置在所述遥控器上时,所述处理器用于从所述图像传输模块获取所述显示图像,并将已添加所述虚拟图像的所述显示图像发送到显示器,以在所述显示器上显示对应的显示画面。
  67. 一种包含计算机可执行指令的计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行如权利要求1至11中任一项所述的飞行辅助方法。
PCT/CN2020/091885 2020-05-22 2020-05-22 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质 WO2021232424A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/091885 WO2021232424A1 (zh) 2020-05-22 2020-05-22 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质
CN202080005564.8A CN112840286A (zh) 2020-05-22 2020-05-22 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/091885 WO2021232424A1 (zh) 2020-05-22 2020-05-22 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质

Publications (1)

Publication Number Publication Date
WO2021232424A1 true WO2021232424A1 (zh) 2021-11-25

Family

ID=75926589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091885 WO2021232424A1 (zh) 2020-05-22 2020-05-22 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质

Country Status (2)

Country Link
CN (1) CN112840286A (zh)
WO (1) WO2021232424A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115938190A (zh) * 2022-12-01 2023-04-07 南京芯传汇电子科技有限公司 一种无人机控制训练模拟器便携式控制终端

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (zh) * 2009-03-04 2009-07-29 北京航空航天大学 一种空中无人机超视距遥控方法
US20140032034A1 (en) * 2012-05-09 2014-01-30 Singularity University Transportation using network of unmanned aerial vehicles
CN103809600A (zh) * 2014-03-04 2014-05-21 北京航空航天大学 一种无人飞艇的人机交互控制系统
CN104898697A (zh) * 2015-05-18 2015-09-09 国家电网公司 一种无人机的三维动态模型及控制方法
CN105045277A (zh) * 2015-07-08 2015-11-11 西安电子科技大学 一种多无人机操控信息显示系统
CN107077113A (zh) * 2014-10-27 2017-08-18 深圳市大疆创新科技有限公司 无人飞行器飞行显示
CN107734289A (zh) * 2016-08-11 2018-02-23 鹦鹉无人机股份有限公司 捕获图像的方法、相关计算机程序和捕获视频的电子系统
CN108253966A (zh) * 2016-12-28 2018-07-06 昊翔电能运动科技(昆山)有限公司 无人机飞行三维模拟显示方法
CN110187700A (zh) * 2019-06-10 2019-08-30 北京科技大学 基于虚拟现实的仿生扑翼飞行机器人远程控制系统和方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200159252A1 (en) * 2018-11-21 2020-05-21 Eagle View Technologies, Inc. Navigating unmanned aircraft using pitch

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493699A (zh) * 2009-03-04 2009-07-29 北京航空航天大学 一种空中无人机超视距遥控方法
US20140032034A1 (en) * 2012-05-09 2014-01-30 Singularity University Transportation using network of unmanned aerial vehicles
CN103809600A (zh) * 2014-03-04 2014-05-21 北京航空航天大学 一种无人飞艇的人机交互控制系统
CN107077113A (zh) * 2014-10-27 2017-08-18 深圳市大疆创新科技有限公司 无人飞行器飞行显示
CN104898697A (zh) * 2015-05-18 2015-09-09 国家电网公司 一种无人机的三维动态模型及控制方法
CN105045277A (zh) * 2015-07-08 2015-11-11 西安电子科技大学 一种多无人机操控信息显示系统
CN107734289A (zh) * 2016-08-11 2018-02-23 鹦鹉无人机股份有限公司 捕获图像的方法、相关计算机程序和捕获视频的电子系统
CN108253966A (zh) * 2016-12-28 2018-07-06 昊翔电能运动科技(昆山)有限公司 无人机飞行三维模拟显示方法
CN110187700A (zh) * 2019-06-10 2019-08-30 北京科技大学 基于虚拟现实的仿生扑翼飞行机器人远程控制系统和方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115938190A (zh) * 2022-12-01 2023-04-07 南京芯传汇电子科技有限公司 一种无人机控制训练模拟器便携式控制终端

Also Published As

Publication number Publication date
CN112840286A (zh) 2021-05-25

Similar Documents

Publication Publication Date Title
US10936894B2 (en) Systems and methods for processing image data based on region-of-interest (ROI) of a user
WO2019242553A1 (zh) 控制拍摄装置的拍摄角度的方法、控制装置及可穿戴设备
WO2020143677A1 (zh) 一种飞行控制方法及飞行控制系统
WO2021078270A1 (zh) 一种可拆换的云台相机、飞行器、系统及其云台拆换方法
WO2018072155A1 (zh) 一种用于控制无人机的穿戴式设备及无人机系统
JP6899875B2 (ja) 情報処理装置、映像表示システム、情報処理装置の制御方法、及びプログラム
US11272105B2 (en) Image stabilization control method, photographing device and mobile platform
JP2016180866A (ja) 空撮装置
JP2017163265A (ja) 操縦支援システム、情報処理装置およびプログラム
WO2019075758A1 (zh) 成像控制方法、成像装置和无人机
WO2020172800A1 (zh) 可移动平台的巡检控制方法和可移动平台
KR20170044451A (ko) 헤드 마운트 디스플레이를 이용한 원격지 카메라 제어 시스템 및 방법
WO2019230604A1 (ja) 検査システム
WO2021232424A1 (zh) 飞行辅助方法及装置、无人飞行器、遥控器、显示器、无人飞行器系统和存储介质
US11467572B2 (en) Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium
WO2021251441A1 (ja) 方法、システムおよびプログラム
WO2020244648A1 (zh) 一种飞行器控制方法、装置及飞行器
WO2021168821A1 (zh) 可移动平台的控制方法和设备
WO2018020853A1 (ja) 移動体操縦システム、操縦シグナル送信システム、移動体操縦方法、プログラム、および記録媒体
WO2020209167A1 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2020168519A1 (zh) 拍摄参数的调整方法、拍摄设备以及可移动平台
WO2022061934A1 (zh) 图像处理方法、装置、系统、平台及计算机可读存储介质
WO2022253018A1 (zh) 无人机视角的视频显示方法及显示系统
WO2019178827A1 (zh) 无人机的通信控制方法、系统和无人机
US11610343B2 (en) Video display control apparatus, method, and non-transitory computer readable medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20936520

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20936520

Country of ref document: EP

Kind code of ref document: A1