WO2022141122A1 - 无人机的控制方法、无人机及存储介质 - Google Patents

无人机的控制方法、无人机及存储介质 Download PDF

Info

Publication number
WO2022141122A1
WO2022141122A1 PCT/CN2020/141085 CN2020141085W WO2022141122A1 WO 2022141122 A1 WO2022141122 A1 WO 2022141122A1 CN 2020141085 W CN2020141085 W CN 2020141085W WO 2022141122 A1 WO2022141122 A1 WO 2022141122A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
amount
target image
image area
virtual
Prior art date
Application number
PCT/CN2020/141085
Other languages
English (en)
French (fr)
Inventor
张立天
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/141085 priority Critical patent/WO2022141122A1/zh
Priority to CN202080079886.7A priority patent/CN114761898A/zh
Publication of WO2022141122A1 publication Critical patent/WO2022141122A1/zh
Priority to US18/343,369 priority patent/US20230359198A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2247Optic providing the operator with simple or augmented images from one or more cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/656Interaction with payloads or external entities
    • G05D1/689Pointing payloads towards fixed or moving targets
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/30Specific applications of the controlled vehicles for social or care-giving applications
    • G05D2105/345Specific applications of the controlled vehicles for social or care-giving applications for photography
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/20Aircraft, e.g. drones
    • G05D2109/25Rotorcrafts
    • G05D2109/254Flying platforms, e.g. multicopters

Definitions

  • the present application relates to the technical field of unmanned aerial vehicles, and in particular, to a control method of an unmanned aerial vehicle, an unmanned aerial vehicle and a storage medium.
  • the prior art provides a solution based on a panoramic camera.
  • the panoramic camera is installed on the drone, and the panoramic video is obtained through the panoramic camera during the flight of the drone.
  • the software edits the panoramic video and cuts out the video effect that the user wants.
  • the present application provides a control method of an unmanned aerial vehicle, an unmanned aerial vehicle and a storage medium.
  • the present application provides a control method for an unmanned aerial vehicle.
  • the unmanned aerial vehicle is provided with a camera device, and the camera device is used to obtain a panoramic image, and the unmanned aerial vehicle is connected in communication with the control device, and the method includes:
  • the target image area is sent to the control device, so that the control device displays the target image area.
  • the present application provides an unmanned aerial vehicle.
  • the unmanned aerial vehicle is provided with a camera device, and the camera device is used to acquire panoramic images.
  • the unmanned aerial vehicle is connected to the control device in communication, and the unmanned aerial vehicle further includes : memory and processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the following steps when executing the computer program:
  • the target image area is sent to the control device, so that the control device displays the target image area.
  • the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor enables the processor to implement the UAV as described above control method.
  • the embodiments of the present application provide a control method for an unmanned aerial vehicle, an unmanned aerial vehicle, and a storage medium.
  • the unmanned aerial vehicle is provided with a camera device for acquiring panoramic images, the unmanned aerial vehicle is connected to the control device in communication, and the control device sent by the control device is obtained.
  • the drone can determine the target image area in the panoramic image in time according to the control rod amount sent by the control device, and return the target image area to the control device, so that the user can view the picture corresponding to the control rod amount in real time.
  • the control device receives it, it can display the target image area in time, and the user can watch the picture corresponding to the joystick amount in real time, such as various thrilling pictures, so as to meet the needs of the user and improve the user experience.
  • FIG. 1 is a schematic flowchart of an embodiment of a control method for an unmanned aerial vehicle of the present application
  • Fig. 2 is the FOV schematic diagram of the fisheye camera device arranged above the unmanned aerial vehicle in one embodiment of the control method of the unmanned aerial vehicle of the present application;
  • Fig. 3 is the FOV schematic diagram of the fisheye camera device arranged below the drone in the embodiment of Fig. 2;
  • Fig. 4 is the FOV schematic diagram of upper and lower two fisheye camera devices splicing in the embodiment of Fig. 2;
  • FIG. 5 is a schematic structural diagram of an unmanned aerial vehicle arm and a camera device deployed in an embodiment of a control method for an unmanned aerial vehicle of the present application;
  • FIG. 6 is a schematic structural diagram of the folding of the arms and the camera device of the drone in the embodiment of FIG. 5;
  • FIG. 7 is a schematic flowchart of another embodiment of the control method of the UAV of the present application.
  • FIG. 8 is a schematic diagram of an embodiment of determining a target image area in the control method of the UAV of the present application.
  • FIG. 9 is a schematic flowchart of another embodiment of the control method of the UAV of the present application.
  • FIG. 10 is a schematic structural diagram of an embodiment of a remote controller joystick in the control method of the unmanned aerial vehicle of the present application
  • FIG. 11 is a schematic diagram of an embodiment of determining the yaw offset angle in the control method of the UAV of the present application.
  • FIG. 12 is a schematic diagram of an embodiment of a virtual camera coordinate system in the control method of the UAV of the present application.
  • FIG. 13 is a schematic structural diagram of an embodiment of the UAV of the present application.
  • Unmanned aerial vehicle 1. Memory; 2. Processor; 3. Camera device; 10. First machine arm; 20. Second machine arm; 30. Rotating shaft.
  • FPV flying of drones is becoming more and more popular, but the operational difficulty of FPV flying is very high.
  • the prior art provides a solution based on a panoramic camera. During the flight of the UAV, a panoramic video is obtained through the installed panoramic camera. After the UAV finishes flying, the user edits the panoramic video and cuts out the user. desired video effect.
  • this post-production method cannot allow users to watch the thrilling pictures in real time during the flight of the drone, and it is still difficult to meet the needs of users.
  • the embodiments of the present application provide a control method for an unmanned aerial vehicle, an unmanned aerial vehicle, and a storage medium.
  • the unmanned aerial vehicle is provided with a camera device for acquiring panoramic images, the unmanned aerial vehicle is connected to the control device in communication, and the control device sent by the control device is obtained.
  • the control device sends the control rod amount
  • the drone determines the target image area in the panoramic image obtained by the camera device according to the obtained control rod amount, and sends the target image area to the control device
  • the drone can determine the target image area in the panoramic image in time according to the control rod amount sent by the control device, and return the target image area to the control device, as Provide technical support for the user to view the picture corresponding to the joystick amount in real time.
  • the control device receives it, the target image area can be displayed in time, and the user can watch the picture corresponding to the joystick amount in real time, such as various thrilling pictures, so as to be able to Meet user needs and improve user experience.
  • FIG. 1 is a schematic flowchart of an embodiment of a control method for an unmanned aerial vehicle of the present application.
  • the drone is provided with a camera device, and the camera device is used to acquire panoramic images.
  • the number of the camera devices is one or more; when the number of camera devices is multiple, the panoramic image may be formed by splicing images captured by the multiple camera devices.
  • how many cameras are needed depends on the field of view (FOV, Field of view) of the selected camera and the required stitching quality. The smaller the FOV of each camera, the more The more the number of camera devices, to achieve 360° full coverage.
  • the cameras are two fisheye cameras, which are respectively set above and below the drone.
  • Each fisheye camera covers more than 1/2 of the FOV of the panoramic image, and between the FOVs of the two fisheye cameras Some overlap each other.
  • the FOVs of the two fisheye cameras are represented by solid line boxes in the figure; as shown in Figure 4, the FOVs of the two fisheye cameras are represented by the upper and lower dotted lines in the figure respectively.
  • the box indicates that the combined FOV of the two fisheye cameras is represented by the solid line box in the figure.
  • the drone 100 includes a first arm 10 and a second arm 20 , the second arm 20 and the first arm 10 are connected by a rotating shaft 30 , and the rotating shaft Both ends of 30 are provided with the camera device 3, and the camera device 3 is a fisheye camera device. Since the second arm 20 is connected with the first arm 10 through the rotating shaft 30 , the camera devices 3 are disposed at both ends of the rotating shaft. Therefore, when the second arm 20 and the first arm 10 of the drone 100 rotate relative to each other, the relative positions of the camera devices 3 provided at both ends of the rotating shaft 30 do not change. There is no need to re-calibrate the relative position when the device 3 performs panoramic shooting. It can not only ensure the stitching speed of panoramic images, but also ensure the accuracy of panoramic image stitching.
  • the method of this embodiment includes: step S101, step S102 and step S103.
  • Step S101 Acquire the control rod amount sent by the control device.
  • Step S102 Determine a target image area in the panoramic image acquired by the camera according to the control rod amount.
  • Step S103 Send the target image area to the control apparatus, so that the control apparatus displays the target image area.
  • the drone is communicatively connected to the control device.
  • the control device may refer to a device that can send control commands that can be responded to to the drone.
  • the control device includes but is not limited to: remote control, user equipment, terminal equipment, etc.
  • the control device may also be a combination of two or more control devices. Combinations such as remote control and user equipment, remote control and terminal equipment, etc.
  • the joystick amount may refer to a control instruction for determining a target image area in the panoramic image acquired by the camera.
  • the amount of the control stick may be issued by the user by pushing the joystick by hand, or by the user by touching the joystick on the touch screen, or by directly inputting an instruction, and so on.
  • the corresponding relationship between the preset image areas, the target image area can be determined according to the control rod amount and the corresponding relationship sent by the control device; another example: the virtual attitude angle can be mapped according to the control rod amount, and the target image can be determined according to the virtual attitude angle area; etc.
  • the drone in the embodiment of the present application is provided with a camera device for acquiring panoramic images, the drone is connected to the control device in communication, and acquires the control rod amount sent by the control device; the panoramic image obtained by the camera device according to the control rod amount Determine the target image area in the process; send the target image area to the control device, so that the control device displays the target image area.
  • the control device sends the control rod amount
  • the drone determines the target image area in the panoramic image obtained by the camera device according to the obtained control rod amount, and sends the target image area to the control device
  • the drone can determine the target image area in the panoramic image in time according to the control rod amount sent by the control device, and return the target image area to the control device, as Allow users to view the picture corresponding to the amount of the joystick in real time (especially when the user sees the picture corresponding to the amount of the joystick in real time during the flight of the drone) to provide technical support.
  • the control device receives it, it can display the target image area in time. Able to view the picture corresponding to the joystick amount in real time (especially the user can watch the picture corresponding to the joystick amount in real time during the flight of the drone), such as various thrilling pictures, so as to meet the needs of the user and improve the user experience.
  • the method of the embodiment of the present application may be applied to an application scenario consisting of a drone, a remote controller, and a head-mounted display device. That is, the control device includes a remote controller and a head-mounted display device.
  • the acquiring the control rod amount sent by the control device may include: acquiring the control rod amount sent by the remote control.
  • the sending the target image area to the control apparatus, so that the control apparatus displays the target image area may include: sending the target image area to the headset
  • the head-mounted display device enables the head-mounted display device to display the target image area.
  • Head-mounted display devices use a set of optical systems (mainly precision optical lenses) to amplify the image on the ultra-fine display screen, project the image on the retina, and then present the large-screen image in the viewer's eyes. Viewing objects with a magnifying glass presents a magnified image of the virtual object.
  • optical signals are sent to the user's eyes, which can achieve different effects such as virtual reality (VR, Virtual Reality), augmented reality (AR, Augmented Reality), and mixed reality (MR, Mixed Reality).
  • the user For conventional display devices, the user must look at the device, while for this head-mounted display device, the user does not need to look at the device; in addition, since the head-mounted display device is usually in the shape of a hat or glasses, it is easy to carry and can be used at any time ; Due to the use of a small display screen, it is very power-saving; especially when a large virtual display is formed, a significant energy-saving effect is produced.
  • the user sends the control stick amount through the remote control, and after the drone receives the control stick amount, the target image area is determined in the panoramic image obtained by the camera device. , and then send the target image area to the head-mounted display device.
  • the head-mounted display device After the head-mounted display device receives the target image area, it displays the target image area, and the user can watch the target image area immersively.
  • the method of the embodiment of the present application may be applied in an application scenario composed of a drone, a remote controller, and a terminal device. That is, the control apparatus includes a remote controller and a terminal device.
  • the acquiring the control stick quantity sent by the control apparatus may include: acquiring the control stick quantity sent by the remote controller.
  • the sending the target image area to the control apparatus, so that the control apparatus displays the target image area may include: sending the target image area to the terminal device , so that the terminal device displays the target image area.
  • the terminal device includes, but is not limited to, a smart phone, a ground control station, a personal computer, a palmtop computer, and the like.
  • an application can be installed on the user's mobile phone, the user sends the joystick amount through the remote control, and after the drone receives the joystick amount, the target image area is determined in the panoramic image obtained by the camera, and then the target image area is sent to the user
  • the user's mobile phone displays the target image area on the screen of the mobile phone, and the user can view the target image area.
  • the drone can send the target image area to the remote controller through a private communication link, and the remote controller can forward it to the mobile phone.
  • the remote controller can forward the target image area to the mobile phone through a connecting line.
  • the drone can send the target image area directly to the mobile phone through a standard communication link, such as WIFI, 4G, etc.
  • the method of the embodiment of the present application may be applied in an application scenario consisting of a drone and a control device having a control area and a display area. That is, the control device is provided with a control area and a display area.
  • the acquiring the control rod amount sent by the control device may include: acquiring the control rod amount sent by the control device, where the control rod amount is Generated based on user operations in the control area.
  • the sending the target image area to the control apparatus, so that the control apparatus displays the target image area may include: sending the target image area to the control apparatus , so that the display area of the control device displays the target image area.
  • the control device has a control area and a display area, the control area can be operated by the user to generate and send out the control rod quantity, and the display area is used for display.
  • the user generates and sends the control rod amount through the control area of the control device.
  • the target image area is determined in the panoramic image obtained by the camera device, and then the target image area is sent to the control device, and the control device receives After reaching the target image area, the target image area is displayed on the display area, and the user can view the target image area.
  • step S102 The details of step S102 will be described in detail below.
  • step S102, the determining a target image area in the panoramic image obtained by the camera according to the control rod amount may include sub-step S1021 and sub-step S1022, as shown in FIG. 7 .
  • Sub-step S1021 Determine the virtual attitude angle mapped by the control stick quantity according to the control stick quantity.
  • Sub-step S1022 Determine the target image area according to the virtual attitude angle.
  • the virtual attitude angle may refer to an imaginary and virtual attitude angle; the virtual attitude angle may include at least one of a pitch angle, a yaw angle, and a roll angle; the virtual attitude angle is used to determine the target image area in the panoramic image.
  • the target image area is not directly determined according to the lever amount, but the virtual attitude angle is mapped according to the lever amount, and then the target image area is determined according to the virtual attitude angle.
  • the method of determining the target image area in the APP is more intuitive, flexible, diverse and convenient, so that it can better meet the various needs of users.
  • the determining the target image area according to the virtual attitude angle may include: determining the target image area according to a preset angle of view and the virtual attitude angle.
  • the field of view is also called the field of view in optical engineering.
  • the size of the field of view determines the field of view of the optical instrument.
  • the larger the field of view the larger the field of view.
  • the preset field of view angle can be used to determine the range of the target image area in the panoramic image;
  • the virtual attitude angle can be used to determine the center of the target image area.
  • the solid line box in the figure represents the panoramic image
  • the virtual attitude angle determines the center A of the target image area
  • the preset field of view determines the range of the target image area in the panoramic image.
  • the box formed by A3 and A4 represents the range of the target image area.
  • the target image area can be determined according to the preset angle of view, and the center of the target image area can be determined according to the virtual attitude angle, in this way, the target image area can be quickly and accurately determined.
  • the virtual attitude angle of the control stick quantity map is related to the drone flight control quantity of the control stick quantity map. Since the joystick amount can not only map the virtual attitude angle, but also the UAV flight control amount. The virtual attitude angle mapped by the joystick amount is related to the UAV flight control amount. In this way, it can be determined according to the virtual attitude angle.
  • the target image area of the UAV is related to the flight process of the UAV, so that the user can view the target image area of the UAV during the flight process in time, simulating the immersive flight experience of FPV.
  • sub-step S1021 the determining the virtual attitude angle mapped by the lever amount according to the lever amount may include sub-step S10211 and sub-step S10212, as shown in FIG. 9 .
  • Sub-step S10211 Determine the drone flight control quantity mapped by the control stick quantity according to the control stick quantity.
  • Sub-step S10212 Determine the virtual attitude angle mapped by the control stick quantity according to the control stick quantity and the UAV flight control quantity.
  • the virtual attitude angle mapped by the control stick quantity is related to the control stick quantity itself and the UAV flight control quantity mapped by the control stick quantity.
  • the target image area determined according to the virtual attitude angle can be
  • the amount of the joystick itself is related to the flight process of the drone, that is, the user can further control the target image area through the amount of the joystick during the flight of the drone, so that the user can watch the drone in time.
  • the determining the UAV flight control quantity mapped by the control rod quantity according to the control rod quantity may further include: according to the control rod quantity and a preset virtual aircraft control model , determine the UAV flight control amount, and the preset aircraft control model is provided with a corresponding relationship between the control stick amount and the UAV flight control amount.
  • a preset aircraft control model is preset, and the preset aircraft control model is set with a corresponding relationship between the control stick quantity and the drone flight control quantity.
  • the corresponding relationship of the control model settings can determine the UAV flight control amount.
  • the user can experience the flight experience of a preset aircraft control model other than the current drone with preset settings.
  • the preset virtual aircraft control model includes a preset Virtual first-person view FPV aircraft control model, in this way, users can experience the flying experience of FPV drones.
  • the preset virtual aircraft control model includes a preset virtual aerial photography aircraft control model. In this way, the user can experience the flight experience of the aerial photography drone.
  • control stick quantity includes a first control stick quantity and a second control stick quantity.
  • the drone flight control quantity mapped by the control stick quantity is determined according to the control stick quantity. , which can also include:
  • (A) Determine the first flight control amount for the UAV to fly up or down in the body coordinate system according to the first control stick amount.
  • the first control stick amount can control the drone to fly up and down
  • the second control stick amount can control the drone to fly back and forth.
  • the drone can be controlled to fly up and down and back and forth through the first control stick amount and the second control stick amount.
  • the left button in the figure is the left joystick
  • the right button is the right joystick.
  • the drone flight controls corresponding to the four sticks in the remote control can be:
  • the throttle stick is used to control the lift of the drone; push the throttle stick up, the drone will fly upward; pull the throttle stick down, no The man-machine flies downward; the altitude of the drone remains unchanged when the throttle stick is in the neutral position.
  • the left joystick When the left joystick is pressed left and right, it is a yaw joystick, and the yaw joystick is used to control the course of the drone; when the yaw joystick is pressed to the left, the drone rotates left (ie, rotates counterclockwise). ; Hit the yaw joystick to the right, and the UAV rotates right (that is, rotates clockwise); when the yaw joystick is in the neutral position, the rotational angular velocity is zero, and the UAV does not rotate.
  • the right joystick When the right joystick is used left and right, it is a Roll joystick.
  • the roll joystick is used to control the left and right flight of the drone; if the roll joystick is turned to the left, the drone will fly to the left (that is, to pan left); Press the pitch stick right, and the drone flies to the right (panning right); when the pitch stick is in the middle, the left and right directions of the drone remain horizontal.
  • the stick amount emitted by the throttle (Throttle) joystick may be defined as the first control stick amount
  • the stick amount emitted by the pitch (Pitch) joystick may be defined as the second control stick amount
  • first flight control quantity and the second flight control quantity are speed control quantities.
  • control stick quantity further includes a third control stick quantity.
  • the virtual control stick quantity mapping is determined according to the control stick quantity and the drone flight control quantity. Attitude angle, which can also include:
  • the pitch angle in the virtual attitude angle is related to the first flight control quantity and the second flight control quantity, and the yaw angle in the virtual attitude angle is determined by another third control stick quantity. In this way, it is possible to A virtual pose angle that can better meet the user's needs is obtained, and then a target image area that can better meet the user's needs is obtained.
  • the method may further include:
  • (A1) Predict the movement trajectory of the UAV to obtain the predicted trajectory of the UAV.
  • the moving trajectory of the UAV can be predicted by the existing method to obtain the predicted trajectory of the UAV, and the yaw offset angle can be obtained accordingly, and the yaw angle in the virtual attitude angle can be adjusted according to the yaw offset angle. , in this way, the yaw angle in the obtained virtual attitude angle can be as consistent as possible with the yaw angle desired by the user, so as to improve the user experience.
  • (A2) determining the yaw offset angle according to the predicted trajectory may further include:
  • the drone in the body coordinate system, can fly forward and backward along the X-axis direction, and the drone can fly up and down along the Z-axis direction.
  • the drone can also rotate around the Z-axis at an angular velocity W.
  • the future trajectory of the drone can be predicted, and the predicted trajectory can be obtained (the curve indicated by the solid arrow in the figure is the predicted trajectory).
  • the look-ahead time t be T
  • the target trajectory point of the UAV on the predicted trajectory is point O
  • the yaw offset angle can be determined according to the target trajectory point 0
  • the virtual attitude angle can be adjusted according to the yaw offset angle. Yaw angle.
  • the virtual attitude angle (the attitude angle indicated by the cone in the figure) can be directed toward the future trajectory of the UAV.
  • users tend to look at the area after turning when turning.
  • a predicted trajectory is obtained by predicting the movement trajectory of the UAV, and the yaw offset angle is determined according to the predicted trajectory. Adjusting the yaw angle in the virtual attitude angle by shifting the angle can make the target image area determined according to the virtual attitude angle more in line with the user's habits.
  • the determining the virtual attitude angle mapped by the joystick quantity according to the joystick quantity may further include: determining the virtual attitude angle in a virtual camera coordinate system according to the joystick quantity attitude angle.
  • the origin, X axis, Y axis and Z axis of the virtual camera coordinate system are pre-defined, and the correspondence between the preset joystick amount and the attitude angle in the virtual camera coordinate system is defined relationship, the virtual attitude angle can be determined in the virtual camera coordinate system according to the received control rod amount.
  • the virtual attitude angle can be decoupled from the flight control quantity of the UAV, so that the user can determine the virtual attitude angle of the control stick quantity in the virtual camera coordinate system according to his own wishes, and then determine the target image area.
  • FIG. 13 is a schematic structural diagram of an embodiment of the unmanned aerial vehicle of the present application. It should be noted that the unmanned aerial vehicle of this embodiment can perform the steps in the above-mentioned control method of the unmanned aerial vehicle. The detailed description of the relevant content, Please refer to the above-mentioned related content of the control method of the UAV, which will not be repeated here.
  • the drone 100 is provided with a camera device 3, and the camera device 3 is used to obtain panoramic images, the drone 100 is connected to the control device in communication, and the drone 100 further includes: a memory 1 and a processor 2; The processor 2 is connected to the memory 1 and the camera device 3 through a bus.
  • the processor 2 may be a microcontroller unit, a central processing unit or a digital signal processor, and so on.
  • the memory 1 may be a Flash chip, a read-only memory, a magnetic disk, an optical disk, a U disk, a mobile hard disk, and the like.
  • Described memory 1 is used for storing computer program;
  • Described processor 2 is used for executing described computer program and when executing described computer program, realizes the following steps:
  • control device obtaining the control rod amount sent by the control device; determining a target image area in the panoramic image obtained by the camera device according to the control rod amount; sending the target image area to the control device, so that the control The device displays the target image area.
  • control device includes a remote controller and a head-mounted display device
  • processor executes the computer program
  • the processor implements the following steps: acquiring the amount of the joystick sent by the remote controller; sending the target image area to the head-mounted display device, so that the head-mounted display device displays the target image area.
  • control device includes a remote controller and a terminal device
  • processor executes the computer program
  • the processor implements the following steps: acquiring the amount of the joystick sent by the remote controller; sending the target image area to the A terminal device, so that the terminal device displays the target image area.
  • control device is provided with a control area and a display area
  • the processor executes the computer program, the processor implements the following steps: acquiring the control rod amount sent by the control device, and the control rod amount is based on the user generated by the operation of the control area; sending the target image area to the control device, so that the display area of the control device displays the target image area.
  • the processor executes the computer program, the following steps are implemented: determining a virtual attitude angle mapped by the lever amount according to the lever amount; determining the target image area according to the virtual attitude angle.
  • the processor when executing the computer program, implements the following steps: determining the target image area according to a preset field of view angle and the virtual attitude angle.
  • the virtual attitude angle of the control stick quantity map is related to the UAV flight control quantity mapped by the control stick quantity map.
  • the processor when executing the computer program, implements the following steps: determining the drone flight control quantity mapped by the control rod quantity according to the control rod quantity; according to the control rod quantity and the unmanned aerial vehicle The aircraft flight control quantity is determined, and the virtual attitude angle mapped by the control stick quantity is determined.
  • the processor when executing the computer program, implements the following steps: determining the UAV flight control amount according to the control stick amount and a preset virtual aircraft control model, and the preset aircraft control model setting There is a corresponding relationship between the control stick quantity and the drone flight control quantity.
  • the preset virtual aircraft control model includes a preset virtual first-person view FPV aircraft control model.
  • control stick quantity includes a first control stick quantity and a second control stick quantity
  • the processor executes the computer program, the processor implements the following steps: determining the drone according to the first control stick quantity The first flight control amount for flying up or down in the body coordinate system; the second flight control amount for the drone to fly forward or backward in the body coordinate system is determined according to the second control stick amount.
  • control stick quantity further includes a third control stick quantity
  • the processor executes the computer program, the processor implements the following steps: determining a yaw angle in the virtual attitude angle according to the third control stick quantity ; Determine the pitch angle in the virtual attitude angle according to the first flight control amount and the second flight control amount.
  • first flight control quantity and the second flight control quantity are speed control quantities.
  • the processor when executing the computer program, implements the following steps: predicting the movement trajectory of the UAV to obtain the predicted trajectory of the UAV; determining a yaw offset according to the predicted trajectory The yaw angle in the virtual attitude angle is adjusted according to the yaw offset angle.
  • the processor when executing the computer program, implements the following steps: obtaining a preset look-ahead time; determining a target trajectory point on the predicted trajectory according to the preset look-ahead time; determining an offset point according to the target trajectory point Navigation offset angle.
  • the number of the camera devices is one or more.
  • the drone includes a first arm and a second arm, the second arm and the first arm are connected by a rotating shaft, the two ends of the rotating shaft are provided with the camera device, the The camera is a fisheye camera.
  • the present application also provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor enables the processor to implement the unmanned aerial vehicle described in any one of the above. Control Method.
  • the relevant content please refer to the above-mentioned relevant content section, which will not be repeated here.
  • the computer-readable storage medium may be an internal storage unit of the above-mentioned drone, such as a hard disk or a memory.
  • the computer-readable storage medium can also be an external storage device, such as an equipped plug-in hard disk, smart memory card, secure digital card, flash memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

一种无人机(100)的控制方法、无人机(100)及存储介质,无人机(100)设置有摄像装置(3),摄像装置(3)用于获取全景图像,无人机(100)与控制装置通信连接。该方法包括:获取控制装置发送的控制杆量(S101);根据控制杆量在摄像装置(3)获取的全景图像中确定目标图像区域(S102);将目标图像区域发送给控制装置,以使得控制装置对目标图像区域进行显示(S103)。

Description

无人机的控制方法、无人机及存储介质 技术领域
本申请涉及无人机技术领域,尤其涉及一种无人机的控制方法、无人机及存储介质。
背景技术
近年来无人机的第一人称视角(FPV,First Person View)飞行越来越流行,其沉浸式的飞行体验吸引不少人的眼球。但是,FPV飞行的操作难度非常高,需要用户操作无人机执行花式飞行动作才能拍摄出惊险刺激的画面。现有技术提供了一种基于全景相机的方案,将全景相机安装在无人机上,在无人机飞行的过程中通过全景相机获取全景视频,在无人机结束飞行后再由用户通过视频后期软件对全景视频进行编辑,裁切出用户想要的视频效果。
然而,这种后期制作方法无法让用户在无人机飞行过程中实时观看到惊险刺激的画面,仍然难以满足用户的需求。
发明内容
基于此,本申请提供一种无人机的控制方法、无人机及存储介质。
第一方面,本申请提供一种无人机的控制方法,无人机设置有摄像装置,所述摄像装置用于获取全景图像,所述无人机与控制装置通信连接,所述方法包括:
获取所述控制装置发送的控制杆量;
根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;
将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
第二方面,本申请提供了一种无人机,无人机上设置有摄像装置,所述摄 像装置用于获取全景图像,所述无人机与控制装置通信连接,所述无人机还包括:存储器和处理器;
所述存储器用于存储计算机程序;
所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
获取所述控制装置发送的控制杆量;
根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;
将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
第三方面,本申请提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上所述的无人机的控制方法。
本申请实施例提供了一种无人机的控制方法、无人机及存储介质,无人机设置有用于获取全景图像的摄像装置,无人机与控制装置通信连接,获取控制装置发送的控制杆量;根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。通过这种方式,能够使无人机根据控制装置发送的控制杆量及时在全景图像中确定目标图像区域,并将目标图像区域返回给控制装置,为让用户实时观看到控制杆量对应的画面提供技术支持,当控制装置收到后能够及时显示目标图像区域,用户能够实时观看到控制杆量对应的画面,例如各种惊险刺激的画面,从而能够满足用户的需求,提升用户体验。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以 根据这些附图获得其他的附图。
图1是本申请无人机的控制方法一实施例的流程示意图;
图2是本申请无人机的控制方法一实施例中设置在无人机上方的鱼眼摄像装置的FOV示意图;
图3是图2的实施例中设置在无人机下方的鱼眼摄像装置的FOV示意图;
图4是图2的实施例中上下两个鱼眼摄像装置拼接的FOV示意图;
图5是本申请无人机的控制方法一实施例中无人机的机臂及摄像装置展开的结构示意图;
图6是图5的实施例中无人机的机臂及摄像装置折叠的结构示意图;
图7是本申请无人机的控制方法另一实施例的流程示意图;
图8是本申请无人机的控制方法中确定目标图像区域一实施例的示意图;
图9是本申请无人机的控制方法又一实施例的流程示意图;
图10是本申请无人机的控制方法中遥控器摇杆一实施例的结构示意图;
图11是本申请无人机的控制方法中确定偏航偏移角一实施例的示意图;
图12是本申请无人机的控制方法中虚拟相机坐标系一实施例的示意图;
图13是本申请无人机一实施例的结构示意图。
主要元件及符号说明:
100、无人机;1、存储器;2、处理器;3、摄像装置;10、第一机臂;20、第二机臂;30、转轴。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
无人机的FPV飞行越来越流行,但是FPV飞行的操作难度非常高。现有技术提供了一种基于全景相机的方案,在无人机飞行的过程中通过安装的全景相机获取全景视频,在无人机结束飞行后再由用户对全景视频进行编辑,裁切出用户想要的视频效果。然而,这种后期制作方法无法让用户在无人机飞行过程中实时观看到惊险刺激的画面,仍然难以满足用户的需求。
本申请实施例提供了一种无人机的控制方法、无人机及存储介质,无人机设置有用于获取全景图像的摄像装置,无人机与控制装置通信连接,获取控制装置发送的控制杆量;根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。由于无人机与控制装置通信连接,控制装置发送控制杆量,无人机根据获取到的控制杆量在摄像装置获取的全景图像中确定目标图像区域,并将目标图像区域发送给控制装置,以使得控制装置对目标图像区域进行显示,通过这种方式,能够使无人机根据控制装置发送的控制杆量及时在全景图像中确定目标图像区域,并将目标图像区域返回给控制装置,为让用户实时观看到控制杆量对应的画面提供技术支持,当控制装置收到后能够及时显示目标图像区域,用户能够实时观看到控制杆量对应的画面,例如各种惊险刺激的画面,从而能够满足用户的需求,提升用户体验。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
参见图1,图1是本申请无人机的控制方法一实施例的流程示意图。
本实施例中,无人机设置有摄像装置,所述摄像装置用于获取全景图像。所述摄像装置的数量为一个或多个;摄像装置的数量为多个时,全景图像可以是这多个摄像装置拍摄的图像拼接而成。在实际应用过程中,需要多少个摄像装置取决于所选择的摄像装置的视场角(FOV,Field of view)情况以及所需要的拼接质量,每个摄像装置的FOV越小,则所需要的摄像装置的数目越多,来实现360°全覆盖。
例如:摄像装置为两个鱼眼摄像装置,分别设置在无人机的上面和下面,每个鱼眼摄像装置覆盖超过1/2的全景图像的FOV,两个鱼眼摄像装置的FOV之间有部分相互重叠。如图2和图3所示,两个鱼眼摄像装置的FOV用图中 的实线方框表示;如图4所示,两个鱼眼摄像装置的FOV分别用图中的上下两个虚线方框表示,两个鱼眼摄像装置组合在一起的FOV用图中的实线方框表示,将这两个鱼眼摄像装置拍摄到的图像进行拼接可得到图中组合在一起的FOV覆盖的全景图像。
如图5至和图6所示,在一实施例中,无人机100包括第一机臂10和第二机臂20,第二机臂20与第一机臂10通过转轴30连接,转轴30的两端设置有所述摄像装置3,所述摄像装置3为鱼眼摄像装置。由于第二机臂20与第一机臂10通过转轴30连接,摄像装置3设置于转轴的两端。因此,无人机100的第二机臂20与第一机臂10相对转动时,转轴30两端设置的摄像装置3不发生相对位置的变化,通过这种方式,在利用转轴30两端的摄像装置3进行全景拍摄时无需重新标定相对位置。既能够保证全景图像的拼接速度,也能够保证全景图像拼接的准确性。
本实施例的所述方法包括:步骤S101、步骤S102以及步骤S103。
步骤S101:获取所述控制装置发送的控制杆量。
步骤S102:根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域。
步骤S103:将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
本实施例中,无人机与控制装置通信连接。控制装置可以是指能够向无人机发送能够被响应的控制指令的装置,控制装置包括但不限于:遥控器、用户设备、终端设备,等等,控制装置还可以是两个以上控制装置的组合,例如遥控器和用户设备、遥控器和终端设备,等等。
控制杆量可以是指用于在摄像装置获取的全景图像中确定目标图像区域的控制指令。控制杆量可以是用户通过手推摇杆的方式发出,也可以是用户通过对触摸屏上的摇杆进行触摸发出,或者直接通过输入指令的方式发出,等等。根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域的实现方式很多,例如:预先将全景图像划分为多个图像区域,预先设定预设控制杆量与全景图像的预设图像区域之间的对应关系,根据控制装置发送的控制杆量和对应关系即可确定目标图像区域;又如:可以根据控制杆量映射虚拟姿态 角,根据虚拟姿态角确定所述目标图像区域;等等。
将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,便于用户及时观看到控制杆量对应的目标图像区域,例如各种惊险刺激的画面,从而能够满足用户的需求,提升用户体验。
本申请实施例无人机设置有用于获取全景图像的摄像装置,无人机与控制装置通信连接,获取控制装置发送的控制杆量;根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。由于无人机与控制装置通信连接,控制装置发送控制杆量,无人机根据获取到的控制杆量在摄像装置获取的全景图像中确定目标图像区域,并将目标图像区域发送给控制装置,以使得控制装置对目标图像区域进行显示,通过这种方式,能够使无人机根据控制装置发送的控制杆量及时在全景图像中确定目标图像区域,并将目标图像区域返回给控制装置,为让用户实时观看到控制杆量对应的画面(特别是用户在无人机飞行过程中实时观看到控制杆量对应的画面)提供技术支持,当控制装置收到后能够及时显示目标图像区域,用户能够实时观看到控制杆量对应的画面(特别是用户能够在无人机飞行过程中实时观看到控制杆量对应的画面),例如各种惊险刺激的画面,从而能够满足用户的需求,提升用户体验。
下面详细说明本申请实施例的方法在目前比较常见的多种具体应用场景中的应用。
在一实施例中,本申请实施例的方法可以应用在无人机、遥控器和头戴式显示设备组成的应用场景中。即所述控制装置包括遥控器和头戴式显示设备,步骤S101中,所述获取所述控制装置发送的控制杆量,可以包括:获取所述遥控器发送的控制杆量。此时步骤S103,所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,可以包括:将所述目标图像区域发送给所述头戴式显示设备,以使得所述头戴式显示设备对所述目标图像区域进行显示。
头戴式显示设备,是通过一组光学系统(主要是精密光学透镜)放大超微显示屏上的图像,将影像投射于视网膜上,进而呈现于观看者眼中大屏幕图像,形象点说就是拿放大镜看物体呈现出放大的虚拟物体图像。通过头戴式显示设 备,向用户的眼睛发送光学信号,可以实现虚拟现实(VR,Virtual Reality)、增强现实(AR,Augmented Reality)、混合现实(MR,Mixed Reality)等不同效果。对于常规显示设备,用户必须看向设备,而对于该头戴式显示设备,则用户不需要看向设备;另外由于头戴式显示设备通常是帽子或眼镜的形状,因此携带方便并且可以随时使用;由于使用小显示屏,因此非常省电;特别地当形成大的虚拟显示器时,产生显著的节能效果。
上述无人机、遥控器和头戴式显示设备组成的应用场景中,用户通过遥控器发送控制杆量,无人机接收到控制杆量后,在摄像装置获取的全景图像中确定目标图像区域,然后将目标图像区域发送给头戴式显示设备,头戴式显示设备接收到目标图像区域后,对目标图像区域进行显示,用户即可沉浸式的观看到目标图像区域。
在另一实施例中,本申请实施例的方法可以应用在无人机、遥控器和终端设备组成的应用场景中。即所述控制装置包括遥控器和终端设备,步骤S101中,所述获取所述控制装置发送的控制杆量,可以包括:获取所述遥控器发送的控制杆量。此时步骤S103,所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,可以包括:将所述目标图像区域发送给所述终端设备,以使得所述终端设备对所述目标图像区域进行显示。
本实施例中,终端设备包括但不限于:智能手机、地面控制站、个人电脑、掌上电脑,等等。例如,用户的手机可以安装应用程序,用户通过遥控器发送控制杆量,无人机接收到控制杆量后,在摄像装置获取的全景图像中确定目标图像区域,然后将目标图像区域发送给用户的手机,用户的手机接收到目标图像区域后,在手机屏幕上对目标图像区域进行显示,用户即可观看到目标图像区域。可选的,无人机可以把目标图像区域通过私有通信链路发送给遥控器,由遥控器转发至手机,例如遥控器通过连接线将目标图像区域转发至手机。或者,无人机可以把目标图像区域通过标准通信链路,例如WIFI,4G等,将目标图像区域直接发送至手机。
在又一实施例中,本申请实施例的方法可以应用在无人机和具有控制区和显示区的控制装置组成的应用场景中。即所述控制装置设置有控制区和显示 区,步骤S101中,所述获取所述控制装置发送的控制杆量,可以包括:获取所述控制装置发送的控制杆量,所述控制杆量是基于用户在所述控制区的操作生成的。此时步骤S103,所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,可以包括:将所述目标图像区域发送给所述控制装置,以使得所述控制装置的显示区对所述目标图像区域进行显示。
本实施例中,控制装置具有控制区和显示区,控制区能够供用户操作生成并发出控制杆量,显示区供显示用。用户通过控制装置的控制区生成并发出控制杆量,无人机接收到控制杆量后,在摄像装置获取的全景图像中确定目标图像区域,然后将目标图像区域发送给控制装置,控制装置接收到目标图像区域后,在显示区上对目标图像区域进行显示,用户即可观看到目标图像区域。
下面详细说明步骤S102的细节内容。
在一实施例中,步骤S102,所述根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域,可以包括:子步骤S1021和子步骤S1022,如图7所示。
子步骤S1021:根据所述控制杆量确定所述控制杆量映射的虚拟姿态角。
子步骤S1022:根据所述虚拟姿态角确定所述目标图像区域。
虚拟姿态角可以是指假想的、虚拟的姿态角;虚拟姿态角可以包括俯仰角、偏航角、翻滚角中的至少一个;虚拟姿态角用于在全景图像中确定目标图像区域。本实施例并不是根据控制杆量直接确定目标图像区域,而是根据控制杆量映射虚拟姿态角,再根据虚拟姿态角确定目标图像区域,通过这种方式,能够使根据控制杆量在全景图像中确定目标图像区域的方式更加直观、灵活、多样、便利,从而更加能够满足用户的多种需求。
在一实施例中,子步骤S1022,所述根据所述虚拟姿态角确定所述目标图像区域,可以包括:根据预设视场角和所述虚拟姿态角确定所述目标图像区域。
视场角在光学工程中又称视场,视场角的大小决定了光学仪器的视野范围,视场角越大,视野就越大。在本实施例中,预设视场角可以用于在全景图像中确定目标图像区域的范围;虚拟姿态角可以用于确定目标图像区域的中心。如图8所示,图中实线方框代表全景图像,虚拟姿态角确定目标图像区域 的中心A,预设视场角在全景图像中确定目标图像区域的范围,图中用A1、A2、A3以及A4组成的方框表示目标图像区域的范围。
由于根据预设视场角能够确定目标图像区域的范围,根据虚拟姿态角能够确定所述目标图像区域的中心,通过这种方式,能够快速、准确确定目标图像区域。
在一实施例中,所述控制杆量映射的虚拟姿态角与所述控制杆量映射的无人机飞行控制量相关。由于控制杆量除了可以映射虚拟姿态角,还可以映射无人机飞行控制量,控制杆量映射的虚拟姿态角与无人机飞行控制量相关,通过这种方式,能够使根据虚拟姿态角确定的目标图像区域与无人机的飞行过程相关,从而能够使用户及时观看到无人机在飞行过程中的目标图像区域,模拟FPV的沉浸式飞行体验。
在一实施例中,子步骤S1021,所述根据所述控制杆量确定所述控制杆量映射的虚拟姿态角,可以包括:子步骤S10211和子步骤S10212,如图9所示。
子步骤S10211:根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量。
子步骤S10212:根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角。
在本实施例中控制杆量映射的虚拟姿态角与控制杆量本身和控制杆量映射的无人机飞行控制量都相关,通过这种方式,能够使根据虚拟姿态角确定的目标图像区域与控制杆量本身和无人机的飞行过程相关,即用户在无人机的飞行过程还能通过控制杆量进一步控制目标图像区域,从而能够使用户及时观看到无人机在飞行过程中用户进一步希望看到的目标图像区域。
在一实施例中,子步骤S10211,所述根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量,还可以包括:根据所述控制杆量和预设虚拟飞机控制模型,确定所述无人机飞行控制量,所述预设飞机控制模型设置有所述控制杆量与所述无人机飞行控制量之间的对应关系。
本实施例中,预先设置有预设飞机控制模型,该预设飞机控制模型设置有控制杆量与无人机飞行控制量之间的对应关系,根据接收到的控制杆量和预设虚拟飞机控制模型设置的对应关系,即可确定无人机飞行控制量。通过这种方 式,能够使用户体验到除当前无人机外的其他预设设置的预设飞机控制模型的飞行体验,例如,在一实施例中,所述预设虚拟飞机控制模型包括预设虚拟第一人称视角FPV飞机控制模型,通过这种方式,用户能够体验FPV无人机的飞行体验。又如:所述预设虚拟飞机控制模型包括预设虚拟航拍飞机控制模型,通过这种方式,用户能够体验航拍无人机的飞行体验。
在一实施例中,所述控制杆量包括第一控制杆量和第二控制杆量,子步骤S10211,所述根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量,还可以包括:
(A)根据所述第一控制杆量确定所述无人机在机体坐标系中向上或向下飞行的第一飞行控制量。
(B)根据所述第二控制杆量确定所述无人机在机体坐标系中向前或向后飞行的第二飞行控制量。
本实施例中,第一控制杆量能够控制无人机上下飞行,第二控制杆量能够控制无人机前后飞行。通过上述方式,能够通过第一控制杆量和第二控制杆量控制无人机上下飞行和前后飞行。
参见图10,以图中的遥控器为例,图中左边按键为左摇杆,右边按键为右摇杆,目前该遥控器中四个杆量对应的无人机飞行控制可以分别是:
(1)左摇杆上下打杆时为油门(Throttle)摇杆,油门摇杆用于控制无人机升降;往上推油门摇杆,无人机往上飞;往下拉油门摇杆,无人机往下飞;油门摇杆中位时无人机的高度保持不变。
(2)左摇杆左右打杆时为偏航(Yaw)摇杆,偏航摇杆用于控制无人机航向;往左打偏航摇杆,无人机左旋转(即逆时针旋转);往右打偏航摇杆,无人机右旋转(即顺时针旋转);偏航摇杆中位时旋转角速度为零,无人机不旋转。
(3)右摇杆上下打杆时为俯仰(Pitch)摇杆,俯仰摇杆用于控制无人机前后飞行;往上推俯仰摇杆,无人机向前飞行;往下拉俯仰摇杆,无人机向后飞行;俯仰摇杆中位时无人机的前后方向保持水平。
(4)右摇杆左右打杆时为翻滚(Roll)摇杆,翻滚摇杆用于控制无人机左右飞行;往左打翻滚摇杆,无人机向左飞行(即左平移);往右打俯仰摇杆, 无人机向右飞行(右平移);俯仰摇杆中位时无人机的左右方向保持水平。
对于本实施例的遥控器,可以定义油门(Throttle)摇杆发出的杆量为第一控制杆量,定义俯仰(Pitch)摇杆发出的杆量为第二控制杆量。
其中,所述第一飞行控制量和所述第二飞行控制量为速度控制量。
在一实施例中,所述控制杆量还包括第三控制杆量,子步骤S10212,所述根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角,还可以包括:
(A)根据所述第三控制杆量确定所述虚拟姿态角中的偏航角。
(B)根据所述第一飞行控制量和所述第二飞行控制量确定所述虚拟姿态角中的俯仰角。
虚拟姿态角中的俯仰角与所述第一飞行控制量和所述第二飞行控制量相关,虚拟姿态角中的偏航角通过另外的第三控制杆量来确定,通过这种方式,能够得到更能满足用户需求的虚拟姿态角,进而得到更能满足用户需求的目标图像区域。
在一实施例中,上述(A)所述根据所述第三控制杆量确定所述虚拟姿态角中的偏航角之后,还可以包括:
(A1)对所述无人机的移动轨迹进行预测以得到所述无人机的预测轨迹。
(A2)根据所述预测轨迹确定偏航偏移角。
(A3)根据所述偏航偏移角调整所述虚拟姿态角中的偏航角。
本实施例可以通过现有的方法对无人机的移动轨迹进行预测得到无人机的预测轨迹,据此得到偏航偏移角,根据偏航偏移角调整虚拟姿态角中的偏航角,通过这种方式能够使得到的虚拟姿态角中的偏航角与用户希望的偏航角尽可能保持一致,以提升用户体验。
其中,(A2)所述根据所述预测轨迹确定偏航偏移角,还可以包括:
(A21)获取预设前瞻时间。
(A22)根据所述预设前瞻时间在所述预测轨迹上确定目标轨迹点。
(A23)根据所述目标轨迹点确定偏航偏移角。
如图11所示,在一实施例中在机体坐标系下,无人机沿X轴方向可以向前飞行和向后飞行,无人机沿Z轴方向可以向上飞行和向下飞行,无人机还可 以绕Z轴以角速度W旋转,根据速度Vx、Vy、Vz、W,可以预测出无人机未来的轨迹,得到预测轨迹(图中实线箭头所指的曲线为预测轨迹),预设前瞻时间t为T,无人机在所述预测轨迹上的目标轨迹点为O点,可以根据目标轨迹点O确定偏航偏移角,并根据偏航偏移角调整虚拟姿态角中的偏航角。
通过这种方式,在无人机转弯的过程中可以使得虚拟姿态角(图中椎体所指示的姿态角)朝向无人机未来的轨迹。类比开车场景中用户在转弯时往往眼睛看向转弯后的区域,本申请实施例通过对无人机的移动轨迹进行预测以得到预测轨迹,根据预测轨迹确定偏航偏移角,根据偏航偏移角调整虚拟姿态角中的偏航角,能够使得根据虚拟姿态角确定的目标图像区域更符合用户习惯。
在一实施例中,子步骤S1021,所述根据所述控制杆量确定所述控制杆量映射的虚拟姿态角,还可以包括:根据所述控制杆量在虚拟相机坐标系中确定所述虚拟姿态角。
例如,参见图12,在一实施例中,预先定义好虚拟相机坐标系的原点、X轴、Y轴以及Z轴,定义好预设控制杆量与虚拟相机坐标系中姿态角之间的对应关系,根据接收到的控制杆量即可在虚拟相机坐标系中确定所述虚拟姿态角。通过这种方式,能够使虚拟姿态角与无人机的飞行控制量解耦,使用户能够根据自己的意愿确定控制杆量在虚拟相机坐标系中的虚拟姿态角,进而确定目标图像区域。
参见图13,图13是本申请无人机一实施例的结构示意图,需要说明的是,本实施例的无人机能够执行上述无人机的控制方法中的步骤,相关内容的详细说明,请参见上述无人机的控制方法的相关内容,在此不再赘叙。
无人机100上设置有摄像装置3,所述摄像装置3用于获取全景图像,所述无人机100与控制装置通信连接,所述无人机100还包括:存储器1和处理器2;处理器2与存储器1、摄像装置3通过总线连接。
其中,处理器2可以是微控制单元、中央处理单元或数字信号处理器,等等。
其中,存储器1可以是Flash芯片、只读存储器、磁盘、光盘、U盘或者移动硬盘等等。
所述存储器1用于存储计算机程序;所述处理器2用于执行所述计算机程 序并在执行所述计算机程序时,实现如下步骤:
获取所述控制装置发送的控制杆量;根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
其中,所述控制装置包括遥控器和头戴式显示设备,所述处理器在执行所述计算机程序时,实现如下步骤:获取所述遥控器发送的控制杆量;将所述目标图像区域发送给所述头戴式显示设备,以使得所述头戴式显示设备对所述目标图像区域进行显示。
其中,所述控制装置包括遥控器和终端设备,所述处理器在执行所述计算机程序时,实现如下步骤:获取所述遥控器发送的控制杆量;将所述目标图像区域发送给所述终端设备,以使得所述终端设备对所述目标图像区域进行显示。
其中,所述控制装置设置有控制区和显示区,所述处理器在执行所述计算机程序时,实现如下步骤:获取所述控制装置发送的控制杆量,所述控制杆量是基于用户在所述控制区的操作生成的;将所述目标图像区域发送给所述控制装置,以使得所述控制装置的显示区对所述目标图像区域进行显示。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述控制杆量确定所述控制杆量映射的虚拟姿态角;根据所述虚拟姿态角确定所述目标图像区域。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据预设视场角和所述虚拟姿态角确定所述目标图像区域。
其中,所述控制杆量映射的虚拟姿态角与所述控制杆量映射的无人机飞行控制量相关。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量;根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述控制杆量和预设虚拟飞机控制模型,确定所述无人机飞行控制量,所述预设飞机控制模型设置有所述控制杆量与所述无人机飞行控制量之间的对应关系。
其中,所述预设虚拟飞机控制模型包括预设虚拟第一人称视角FPV飞机控制模型。
其中,所述控制杆量包括第一控制杆量和第二控制杆量,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述第一控制杆量确定所述无人机在机体坐标系中向上或向下飞行的第一飞行控制量;根据所述第二控制杆量确定所述无人机在机体坐标系中向前或向后飞行的第二飞行控制量。
其中,所述控制杆量还包括第三控制杆量,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述第三控制杆量确定所述虚拟姿态角中的偏航角;根据所述第一飞行控制量和所述第二飞行控制量确定所述虚拟姿态角中的俯仰角。
其中,所述第一飞行控制量和所述第二飞行控制量为速度控制量。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:对所述无人机的移动轨迹进行预测以得到所述无人机的预测轨迹;根据所述预测轨迹确定偏航偏移角;根据所述偏航偏移角调整所述虚拟姿态角中的偏航角。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:获取预设前瞻时间;根据所述预设前瞻时间在所述预测轨迹上确定目标轨迹点;根据所述目标轨迹点确定偏航偏移角。
其中,所述处理器在执行所述计算机程序时,实现如下步骤:根据所述控制杆量在虚拟相机坐标系中确定所述虚拟姿态角。
其中,所述摄像装置的数量为一个或多个。
其中,所述无人机包括第一机臂和第二机臂,所述第二机臂与所述第一机臂通过转轴连接,所述转轴的两端设置有所述摄像装置,所述摄像装置为鱼眼摄像装置。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上任一项所述的无人机的控制方法。相关内容的详细说明请参见上述相关内容部分,在此不再赘叙。
其中,该计算机可读存储介质可以是上述无人机的内部存储单元,例如硬盘或内存。该计算机可读存储介质也可以是外部存储设备,例如配备的插接式 硬盘、智能存储卡、安全数字卡、闪存卡,等等。
应当理解,在本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
以上所述,仅为本申请的具体实施例,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (37)

  1. 一种无人机的控制方法,其特征在于,无人机设置有摄像装置,所述摄像装置用于获取全景图像,所述无人机与控制装置通信连接,所述方法包括:
    获取所述控制装置发送的控制杆量;
    根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;
    将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
  2. 根据权利要求1所述的方法,其特征在于,所述控制装置包括遥控器和头戴式显示设备,所述获取所述控制装置发送的控制杆量,包括:
    获取所述遥控器发送的控制杆量;
    所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,包括:
    将所述目标图像区域发送给所述头戴式显示设备,以使得所述头戴式显示设备对所述目标图像区域进行显示。
  3. 根据权利要求1所述的方法,其特征在于,所述控制装置包括遥控器和终端设备,所述获取所述控制装置发送的控制杆量,包括:
    获取所述遥控器发送的控制杆量;
    所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,包括:
    将所述目标图像区域发送给所述终端设备,以使得所述终端设备对所述目标图像区域进行显示。
  4. 根据权利要求1所述的方法,其特征在于,所述控制装置设置有控制区和显示区,所述获取所述控制装置发送的控制杆量,包括:
    获取所述控制装置发送的控制杆量,所述控制杆量是基于用户在所述控制区的操作生成的;
    所述将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示,包括:
    将所述目标图像区域发送给所述控制装置,以使得所述控制装置的显示区 对所述目标图像区域进行显示。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域,包括:
    根据所述控制杆量确定所述控制杆量映射的虚拟姿态角;
    根据所述虚拟姿态角确定所述目标图像区域。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述虚拟姿态角确定所述目标图像区域,包括:
    根据预设视场角和所述虚拟姿态角确定所述目标图像区域。
  7. 根据权利要求5所述的方法,其特征在于,所述控制杆量映射的虚拟姿态角与所述控制杆量映射的无人机飞行控制量相关。
  8. 根据权利要求5所述的方法,其特征在于,所述根据所述控制杆量确定所述控制杆量映射的虚拟姿态角,包括:
    根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量;
    根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角。
  9. 根据权利要求8所述的方法,其特征在于,所述根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量,包括:
    根据所述控制杆量和预设虚拟飞机控制模型,确定所述无人机飞行控制量,所述预设飞机控制模型设置有所述控制杆量与所述无人机飞行控制量之间的对应关系。
  10. 根据权利要求9所述的方法,其特征在于,所述预设虚拟飞机控制模型包括预设虚拟第一人称视角FPV飞机控制模型。
  11. 根据权利要求8所述的方法,其特征在于,所述控制杆量包括第一控制杆量和第二控制杆量,所述根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量,包括:
    根据所述第一控制杆量确定所述无人机在机体坐标系中向上或向下飞行的第一飞行控制量;
    根据所述第二控制杆量确定所述无人机在机体坐标系中向前或向后飞行的第二飞行控制量。
  12. 根据权利要求11所述的方法,其特征在于,所述控制杆量还包括第三控制杆量,所述根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角,包括:
    根据所述第三控制杆量确定所述虚拟姿态角中的偏航角;
    根据所述第一飞行控制量和所述第二飞行控制量确定所述虚拟姿态角中的俯仰角。
  13. 根据权利要求12所述的方法,其特征在于,所述第一飞行控制量和所述第二飞行控制量为速度控制量。
  14. 根据权利要求12所述的方法,其特征在于,所述根据所述第三控制杆量确定所述虚拟姿态角中的偏航角之后,还包括:
    对所述无人机的移动轨迹进行预测以得到所述无人机的预测轨迹;
    根据所述预测轨迹确定偏航偏移角;
    根据所述偏航偏移角调整所述虚拟姿态角中的偏航角。
  15. 根据权利要求14所述的方法,其特征在于,所述根据所述预测轨迹确定偏航偏移角,包括:
    获取预设前瞻时间;
    根据所述预设前瞻时间在所述预测轨迹上确定目标轨迹点;
    根据所述目标轨迹点确定偏航偏移角。
  16. 根据权利要求5所述的方法,其特征在于,所述根据所述控制杆量确定所述控制杆量映射的虚拟姿态角,包括:
    根据所述控制杆量在虚拟相机坐标系中确定所述虚拟姿态角。
  17. 根据权利要求1所述的方法,其特征在于,所述摄像装置的数量为一个或多个。
  18. 根据权利要求17所述的方法,其特征在于,所述无人机包括第一机臂和第二机臂,所述第二机臂与所述第一机臂通过转轴连接,所述转轴的两端设置有所述摄像装置,所述摄像装置为鱼眼摄像装置。
  19. 一种无人机,其特征在于,无人机上设置有摄像装置,所述摄像装置用于获取全景图像,所述无人机与控制装置通信连接,所述无人机还包括:存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    获取所述控制装置发送的控制杆量;
    根据所述控制杆量在所述摄像装置获取的全景图像中确定目标图像区域;
    将所述目标图像区域发送给所述控制装置,以使得所述控制装置对所述目标图像区域进行显示。
  20. 根据权利要求19所述的无人机,其特征在于,所述控制装置包括遥控器和头戴式显示设备,所述处理器在执行所述计算机程序时,实现如下步骤:
    获取所述遥控器发送的控制杆量;
    将所述目标图像区域发送给所述头戴式显示设备,以使得所述头戴式显示设备对所述目标图像区域进行显示。
  21. 根据权利要求19所述的无人机,其特征在于,所述控制装置包括遥控器和终端设备,所述处理器在执行所述计算机程序时,实现如下步骤:
    获取所述遥控器发送的控制杆量;
    将所述目标图像区域发送给所述终端设备,以使得所述终端设备对所述目标图像区域进行显示。
  22. 根据权利要求19所述的无人机,其特征在于,所述控制装置设置有控制区和显示区,所述处理器在执行所述计算机程序时,实现如下步骤:
    获取所述控制装置发送的控制杆量,所述控制杆量是基于用户在所述控制区的操作生成的;
    将所述目标图像区域发送给所述控制装置,以使得所述控制装置的显示区对所述目标图像区域进行显示。
  23. 根据权利要求19所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述控制杆量确定所述控制杆量映射的虚拟姿态角;
    根据所述虚拟姿态角确定所述目标图像区域。
  24. 根据权利要求23所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据预设视场角和所述虚拟姿态角确定所述目标图像区域。
  25. 根据权利要求23所述的无人机,其特征在于,所述控制杆量映射的虚拟姿态角与所述控制杆量映射的无人机飞行控制量相关。
  26. 根据权利要求23所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述控制杆量确定所述控制杆量映射的无人机飞行控制量;
    根据所述控制杆量和所述无人机飞行控制量,确定所述控制杆量映射的虚拟姿态角。
  27. 根据权利要求26所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述控制杆量和预设虚拟飞机控制模型,确定所述无人机飞行控制量,所述预设飞机控制模型设置有所述控制杆量与所述无人机飞行控制量之间的对应关系。
  28. 根据权利要求27所述的无人机,其特征在于,所述预设虚拟飞机控制模型包括预设虚拟第一人称视角FPV飞机控制模型。
  29. 根据权利要求26所述的无人机,其特征在于,所述控制杆量包括第一控制杆量和第二控制杆量,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述第一控制杆量确定所述无人机在机体坐标系中向上或向下飞行的第一飞行控制量;
    根据所述第二控制杆量确定所述无人机在机体坐标系中向前或向后飞行的第二飞行控制量。
  30. 根据权利要求29所述的无人机,其特征在于,所述控制杆量还包括第三控制杆量,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述第三控制杆量确定所述虚拟姿态角中的偏航角;
    根据所述第一飞行控制量和所述第二飞行控制量确定所述虚拟姿态角中的俯仰角。
  31. 根据权利要求30所述的无人机,其特征在于,所述第一飞行控制量和所述第二飞行控制量为速度控制量。
  32. 根据权利要求30所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    对所述无人机的移动轨迹进行预测以得到所述无人机的预测轨迹;
    根据所述预测轨迹确定偏航偏移角;
    根据所述偏航偏移角调整所述虚拟姿态角中的偏航角。
  33. 根据权利要求32所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    获取预设前瞻时间;
    根据所述预设前瞻时间在所述预测轨迹上确定目标轨迹点;
    根据所述目标轨迹点确定偏航偏移角。
  34. 根据权利要求23所述的无人机,其特征在于,所述处理器在执行所述计算机程序时,实现如下步骤:
    根据所述控制杆量在虚拟相机坐标系中确定所述虚拟姿态角。
  35. 根据权利要求19所述的无人机,其特征在于,所述摄像装置的数量为一个或多个。
  36. 根据权利要求35所述的无人机,其特征在于,所述无人机包括第一机臂和第二机臂,所述第二机臂与所述第一机臂通过转轴连接,所述转轴的两端设置有所述摄像装置,所述摄像装置为鱼眼摄像装置。
  37. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1-18任一项所述的无人机的控制方法。
PCT/CN2020/141085 2020-12-29 2020-12-29 无人机的控制方法、无人机及存储介质 WO2022141122A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/141085 WO2022141122A1 (zh) 2020-12-29 2020-12-29 无人机的控制方法、无人机及存储介质
CN202080079886.7A CN114761898A (zh) 2020-12-29 2020-12-29 无人机的控制方法、无人机及存储介质
US18/343,369 US20230359198A1 (en) 2020-12-29 2023-06-28 Unmanned aerial vehicle, control method thereof, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/141085 WO2022141122A1 (zh) 2020-12-29 2020-12-29 无人机的控制方法、无人机及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/343,369 Continuation US20230359198A1 (en) 2020-12-29 2023-06-28 Unmanned aerial vehicle, control method thereof, and storage medium

Publications (1)

Publication Number Publication Date
WO2022141122A1 true WO2022141122A1 (zh) 2022-07-07

Family

ID=82259901

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/141085 WO2022141122A1 (zh) 2020-12-29 2020-12-29 无人机的控制方法、无人机及存储介质

Country Status (3)

Country Link
US (1) US20230359198A1 (zh)
CN (1) CN114761898A (zh)
WO (1) WO2022141122A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480509A (zh) * 2022-09-14 2022-12-16 哈尔滨工业大学(深圳) 一种基于蜂窝网络的无人机控制系统及控制方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105554480A (zh) * 2016-03-01 2016-05-04 深圳市大疆创新科技有限公司 无人机拍摄图像的控制方法、装置、用户设备及无人机
CN106339691A (zh) * 2016-09-07 2017-01-18 四川天辰智创科技有限公司 对物体进行标注的方法及装置
CN106371460A (zh) * 2016-09-07 2017-02-01 四川天辰智创科技有限公司 寻找目标的方法及装置
CN106485736A (zh) * 2016-10-27 2017-03-08 深圳市道通智能航空技术有限公司 一种无人机全景视觉跟踪方法、无人机以及控制终端
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统
US20180113462A1 (en) * 2016-10-22 2018-04-26 Gopro, Inc. Position-based soft stop for a 3-axis gimbal
CN109074087A (zh) * 2017-12-25 2018-12-21 深圳市大疆创新科技有限公司 偏航姿态控制方法、无人机、计算机可读存储介质
CN110262565A (zh) * 2019-05-28 2019-09-20 深圳市吉影科技有限公司 应用于水下六推无人机的目标跟踪运动控制方法及装置
CN110771137A (zh) * 2018-05-28 2020-02-07 深圳市大疆创新科技有限公司 延时拍摄控制方法和设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105554480A (zh) * 2016-03-01 2016-05-04 深圳市大疆创新科技有限公司 无人机拍摄图像的控制方法、装置、用户设备及无人机
CN106339691A (zh) * 2016-09-07 2017-01-18 四川天辰智创科技有限公司 对物体进行标注的方法及装置
CN106371460A (zh) * 2016-09-07 2017-02-01 四川天辰智创科技有限公司 寻找目标的方法及装置
US20180113462A1 (en) * 2016-10-22 2018-04-26 Gopro, Inc. Position-based soft stop for a 3-axis gimbal
CN106485736A (zh) * 2016-10-27 2017-03-08 深圳市道通智能航空技术有限公司 一种无人机全景视觉跟踪方法、无人机以及控制终端
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统
CN109074087A (zh) * 2017-12-25 2018-12-21 深圳市大疆创新科技有限公司 偏航姿态控制方法、无人机、计算机可读存储介质
CN110771137A (zh) * 2018-05-28 2020-02-07 深圳市大疆创新科技有限公司 延时拍摄控制方法和设备
CN110262565A (zh) * 2019-05-28 2019-09-20 深圳市吉影科技有限公司 应用于水下六推无人机的目标跟踪运动控制方法及装置

Also Published As

Publication number Publication date
CN114761898A (zh) 2022-07-15
US20230359198A1 (en) 2023-11-09

Similar Documents

Publication Publication Date Title
US20220222905A1 (en) Display control apparatus, display control method, and program
WO2018214078A1 (zh) 拍摄控制方法及装置
US10567497B2 (en) Reticle control and network based operation of an unmanned aerial vehicle
WO2018120132A1 (zh) 控制方法、装置、设备及无人飞行器
US11189055B2 (en) Information processing apparatus and method and program
JP7042644B2 (ja) 情報処理装置、画像生成方法およびコンピュータプログラム
WO2019242553A1 (zh) 控制拍摄装置的拍摄角度的方法、控制装置及可穿戴设备
US10297085B2 (en) Augmented reality creations with interactive behavior and modality assignments
JP2001195601A (ja) 複合現実感提示装置及び複合現実感提示方法並びに記憶媒体
JP2020521992A (ja) モジュラー型mr装置のための撮像方法
JP2017163265A (ja) 操縦支援システム、情報処理装置およびプログラム
WO2020000402A1 (zh) 可移动平台的操控方法、装置及可移动平台
CN111650967A (zh) 一种用于影视拍摄的无人机及云台操控系统
WO2019019398A1 (zh) 遥控器和无人飞行器系统
US20230359198A1 (en) Unmanned aerial vehicle, control method thereof, and storage medium
US10803652B2 (en) Image generating apparatus, image generating method, and program for displaying fixation point objects in a virtual space
Xia et al. A 6-DOF telexistence drone controlled by a head mounted display
JP7435599B2 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2023201574A1 (zh) 无人机的控制方法、图像显示方法、无人机及控制终端
JP2002271694A (ja) 画像処理方法、画像処理装置、スタジオ装置、記憶媒体及びプログラム
WO2022253018A1 (zh) 无人机视角的视频显示方法及显示系统
WO2022056683A1 (zh) 视场确定方法、视场确定装置、视场确定系统和介质
WO2022061934A1 (zh) 图像处理方法、装置、系统、平台及计算机可读存储介质
WO2024092586A1 (zh) 无人机的控制方法、装置及存储介质
WO2021134375A1 (zh) 视频处理方法、装置、控制终端、系统及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20967470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20967470

Country of ref document: EP

Kind code of ref document: A1