WO2019183789A1 - 无人机的控制方法、装置和无人机 - Google Patents

无人机的控制方法、装置和无人机 Download PDF

Info

Publication number
WO2019183789A1
WO2019183789A1 PCT/CN2018/080605 CN2018080605W WO2019183789A1 WO 2019183789 A1 WO2019183789 A1 WO 2019183789A1 CN 2018080605 W CN2018080605 W CN 2018080605W WO 2019183789 A1 WO2019183789 A1 WO 2019183789A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
target pattern
drone
axis component
axis
Prior art date
Application number
PCT/CN2018/080605
Other languages
English (en)
French (fr)
Inventor
周游
陆正茂
唐克坦
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/080605 priority Critical patent/WO2019183789A1/zh
Priority to CN201880002830.4A priority patent/CN109661631A/zh
Publication of WO2019183789A1 publication Critical patent/WO2019183789A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls

Definitions

  • the invention relates to the technical field of drones, in particular to a control method and device for a drone and a drone.
  • the remote control rocker can be used to achieve precise control.
  • the operator can finely adjust the heading of the drone or the angle of the gimbal to control the drone to follow the target shooting.
  • Smart phones can also be used for precise control.
  • the operator implements the operation through the touch screen analog joystick of the smartphone, or maps the attitude of the aircraft through the gesture of the smartphone.
  • the above-mentioned drone control method requires the operator to have rich experience and technology.
  • the control of the drone becomes complicated and the user interaction experience is poor.
  • the invention provides a control method and device for a drone and a drone, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • an embodiment of the present invention provides a method for controlling a drone, including:
  • the drone is controlled to track the target pattern according to the pose information.
  • an embodiment of the present invention provides a control device for a drone, including:
  • the acquiring module is configured to acquire an image including the target pattern, and obtain the pose information of the target pattern in the three-dimensional coordinate system according to the image.
  • the control module is configured to control the drone tracking target pattern according to the pose information.
  • an embodiment of the present invention provides a control device for a drone, including: a memory and a processor;
  • the memory is configured to store program code
  • the processor the program code is invoked, and when the program code is executed, is used to perform the following operations:
  • the drone is controlled to track the target pattern according to the pose information.
  • an embodiment of the present invention provides a drone, including a control device for a drone provided by an embodiment of the present invention.
  • the invention provides a control method and device for a drone and a drone.
  • the pose information of the target pattern in the three-dimensional coordinate system can be obtained according to the image, and then the posture information is controlled according to the pose information.
  • Man-machine tracking target pattern Utilizing a specific pattern, the UAV follow-up effect can be easily realized, and the operator can control the drone through complicated control operations by using a professional control device such as a remote control rocker, thereby reducing the control difficulty of the drone and the operator.
  • the learning cost increases the user interaction experience.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system in accordance with an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for controlling a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a target pattern and an image according to an embodiment of the present invention.
  • FIG. 4 is another schematic diagram of a target pattern and an image according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of controlling a height adjustment of a drone according to an embodiment of the present invention.
  • FIG. 6 is another schematic diagram of controlling the height adjustment of a drone according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of controlling a drone by two target patterns according to an embodiment of the present invention.
  • FIG. 8 is another schematic diagram of controlling a drone by two target patterns according to an embodiment of the present invention.
  • FIG. 9 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 10 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 11 is a schematic structural diagram of a control device for a drone according to an embodiment of the present invention.
  • FIG. 12 is a schematic structural diagram of a control device for a drone according to another embodiment of the present invention.
  • Embodiments of the present invention provide a method, an apparatus, and a drone for controlling a drone. It should be noted that the control method of the UAV provided by the embodiment of the present invention is applicable not only to the UAV but also to other devices with a camera. For example, an unmanned car. The following description of the present invention will be described by taking an unmanned aerial vehicle as an example.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system in accordance with an embodiment of the present invention. This embodiment is described by taking a rotorcraft unmanned aerial vehicle as an example.
  • the unmanned aerial vehicle system 100 can include an unmanned aerial vehicle 110.
  • Unmanned aerial vehicle 110 may include power system 150, flight control system 160, and a rack.
  • the unmanned flight system 100 may further include a pan/tilt head 120.
  • the unmanned flight system 100 may also include a display device 130.
  • the UAV 110 can be in wireless communication with the display device 130.
  • the rack can include a fuselage and a tripod (also known as a landing gear).
  • the fuselage may include a center frame and one or more arms coupled to the center frame, the one or more arms extending radially from the center frame.
  • the stand is coupled to the fuselage for supporting when the UAV 110 is landing.
  • Power system 150 may include one or more electronic governors (referred to as ESCs) 151, one or more propellers 153, and one or more electric machines 152 corresponding to one or more propellers 153, wherein motor 152 is coupled Between the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are disposed on the arm of the unmanned aerial vehicle 110; the electronic governor 151 is configured to receive the driving signal generated by the flight control system 160 and provide driving according to the driving signal. Current is supplied to the motor 152 to control the rotational speed of the motor 152. Motor 152 is used to drive propeller rotation to power the flight of unmanned aerial vehicle 110, which enables unmanned aerial vehicle 110 to achieve one or more degrees of freedom of motion.
  • ESCs electronic governors
  • the UAV 110 can be rotated about one or more axes of rotation.
  • the above-described rotating shaft may include a roll, a yaw, and a pitch.
  • the motor 152 can be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • Flight control system 160 may include flight controller 161 and sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the unmanned aerial vehicle, that is, the position information and state information of the UAV 110 in space, for example, three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration, and three-dimensional angular velocity.
  • Sensing system 162 can include, for example, at least one of a gyroscope, an ultrasonic sensor, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system can be a Global Positioning System (GPS).
  • GPS Global Positioning System
  • the flight controller 161 is used to control the flight of the unmanned aerial vehicle 110, for example, the flight of the unmanned aerial vehicle 110 can be controlled based on the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the unmanned aerial vehicle 110 in accordance with a pre-programmed program command, or may control the unmanned aerial vehicle 110 through a photographing screen.
  • the pan/tilt 120 can include a motor 122.
  • the pan/tilt is used to carry the photographing device 123.
  • the flight controller 161 can control the motion of the platform 120 via the motor 122.
  • the platform 120 may further include a controller for controlling the motion of the platform 120 by controlling the motor 122.
  • the platform 120 can be independent of the UAV 110 or a portion of the UAV 110.
  • the motor 122 can be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the pan/tilt can be located at the top of the UAV or at the bottom of the UAV.
  • the photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and perform photographing under the control of the flight controller, and the flight controller may also take an image according to the photographing device 123.
  • the UAV 110 is controlled.
  • the imaging device 123 of the present embodiment includes at least a photosensitive element, such as a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It will be appreciated that the camera 123 can also be directly attached to the UAV 110 so that the platform 120 can be omitted.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • Display device 130 is located at the ground end of unmanned aerial vehicle system 100, can communicate with unmanned aerial vehicle 110 wirelessly, and can be used to display attitude information for unmanned aerial vehicle 110. In addition, an image taken by the photographing device can also be displayed on the display device 130. It should be understood that display device 130 may be a device that is independent of UAV 110.
  • the image coordinate system is a two-dimensional plane, also called an image plane, which can be understood as the surface of the sensor in the camera. Each sensor has a certain size and a certain resolution, which determines the conversion relationship between millimeters and pixels.
  • the coordinates of a point in the image coordinate system may be expressed as (u, v) in units of pixels, or as (x, y) in units of millimeters.
  • the image coordinate system can be divided into an image pixel coordinate system and an image physical coordinate system.
  • the unit of the image pixel coordinate system may be a pixel, and the two coordinate axes may be referred to as a U axis and a V axis, respectively.
  • the unit of the image physical coordinate system may be millimeters, and the two coordinate axes may be referred to as an X axis and a Y axis, respectively.
  • the camera coordinate system is a three-dimensional coordinate system.
  • the origin of the camera coordinate system is the optical center of the camera (lens), and the X-axis (also called U-axis) and Y-axis (also called V-axis) of the camera coordinate system are respectively associated with the X-axis (U-axis) of the image coordinate system.
  • the Y axis (V axis) is parallel, and the Z axis is the optical axis of the camera.
  • the geodetic coordinate system is a three-dimensional coordinate system, which can also be called a navigation coordinate system, a local horizontal coordinate system, or a North-East-Down Coordinate System (NED), which is commonly used for navigation calculations.
  • NED North-East-Down Coordinate System
  • the X axis points to the north (North), the Y axis points to the east (East), and the Z axis points to the center (Down).
  • the X and Y axes are tangent to the Earth's surface.
  • Body coordinate system (BodyFrame)
  • the body coordinate system (also known as the Body coordinate system or the Body system) is a three-dimensional coordinate system and is a coordinate system fixed to the body of the drone.
  • the origin of the body coordinate system is at the center of gravity of the aircraft.
  • the X-axis of the body coordinate system points forward along the longitudinal axis of the body, or to the forward direction of the aircraft nose.
  • the Y axis points to the right along the horizontal axis of the body, or from the origin to the right side of the aircraft.
  • the direction of the Z axis is determined by the right hand rule according to the X axis and the Y axis.
  • the horizontal coordinate system of the organism is a three-dimensional coordinate system.
  • the origin of the horizontal coordinate system of the organism is at the center of gravity of the aircraft.
  • the X-axis of the horizontal coordinate system of the body points forward along the longitudinal axis of the body, or to the forward direction of the nose of the aircraft.
  • the positive direction of the Z axis is toward the center of the earth.
  • the direction of the Y-axis is determined by the right-hand rule according to the X-axis and the Z-axis.
  • FIG. 2 is a flowchart of a method for controlling a drone according to Embodiment 1 of the present invention.
  • the execution body may be a control device of the drone, and the control device of the drone may be disposed in the drone.
  • the control method of the UAV provided in this embodiment may include:
  • the surrounding environment of the drone can be photographed by the image acquiring device on the drone to obtain an image including the target pattern.
  • This embodiment does not limit the type of the image acquisition device.
  • it may be the photographing device 123 shown in Fig. 1, or an image sensor on the drone.
  • the target pattern can be used to control the drone so that the drone can follow the target pattern in the pattern following mode.
  • This embodiment does not limit the shape and size of the target pattern.
  • the target pattern may be a preset pattern.
  • a regular pattern such as a circle or a square, which is easily recognized, or a specific pattern such as a two-dimensional code.
  • the target pattern can be a human hand.
  • the present invention does not limit the number of target patterns. In a specific application scenario, the number of target patterns is different, and different control of the drone can be realized.
  • FIG. 3 is a schematic diagram of a target pattern and an image according to Embodiment 1 of the present invention. As shown in FIG. 3, after the drone enters the pattern following mode, the image 11 is acquired.
  • the target pattern 12 is included in the image 11.
  • the target pattern 12 is specifically a circular point.
  • the circular point can be fixed on the person's hand, and the circular point is moved by the movement of the hand.
  • the three-dimensional coordinate system may include at least one of a camera coordinate system, a geodetic coordinate system, and a body horizontal coordinate system.
  • the camera coordinate system, the geodetic coordinate system, and the horizontal coordinate system of the body have a certain conversion relationship.
  • the conversion relationship is not limited, and an existing conversion relationship may be adopted. That is to say, if the pose information of the target pattern in any of the above three-dimensional coordinate systems is obtained, the pose information of the target pattern in the other three-dimensional coordinate system can be obtained according to the conversion relationship.
  • the method may further include:
  • the pose information of the target pattern in the camera coordinate system is converted into the pose information of the target pattern in the geodetic coordinate system.
  • the target pattern can be determined according to the position and posture of the drone itself in the geodetic coordinate system, the position and posture relationship between the image acquiring device and the drone.
  • the pose information in the camera coordinate system is converted to the geodetic coordinate system, and the pose information of the target pattern in the camera coordinate system is obtained.
  • the Z-axis component of the target pattern in the geodetic coordinate system is taken as an example for description.
  • the Z-axis component of the target pattern in the geodetic coordinate system indicates the height of the target pattern to ground.
  • the drone can acquire its own height to the ground in the geodetic coordinate system. According to the pose information of the target pattern in the camera coordinate system, the position and attitude relationship between the image acquiring device and the drone, and the height of the drone itself in the earth coordinate system, the height of the target pattern can be obtained. That is, the Z-axis component of the target pattern in the geodetic coordinate system.
  • the pose information of the target pattern in the three-dimensional coordinate system includes translation information (which may be identified by T) of the target pattern in the three-dimensional coordinate system and/or posture information (Rotation) of the target pattern in the three-dimensional coordinate system ( Can be identified by R).
  • the position information of the target pattern in the three-dimensional coordinate system can be represented by coordinate values in three-dimensional coordinates.
  • the attitude information of the target pattern in the three-dimensional coordinate system indicates the degree of tilt (Tilt) and/or the degree of torsion of the target pattern in the three-dimensional coordinate system.
  • FIG. 4 is another schematic diagram of a target pattern and an image according to Embodiment 1 of the present invention.
  • the target pattern is a circular point.
  • the optical axis of the image acquisition device of the drone may be perpendicular to the plane of the image 11.
  • the position information of the target pattern 12 in the camera coordinate system can be expressed as (x1, y1, z1), and the attitude information can be expressed as R1.
  • the position information of the target pattern 22 in the camera coordinate system can be expressed as (x2, y2, z2), and the attitude information can be expressed as R2.
  • obtaining the pose information of the target pattern in the three-dimensional coordinate system according to the image may include:
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained from the image.
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the image is directly processed, the target pattern in the image is recognized, and the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained.
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value. Since the image is directly processed, and the resolution of the original image is generally high, the accuracy of identifying the target pattern and the accuracy of acquiring the pose information of the target pattern are improved.
  • the method of processing an image to recognize a target pattern in this embodiment is not limited.
  • a detection algorithm a template matching algorithm, a Convolutional Neural Network (CNN) algorithm, or the like can be employed.
  • CNN Convolutional Neural Network
  • the method for obtaining the pose information of the target pattern in the three-dimensional coordinate system according to the pixel coordinate value is not limited, and an existing algorithm may be used.
  • the position information T and the attitude information R of the target pattern in the camera coordinate system are obtained by a Perspective-n-Point (PNP) algorithm.
  • the PNP solution algorithm is an algorithm that solves camera external parameters by minimizing reprojection errors by using multiple pairs of 3D and 2D matching points in the case of known or unknown camera internal parameters.
  • the PNP solving algorithm is a commonly used algorithm in pose tracking.
  • obtaining the pose information of the target pattern in the three-dimensional coordinate system according to the image may include:
  • the image is downsampled to obtain a low resolution picture.
  • the region of interest is determined in the low resolution picture, the region of interest including the target pattern.
  • a to-be-processed area corresponding to the region of interest is determined in the high-resolution picture corresponding to the low-resolution picture, and the resolution of the high-resolution picture is higher than the resolution of the low-resolution picture.
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained according to the area to be processed.
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the image is subjected to down sampling processing to obtain a low resolution picture.
  • the resolution of a low resolution picture is less than the resolution of the image.
  • the specific value of the resolution of the low resolution picture is not limited in this embodiment, and may be, for example, 640*480.
  • the target pattern is identified in the low resolution picture and the Region of Interest (ROI) is determined. Among them, the target pattern can be located in the center of the ROI. Then, a to-be-processed area corresponding to the region of interest is determined in the high-resolution picture corresponding to the low-resolution picture. Among them, the resolution of the high resolution picture is higher than the resolution of the low resolution picture.
  • the high resolution picture can be an image or downsampled to the image. According to the to-be-processed area, the pixel coordinate value of the center point of the target pattern in the image coordinate system can be obtained, and then the pose information of the target pattern in the three-dimensional coordinate system can be obtained.
  • the ROI including the target pattern can be determined in the low resolution picture, and the amount of data calculation is reduced in the process of identifying the target pattern, and the operation speed is improved.
  • the method may further include:
  • the resolution of the original image is guaranteed to be standard resolution, which improves the data processing effect.
  • the standard resolution refers to the resolution that is usually set according to the image display size.
  • the specific value of the standard resolution is not limited in this embodiment, and is set as needed.
  • the drone tracking target pattern can be controlled according to the pose information, thereby implementing control of the drone.
  • the target pattern can control the flying height of the drone, or control the moving distance or moving speed when the drone approaches or moves away from the target pattern, or control the rotation angle or rotation of the drone when flying around the target pattern. Angular speed, and so on.
  • the control method of the UAV provided by the embodiment can obtain the pose information of the target pattern in the three-dimensional coordinate system according to the image by acquiring the image containing the target pattern, and then control the tracking target pattern of the drone according to the pose information.
  • the use of a specific pattern can easily realize the following effect of the drone, and avoid the operator using a professional control device such as a remote control rocker to control the drone through complicated control operations, thereby reducing the control difficulty and operation of the drone.
  • the learning cost of the user enhances the user interaction experience.
  • the method for controlling the UAV provided by the embodiment may further include:
  • the first exposure parameter is used for shooting among a plurality of preset exposure parameters.
  • this step can be applied to the initialization phase after the drone enters the pattern following mode.
  • the exposure can be polled based on a preset number of exposure parameters. That is to say, shooting with different exposure parameters results in different images containing the target pattern.
  • a first exposure parameter corresponding to the picture that can identify the target pattern and has the shortest exposure time.
  • an image including the target pattern can be acquired according to the first exposure parameter.
  • the present embodiment does not limit a plurality of preset exposure parameters.
  • the three-dimensional coordinate system can include a camera coordinate system.
  • the pose information of the target pattern in the three-dimensional coordinate system may include a U-axis component, a V-axis component, and a Z-axis component of the center point of the target pattern in the camera coordinate system.
  • the U-axis component may also be referred to as an X-axis component
  • the V-axis component may also be referred to as a Y-axis component.
  • controlling the drone tracking target pattern according to the pose information in the foregoing S103 may include:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the V-axis component of the center point of the target pattern in the camera coordinate system.
  • the desired height of the drone can be determined, thereby controlling the moving speed of the drone in the Z-axis direction of the body coordinate system, and adjusting the drone the height of.
  • the desired height of the drone can be at the same height as the target pattern.
  • the drone may be located above the target pattern, and the distance between the desired height of the drone and the target pattern is a first preset value.
  • the drone may be located below the target pattern, and the distance between the desired height of the drone and the target pattern is a second preset value.
  • the specific values of the first preset value and the second preset value are not limited in this embodiment.
  • the moving speed of the UAV in the Z-axis direction of the body coordinate system is controlled according to the desired height of the UAV, and the existing implementation manner may be adopted, which is not limited in this embodiment. It varies depending on the drone motion model. For example, the drone can move at a constant speed, accelerate the movement, and the like.
  • FIG. 5 is a schematic diagram of controlling the height adjustment of a drone according to an embodiment of the present invention.
  • the left side of FIG. 5 shows the image 31 including the target pattern 32 and the coordinate axes (u, v, z) of the camera coordinate system.
  • the right side of the drone 33 and the coordinate axes (X, Y, Z) of the body coordinate system are shown on the right side in FIG.
  • the U-axis component, the V-axis component, and the Z-axis component of the center point of the target pattern 32 in the camera coordinate system are u1, v1, and z1, respectively.
  • the desired height of the drone 33 is at the same height as the target pattern 32, the desired height of the drone 33 may be the height A (v1).
  • the desired height of the drone 33 may be the height B (v1+d1) ).
  • the moving speed v of the drone in the Z-axis direction in the body coordinate system can be controlled according to the Y-axis component of the center point of the target pattern in the camera coordinate system.
  • the V-axis component of the target pattern in the camera coordinate system controls the movement speed of the drone in the Z-axis direction of the body coordinate system, thereby avoiding the operator's complicated control by using a professional control device such as a remote control rocker.
  • the operation controls the flying height of the drone, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • controlling the drone tracking target pattern according to the pose information in the foregoing S103 may include:
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the camera coordinate system.
  • the desired position of the drone moving forward and backward can be determined, so that the moving speed of the drone in the X-axis direction of the body coordinate system can be controlled, and the adjustment is not performed.
  • the desired distance between the drone and the target pattern may be a third preset value. The specific value of the third preset value is not limited in this embodiment.
  • the moving speed of the UAV in the X-axis direction of the body coordinate system is controlled according to the desired position of the UAV moving forward and backward, and the existing implementation manner may be adopted, and this embodiment does not limited. It varies depending on the drone motion model. For example, the drone can move at a constant speed, accelerate the movement, and the like.
  • the pose information of the target pattern in the camera coordinate system can be seen in FIG.
  • the Z-axis component of the center point of the target pattern 32 in the camera coordinate system is z1. If the desired distance between the drone 33 and the target pattern 32 is P (P>0), the desired moving position of the drone 33 may be z1+P, so that the drone 33 can be controlled in the body coordinate system. The speed of movement in the direction of the axis.
  • the Z-axis component of the target pattern in the camera coordinate system controls the movement speed of the drone in the X-axis direction of the body coordinate system, thereby avoiding the operator's complicated control by using a professional control device such as a remote control rocker.
  • the operation and control of the distance of the drone reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the U-axis component of the center point of the target pattern in the camera coordinate system.
  • the desired position of the left and right movement of the drone can be determined, thereby controlling the moving speed of the drone in the Y-axis direction of the body coordinate system, and the adjustment is not performed.
  • the drone can face the target pattern, that is, by controlling the left and right movement of the drone so that the target pattern is located on the X-axis of the body coordinate system.
  • the drone is located on the left side of the target pattern with respect to the target pattern, and the distance between the drone and the target pattern is a fourth preset value.
  • the drone is located on the right side of the target pattern with respect to the target pattern, and the distance between the drone and the target pattern is a fifth preset value.
  • the specific values of the fourth preset value and the fifth preset value are not limited in this embodiment.
  • the moving speed of the UAV in the Y-axis direction of the body coordinate system is controlled according to the desired position of the UAV moving left and right, and the existing implementation manner may be adopted. limited. It varies depending on the drone motion model. For example, the drone can move at a constant speed, accelerate the movement, and the like.
  • the pose information of the target pattern in the camera coordinate system can be seen in FIG.
  • the X-axis component of the center point of the target pattern 32 in the camera coordinate system is u1. If the drone 33 is facing the target pattern 32, the desired moving position of the drone 33 is u1. If the drone 33 is located on the left side of the target pattern 32 with respect to the target pattern 32, and the desired distance between the drone 33 and the target pattern 32 is Q (Q>0), the desired moving position of the drone 33 is u1-Q. If the drone 33 is located to the right of the target pattern 32 with respect to the target pattern 32, and the desired distance between the drone 33 and the target pattern 32 is Q, the desired moving position of the drone 33 is u1+Q. Thereby, the moving speed of the drone 33 in the Y-axis direction in the body coordinate system can be controlled according to the U-axis component of the center point of the target pattern in the camera coordinate system.
  • the U-axis component of the target pattern in the camera coordinate system controls the movement speed of the drone in the Y-axis direction of the body coordinate system, thereby avoiding the operator's complicated control by using a professional control device such as a remote control rocker.
  • the operation controls the left and right movement of the drone, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • the three-dimensional coordinate system can include a geodetic coordinate system.
  • the pose information of the target pattern in the three-dimensional coordinate system may include an X-axis component, a Y-axis component, and a Z-axis component of the center point of the target pattern in the geodetic coordinate system, and attitude information of the target pattern in the geodetic coordinate system.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the Z-axis component of the center point of the target pattern in the geodetic coordinate system indicates the height of the target pattern to the ground.
  • the desired height of the drone can be determined, so that the moving speed of the drone in the Z-axis direction of the body coordinate system can be controlled, and the height of the drone can be adjusted.
  • the desired height of the drone can be at the same height as the target pattern.
  • the drone may be located above the target pattern, and the distance between the desired height of the drone and the target pattern is a first preset value.
  • the drone may be located below the target pattern, and the distance between the desired height of the drone and the target pattern is a second preset value.
  • the first preset value and the second preset value may participate in the above description, and details are not described herein again.
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the desired height of the drone, and the existing implementation manner can be adopted. You can participate in the above description, and will not go into details here.
  • FIG. 6 is another schematic diagram of controlling the height adjustment of the drone according to an embodiment of the present invention.
  • the left side of FIG. 6 shows the target pattern 42 and the coordinate axes (x, y, z) of the geodetic coordinate system.
  • the current height of the drone 43 and the coordinate axes (X, Y, Z) of the body coordinate system are shown on the right side in Fig. 6.
  • the X-axis component, the Y-axis component, and the Z-axis component of the center point of the target pattern 42 in the geodetic coordinate system are x3, y3, and z3, respectively.
  • the desired height of the drone 43 may be the height C (y3).
  • the desired height of the drone 43 may be the height D (y3+d3) ).
  • the moving speed v of the drone 43 in the Z-axis direction in the body coordinate system can be controlled according to the Z-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the Z-axis component of the target pattern in the geodetic coordinate system controls the movement speed of the drone in the Z-axis direction of the body coordinate system, thereby avoiding the operator's complicated control by using a professional control device such as a remote control rocker.
  • the operation controls the flying height of the drone, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the distance between the drone and the target pattern is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the distance.
  • the desired position between the drone and the target pattern can be determined, thereby controlling the X-axis direction of the drone in the body coordinate system.
  • the speed of movement on the top adjusts the distance between the drone and the target pattern.
  • the distance between the drone and the target pattern is determined by the X-axis component and the Y-axis component of the center point of the target pattern in the geodetic coordinate system, and the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the distance,
  • the operator avoids the use of a professional control device such as a remote control rocker to control the distance of the drone through complicated control operations, reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the heading offset angle is determined according to the attitude information of the target pattern in the geodetic coordinate system, and the heading offset angle is the heading angle of the drone in the horizontal coordinate system of the human body and the local horizontal coordinate system.
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the heading offset angle.
  • the posture information of the target pattern in the geodetic coordinate system can be decomposed into tilt information (Tilt) and torsion information (Torsion), thereby determining the heading offset angle and the desired distance of the left and right movement of the drone, and controlling the drone.
  • tilt information tilt information
  • Torsion torsion information
  • the movement speed in the Y-axis direction in the body coordinate system adjusts the left and right movement of the drone.
  • the heading offset angle is determined by the attitude information of the target pattern in the geodetic coordinate system, and the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the heading offset angle, thereby avoiding the operator using the remote control rocker and the like.
  • the control device controls the left and right movement of the drone through complicated control operations, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • the three-dimensional coordinate system can include a body horizontal coordinate system.
  • the pose information of the target pattern in the three-dimensional coordinate system may include an X-axis component and a Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the angle.
  • the drone can be controlled to rotate around the target pattern to implement the drone tracking target pattern.
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system and the expectation of the drone can be determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system is determined by the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body, and the drone is controlled in the body according to the angle
  • the angular velocity on the yaw axis in the coordinate system avoids the operator's use of professional control equipment such as the remote control joystick to control the movement of the drone through complicated control operations, reducing the control difficulty of the drone and the learning cost of the operator. , enhance the user interaction experience.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system in the planned route is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the circular tangential linear velocity of the drone when rotating around the target pattern is determined according to the angle.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the tangential linear velocity of the ring.
  • the drone can be controlled to rotate around the target pattern to implement the drone tracking target pattern.
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system may be determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the circular tangential linear velocity of the drone when rotating around the target pattern can be determined.
  • the yaw angular velocity can be controlled according to the tangential linear velocity and the arc radius of the ring.
  • the angular velocity of the UAV in the yaw axis of the body coordinate system is controlled according to the tangential linear velocity of the ring, thereby avoiding the operator using the remote control rocker
  • the professional control equipment controls the movement of the drone through complicated control operations, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • the number of target patterns can be two.
  • the three-dimensional coordinate system may include a camera coordinate system, and the pose information of the target pattern in the three-dimensional coordinate system may include an X-axis component, a Y-axis component, and a Z-axis component of the center points of the two target patterns in the camera coordinate system, respectively.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the yaw angle of the pan/tilt on the UAV or the UAV in the body coordinate system is controlled according to the X-axis component and the Y-axis component of the center point of the two target patterns in the camera coordinate system.
  • FIG. 7 is a schematic diagram of controlling a drone by two target patterns according to an embodiment of the present invention.
  • the image 51 including two target patterns is shown on the left side in FIG.
  • the two target patterns are the target pattern 52 and the target pattern 53, respectively.
  • the middle portion in FIG. 7 is a top view of the current scene, showing the positional relationship between the target pattern 52, the target pattern 53, and the drone 54.
  • the position of the center point of the target pattern 52 in the camera coordinate system is marked as P1
  • X1 is the X-axis component of the target pattern 52 in the camera coordinate system
  • y1 is the Y-axis component of the target pattern 52 in the camera coordinate system
  • z1 is the Z-axis component of the target pattern 52 in the camera coordinate system.
  • X2 is the X-axis component of the target pattern 53 in the camera coordinate system
  • y2 is the Y-axis component of the target pattern 53 in the camera coordinate system
  • z2 is the Z-axis component of the target pattern 53 in the camera coordinate system.
  • the UAV 54 or the PTZ 55 on the UAV 54 can be controlled to rotate in the horizontal direction such that the UAV 54 or the PTZ 55 on the UAV 54 is perpendicular to P1 and P2. Connection.
  • the right side of FIG. 7 is a top view of the scene after the drone is rotated, showing the positional relationship between the target pattern 52, the target pattern 53, and the drone 54.
  • the angle ⁇ of the head of the drone or the drone that needs to be rotated can be determined by the following formula.
  • the angle ⁇ is mapped to the heading change amount (yaw angle) of the drone in the geodetic coordinate system, or the change amount of the head of the drone in the geodetic coordinate system.
  • the X-axis component and the Y-axis component of the two target patterns in the camera coordinate system respectively control the yaw angle of the pan/tilt on the UAV or the UAV in the body coordinate system, thereby avoiding the operator using the remote control
  • the professional control device such as the joystick controls the rotation of the drone through complicated control operations, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • controlling the UAV tracking target pattern according to the pose information may include:
  • the pitch angle of the pan/tilt on the drone in the body coordinate system is controlled according to the X-axis component and the Z-axis component of the center point of the two target patterns in the camera coordinate system, respectively.
  • FIG. 8 is another schematic diagram of controlling a drone by two target patterns according to an embodiment of the present invention.
  • the image on the left side of FIG. 8 shows an image 61 including two target patterns.
  • the two target patterns are the target pattern 62 and the target pattern 63, respectively.
  • the middle portion in Fig. 8 is a top view of the current scene, showing the positional relationship between the target pattern 62, the target pattern 63, and the drone 64.
  • X1 is the X-axis component of the target pattern 62 in the camera coordinate system
  • y1 is the Y-axis component of the target pattern 62 in the camera coordinate system
  • z1 is the Z-axis component of the target pattern 62 in the camera coordinate system.
  • X2 is the X-axis component of the target pattern 63 in the camera coordinate system
  • y2 is the Y-axis component of the target pattern 63 in the camera coordinate system
  • z2 is the Z-axis component of the target pattern 63 in the camera coordinate system.
  • the pan/tilt head 65 on the drone 64 can be controlled to rotate in the vertical direction such that the pan/tilt head 65 on the drone 64 is perpendicular to the line connecting Q1 and Q2.
  • the right side of Fig. 8 is a top view of the scene after the pan/tilt on the drone, showing the positional relationship between the target pattern 52, the target pattern 53, the drone 54 and the pan/tilt 65.
  • the angle ⁇ required for the pan/tilt on the drone can be determined by the following formula.
  • the angle ⁇ is mapped to the elevation angle of the gimbal on the drone in the body coordinate system.
  • the yaw angle of the pan/tilt of the UAV or the UAV in the body coordinate system is controlled by the two target patterns, and the PTZ on the UAV is controlled in the body coordinate system.
  • the pitch angle of the middle rotation can be combined with each other and executed at the same time.
  • the X-axis component and the Z-axis component of the two target patterns in the camera coordinate system respectively control the pitch angle of the pan/tilt on the UAV in the body coordinate system, thereby avoiding the operator's use of the remote control joystick and the like.
  • the control device controls the rotation of the drone through complicated control operations, which reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • the embodiment provides a method for controlling a drone, comprising: acquiring an image including a target pattern, obtaining pose information of the target pattern in the three-dimensional coordinate system according to the image, and controlling the drone tracking target pattern according to the pose information.
  • the control method of the drone provided by the embodiment can realize the following effect of the drone simply by using a specific pattern, and avoids the operator controlling the drone through complicated control operations by using a professional control device such as a remote control rocker. It reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • FIG. 9 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • This embodiment provides a specific implementation manner of the control method of the unmanned aerial vehicle based on the above embodiments.
  • the control method of the UAV provided in this embodiment may include:
  • the ROI includes a target pattern.
  • the resolution of the high resolution picture is higher than the resolution of the low resolution picture.
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained according to the image to be processed, and the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the pose information of the target pattern in the camera coordinate system may include a U-axis component, a V-axis component, and a Z-axis component of the center point of the target pattern in the camera coordinate system.
  • the embodiment provides a control method for the drone, which can realize the following effect of the drone simply by using a specific pattern, and avoids the operator using a professional control device such as a remote control rocker to control through complicated control operations.
  • the man-machine reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • control method of the UAV provided in this embodiment may include S201 to S207.
  • control method of the drone provided in this embodiment may further include:
  • the pose information of the target pattern in the horizontal coordinate system of the body may include an X-axis component and a Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the present embodiment does not limit the execution order between S205 to S207 and S209 to S211.
  • the embodiment provides a control method for the drone, which can realize the following effect of the drone simply by using a specific pattern, and avoids the operator using a professional control device such as a remote control rocker to control through complicated control operations.
  • the man-machine reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • FIG. 10 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • This embodiment provides another specific implementation manner of the control method of the unmanned aerial vehicle based on the above embodiments.
  • the control method of the UAV provided in this embodiment may include:
  • the ROI includes a target pattern.
  • the resolution of the high resolution picture is higher than the resolution of the low resolution picture.
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained according to the image to be processed, and the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the pose information of the target pattern in the geodetic coordinate system may include an X-axis component, a Y-axis component, and a Z-axis component of the center point of the target pattern in the geodetic coordinate system, and attitude information of the target pattern in the geodetic coordinate system.
  • the distance between the drone and the target pattern is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the geodetic coordinate system, and the UAV is controlled in the X-axis direction of the body coordinate system according to the distance. Moving speed.
  • the heading offset angle is determined according to the attitude information of the target pattern in the geodetic coordinate system, and the heading offset angle is an angle between the heading of the unmanned aerial vehicle system and the local horizontal coordinate system.
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the heading offset angle.
  • the embodiment provides a control method for the drone, which can realize the following effect of the drone simply by using a specific pattern, and avoids the operator using a professional control device such as a remote control rocker to control through complicated control operations.
  • the man-machine reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • control method of the UAV may include S301 to S308.
  • control method of the drone provided in this embodiment may further include:
  • the pose information of the target pattern in the horizontal coordinate system of the body may include an X-axis component and a Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • S311 Determine an angle between the target pattern and the yaw direction of the drone in the body coordinate system according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the embodiment does not limit the execution order between S305 to S308 and S310 to S312.
  • the embodiment provides a control method for the drone, which can realize the following effect of the drone simply by using a specific pattern, and avoids the operator using a professional control device such as a remote control rocker to control through complicated control operations.
  • the man-machine reduces the control difficulty of the drone and the learning cost of the operator, and improves the user interaction experience.
  • FIG. 11 is a schematic structural diagram of a control device for a drone according to an embodiment of the present invention.
  • the control device of the UAV provided by the embodiment is used to execute the control method of the UAV provided by the embodiment of the method of the present invention.
  • the control device of the drone provided in this embodiment may include:
  • the obtaining module 71 is configured to acquire an image including a target pattern, and obtain pose information of the target pattern in the three-dimensional coordinate system according to the image.
  • the control module 72 is configured to control the drone tracking target pattern according to the pose information.
  • the three-dimensional coordinate system includes a camera coordinate system
  • the pose information includes a U-axis component, a V-axis component, and a Z-axis component of a center point of the target pattern in the camera coordinate system.
  • control module 72 is specifically configured to:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the V-axis component of the center point of the target pattern in the camera coordinate system.
  • control module 72 is specifically configured to:
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the camera coordinate system.
  • control module 72 is specifically configured to:
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the U-axis component of the center point of the target pattern in the camera coordinate system.
  • the three-dimensional coordinate system includes a geodetic coordinate system
  • the pose information includes an X-axis component, a Y-axis component, and a Z-axis component of the center point of the target pattern in the geodetic coordinate system, and posture information of the target pattern in the geodetic coordinate system.
  • control module 72 is specifically configured to:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the geodetic coordinate system.
  • control module 72 is specifically configured to:
  • the distance between the drone and the target pattern is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the distance.
  • control module 72 is specifically configured to:
  • the heading offset angle is determined according to the attitude information of the target pattern in the geodetic coordinate system, and the heading offset angle is the heading angle of the drone in the horizontal coordinate system of the human body and the local horizontal coordinate system.
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the heading offset angle.
  • the three-dimensional coordinate system further includes a horizontal coordinate system of the body, and the pose information includes an X-axis component and a Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • control module 72 is specifically configured to:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the angle.
  • control module 72 is specifically configured to:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system in the planned route is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the circular tangential linear velocity of the drone when rotating around the target pattern is determined according to the angle.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the tangential linear velocity of the ring.
  • the target pattern is two
  • the three-dimensional coordinate system is a camera coordinate system
  • the pose information includes an X-axis component, a Y-axis component, and a Z-axis component of the center points of the two target patterns in the camera coordinate system, respectively.
  • control module 72 is specifically configured to:
  • the yaw angle of the pan/tilt on the UAV or the UAV in the body coordinate system is controlled according to the X-axis component and the Y-axis component of the center point of the two target patterns in the camera coordinate system.
  • the yaw angle ⁇ is
  • the two target patterns include a first target pattern and a second target pattern
  • x1 is an X-axis component of the first target pattern in a camera coordinate system
  • y1 is a Y-axis component of the first target pattern in a camera coordinate system
  • x2 The X-axis component of the second target pattern in the camera coordinate system
  • y2 is the Y-axis component of the second target pattern in the camera coordinate system.
  • control module 72 is specifically configured to:
  • the pitch angle of the pan/tilt on the drone in the body coordinate system is controlled according to the X-axis component and the Z-axis component of the center point of the two target patterns in the camera coordinate system, respectively.
  • the pitch angle ⁇ is
  • the two target patterns include a first target pattern and a second target pattern
  • x1 is an X-axis component of the first target pattern in a camera coordinate system
  • z1 is a Z-axis component of the first target pattern in a camera coordinate system
  • x2 The X-axis component of the second target pattern in the camera coordinate system
  • z2 is the Z-axis component of the second target pattern in the camera coordinate system.
  • control module 72 is further configured to:
  • the first exposure parameter is used for shooting among a plurality of preset exposure parameters.
  • the obtaining module 71 is specifically configured to:
  • the image is downsampled to obtain a low resolution picture.
  • the region of interest is determined in the low resolution picture, the region of interest including the target pattern.
  • a to-be-processed area corresponding to the region of interest is determined in the high-resolution picture corresponding to the low-resolution picture, and the resolution of the high-resolution picture is higher than the resolution of the low-resolution picture.
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained according to the area to be processed.
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the control device of the UAV provided in this embodiment is used to perform the control method of the UAV provided by the method embodiment of the present invention, and the principle is similar, and details are not described herein again.
  • FIG. 12 is a schematic structural diagram of a control device for a drone according to another embodiment of the present invention.
  • the control device of the UAV provided by the embodiment is used to execute the control method of the UAV provided by the embodiment of the method of the present invention.
  • the control device of the drone provided in this embodiment may include: a memory 81 and a processor 82.
  • the memory 81 is configured to store program code.
  • the processor 82 calls the program code to perform the following operations when the program code is executed:
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained from the image.
  • the drone tracking target pattern is controlled according to the pose information.
  • the three-dimensional coordinate system includes a camera coordinate system
  • the pose information includes a U-axis component, a V-axis component, and a Z-axis component of a center point of the target pattern in the camera coordinate system.
  • the processor 82 is specifically configured to:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the V-axis component of the center point of the target pattern in the camera coordinate system.
  • the processor 82 is specifically configured to:
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the camera coordinate system.
  • the processor 82 is specifically configured to:
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the U-axis component of the center point of the target pattern in the camera coordinate system.
  • the three-dimensional coordinate system includes a geodetic coordinate system
  • the pose information includes an X-axis component, a Y-axis component, and a Z-axis component of the center point of the target pattern in the geodetic coordinate system, and posture information of the target pattern in the geodetic coordinate system.
  • the processor 82 is specifically configured to:
  • the moving speed of the drone in the Z-axis direction of the body coordinate system is controlled according to the Z-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the processor 82 is specifically configured to:
  • the distance between the drone and the target pattern is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the geodetic coordinate system.
  • the moving speed of the drone in the X-axis direction of the body coordinate system is controlled according to the distance.
  • the processor 82 is specifically configured to:
  • the heading offset angle is determined according to the attitude information of the target pattern in the geodetic coordinate system, and the heading offset angle is the heading angle of the drone in the horizontal coordinate system of the human body and the local horizontal coordinate system.
  • the moving speed of the drone in the Y-axis direction of the body coordinate system is controlled according to the heading offset angle.
  • the three-dimensional coordinate system further includes a horizontal coordinate system of the body, and the pose information includes an X-axis component and a Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the processor 82 is specifically configured to:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the angle.
  • the processor 82 is specifically configured to:
  • the angle between the target pattern and the yaw direction of the drone in the body coordinate system in the planned route is determined according to the X-axis component and the Y-axis component of the center point of the target pattern in the horizontal coordinate system of the body.
  • the circular tangential linear velocity of the drone when rotating around the target pattern is determined according to the angle.
  • the angular velocity of the drone on the yaw axis in the body coordinate system is controlled according to the tangential linear velocity of the ring.
  • the target pattern is two
  • the three-dimensional coordinate system is a camera coordinate system
  • the pose information includes an X-axis component, a Y-axis component, and a Z-axis component of the center points of the two target patterns in the camera coordinate system, respectively.
  • the processor 82 is specifically configured to:
  • the yaw angle of the pan/tilt on the UAV or the UAV in the body coordinate system is controlled according to the X-axis component and the Y-axis component of the center point of the two target patterns in the camera coordinate system.
  • the yaw angle ⁇ is
  • the two target patterns include a first target pattern and a second target pattern
  • x1 is an X-axis component of the first target pattern in a camera coordinate system
  • y1 is a Y-axis component of the first target pattern in a camera coordinate system
  • x2 The X-axis component of the second target pattern in the camera coordinate system
  • y2 is the Y-axis component of the second target pattern in the camera coordinate system.
  • the processor 82 is specifically configured to:
  • the pitch angle of the pan/tilt on the drone in the body coordinate system is controlled according to the X-axis component and the Z-axis component of the center point of the two target patterns in the camera coordinate system, respectively.
  • the pitch angle ⁇ is
  • the two target patterns include a first target pattern and a second target pattern
  • x1 is an X-axis component of the first target pattern in a camera coordinate system
  • z1 is a Z-axis component of the first target pattern in a camera coordinate system
  • x2 The X-axis component of the second target pattern in the camera coordinate system
  • z2 is the Z-axis component of the second target pattern in the camera coordinate system.
  • processor 82 is further configured to:
  • the first exposure parameter is used for shooting among a plurality of preset exposure parameters.
  • the processor 82 is specifically configured to:
  • the image is downsampled to obtain a low resolution picture.
  • the region of interest is determined in the low resolution picture, the region of interest including the target pattern.
  • a to-be-processed area corresponding to the region of interest is determined in the high-resolution picture corresponding to the low-resolution picture, and the resolution of the high-resolution picture is higher than the resolution of the low-resolution picture.
  • the pixel coordinate value of the center point of the target pattern in the image coordinate system is obtained according to the area to be processed.
  • the pose information of the target pattern in the three-dimensional coordinate system is obtained according to the pixel coordinate value.
  • the control device of the UAV provided in this embodiment is used to perform the control method of the UAV provided by the method embodiment of the present invention, and the principle is similar, and details are not described herein again.
  • An embodiment of the present invention further provides a drone, including the control device of the drone provided by the embodiment shown in FIG. 11 or FIG.
  • the control device of the UAV provided by the embodiment is used to perform the control method of the UAV provided by the embodiment of the method of the present invention, and the principle is similar, and details are not described herein again.
  • the aforementioned program can be stored in a computer readable storage medium.
  • the program when executed, performs the steps including the foregoing method embodiments; and the foregoing storage medium includes various media that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机的控制方法、装置和无人机,无人机的控制方法包括:获取包含目标图案的图像(S101);根据所述图像获得所述目标图案在三维坐标系中的位姿信息(S102);根据所述位姿信息控制无人机跟踪所述目标图案(S103)。利用特定的图案可以实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。

Description

无人机的控制方法、装置和无人机 技术领域
本发明涉及无人机技术领域,尤其涉及一种无人机的控制方法、装置和无人机。
背景技术
随着技术的进步和成本的降低,越来越多的用户开始使用无人机进行航拍活动。
目前,对于无人机的控制,可以采用遥控器摇杆的方式实现精准控制。操作者通过操控遥控器摇杆,可以精细的调整无人机的航向或者云台的角度,控制无人机跟随目标拍摄。也可以采用智能手机实现精准控制。操作者通过智能手机的触屏模拟摇杆实现操作,或者通过智能手机的姿态来映射飞行器的姿态。
但是,上述无人机的控制方式,要求操作者有着丰富的经验以及技术。对于普通用户来说,尤其是对于小型航拍器,由于上述控制方式过于复杂,使得无人机的控制变得复杂,用户交互体验很差。
发明内容
本发明提供一种无人机的控制方法、装置和无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
第一方面,本发明实施例提供一种无人机的控制方法,包括:
获取包含目标图案的图像;
根据所述图像获得所述目标图案在三维坐标系中的位姿信息;
根据所述位姿信息控制无人机跟踪所述目标图案。
第二方面,本发明实施例提供一种无人机的控制装置,包括:
获取模块,用于获取包含目标图案的图像,根据图像获得目标图案在三维坐标系中的位姿信息。
控制模块,用于根据位姿信息控制无人机跟踪目标图案。
第三方面,本发明实施例提供一种无人机的控制装置,包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:
获取包含目标图案的图像;
根据所述图像获得所述目标图案在三维坐标系中的位姿信息;
根据所述位姿信息控制无人机跟踪所述目标图案。
第四方面,本发明实施例提供一种无人机,包括本发明实施例提供的无人机的控制装置。
本发明提供了一种无人机的控制方法、装置和无人机,通过获取包含目标图案的图像,可以根据图像获得目标图案在三维坐标系中的位姿信息,进而根据位姿信息控制无人机跟踪目标图案。利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为根据本发明的实施例的无人飞行系统的示意性架构图;
图2为本发明一实施例提供的无人机的控制方法的流程图;
图3为本发明一实施例提供的目标图案和图像的一个示意图;
图4为本发明一实施例提供的目标图案和图像的另一个示意图;
图5为本发明一实施例提供的控制无人机调整高度的一个示意图;
图6为本发明一实施例提供的控制无人机调整高度的另一个示意图;
图7为本发明一实施例提供的通过两个目标图案控制无人机的一个示意图;
图8为本发明一实施例提供的通过两个目标图案控制无人机的另一个示意图;
图9为本发明另一实施例提供的无人机的控制方法的流程图;
图10为本发明又一实施例提供的无人机的控制方法的流程图;
图11为本发明一实施例提供的无人机的控制装置的结构示意图;
图12为本发明另一实施例提供的无人机的控制装置的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例提供了无人机的控制方法、装置和无人机。需要说明,本发明实施例提供的无人机的控制方法,不仅适用于无人机,还可以适用于其他带有摄像头的设备。例如,无人驾驶的汽车。以下对本发明的描述以无人机为示例进行说明。
图1为根据本发明的实施例的无人飞行系统的示意性架构图。本实施例以旋翼无人飞行器为例进行说明。
无人飞行系统100可以包括无人飞行器110。无人飞行器110可以包括动力系统150、飞行控制系统160和机架。可选的,无人飞行系统100还可以包括云台120。可选的,无人飞行系统100还可以包括显示设备130。无人飞行器110可以与显示设备130进行无线通信。
机架可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在无人飞行器110着陆时起支撑作用。
动力系统150可以包括一个或多个电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在无人飞行器110的机臂上;电子调速器151用于接收飞行控制系 统160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为无人飞行器110的飞行提供动力,该动力使得无人飞行器110能够实现一个或多个自由度的运动。在某些实施例中,无人飞行器110可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴(Roll)、偏航轴(Yaw)和俯仰轴(pitch)。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以是有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量无人飞行器的姿态信息,即无人飞行器110在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、超声传感器、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。飞行控制器161用于控制无人飞行器110的飞行,例如,可以根据传感系统162测量的姿态信息控制无人飞行器110的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对无人飞行器110进行控制,也可以通过拍摄画面对无人飞行器110进行控制。
云台120可以包括电机122。云台用于携带拍摄装置123。飞行控制器161可以通过电机122控制云台120的运动。可选地,作为另一实施例,云台120还可以包括控制器,用于通过控制电机122来控制云台120的运动。应理解,云台120可以独立于无人飞行器110,也可以为无人飞行器110的一部分。应理解,电机122可以是直流电机,也可以是交流电机。另外,电机122可以是无刷电机,也可以是有刷电机。还应理解,云台可以位于无人飞行器的顶部,也可以位于无人飞行器的底部。
拍摄装置123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄装置123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄,飞行控制器也可以根据拍摄装置123拍摄的图像控制无人飞行器110。本实施例的拍摄装置123至少包括感光元件,该感光元件例如为互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)传感器或电荷耦合元件(Charge-coupled Device,CCD)传感器。可以理解,拍摄装置123也可 直接固定于无人飞行器110上,从而云台120可以省略。
显示设备130位于无人飞行系统100的地面端,可以通过无线方式与无人飞行器110进行通信,并且可以用于显示无人飞行器110的姿态信息。另外,还可以在显示设备130上显示拍摄装置拍摄的图像。应理解,显示设备130可以是独立于无人飞行器110的设备。
应理解,上述对于无人飞行系统各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。
下面对本发明实施例涉及的相关概念进行介绍。
1)图像坐标系
图像坐标系是一个二维平面,又称为像平面,可以理解为摄像装置中传感器的表面。每个传感器都有一定的尺寸,也有一定的分辨率,这就确定了毫米与像素点之间的转换关系。图像坐标系中一个点的坐标可以以像素为单位表示为(u,v),也可以为毫米为单位表示为(x,y)。或者说,图像坐标系可以分为图像像素坐标系和图像物理坐标系。图像像素坐标系的单位可以为像素,两个坐标轴可以分别称为U轴和V轴。图像物理坐标系的单位可以为毫米,两个坐标轴可以分别称为X轴和Y轴。
2)相机坐标系
相机坐标系为三维坐标系。相机坐标系的原点为相机(透镜)的光心,相机坐标系的X轴(也称为U轴)与Y轴(也称为V轴)分别与图像坐标系的X轴(U轴)与Y轴(V轴)平行,Z轴为相机的光轴。
3)大地坐标系(ground坐标系)
大地坐标系为三维坐标系,也可以称为导航坐标系、当地水平坐标系或者“北东地”坐标系(North-East-DownCoordinateSystem,NED),通常用于导航计算时使用。
在大地坐标系中,X轴指向北方(North),Y轴指向东方(East),Z轴指向地心(Down)。X轴与Y轴与地球表面相切。
4)机体坐标系(BodyFrame)
机体坐标系(也称为Body坐标系或者Body系)为三维坐标系,是固连在无人机机体上的坐标系。机体坐标系的原点在飞行器的重心处。机体坐标系的X轴沿机体纵轴指向前,或者说指向飞行器机头的前进方向。Y轴沿机 体横轴指向右,或者说由原点指向飞行器的右侧。Z轴的方向根据X轴和Y轴由右手法则确定。
5)机体水平坐标系
机体水平坐标系为三维坐标系。机体水平坐标系的原点在飞行器的重心处。机体水平坐标系的X轴沿机体纵轴指向前,或者说指向飞行器机头的前进方向。Z轴的正方向朝向地心。Y轴的方向根据X轴和Z轴由右手法则确定。
图2为本发明实施例一提供的无人机的控制方法的流程图。本实施例提供的无人机的控制方法,执行主体可以为无人机的控制装置,该无人机的控制装置可以设置在无人机中。如图2所示,本实施例提供的无人机的控制方法,可以包括:
S101、获取包含目标图案的图像。
具体的,无人机进入图案跟随模式后,可以通过无人机上的图像获取装置拍摄无人机周围环境,获取包含目标图案的图像。本实施例对于图像获取装置的类型不做限定。例如,可以为图1中示出的拍摄装置123,或者为无人机上的图像传感器。
其中,目标图案(Marker)可以用于控制无人机,以使无人机在图案跟随模式下可以跟随该目标图案。本实施例对于目标图案的形状和大小不做限定。可选的,目标图案可以为预设的图案。例如,圆形、正方形等容易识别的规则图案,或者为二维码等特定图案。可选的,目标图案可以为人的手。本发明对于目标图案的个数不做限定。在具体的应用场景中,目标图案的个数不同,可以实现对无人机的不同控制。
下面通过示例进行说明。
图3为本发明实施例一提供的目标图案和图像的一个示意图。如图3所示,无人机进入图案跟随模式后,获取了图像11。图像11中包括目标图案12。该目标图案12具体为1个圆形点。该圆形点可以固定在人的手上,通过手的移动带动该圆形点移动。
S102、根据图像获得目标图案在三维坐标系中的位姿信息。
其中,本实施例对于三维坐标系的类型不做限定。可选的,三维坐标系 可以包括相机坐标系、大地坐标系和机体水平坐标系中的至少一种。需要说明的是,相机坐标系、大地坐标系和机体水平坐标系之间具有一定的转换关系,本实施例对于该转换关系不做限定,可以采用现有的转换关系。也就是说,如果获得了目标图案在上述任意一种三维坐标系中的位姿信息,则可以根据该转换关系获得目标图案在上述其他三维坐标系中的位姿信息。
可选的,若目标图案在三维坐标系中的位姿信息为目标图案在相机坐标系中的位姿信息,还可以包括:
将目标图案在相机坐标系中的位姿信息转换为目标图案在大地坐标系中的位姿信息。
具体的,获得目标图案在相机坐标系中的位姿信息后,可以根据无人机自身在大地坐标系下的位姿位置、图像获取装置与无人机之间的位置姿态关系,将目标图案在相机坐标系中的位姿信息转换到大地坐标系下,获得目标图案在相机坐标系中的位姿信息。
示例性的,以获取目标图案在大地坐标系中的Z轴分量为例进行说明。目标图案在大地坐标系中的Z轴分量指示了目标图案的对地高度。无人机可以获取自身在大地坐标系下的对地高度。根据目标图案在相机坐标系中的位姿信息、图像获取装置与无人机之间的位置姿态关系和无人机自身在大地坐标系下的对地高度,可以获得目标图案的对地高度,即目标图案在大地坐标系中的Z轴分量。
其中,目标图案在三维坐标系中的位姿信息包括目标图案在三维坐标系中的位置(Translation)信息(可以用T标识)和/或目标图案在三维坐标系中的姿态(Rotation)信息(可以用R标识)。所述目标图案在三维坐标系中的位置信息可以用三维坐标中的坐标值来表示。所述目标图案在三维坐标系中的姿态信息指示了目标图案在三维坐标系中的倾斜(Tilt)程度和/或扭转(Torsion)程度。
下面通过具体示例进行说明。
图4为本发明实施例一提供的目标图案和图像的另一个示意图。请同时参见图3和图4。目标图案均为圆形点。在图3所示场景中,无人机的图像获取装置的光轴可能与图像11所在平面垂直。目标图案12在相机坐标系中的位置信息可以表示为(x1,y1,z1),姿态信息可以表示为R1。在图4所示 场景中,无人机的图像获取装置的光轴与图像21所在平面之间可能存在一定的夹角。目标图案22在相机坐标系中的位置信息可以表示为(x2,y2,z2),姿态信息可以表示为R2。
可选的,根据图像获得目标图案在三维坐标系中的位姿信息,可以包括:
根据图像获得目标图案的中心点在图像坐标系中的像素坐标值。
根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
在该种实现方式中,直接对图像进行处理,识别出图像中的目标图案,获得目标图案的中心点在图像坐标系中的像素坐标值。根据像素坐标值获得目标图案在三维坐标系中的位姿信息。由于直接对图像进行处理,而原始图像的分辨率通常较高,因此,提升了识别目标图案的准确率以及获取目标图案的位姿信息的准确性。
本实施例对图像进行处理从而识别目标图案的方法不做限定。例如,可以采用检测算法、模板匹配算法,卷积神经网络(Convolutional Neural Network,CNN)算法等。
本实施例对根据像素坐标值获得目标图案在三维坐标系中的位姿信息的方法不做限定,可以采用现有的算法。例如,通过N点透视(Perspective-n-Point,PNP)算法获得目标图案在相机坐标系下的位置信息T和姿态信息R。PNP求解算法是指通过多对3D与2D的匹配点,在已知或者未知相机内参的情况下,利用最小化重投影误差来求解相机外参的算法。PNP求解算法是位姿跟踪中常用的算法。
可选的,根据图像获得目标图案在三维坐标系中的位姿信息,可以包括:
对图像进行降采样获得低分辨率图片。
在低分辨率图片中确定感兴趣区域,感兴趣区域包括目标图案。
在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域,高分辨率图片的分辨率高于低分辨图片的分辨率。
根据待处理区域获得目标图案的中心点在图像坐标系中的像素坐标值。
根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
具体的,对图像进行降采样(Downsample)处理后获得低分辨率图片。低分辨率图片的分辨率小于图像的分辨率。本实施例对于低分辨率图片的分辨率的具体数值不做限定,例如可以为640*480。在低分辨率图片中识别出 目标图案并确定感兴趣区域(Region of interest,ROI)。其中,目标图案可以位于ROI的中心。然后,在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域。其中,高分辨率图片的分辨率高于低分辨图片的分辨率。可选的,高分辨率图片可以是图像,或者是对图像进行降采样获得的。根据待处理区域可以获得目标图案的中心点在图像坐标系中的像素坐标值,进而可以获得目标图案在三维坐标系中的位姿信息。
通过对原始图像进行降采样,可以在低分辨率图片中确定包括目标图案的ROI,在识别目标图案过程中降低了数据运算量,提升了运算速度。
可选的,若图像的分辨率不是标准分辨率,对图像进行降采样获得低分辨率图片之前,还可以包括:
根据标准分辨率对图像进行剪裁。
通过对原始图像进行剪裁,确保了原始图像的分辨率为标准分辨率,提升了数据处理效果。其中,标准分辨率是指根据图像显示大小通常设置的分辨率。本实施例对于标准分辨率的具体数值不做限定,根据需要进行设置。
S103、根据位姿信息控制无人机跟踪目标图案。
具体的,在获取目标图案在三维坐标系中的位姿信息后,可以根据该位姿信息控制无人机跟踪目标图案,从而实现对无人机的控制。例如,通过目标图案可以控制无人机的飞行高度,或者,控制无人机靠近或者远离目标图案飞行时的移动距离或者移动速度,或者,控制无人机环绕目标图案飞行时的旋转角度或者旋转角速度,等等。
可见,本实施例提供的无人机的控制方法,通过获取包含目标图案的图像,可以根据图像获得目标图案在三维坐标系中的位姿信息,进而根据位姿信息控制无人机跟踪目标图案。由于利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,本实施例提供的无人机的控制方法,获取包含目标图案的图像之前,还可以包括:
在预设的多个曝光参数中确定采用第一曝光参数进行拍摄。
具体的,该步骤可以应用于无人机进入图案跟随模式后的初始化阶段。 初始化时,可以根据预设的多个曝光参数轮询曝光。也就是说,采用不同的曝光参数进行拍摄得到不同的包含目标图案的图片。分别对不同的图片进行检测,可以确定出能够识别目标图案且曝光时间最短的图片对应的第一曝光参数。之后,可以根据第一曝光参数获取包含目标图案的图像。
需要说明的是,本实施例对于预设的多个曝光参数不做限定。
在一些实施例中,三维坐标系可以包括相机坐标系。目标图案在三维坐标系中的位姿信息可以包括目标图案的中心点在相机坐标系中的U轴分量、V轴分量和Z轴分量。
其中,U轴分量也可以称为X轴分量,V轴分量也可以称为Y轴分量。
可选的,在一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在相机坐标系中的V轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
具体的,根据目标图案的中心点在相机坐标系中的V轴分量可以确定无人机的期望高度,从而可以控制无人机在机体坐标系中Z轴方向上的移动速度,调整无人机的高度。其中,无人机的期望高度可以与目标图案处于同一高度。或者,无人机可以位于目标图案上方,无人机的期望高度与目标图案之间的距离为第一预设值。或者,无人机可以位于目标图案下方,无人机的期望高度与目标图案之间的距离为第二预设值。本实施例对于第一预设值和第二预设值的具体取值不做限定。
需要说明的是,在本实施例中,根据无人机的期望高度控制无人机在机体坐标系中Z轴方向上的移动速度,可以采用现有的实现方式,本实施例不做限定,根据无人机运动模型的不同而有所不同。例如,无人机可以匀速运动、加速运动等。
下面通过具体示例详细说明。
图5为本发明一实施例提供的控制无人机调整高度的一个示意图。如图5所示,图5中左侧示出了包含目标图案32的图像31以及相机坐标系的坐标轴(u,v,z)。图5中右侧示出了无人机33的当前高度以及机体坐标系的坐标轴(X,Y,Z)。目标图案32的中心点在相机坐标系中的U轴分量、V轴分量和Z轴分量分别为u1,v1和z1。如果无人机33的期望高度与目标图案32 处于同一高度,则无人机33的期望高度可以为高度A(v1)。如果无人机33位于目标图案32的上方,且无人机33的期望高度与目标图案32相差第一预设值(d1),则无人机33的期望高度可以为高度B(v1+d1)。从而,根据目标图案的中心点在相机坐标系中的Y轴分量可以控制无人机在机体坐标系中Z轴方向上的移动速度v。
通过目标图案的中心点在相机坐标系中的V轴分量控制无人机在机体坐标系中Z轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的飞行高度,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,在另一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在相机坐标系中的Z轴分量控制无人机在机体坐标系中X轴方向上的移动速度。
具体的,根据目标图案的中心点在相机坐标系中的Z轴分量可以确定无人机前后移动的期望位置,从而可以控制无人机在机体坐标系中X轴方向上的移动速度,调整无人机与目标图案之间的远近。其中,无人机与目标图案之间的期望距离可以为第三预设值。本实施例对于第三预设值的具体取值不做限定。
需要说明的是,在本实施例中,根据无人机前后移动的期望位置控制无人机在机体坐标系中X轴方向上的移动速度,可以采用现有的实现方式,本实施例不做限定。根据无人机运动模型的不同而有所不同。例如,无人机可以匀速运动、加速运动等。
下面通过具体示例详细说明。
目标图案在相机坐标系中的位姿信息可以参见图5。目标图案32的中心点在相机坐标系中的Z轴分量为z1。若无人机33与目标图案32之间的期望距离为P(P>0),则无人机33的期望移动位置可以为z1+P,从而可以控制无人机33在机体坐标系中X轴方向上的移动速度。
通过目标图案的中心点在相机坐标系中的Z轴分量控制无人机在机体坐标系中X轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的远近,降低了无人机的控制难度和操 作者的学习成本,提升了用户交互体验。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在相机坐标系中的U轴分量控制无人机在机体坐标系中Y轴方向上的移动速度。
具体的,根据目标图案的中心点在相机坐标系中的U轴分量可以确定无人机左右移动的期望位置,从而可以控制无人机在机体坐标系中Y轴方向上的移动速度,调整无人机的左右移动。其中,无人机可以正对目标图案,即通过控制无人机左右移动使得目标图案位于机体坐标系的X轴上。或者,无人机相对于目标图案位于目标图案的左侧,无人机与目标图案之间的距离为第四预设值。或者,无人机相对于目标图案位于目标图案的右侧,无人机与目标图案之间的距离为第五预设值。本实施例对于第四预设值和第五预设值的具体取值不做限定。
需要说明的是,在本实施例中,根据无人机左右移动的期望位置控制无人机在机体坐标系中Y轴方向上的移动速度,可以采用现有的实现方式,本实施例不做限定。根据无人机运动模型的不同而有所不同。例如,无人机可以匀速运动、加速运动等。
下面通过具体示例详细说明。
目标图案在相机坐标系中的位姿信息可以参见图5。目标图案32的中心点在相机坐标系中的X轴分量为u1。如果无人机33正对目标图案32,则无人机33的期望移动位置为u1。如果无人机33相对于目标图案32位于目标图案32的左侧,且无人机33与目标图案32之间的期望距离为Q(Q>0),则无人机33的期望移动位置为u1-Q。如果无人机33相对于目标图案32位于目标图案32的右侧,且无人机33与目标图案32之间的期望距离为Q,则无人机33的期望移动位置为u1+Q。从而,根据目标图案的中心点在相机坐标系中的U轴分量可以控制无人机33在机体坐标系中Y轴方向上的移动速度。
通过目标图案的中心点在相机坐标系中的U轴分量控制无人机在机体坐标系中Y轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的左右移动,降低了无人机的控制难度 和操作者的学习成本,提升了用户交互体验。
在一些实施例中,三维坐标系可以包括大地坐标系。目标图案在三维坐标系中的位姿信息可以包括目标图案的中心点在大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及目标图案在大地坐标系中的姿态信息。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在大地坐标系中的Z轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
具体的,目标图案的中心点在大地坐标系中的Z轴分量指示了目标图案的对地高度。根据目标图案的中心点在大地坐标系中的Z轴分量可以确定无人机的期望高度,从而可以控制无人机在机体坐标系中Z轴方向上的移动速度,调整无人机的高度。其中,无人机的期望高度可以与目标图案处于同一高度。或者,无人机可以位于目标图案上方,无人机的期望高度与目标图案之间的距离为第一预设值。或者,无人机可以位于目标图案下方,无人机的期望高度与目标图案之间的距离为第二预设值。第一预设值和第二预设值可以参加上述描述,此处不再赘述。
需要说明的是,在本实施例中,根据无人机的期望高度控制无人机在机体坐标系中Z轴方向上的移动速度,可以采用现有的实现方式。可以参加上述描述,此处不再赘述。
下面通过具体示例详细说明。
图6为本发明一实施例提供的控制无人机调整高度的另一个示意图。如图6所示,图6中左侧示出了目标图案42以及大地坐标系的坐标轴(x,y,z)。图6中右侧示出了无人机43的当前高度以及机体坐标系的坐标轴(X,Y,Z)。目标图案42的中心点在大地坐标系中的X轴分量、Y轴分量和Z轴分量分别为x3,y3和z3。如果无人机43的期望高度与目标图案42处于同一高度,则无人机43的期望高度可以为高度C(y3)。如果无人机43位于目标图案42的上方,且无人机43的期望高度与目标图案42相差第一预设值(d3),则无人机43的期望高度可以为高度D(y3+d3)。从而,根据目标图案的中心点在大地坐标系中的Z轴分量可以控制无人机43在机体坐标系中Z轴方向上的移动速度v。
通过目标图案的中心点在大地坐标系中的Z轴分量控制无人机在机体坐标系中Z轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的飞行高度,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量确定无人机与目标图案之间的距离。
根据距离控制无人机在机体坐标系中X轴方向上的移动速度。
具体的,根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量可以确定无人机与目标图案之间的期望位置,从而可以控制无人机在机体坐标系中X轴方向上的移动速度,调整无人机与目标图案之间的远近。
通过目标图案的中心点在大地坐标系中的X轴分量和Y轴分量确定无人机与目标图案之间的距离,根据距离控制无人机在机体坐标系中X轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的远近,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案在大地坐标系中的姿态信息确定航向偏移角,航向偏移角为无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角。
根据航向偏移角控制无人机在机体坐标系中Y轴方向上的移动速度。
具体的,可以将目标图案在大地坐标系中的姿态信息分解为倾斜信息(Tilt)和扭转信息(Torsion),进而可以确定航向偏移角和无人机左右移动的期望距离,控制无人机在机体坐标系中Y轴方向上的移动速度,调整无人机的左右移动。
通过目标图案在大地坐标系中的姿态信息确定航向偏移角,根据航向偏移角控制无人机在机体坐标系中Y轴方向上的移动速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的左右移动,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
在一些实施例中,三维坐标系可以包括机体水平坐标系。目标图案在三维坐标系中的位姿信息可以包括目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角控制无人机在机体坐标系中偏航轴上的角速度。
在该种实现方式中,可以控制无人机围绕目标图案旋转,从而实现无人机跟踪目标图案。具体的,可以根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角以及无人机的期望移动位置,根据夹角控制无人机在机体坐标系中偏航轴(yaw轴)上的角速度。
通过目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角,根据夹角控制无人机在机体坐标系中偏航轴上的角速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的移动,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定在规划的航线中目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角确定无人机围绕目标图案旋转时的圆环切向线速度。
根据圆环切向线速度控制无人机在机体坐标系中偏航轴上的角速度。
在该种实现方式中,可以控制无人机围绕目标图案旋转,从而实现无人机跟踪目标图案。具体的,根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量可以确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。根据夹角可以确定无人机围绕目标图案旋转时的圆环切向线速度。根据圆环切向线速度和圆弧半径可以控制yaw角角速度。
通过确定无人机围绕目标图案旋转时的圆环切向线速度,根据圆环切向 线速度控制无人机在机体坐标系中偏航轴上的角速度,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的移动,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
在一些实施例中,目标图案的个数可以为两个。三维坐标系可以包括相机坐标系,目标图案在三维坐标系中的位姿信息可以包括两个目标图案的中心点分别在相机坐标系中的X轴分量、Y轴分量和Z轴分量。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Y轴分量控制无人机或者无人机上的云台在机体坐标系中旋转的偏航角。
下面通过具体示例进行说明。
图7为本发明一实施例提供的通过两个目标图案控制无人机的一个示意图。如图7所示,图7中左侧示出了包括两个目标图案的图像51。两个目标图案分别为目标图案52和目标图案53。图7中中部为当前场景的俯视图,示出了目标图案52、目标图案53以及无人机54之间的位置关系。其中,目标图案52的中心点在相机坐标系中的位置标记为P1,坐标值可以标记为P 1=[x 1y 1z 1] T。x1为目标图案52在相机坐标系中的X轴分量,y1为目标图案52在相机坐标系中的Y轴分量、z1为目标图案52在相机坐标系中的Z轴分量。目标图案53的中心点在相机坐标系中的位置标记为P2,坐标值可以标记为P 2=[x 2y 2z 2] T。x2为目标图案53在相机坐标系中的X轴分量,y2为目标图案53在相机坐标系中的Y轴分量、z2为目标图案53在相机坐标系中的Z轴分量。根据目标图案52和目标图案53可以控制无人机54或者无人机54上的云台55在水平方向上旋转,使得无人机54或者无人机54上的云台55垂直于P1与P2的连线。示例性的,图7中右侧为无人机旋转后的场景俯视图,示出了目标图案52、目标图案53以及无人机54之间的位置关系。
其中,可以通过下面的公式确定无人机或者无人机上的云台需要转动的角度θ。该角度θ映射到大地坐标系下即为无人机的航向变化量(偏航角),或者是无人机的云台在大地坐标系下的朝向变化量。
Figure PCTCN2018080605-appb-000001
需要说明的是,本实施例不限定上述公式中减法的执行顺序。即,上述 公式也可以表示为:
Figure PCTCN2018080605-appb-000002
通过两个目标图案的中心点分别在相机坐标系中的X轴分量和Y轴分量控制无人机或者无人机上的云台在机体坐标系中旋转的偏航角,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的旋转,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
可选的,在又一种实现方式中,上述S103中,根据位姿信息控制无人机跟踪目标图案,可以包括:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Z轴分量控制无人机上的云台在机体坐标系中旋转的俯仰角。
下面通过具体示例进行说明。
图8为本发明一实施例提供的通过两个目标图案控制无人机的另一个示意图。如图8所示,图8中左侧示出了包括两个目标图案的图像61。两个目标图案分别为目标图案62和目标图案63。图8中中部为当前场景的俯视图,示出了目标图案62、目标图案63以及无人机64之间的位置关系。其中,目标图案62的中心点在相机坐标系中的位置标记为Q1,坐标值可以标记为Q 1=[x 1y 1z 1] T。x1为目标图案62在相机坐标系中的X轴分量,y1为目标图案62在相机坐标系中的Y轴分量、z1为目标图案62在相机坐标系中的Z轴分量。目标图案63的中心点在相机坐标系中的位置标记为Q2,坐标值可以标记为Q 2=[x 2y 2z 2] T。x2为目标图案63在相机坐标系中的X轴分量,y2为目标图案63在相机坐标系中的Y轴分量、z2为目标图案63在相机坐标系中的Z轴分量。根据目标图案62和目标图案63可以控制无人机64上的云台65在垂直方向上旋转,使得无人机64上的云台65垂直于Q1与Q2的连线。图8中右侧为无人机上的云台旋转后的场景俯视图,示出了目标图案52、目标图案53、无人机54和云台65之间的位置关系。
其中,可以通过下面的公式确定无人机上的云台需要转动的角度α。该角度α映射到大地坐标系下即为无人机上的云台在机体坐标系中旋转的俯仰角。
Figure PCTCN2018080605-appb-000003
需要说明的是,本实施例不限定上述公式中减法的执行顺序。即,上述 公式也可以表示为:
Figure PCTCN2018080605-appb-000004
需要说明的是,在上述实现方式中,通过两个目标图案控制无人机或者无人机的云台在机体坐标系中旋转的偏航角,以及控制无人机上的云台在机体坐标系中旋转的俯仰角,可以相互结合,同时执行。
通过两个目标图案的中心点分别在相机坐标系中的X轴分量和Z轴分量控制无人机上的云台在机体坐标系中旋转的俯仰角,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机的旋转,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
本实施例提供了一种无人机的控制方法,包括:获取包含目标图案的图像,根据图像获得目标图案在三维坐标系中的位姿信息,根据位姿信息控制无人机跟踪目标图案。本实施例提供的无人机的控制方法,利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
图9为本发明另一实施例提供的无人机的控制方法的流程图。本实施例在上述实施例的基础上,提供了无人机的控制方法的一种具体实现方式。如图9所示,本实施例提供的无人机的控制方法,可以包括:
S201、获取包含目标图案的图像。
S202、对图像进行降采样获得低分辨率图片。
S203、在低分辨率图片中确定ROI。
其中,ROI包括目标图案。
S204、在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域。
其中,高分辨率图片的分辨率高于低分辨图片的分辨率。
S205、根据待处理图片获得目标图案在相机坐标系中的位姿信息。
具体的,根据待处理图片获得目标图案的中心点在图像坐标系中的像素坐标值,根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
其中,目标图案在相机坐标系中的位姿信息可以包括目标图案的中心点 在相机坐标系中的U轴分量、V轴分量和Z轴分量。
S206、根据目标图案的中心点在相机坐标系中的V轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
S207、根据目标图案的中心点在相机坐标系中的Z轴分量控制无人机在机体坐标系中X轴方向上的移动速度。
S208、根据目标图案的中心点在相机坐标系中的U轴分量控制无人机在机体坐标系中Y轴方向上的移动速度。
其中,S201~S208可以参见上述实施例中的说明,原理相似,此处不再赘述。
本实施例提供了一种无人机的控制方法,利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
在另一个实施例中,还提供了无人机的控制方法的另一种具体实现方式。参见图9,本实施例提供的无人机的控制方法,可以包括S201~S207,可以参见上述实施例中的描述,此处不再赘述。本实施例提供的无人机的控制方法,还可以包括:
S209、根据待处理图片获得目标图案在机体水平坐标系的位姿信息。
其中,目标图案在机体水平坐标系的位姿信息可以包括目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量。
S210、根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。
S211、根据夹角控制无人机在机体坐标系中偏航轴上的角速度。
其中,S209~S211可以参见上述实施例中的说明,原理相似,此处不再赘述。
其中,本实施例不限定S205~S207与S209~S211之间的执行顺序。
本实施例提供了一种无人机的控制方法,利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本, 提升了用户交互体验。
图10为本发明又一实施例提供的无人机的控制方法的流程图。本实施例在上述实施例的基础上,提供了无人机的控制方法的又一种具体实现方式。如图10所示,本实施例提供的无人机的控制方法,可以包括:
S301、获取包含目标图案的图像。
S302、对图像进行降采样获得低分辨率图片。
S303、在低分辨率图片中确定ROI。
其中,ROI包括目标图案。
S304、在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域。
其中,高分辨率图片的分辨率高于低分辨图片的分辨率。
S305、根据待处理图片获得目标图案在相机坐标系中的位姿信息。
具体的,根据待处理图片获得目标图案的中心点在图像坐标系中的像素坐标值,根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
S306、将目标图案在相机坐标系中的位姿信息转换为目标图案在大地坐标系中的位姿信息。
其中,目标图案在大地坐标系中的位姿信息可以包括目标图案的中心点在大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及目标图案在大地坐标系中的姿态信息。
S307、根据目标图案的中心点在大地坐标系中的Z轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
S308、根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量控制无人机在机体坐标系中X轴方向上的移动速度。
具体的,根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量确定无人机与目标图案之间的距离,根据距离控制无人机在机体坐标系中X轴方向上的移动速度。
S309、根据目标图案在大地坐标系中的姿态信息控制无人机在机体坐标系中Y轴方向上的移动速度。
具体的,根据目标图案在大地坐标系中的姿态信息确定航向偏移角,航 向偏移角为无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角。根据航向偏移角控制无人机在机体坐标系中Y轴方向上的移动速度。
其中,S301~S309可以参见上述实施例中的说明,原理相似,此处不再赘述。
本实施例提供了一种无人机的控制方法,利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
在又一个实施例中,还提供了无人机的控制方法的另一种具体实现方式。参见图10,本实施例提供的无人机的控制方法,可以包括S301~S308,可以参见上述实施例中的描述,此处不再赘述。本实施例提供的无人机的控制方法,还可以包括:
S310、根据待处理图片获得目标图案在机体水平坐标系的位姿信息。
其中,目标图案在机体水平坐标系的位姿信息可以包括目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量。
S311、根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。
S312、根据夹角控制无人机在机体坐标系中偏航轴上的角速度。
其中,S209~S211可以参见上述实施例中的说明,原理相似,此处不再赘述。
其中,本实施例不限定S305~S308与S310~S312之间的执行顺序。
本实施例提供了一种无人机的控制方法,利用特定的图案就可以简单的实现无人机跟随效果,避免了操作者利用遥控器摇杆等专业的控制设备通过复杂的控制操作控制无人机,降低了无人机的控制难度和操作者的学习成本,提升了用户交互体验。
图11为本发明一实施例提供的无人机的控制装置的结构示意图。本实施例提供的无人机的控制装置,用于执行本发明方法实施例提供的无人机的控制方法。如图11所示,本实施例提供的无人机的控制装置,可以包括:
获取模块71,用于获取包含目标图案的图像,根据图像获得目标图案在三维坐标系中的位姿信息。
控制模块72,用于根据位姿信息控制无人机跟踪目标图案。
可选的,三维坐标系包括相机坐标系,位姿信息包括目标图案的中心点在相机坐标系中的U轴分量、V轴分量和Z轴分量。
可选的,控制模块72具体用于:
根据目标图案的中心点在相机坐标系中的V轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
可选的,控制模块72具体用于:
根据目标图案的中心点在相机坐标系中的Z轴分量控制无人机在机体坐标系中X轴方向上的移动速度。
可选的,控制模块72具体用于:
根据目标图案的中心点在相机坐标系中的U轴分量控制无人机在机体坐标系中Y轴方向上的移动速度。
可选的,三维坐标系包括大地坐标系,位姿信息包括目标图案的中心点在大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及目标图案在大地坐标系中的姿态信息。
可选的,控制模块72具体用于:
根据目标图案的中心点在大地坐标系中的Z轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
可选的,控制模块72具体用于:
根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量确定无人机与目标图案之间的距离。
根据距离控制无人机在机体坐标系中X轴方向上的移动速度。
可选的,控制模块72具体用于:
根据目标图案在大地坐标系中的姿态信息确定航向偏移角,航向偏移角为无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角。
根据航向偏移角控制无人机在机体坐标系中Y轴方向上的移动速度。
可选的,三维坐标系还包括机体水平坐标系,位姿信息包括目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量。
可选的,控制模块72具体用于:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角控制无人机在机体坐标系中偏航轴上的角速度。
可选的,控制模块72具体用于:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定在规划的航线中目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角确定无人机围绕目标图案旋转时的圆环切向线速度。
根据圆环切向线速度控制无人机在机体坐标系中偏航轴上的角速度。
可选的,目标图案为两个,三维坐标系为相机坐标系,位姿信息包括两个目标图案的中心点分别在相机坐标系中的X轴分量、Y轴分量和Z轴分量。
可选的,控制模块72具体用于:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Y轴分量控制无人机或者无人机上的云台在机体坐标系中旋转的偏航角。
可选的,偏航角θ为
Figure PCTCN2018080605-appb-000005
其中,两个目标图案包括第一目标图案和第二目标图案,x1为第一目标图案在相机坐标系中的X轴分量,y1为第一目标图案在相机坐标系中的Y轴分量,x2为第二目标图案在相机坐标系中的X轴分量,y2为第二目标图案在相机坐标系中的Y轴分量。
可选的,控制模块72具体用于:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Z轴分量控制无人机上的云台在机体坐标系中旋转的俯仰角。
可选的,俯仰角α为
Figure PCTCN2018080605-appb-000006
其中,两个目标图案包括第一目标图案和第二目标图案,x1为第一目标图案在相机坐标系中的X轴分量,z1为第一目标图案在相机坐标系中的Z轴分量,x2为第二目标图案在相机坐标系中的X轴分量,z2为第二目标图案在相机坐标系中的Z轴分量。
可选的,控制模块72还用于:
在预设的多个曝光参数中确定采用第一曝光参数进行拍摄。
可选的,获取模块71具体用于:
对图像进行降采样获得低分辨率图片。
在低分辨率图片中确定感兴趣区域,感兴趣区域包括目标图案。
在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域,高分辨率图片的分辨率高于低分辨图片的分辨率。
根据待处理区域获得目标图案的中心点在图像坐标系中的像素坐标值。
根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
本实施例提供的无人机的控制装置,用于执行本发明方法实施例提供的无人机的控制方法,原理相似,此处不再赘述。
图12为本发明另一实施例提供的无人机的控制装置的结构示意图。本实施例提供的无人机的控制装置,用于执行本发明方法实施例提供的无人机的控制方法。如图12所示,本实施例提供的无人机的控制装置,可以包括:存储器81和处理器82。
所述存储器81,用于存储程序代码。
所述处理器82,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:
获取包含目标图案的图像。
根据图像获得目标图案在三维坐标系中的位姿信息。
根据位姿信息控制无人机跟踪目标图案。
可选的,三维坐标系包括相机坐标系,位姿信息包括目标图案的中心点在相机坐标系中的U轴分量、V轴分量和Z轴分量。
可选的,处理器82具体用于:
根据目标图案的中心点在相机坐标系中的V轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
可选的,处理器82具体用于:
根据目标图案的中心点在相机坐标系中的Z轴分量控制无人机在机体坐标系中X轴方向上的移动速度。
可选的,处理器82具体用于:
根据目标图案的中心点在相机坐标系中的U轴分量控制无人机在机体坐标系中Y轴方向上的移动速度。
可选的,三维坐标系包括大地坐标系,位姿信息包括目标图案的中心点在大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及目标图案在大地坐标系中的姿态信息。
可选的,处理器82具体用于:
根据目标图案的中心点在大地坐标系中的Z轴分量控制无人机在机体坐标系中Z轴方向上的移动速度。
可选的,处理器82具体用于:
根据目标图案的中心点在大地坐标系中的X轴分量和Y轴分量确定无人机与目标图案之间的距离。
根据距离控制无人机在机体坐标系中X轴方向上的移动速度。
可选的,处理器82具体用于:
根据目标图案在大地坐标系中的姿态信息确定航向偏移角,航向偏移角为无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角。
根据航向偏移角控制无人机在机体坐标系中Y轴方向上的移动速度。
可选的,三维坐标系还包括机体水平坐标系,位姿信息包括目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量。
可选的,处理器82具体用于:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角控制无人机在机体坐标系中偏航轴上的角速度。
可选的,处理器82具体用于:
根据目标图案的中心点在机体水平坐标系中的X轴分量和Y轴分量确定在规划的航线中目标图案与无人机在机体坐标系中偏航方向之间的夹角。
根据夹角确定无人机围绕目标图案旋转时的圆环切向线速度。
根据圆环切向线速度控制无人机在机体坐标系中偏航轴上的角速度。
可选的,目标图案为两个,三维坐标系为相机坐标系,位姿信息包括两个目标图案的中心点分别在相机坐标系中的X轴分量、Y轴分量和Z轴分量。
可选的,处理器82具体用于:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Y轴分量控制无人机或者无人机上的云台在机体坐标系中旋转的偏航角。
可选的,偏航角θ为
Figure PCTCN2018080605-appb-000007
其中,两个目标图案包括第一目标图案和第二目标图案,x1为第一目标图案在相机坐标系中的X轴分量,y1为第一目标图案在相机坐标系中的Y轴分量,x2为第二目标图案在相机坐标系中的X轴分量,y2为第二目标图案在相机坐标系中的Y轴分量。
可选的,处理器82具体用于:
根据两个目标图案的中心点分别在相机坐标系中的X轴分量和Z轴分量控制无人机上的云台在机体坐标系中旋转的俯仰角。
可选的,俯仰角α为
Figure PCTCN2018080605-appb-000008
其中,两个目标图案包括第一目标图案和第二目标图案,x1为第一目标图案在相机坐标系中的X轴分量,z1为第一目标图案在相机坐标系中的Z轴分量,x2为第二目标图案在相机坐标系中的X轴分量,z2为第二目标图案在相机坐标系中的Z轴分量。
可选的,处理器82还用于:
在预设的多个曝光参数中确定采用第一曝光参数进行拍摄。
可选的,处理器82具体用于:
对图像进行降采样获得低分辨率图片。
在低分辨率图片中确定感兴趣区域,感兴趣区域包括目标图案。
在与低分辨率图片对应的高分辨率图片中确定与感兴趣区域对应的待处理区域,高分辨率图片的分辨率高于低分辨率图片的分辨率。
根据待处理区域获得目标图案的中心点在图像坐标系中的像素坐标值。
根据像素坐标值获得目标图案在三维坐标系中的位姿信息。
本实施例提供的无人机的控制装置,用于执行本发明方法实施例提供的无人机的控制方法,原理相似,此处不再赘述。
本发明实施例还提供一种无人机,包括图11或图12所示实施例提供的无人机的控制装置。
本实施例提供的无人机,包括的无人机的控制装置用于执行本发明方法实施例提供的无人机的控制方法,原理相似,此处不再赘述。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步 骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例例如能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
最后应说明的是:以上各实施例仅用以说明本发明实施例的技术方案,而非对其限制;尽管参照前述各实施例对本发明实施例进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明实施例技术方案的范围。

Claims (39)

  1. 一种无人机的控制方法,其特征在于,包括:
    获取包含目标图案的图像;
    根据所述图像获得所述目标图案在三维坐标系中的位姿信息;
    根据所述位姿信息控制无人机跟踪所述目标图案。
  2. 根据权利要求1所述的方法,其特征在于,所述三维坐标系包括相机坐标系,所述位姿信息包括所述目标图案的中心点在所述相机坐标系中的U轴分量、V轴分量和Z轴分量。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述相机坐标系中的V轴分量控制所述无人机在机体坐标系中Z轴方向上的移动速度。
  4. 根据权利要求2所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述相机坐标系中的Z轴分量控制所述无人机在机体坐标系中X轴方向上的移动速度。
  5. 根据权利要求2所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述相机坐标系中的U轴分量控制所述无人机在机体坐标系中Y轴方向上的移动速度。
  6. 根据权利要求1所述的方法,其特征在于,所述三维坐标系包括大地坐标系,所述位姿信息包括所述目标图案的中心点在所述大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及所述目标图案在所述大地坐标系中的姿态信息。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述大地坐标系中的Z轴分量控制所述无人机在机体坐标系中Z轴方向上的移动速度。
  8. 根据权利要求6所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述大地坐标系中的X轴分量和Y轴分量确定所述无人机与所述目标图案之间的距离;
    根据所述距离控制所述无人机在机体坐标系中X轴方向上的移动速度。
  9. 根据权利要求6所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案在所述大地坐标系中的姿态信息确定航向偏移角,所述航向偏移角为所述无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角;
    根据所述航向偏移角控制所述无人机在机体坐标系中Y轴方向上的移动速度。
  10. 根据权利要求1-4、6-8中任一项所述的方法,其特征在于,所述三维坐标系还包括机体水平坐标系,所述位姿信息包括所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量。
  11. 根据权利要求10所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量确定所述目标图案与所述无人机在机体坐标系中偏航方向之间的夹角;
    根据所述夹角控制所述无人机在机体坐标系中偏航轴上的角速度。
  12. 根据权利要求10所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量确定在规划的航线中所述目标图案与所述无人机在机体坐标系中偏航方向之间的夹角;
    根据所述夹角确定所述无人机围绕所述目标图案旋转时的圆环切向线速度;
    根据所述圆环切向线速度控制所述无人机在机体坐标系中偏航轴上的角速度。
  13. 根据权利要求1所述的方法,其特征在于,所述目标图案为两个,所述三维坐标系为相机坐标系,所述位姿信息包括两个目标图案的中心点分别在所述相机坐标系中的X轴分量、Y轴分量和Z轴分量。
  14. 根据权利要求13所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述两个目标图案的中心点分别在所述相机坐标系中的X轴分量和Y轴分量控制所述无人机或者所述无人机上的云台在机体坐标系中旋转的偏航角。
  15. 根据权利要求14所述的方法,其特征在于,所述偏航角θ为
    Figure PCTCN2018080605-appb-100001
    其中,所述两个目标图案包括第一目标图案和第二目标图案,x1为所述第一目标图案在所述相机坐标系中的X轴分量,y1为所述第一目标图案在所述相机坐标系中的Y轴分量,x2为所述第二目标图案在所述相机坐标系中的X轴分量,y2为所述第二目标图案在所述相机坐标系中的Y轴分量。
  16. 根据权利要求13所述的方法,其特征在于,所述根据所述位姿信息控制无人机跟踪所述目标图案,包括:
    根据所述两个目标图案的中心点分别在所述相机坐标系中的X轴分量和Z轴分量控制所述无人机上的云台在机体坐标系中旋转的俯仰角。
  17. 根据权利要求16所述的方法,其特征在于,所述俯仰角α为
    Figure PCTCN2018080605-appb-100002
    其中,所述两个目标图案包括第一目标图案和第二目标图案,x1为所述第一目标图案在所述相机坐标系中的X轴分量,z1为所述第一目标图案在所述相机坐标系中的Z轴分量,x2为所述第二目标图案在所述相机坐标系中的X轴分量,z2为所述第二目标图案在所述相机坐标系中的Z轴分量。
  18. 根据权利要求1至17任一项所述的方法,其特征在于,所述获取包含目标图案的图像之前,还包括:
    在预设的多个曝光参数中确定采用第一曝光参数进行拍摄。
  19. 根据权利要求1至17任一项所述的方法,其特征在于,所述根据所述图像获得所述目标图案在三维坐标系中的位姿信息,包括:
    对所述图像进行降采样获得低分辨率图片;
    在所述低分辨率图片中确定感兴趣区域,所述感兴趣区域包括所述目标图案;
    在与所述低分辨率图片对应的高分辨率图片中确定与所述感兴趣区域对应的待处理区域,所述高分辨率图片的分辨率高于低分辨率图片的分辨率;
    根据所述待处理区域获得所述目标图案的中心点在图像坐标系中的像素坐标值;
    根据所述像素坐标值获得所述目标图案在所述三维坐标系中的位姿信息。
  20. 一种无人机的控制装置,其特征在于,包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:
    获取包含目标图案的图像;
    根据所述图像获得所述目标图案在三维坐标系中的位姿信息;
    根据所述位姿信息控制无人机跟踪所述目标图案。
  21. 根据权利要求20所述的装置,其特征在于,所述三维坐标系包括相机坐标系,所述位姿信息包括所述目标图案的中心点在所述相机坐标系中的U轴分量、V轴分量和Z轴分量。
  22. 根据权利要求21所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述相机坐标系中的V轴分量控制所述无人机在机体坐标系中Z轴方向上的移动速度。
  23. 根据权利要求21所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述相机坐标系中的Z轴分量控制所述无人机在机体坐标系中X轴方向上的移动速度。
  24. 根据权利要求21所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述相机坐标系中的U轴分量控制所述无人机在机体坐标系中Y轴方向上的移动速度。
  25. 根据权利要求20所述的装置,其特征在于,所述三维坐标系包括大地坐标系,所述位姿信息包括所述目标图案的中心点在所述大地坐标系中的X轴分量、Y轴分量和Z轴分量,以及所述目标图案在所述大地坐标系中的姿态信息。
  26. 根据权利要求25所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述大地坐标系中的Z轴分量控制所述无人机在机体坐标系中Z轴方向上的移动速度。
  27. 根据权利要求25所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述大地坐标系中的X轴分量和Y轴分量确定所述无人机与所述目标图案之间的距离;
    根据所述距离控制所述无人机在机体坐标系中X轴方向上的移动速度。
  28. 根据权利要求25所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案在所述大地坐标系中的姿态信息确定航向偏移角,所述航向偏移角为所述无人机分别在机体水平坐标系与当地水平坐标系中的航向夹角;
    根据所述航向偏移角控制所述无人机在机体坐标系中Y轴方向上的移动速度。
  29. 根据权利要求20-23、25-27中任一项所述的装置,其特征在于,所述三维坐标系还包括机体水平坐标系,所述位姿信息包括所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量。
  30. 根据权利要求29所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量确定所述目标图案与所述无人机在机体坐标系中偏航方向之间的夹角;
    根据所述夹角控制所述无人机在机体坐标系中偏航轴上的角速度。
  31. 根据权利要求29所述的装置,其特征在于,所述处理器具体用于:
    根据所述目标图案的中心点在所述机体水平坐标系中的X轴分量和Y轴分量确定在规划的航线中所述目标图案与所述无人机在机体坐标系中偏航方向之间的夹角;
    根据所述夹角确定所述无人机围绕所述目标图案旋转时的圆环切向线速度;
    根据所述圆环切向线速度控制所述无人机在机体坐标系中偏航轴上的角速度。
  32. 根据权利要求20所述的装置,其特征在于,所述目标图案为两个,所述三维坐标系为相机坐标系,所述位姿信息包括两个目标图案的中心点分别在所述相机坐标系中的X轴分量、Y轴分量和Z轴分量。
  33. 根据权利要求32所述的装置,其特征在于,所述处理器具体用于:
    根据所述两个目标图案的中心点分别在所述相机坐标系中的X轴分量和Y轴分量控制所述无人机或者所述无人机上的云台在机体坐标系中旋转的偏 航角。
  34. 根据权利要求33所述的装置,其特征在于,所述偏航角θ为
    Figure PCTCN2018080605-appb-100003
    其中,所述两个目标图案包括第一目标图案和第二目标图案,x1为所述第一目标图案在所述相机坐标系中的X轴分量,y1为所述第一目标图案在所述相机坐标系中的Y轴分量,x2为所述第二目标图案在所述相机坐标系中的X轴分量,y2为所述第二目标图案在所述相机坐标系中的Y轴分量。
  35. 根据权利要求32所述的装置,其特征在于,所述处理器具体用于:
    根据所述两个目标图案的中心点分别在所述相机坐标系中的X轴分量和Z轴分量控制所述无人机上的云台在机体坐标系中旋转的俯仰角。
  36. 根据权利要求35所述的装置,其特征在于,所述俯仰角α为
    Figure PCTCN2018080605-appb-100004
    其中,所述两个目标图案包括第一目标图案和第二目标图案,x1为所述第一目标图案在所述相机坐标系中的X轴分量,z1为所述第一目标图案在所述相机坐标系中的Z轴分量,x2为所述第二目标图案在所述相机坐标系中的X轴分量,z2为所述第二目标图案在所述相机坐标系中的Z轴分量。
  37. 根据权利要求20-36任一项所述的装置,其特征在于,所述处理器还用于:
    在预设的多个曝光参数中确定采用第一曝光参数进行拍摄。
  38. 根据权利要求20-36任一项所述的装置,其特征在于,所述处理器具体用于:
    对所述图像进行降采样获得低分辨率图片;
    在所述低分辨率图片中确定感兴趣区域,所述感兴趣区域包括所述目标图案;
    在与所述低分辨率图片对应的高分辨率图片中确定与所述感兴趣区域对应的待处理区域,所述高分辨率图片的分辨率高于低分辨率图片的分辨率;
    根据所述待处理区域获得所述目标图案的中心点在图像坐标系中的像素坐标值;
    根据所述像素坐标值获得所述目标图案在所述三维坐标系中的位姿信息。
  39. 一种无人机,其特征在于,包括:如权利要求20-38任一项所述的无人机的控制装置。
PCT/CN2018/080605 2018-03-27 2018-03-27 无人机的控制方法、装置和无人机 WO2019183789A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/080605 WO2019183789A1 (zh) 2018-03-27 2018-03-27 无人机的控制方法、装置和无人机
CN201880002830.4A CN109661631A (zh) 2018-03-27 2018-03-27 无人机的控制方法、装置和无人机

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/080605 WO2019183789A1 (zh) 2018-03-27 2018-03-27 无人机的控制方法、装置和无人机

Publications (1)

Publication Number Publication Date
WO2019183789A1 true WO2019183789A1 (zh) 2019-10-03

Family

ID=66117540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080605 WO2019183789A1 (zh) 2018-03-27 2018-03-27 无人机的控制方法、装置和无人机

Country Status (2)

Country Link
CN (1) CN109661631A (zh)
WO (1) WO2019183789A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116839595A (zh) * 2023-09-01 2023-10-03 北京宝隆泓瑞科技有限公司 一种创建无人机航线的方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110076774A (zh) * 2019-04-25 2019-08-02 上海瀚创机器人技术有限公司 捕捉目标控制模式切换方法、装置、设备及存储介质
CN111665870B (zh) * 2020-06-24 2024-06-14 深圳市道通智能航空技术股份有限公司 一种轨迹跟踪方法及无人机
WO2022021027A1 (zh) * 2020-07-27 2022-02-03 深圳市大疆创新科技有限公司 目标跟踪方法、装置、无人机、系统及可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136762A (zh) * 2011-11-29 2013-06-05 南京理工大学常熟研究院有限公司 一种动态图像目标跟踪方法
CN106444846A (zh) * 2016-08-19 2017-02-22 杭州零智科技有限公司 移动终端的定位和控制方法、装置及无人机
WO2017050279A1 (zh) * 2015-09-25 2017-03-30 广州极飞科技有限公司 无人机深度图像的获取方法、装置及无人机
CN106871902A (zh) * 2017-02-16 2017-06-20 广东工业大学 一种无人机导航的方法、装置以及系统
CN107102647A (zh) * 2017-03-30 2017-08-29 中国人民解放军海军航空工程学院青岛校区 基于图像的无人机目标跟踪控制方法
CN107636679A (zh) * 2016-12-30 2018-01-26 深圳前海达闼云端智能科技有限公司 一种障碍物检测方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426282A (zh) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 遥控方法及终端
CN105353772B (zh) * 2015-11-16 2018-11-09 中国航天时代电子公司 一种无人机机动目标定位跟踪中的视觉伺服控制方法
CN105758397B (zh) * 2016-02-14 2018-09-25 中国船舶工业系统工程研究院 一种飞行器摄像定位方法
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机
WO2018098784A1 (zh) * 2016-12-01 2018-06-07 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136762A (zh) * 2011-11-29 2013-06-05 南京理工大学常熟研究院有限公司 一种动态图像目标跟踪方法
WO2017050279A1 (zh) * 2015-09-25 2017-03-30 广州极飞科技有限公司 无人机深度图像的获取方法、装置及无人机
CN106444846A (zh) * 2016-08-19 2017-02-22 杭州零智科技有限公司 移动终端的定位和控制方法、装置及无人机
CN107636679A (zh) * 2016-12-30 2018-01-26 深圳前海达闼云端智能科技有限公司 一种障碍物检测方法及装置
CN106871902A (zh) * 2017-02-16 2017-06-20 广东工业大学 一种无人机导航的方法、装置以及系统
CN107102647A (zh) * 2017-03-30 2017-08-29 中国人民解放军海军航空工程学院青岛校区 基于图像的无人机目标跟踪控制方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116839595A (zh) * 2023-09-01 2023-10-03 北京宝隆泓瑞科技有限公司 一种创建无人机航线的方法
CN116839595B (zh) * 2023-09-01 2023-11-28 北京宝隆泓瑞科技有限公司 一种创建无人机航线的方法

Also Published As

Publication number Publication date
CN109661631A (zh) 2019-04-19

Similar Documents

Publication Publication Date Title
US12069372B2 (en) Gimbal control method and apparatus, control terminal and aircraft system
WO2019144271A1 (zh) 无人机的控制方法、设备和无人机
US11057604B2 (en) Image processing method and device
WO2019183789A1 (zh) 无人机的控制方法、装置和无人机
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
WO2018120350A1 (zh) 对无人机进行定位的方法及装置
WO2021217371A1 (zh) 可移动平台的控制方法和装置
WO2019227441A1 (zh) 可移动平台的拍摄控制方法和设备
WO2019144300A1 (zh) 目标检测方法、装置和可移动平台
WO2020014987A1 (zh) 移动机器人的控制方法、装置、设备及存储介质
WO2020172800A1 (zh) 可移动平台的巡检控制方法和可移动平台
CN111247389B (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
WO2019189381A1 (ja) 移動体、制御装置、および制御プログラム
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
CN109754420B (zh) 一种目标距离估计方法、装置及无人机
WO2019104583A1 (zh) 最高温度点跟踪方法、装置和无人机
WO2020042159A1 (zh) 一种云台的转动控制方法、装置及控制设备、移动平台
WO2020062089A1 (zh) 磁传感器校准方法以及可移动平台
WO2019205103A1 (zh) 云台姿态修正方法、云台姿态修正装置、云台、云台系统和无人机
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム
WO2021168821A1 (zh) 可移动平台的控制方法和设备
JP6949930B2 (ja) 制御装置、移動体および制御方法
EP3957954A1 (en) Active gimbal stabilized aerial visual-inertial navigation system
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
WO2022205294A1 (zh) 无人机的控制方法、装置、无人机及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18912879

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18912879

Country of ref document: EP

Kind code of ref document: A1