WO2020048365A1 - 飞行器的飞行控制方法、装置、终端设备及飞行控制系统 - Google Patents

飞行器的飞行控制方法、装置、终端设备及飞行控制系统 Download PDF

Info

Publication number
WO2020048365A1
WO2020048365A1 PCT/CN2019/103096 CN2019103096W WO2020048365A1 WO 2020048365 A1 WO2020048365 A1 WO 2020048365A1 CN 2019103096 W CN2019103096 W CN 2019103096W WO 2020048365 A1 WO2020048365 A1 WO 2020048365A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
flight
target point
angle
determining
Prior art date
Application number
PCT/CN2019/103096
Other languages
English (en)
French (fr)
Inventor
冯银华
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2020048365A1 publication Critical patent/WO2020048365A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • Embodiments of the present invention relate to the technical field of aircraft, and in particular, to a flight control method for an aircraft, a flight control device for an aircraft, a terminal device, and a flight control system.
  • aerial vehicles such as Unmanned Aerial Vehicles (UAVs), referred to as unmanned aerial vehicles (UAVs)
  • UAVs Unmanned Aerial Vehicles
  • UAVs unmanned aerial vehicles
  • the user can manually input the target point and control the flight of the aircraft to make the aircraft fly to the location specified by the user to complete the specified task.
  • This method of controlling flight is called "pointing flight”.
  • the aerial photography device is used to take aerial photos of the affected area and search and rescue the affected people. When the affected people are found trapped in a certain place, a pointing flight function can be used.
  • An input interface such as a screen is used to input a target point with a finger, and the aircraft is controlled to fly to the target point by controlling the aircraft, that is, the location of the affected person, so as to provide various assistance to the affected person as soon as possible, such as placing food and medicine.
  • the related technology has at least the following problems: For the current control of the flight of the aircraft by pointing flight, the aircraft will be controlled to fly in a predetermined direction according to the target point input by the user. However, when When the aircraft reaches the target position, it will not stop, but will continue to fly. That is to say, the existing pointing flight function only controls the flight direction of the aircraft, but cannot control the flight distance of the aircraft. After the position, the user still needs to manually operate the aircraft again to stop it, so it cannot really achieve the effect of which one is flying.
  • Embodiments of the present invention provide an aircraft flight control method, an aircraft flight control device, a terminal device, and a flight control system, which can truly achieve the flight control effect of which one is flying.
  • an embodiment of the present invention provides a flight control method for an aircraft, which is applied to a terminal device that is communicatively connected to the aircraft.
  • the aircraft includes a gimbal and a shooting device mounted on the gimbal.
  • the method includes:
  • determining the flight direction of the aircraft according to the target point includes:
  • the obtaining the focal length of the photographing device specifically includes:
  • a focal length of the photographing device is determined according to a field of view of the photographing device and a pixel corresponding to the field of view of the photographing device.
  • determining the flight distance of the aircraft according to the target point includes:
  • the attitude information includes an attitude angle.
  • a calculation formula for determining a flying distance of the aircraft is:
  • L is the flight distance of the aircraft
  • is the elevation angle in the attitude angle of the gimbal
  • h is the flight height of the aircraft
  • is the deflection angle
  • the optical axis direction of the photographing device is determined by a pitch angle in an attitude angle of the gimbal.
  • the determining a target point in the image specifically includes:
  • the selected target point is converted into a target point in the image.
  • an embodiment of the present invention provides a flight control device for an aircraft, which is configured on a terminal device, the terminal device is communicatively connected to the aircraft, and the aircraft includes a gimbal and a camera mounted on the gimbal A device comprising:
  • a first determining module configured to determine a target point in an image, where the image is an image captured by the shooting device when the aircraft is at the current position;
  • a second determining module configured to determine a flight direction of the aircraft according to the target point
  • a third determining module configured to determine a flying distance of the aircraft according to the target point
  • a flight control module is configured to control the flight of the aircraft according to the flight direction and the flight distance.
  • the second determining module includes:
  • a focal length acquisition unit configured to acquire a focal length of the photographing device
  • a pixel difference determining unit configured to determine a pixel difference between a center point of the image and the target point according to the target point
  • a flight direction determining unit is configured to determine a flight direction of the aircraft according to the focal length and a pixel difference between a center point of the image and the target point.
  • the focal length obtaining unit is specifically configured to:
  • a focal length of the photographing device is determined according to a field of view angle of the photographing device and a pixel corresponding to the field of view angle of the photographing device.
  • the third determining module includes:
  • An attitude information obtaining unit configured to obtain attitude information of the pan / tilt head
  • a flying height acquisition unit configured to acquire a flying height of the aircraft
  • a deflection angle acquisition unit configured to determine a deflection angle of the aircraft according to the target point, where the deflection angle is an angle deflected by a flight direction of the aircraft relative to an optical axis direction of the photographing device;
  • a flying distance determining unit is configured to determine a flying distance of the aircraft according to the attitude information, the flying height, and a deflection angle.
  • the attitude information includes an attitude angle.
  • the calculation formula for determining the flight distance of the aircraft based on the attitude information, the flight height, and the deflection angle is:
  • L is the flight distance of the aircraft
  • is the elevation angle in the attitude angle of the gimbal
  • h is the flight height of the aircraft
  • is the deflection angle
  • the optical axis direction of the photographing device is determined by a pitch angle in an attitude angle of the gimbal.
  • the first determining module is specifically configured to:
  • the selected target point is converted into a target point in the image.
  • an embodiment of the present invention provides a terminal device, including:
  • At least one processor At least one processor
  • a memory connected in communication with the at least one processor; wherein,
  • the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute a flight control method of an aircraft as described above.
  • an embodiment of the present invention provides a flight control system, including: an aircraft and the terminal device described above, the terminal device is connected to the aircraft.
  • an embodiment of the present invention provides a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions. When the instructions are executed by a computer, the computer is caused to execute the flight control method of the aircraft as described above.
  • an embodiment of the present invention further provides a non-volatile computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to cause a computer to execute as described above.
  • the target point in the image is determined first, and then the flight direction and the flight distance of the aircraft are determined according to the target point, and the flight of the aircraft is controlled based on the flight direction and the flight distance, so that the aircraft flies to the determined image.
  • the corresponding point of the target point in the actual scene space which can accurately control the aircraft to fly to the target point entered by the user, and immediately control the aircraft to stop flying after the aircraft has flown to the target point.
  • the effect makes it unnecessary for the user to manually operate the aircraft again to stop it after flying to the desired position, which facilitates the operation of the user, reduces the possibility of the aircraft flying, and effectively improves the user experience.
  • FIG. 1 is a schematic diagram of an application environment of a flight control method for an aircraft according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of another application environment of a flight control method for an aircraft according to an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a flight control method for an aircraft according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of calculating a flying distance provided by an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of determining a flight direction of the aircraft according to the target point according to an embodiment of the present invention
  • FIG. 6 is a schematic flowchart of determining a flight distance of the aircraft according to the target point according to an embodiment of the present invention
  • FIG. 7 is a schematic diagram of a flight control device for an aircraft according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of a flight control system according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of an application environment of a flight control method for an aircraft provided by the present invention.
  • the application environment includes: an aircraft 10, a terminal device 20, and a user (not shown).
  • the terminal device 20 is communicatively connected to the aircraft 10 to implement data or information transmission with the aircraft 10.
  • the user can hold the terminal device 20 to perform various operations and the like on the terminal device 20.
  • the aircraft 10 can be used to capture an image (or a video, an image, or a picture), etc., and transmit the captured image to the terminal device 20 so as to display the image captured by the aircraft 10 on the screen of the terminal device 20.
  • the terminal device 20 can click or touch any point on the screen of the terminal device 20 as a target point for the aircraft 10 to fly.
  • the terminal device 20 can obtain the coordinates of the target point selected by the user on the screen of the terminal device 20 through processing, and then map the selected target point through the mapping relationship. The coordinates are converted into the coordinates of the target point in the image, thereby determining the target point in the image.
  • the terminal device 20 After determining the target point in the image, the terminal device 20 combines the state parameters of the aircraft 10 (such as the attitude information of the aircraft's gimbal and the flying height of the aircraft) and the fixed parameters of the aircraft 10 (such as the focal length of the aircraft's camera).
  • the flight direction and distance of the aircraft 10 can be determined, so as to control the aircraft 10 to fly to the corresponding point of the target point in the image in the actual scene space, so that the aircraft can be accurately controlled to fly to the target point entered by the user,
  • the aircraft is controlled to stop flying to achieve the flight control effect, which means that the user does not need to manually operate the aircraft to stop it again after flying to the desired position, which is convenient for the user's operation and reduces the flying loss of the aircraft.
  • the remote control's joystick such as up and down, left and right, front and back, rotation, etc.
  • the joystick of the remote control is cumbersome to operate.
  • the pointing flight technology can be used to control the flight of the aircraft.
  • pointing flight technology can avoid frequent operation of the remote control's joystick, in current pointing flight technology, when a user selects a target point through the screen of a terminal device (for example, a remote control), the terminal device will input the user's input. After processing the data, the target point is sent to the vision module of the aircraft.
  • the vision module calculates the specific position of the target object through the received data information combined with the selected coordinate points, and allows the aircraft to fly according to the calculated position.
  • the aircraft cannot calculate the specific depth information of the target (such as flight distance). Therefore, the aircraft will always fly in a predetermined direction, and even if the aircraft reaches the target position, it will not stop, but continue to fly, which cannot really achieve the effect of which fly. In addition, if the aircraft continues to fly in a predetermined direction, there may be a risk of the aircraft being lost.
  • the flight control method provided by the embodiment of the present invention does not need to frequently operate the joystick of the remote controller, which reduces the tedious operation of controlling the flight of the aircraft 10 and improves the user experience.
  • controlling the flight of the aircraft 10 according to the flight direction and the flight distance, so that the aircraft 10 flies to a point corresponding to a target point in the image in the actual scene space not only controls the flight of the aircraft 10
  • the direction also controls the distance that the aircraft 10 needs to fly in order to achieve the flight control effect, which means that the user does not have to manually operate the aircraft 10 to stop it after flying to the desired position, which is convenient for the user's operation. It also reduces the possibility of the aircraft 10 being lost and effectively improves the user experience.
  • the aircraft 10 may be any suitable flying instrument.
  • the aircraft 10 may be an unmanned aerial vehicle, an unmanned ship, or other movable devices.
  • UAV unmanned aerial vehicle
  • UAVs are unmanned aircraft with mission loads, operated by remote control equipment or self-contained program control devices.
  • the drone may be various types of drones.
  • the drone may be a rotorcraft, for example, a multi-rotor aircraft propelled by multiple propulsion devices through air.
  • Embodiments of the present invention are not Limited to this, the drone may also be other types of drones, such as fixed-wing drones, unmanned airships, para-wing drones, flapping-wing drones, and the like.
  • UAVs include, but are not limited to, airframes, power systems, flight control components, gimbals, cameras, image transmission modules, and so on.
  • the flight control component and the image transmission module are arranged in the fuselage, the power system and the gimbal are both mounted on the fuselage, and the shooting device is mounted on the gimbal.
  • the flight control component can be coupled with the power system, the PTZ, the camera, and the image transmission module to achieve communication.
  • the fuselage may include a center frame and one or more arms connected to the center frame, and one or more arms extend radially from the center frame.
  • the number of the arms can be 2, 4, 6, and so on.
  • One or more arms are used to carry the power system.
  • the power system may include an electronic governor (referred to as an ESC), one or more propellers, and one or more first motors corresponding to the one or more propellers, where the first motor is connected between the electronic governor and the propeller In between, the first motor and the propeller are arranged on the corresponding arm; the electronic governor is used to receive the driving signal generated by the flight control component and provide a driving current to the first motor according to the driving signal to control the speed of the first motor.
  • the first motor is used to drive the propeller to rotate, so as to provide power for the flight of the drone, and the power enables the drone to achieve one or more degrees of freedom of motion, such as forward and backward motion, up and down motion, and the like.
  • the drone may rotate about one or more axes of rotation.
  • the rotation axis may include a roll axis, a pan axis, and a pitch axis.
  • the first motor may be a DC motor or an AC motor.
  • the first motor may be a brushless motor or a brush motor.
  • the flight control component has the ability to monitor and control the flight and mission of the drone, and includes a set of equipment for drone launch and recovery control.
  • the flight control component is used to control the flight of the drone.
  • Flight control components may include flight controllers and sensing systems.
  • the sensor system is used to measure the position information and status information of the drone and various parts of the drone, for example, three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration and three-dimensional angular velocity, flying height, and the like.
  • the sensing system may include, for example, at least one of an infrared sensor, an acoustic wave sensor, a gyroscope, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a Global Positioning System (Global Positioning System, GPS).
  • the flight controller is used to control the flight of the drone. It can be understood that the flight controller can control the drone according to a pre-programmed program instruction, and can also control the drone by responding to one or more control instructions from other devices.
  • the terminal device 20 generates a control instruction according to the flight direction and the flight distance, and sends the control instruction to the flight controller, and the flight controller receives the control instruction to control the flight of the drone through the control instruction; or, the flight The controller receives the flying direction and the flying distance sent by the terminal device 20, and controls the flight of the drone according to the control instruction generated according to the flying direction and the flying distance.
  • the gimbal is used to carry a camera.
  • the gimbal is provided with a second motor, and the flight control component can control the gimbal, and specifically control the movement (such as the rotation speed) of the second motor to adjust the angle of the image taken by the drone.
  • the second motor may be a brushless motor or a brush motor.
  • the gimbal can be located at the top or bottom of the fuselage.
  • the PTZ is used as a part of the UAV. It can be understood that in some other embodiments, the PTZ can be independent of the UAV.
  • the shooting device may be a device for capturing images, such as a camera, a camera phone, or a video camera.
  • the shooting device may communicate with the flight control component and perform shooting under the control of the flight control component.
  • the flight control component controls the shooting frequency of the images captured by the shooting device, that is, how many times are captured per unit time.
  • the flight control component controls the angle of the captured image of the shooting device through the pan / tilt.
  • the image transmission module is used to transmit images, pictures or videos captured by a drone photographing device in the sky in a real-time and stable manner to a ground wireless image transmission receiving device, such as the terminal device 20 and the like.
  • the terminal device 20 may be any suitable electronic device. For example, smartphones, tablets, personal computers, wearables, and more.
  • the terminal device 20 includes a communication module for communicating with the aircraft 10 described above.
  • the communication module may be a wireless communication module, such as a WiFi module, a Bluetooth module, an infrared module, and a general packet radio service (GPRS). ) Modules and so on.
  • the terminal device 20 also includes a screen, which can implement input and output functions. For example, a user selects a target point on the screen through the screen, and displays an image, screen, or video captured by the aircraft 10 through the screen.
  • FIG. 2 is a schematic diagram of another application environment of a flight control method for an aircraft provided by the present invention.
  • the application environment further includes a remote controller 30.
  • the remote controller 30 is communicatively connected with the aircraft 10 and the terminal device 20 respectively.
  • the remote controller 30 and the aircraft 10 can perform wireless communication, and the remote controller 30 and the terminal device 20 can be connected through USB.
  • the remote control 30 may be any suitable remote control device.
  • the remote control unit 30 is a remote control unit on the ground (ship) surface or an aerial platform that controls the flying aircraft through a flight control component.
  • the remote controller 30 is used to transfer data, information, or instructions.
  • the remote controller 30 receives data or information (such as an image captured by the photographing device) sent by the aircraft 10 and sends the data Data, information, or instructions are sent to the terminal device 20; or, the remote controller 30 receives data or information (such as a control instruction generated according to the flight direction and flight distance) sent by the terminal device 20, and sends the data or information to the aircraft 10.
  • the aircraft 10 and the terminal device 20 have a certain distance, especially for some high-altitude shooting, the aircraft 10 and the terminal device 20 are usually far away.
  • the remote controller 30 relays data, information, and instructions.
  • the remote controller 30 is not necessary, that is, the terminal device 20 can directly send a control instruction to the aircraft 10 to control the flight of the aircraft 10.
  • FIG. 3 is a schematic flowchart of a flight control method for an aircraft according to an embodiment of the present invention. This method is suitable for controlling the flight of various aircraft, for example, aircraft 10 in FIG. 1. This method may be performed by various terminal devices, for example, the terminal device 20 in FIG. 1.
  • the terminal device is communicatively connected to the aircraft, and the aircraft includes a gimbal and a photographing device mounted on the gimbal.
  • the flight control method of the aircraft includes:
  • the image is an image captured by the shooting device when the aircraft is at the current position.
  • the aircraft can send the image to the terminal device.
  • the terminal device After the terminal device receives the image, it will perform certain processing on the image, such as scaling the image size to fit the screen of the terminal device. Display, the processed image is displayed on the screen of the terminal device.
  • the target point can be selected through this screen in order to determine the target point in the image.
  • the terminal device determining the target point in the image includes: acquiring a target point selected by a user on the screen of the terminal device; and converting the selected target point into a target point in the image.
  • a terminal device receives an input operation of a user to obtain a target point selected by the user on a screen of the terminal device.
  • the user can select a point P0 on the screen as a target point by input operations such as clicking or touching the screen of the terminal device.
  • the terminal device can calculate the coordinates (x0, y0) of P0 in the plane coordinate system corresponding to the screen, that is, the coordinates (x0, y0) are used to represent the selected target point P0.
  • Position, and then converting the selected target point P0 into a target point P1 in the image which is equivalent to converting coordinates (x0, y0) to coordinates (x1, y1) of P1 in a corresponding image coordinate system.
  • the user's input operation may further include directly inputting the coordinates (x0, y0) of P0.
  • the terminal device determining the flight direction of the aircraft according to the target point includes the following steps:
  • acquiring the focal length of the photographing device by the terminal device includes: determining a pixel corresponding to the field of view of the photographing device according to a preset correspondence between the field of view and the pixels; and according to the photographing The field of view of the device and the pixels corresponding to the field of view of the shooting device determine the focal length of the shooting device.
  • the field of view (FOV) of the shooting device is the angle formed by the lens center point of the shooting device to the two edges of the imaging plane.
  • the size of the field of view determines the field of view of the camera. The larger the field of view, the larger the field of view. Because the imaging plane has four sides, it corresponds to the field of view in two directions.
  • the field of view angle refers to a field of view corresponding to the direction of the pitch axis (such as AOB in FIG. 4).
  • the mapping relationship table used to indicate the correspondence between the preset field of view angle and the pixel may be pre-configured in the terminal device, and after obtaining the field angle of the photographing device, the mapping relationship table may be used to obtain the mapping relationship table. Pixels corresponding to the field of view of the imaging device. For example, assuming that the FOV of the shooting device is 48 degrees, and combined with the preset correspondence between the FOV and the pixels, it can be obtained that the corresponding pixel is 360. Then, according to the field of view angle of the photographing device and the pixels corresponding to the field of view angle of the photographing device, the focal length of the photographing device can be determined.
  • the calculation formula for determining the focal length of the photographing device is as follows:
  • f is the focal length of the shooting device
  • FOV is the field angle of the shooting device
  • the manner in which the terminal device obtains the focal length of the photographing device may further include: obtaining the focal length of the photographing device by receiving the input focal length of the photographing device; the focal length of the photographing device is pre-configured in the terminal device The focal length of the photographing device is directly read from the terminal device; the focal length of the photographing device is pre-configured in an aircraft or other device, the terminal device reads the focal length of the photographing device from the aircraft or other device, and so on.
  • the pixel difference ⁇ x between the center point of the image and the target point refers to the pixel difference in the y-axis direction in the image coordinate system.
  • the flying direction of the aircraft is a direction in which the current position of the aircraft is directed to a target point in the image.
  • the flying direction of the aircraft is a direction from point O to point P1.
  • O is the current position of the aircraft.
  • the deflection angle ⁇ can be determined according to the focal length f and the pixel difference ⁇ x between the center point C of the image and the target point P1, where the deflection angle ⁇ is the flight direction of the aircraft relative to the photographing device. The angle by which the optical axis direction is deflected, so that the flying direction of the aircraft can be determined.
  • is a deflection angle
  • ⁇ x is a pixel difference between a central point C of the image and the target point P1
  • f is a focal length of the photographing device.
  • Obtaining the attitude information of the PTZ by the terminal device specifically includes: first obtaining the attitude information of the PTZ from the attitude acquisition sensor provided on the PTZ, and sending the attitude information of the PTZ to the terminal device, so that the terminal device obtains the attitude information.
  • the attitude acquisition sensor may be an inertial measurement unit (Inertial Measurement Unit, IMU) and the like.
  • IMU is a sensor that measures the attitude information (or angular rate) and acceleration of an object in three axes.
  • IMUs have a six-axis IMU and a nine-axis IMU.
  • one IMU includes three single-axis accelerometers and three single-axis gyroscopes.
  • the accelerometer detects the acceleration signals of the object in the carrier coordinate system independently of the three axes
  • the gyroscope detects the carrier relative to the Navigating the angular velocity signal of the coordinate system, measuring the angular velocity and acceleration of the object in three-dimensional space, and using this solution to calculate the attitude angle of the object.
  • one IMU includes three single-axis accelerometers, three single-axis gyroscopes, and three single-axis geomagnetometers.
  • the nine-axis IMU's accelerometer is similar to the gyroscope, and the nine-axis IMU
  • the geomagnetic meter is used to detect the component of the geomagnetic field on the horizontal plane in the inertial system, and the direction of this component always points to the north pole.
  • the attitude information includes an attitude angle.
  • the attitude angle of the gimbal is expressed by Euler angle, that is, the attitude angle of the gimbal is described by Euler angle ( ⁇ , ⁇ , ⁇ ).
  • is the pitch angle in the attitude angle of the head
  • is the yaw angle in the attitude angle of the head
  • is the roll angle in the attitude angle of the head.
  • the flying height of the aircraft refers to the height of the current flight position of the aircraft relative to the take-off point of the aircraft.
  • At least one distance acquisition sensor may be set on the aircraft to measure a first height of the current position relative to the ground, and the aircraft sends the first height to the terminal device, so that the terminal device obtains the current position relative to First height above ground.
  • the at least one distance acquisition sensor may include, but is not limited to, an ultrasonic sensor, an infrared sensor, a microwave sensor, and the like.
  • the second height of the take-off point of the aircraft relative to the ground can be measured by the at least one distance acquisition sensor when the aircraft takes off. Generally, to ensure the safety of the aircraft, the second height of the take-off point relative to the ground is about 2 meters.
  • the aircraft sends the second height to the terminal device, so that the terminal device obtains the second height of the takeoff point relative to the ground.
  • the difference in height between the first altitude and the second altitude is the flying altitude of the aircraft.
  • the deflection angle is an angle that the flight direction of the aircraft is deflected with respect to the optical axis direction of the photographing device.
  • the terminal device determining the deflection angle of the aircraft according to the target point specifically includes: according to the focal length f, and a pixel difference ⁇ x between the center point C of the image and the target point P1, Determine the deflection angle ⁇ .
  • the deflection angle is an angle that the flight direction of the aircraft is deflected with respect to the optical axis direction of the photographing device.
  • the attitude information includes an attitude angle.
  • the optical axis direction of the photographing device is determined by a pitch angle among attitude angles of the gimbal. As shown in FIG. 4, the included angle between the optical axis direction and the pitch axis direction of the photographing device is the pitch angle in the attitude angle of the gimbal.
  • the optical axis direction of the photographing device can be adjusted by adjusting the pitch angle, so that Adjusting an angle of an image captured by the shooting device.
  • the calculation formula for determining the flying distance of the aircraft according to the attitude information, the flying height, and the deflection angle is:
  • L is the flight distance of the aircraft
  • is the elevation angle in the attitude angle of the gimbal
  • h is the flight height of the aircraft
  • is the deflection angle
  • the flying distance of the aircraft can be obtained, that is, the point P2 corresponding to the target point in the image in the actual scene space is determined, so that the aircraft flies in the specified direction (such as the direction from O to P1 in FIG. 4) Distance (such as L in FIG. 4), so that the aircraft flies to the position of P2, and achieves the effect of which means fly.
  • the terminal device controlling the flight of the aircraft according to the flight direction and the flight distance includes: a control instruction generated by the terminal device according to the flight direction and the flight distance, and sending the control instruction to the aircraft to control the aircraft through the control instruction Or the terminal device sends the flight direction and flight distance to the aircraft, so that the aircraft generates a control instruction generated by the flight direction and flight distance, so as to control the flight of the aircraft through the control instruction.
  • steps 3021-3023 and step 3024 are not inconsistent.
  • -3027 can have different execution orders, for example, step 3022 is performed first, then step 3021 is performed, or step 3022 is performed simultaneously with step 3021, and so on.
  • the target point in the image is determined first, and then the flying direction of the aircraft and the flying distance of the aircraft are determined according to the target point, and the flight of the aircraft is controlled based on the flying direction and the flying distance to make the aircraft fly to the determined
  • the target point in the image corresponds to the point in the actual scene space, so you can accurately control the aircraft to fly to the target point entered by the user, and control the aircraft to stop flying immediately after the aircraft flies to the target point.
  • the flight control effect makes it unnecessary for the user to manually operate the aircraft to stop it after flying to the desired position, which facilitates the operation of the user, reduces the possibility of the aircraft flying, and effectively improves the user experience.
  • FIG. 7 is a schematic diagram of a flight control device for an aircraft according to an embodiment of the present invention.
  • the flight control device 70 of the aircraft may be configured in various terminal devices, for example, the terminal device 20 in FIG. 1.
  • the terminal device is communicatively connected to the aircraft, and the aircraft includes a gimbal and a photographing device mounted on the gimbal.
  • the flight control device 70 of the aircraft includes a first determination module 701, a second determination module 702, a third determination module 703, and a flight control module 704.
  • the first determining module 701 is configured to determine a target point in the image.
  • the image is an image captured by the shooting device when the aircraft is at the current position.
  • the first determining module 701 is specifically configured to: obtain a target point selected by a user on a screen of the terminal device; and convert the selected target point into a target point in an image.
  • the first determining module 701 receives an input operation of a user to obtain a target point selected by the user on a screen of the terminal device.
  • the input operation includes: tapping the screen, touching the screen, or inputting the coordinates of the selected target point, and the like.
  • the second determining module 702 is configured to determine a flight direction of the aircraft according to the target point.
  • the second determination module 702 includes a focal length acquisition unit 7021, a pixel difference determination unit 7022, and a flight direction determination unit 7023.
  • the focal length acquisition unit 7021 is configured to acquire a focal length of the photographing device.
  • the focal length obtaining unit 7021 is specifically configured to determine a pixel corresponding to a field of view of the photographing device according to a preset correspondence between a field of view and a pixel; according to a field of view of the photographing device The pixels corresponding to the angle and the field of view of the photographing device determine the focal length of the photographing device.
  • the mapping relationship table used to indicate the preset field of view angle and pixel correspondence can be pre-configured in the terminal device.
  • the configuration can be read by configuration.
  • the mapping relationship table in the terminal device obtains pixels corresponding to the field of view angle of the shooting device. For example, assuming that the FOV of the shooting device is 48 degrees, and combined with the preset correspondence between the FOV and the pixels, it can be obtained that the corresponding pixel is 360. Then, according to the field of view angle of the photographing device and the pixels corresponding to the field of view angle of the photographing device, the focal length of the photographing device can be determined.
  • the calculation formula for determining the focal length of the photographing device by the focal length obtaining unit 7021 is as follows:
  • f is the focal length of the shooting device
  • FOV is the field angle of the shooting device
  • the manner in which the focal length acquisition unit 7021 acquires the focal length of the photographing device may further include: obtaining the focal length of the photographing device by receiving the input focal length of the photographing device; the focal length of the photographing device is configured in advance with the terminal In the device, the focal length acquiring unit 7021 reads the focal length of the photographing device directly from the terminal device; the focal length of the photographing device is pre-configured in the aircraft or other equipment, and the focal length acquiring unit 7021 reads the photographing from the aircraft or other equipment The focal length of the device, etc.
  • the pixel difference determining unit 7022 is configured to determine a pixel difference between a center point of the image and the target point according to the target point.
  • the flight direction determining unit 7023 is configured to determine a flight direction of the aircraft according to the focal length and a pixel difference between a center point of the image and the target point.
  • the flying direction of the aircraft is a direction in which the current position of the aircraft is directed to a target point in the image.
  • the flight direction determining unit 7023 can determine a deflection angle ⁇ according to a focal length f and a pixel difference ⁇ x between the center point of the image and the target point, where the deflection angle ⁇ is the flight direction of the aircraft relative to the The angle by which the optical axis direction of the photographing device is deflected, so that the flying direction of the aircraft can be determined.
  • the formula for determining the deflection angle ⁇ by the flight direction determination unit 7023 is as follows:
  • is a deflection angle
  • ⁇ x is a pixel difference between a center point of the image and the target point
  • f is a focal length of a photographing device.
  • the third determining module 703 is configured to determine a flying distance of the aircraft according to the target point.
  • the third determining module 703 includes an attitude information obtaining unit 7031, a flying height obtaining unit 7032, a deflection angle obtaining unit 7033, and a flying distance determining unit 7034.
  • the attitude information obtaining unit 7031 is configured to obtain attitude information of the pan / tilt head.
  • the attitude information acquisition unit 7031 acquires the attitude information of the PTZ specifically includes: first obtaining the attitude information of the PTZ by the attitude acquisition sensor provided on the PTZ, and sending the attitude information of the PTZ to the terminal device, so that the terminal device can acquire This attitude information is obtained.
  • the attitude acquisition sensor may be an IMU and the like.
  • the attitude information includes an attitude angle.
  • the attitude angle of the gimbal is expressed by Euler angle, that is, the attitude angle of the gimbal is described by Euler angle ( ⁇ , ⁇ , ⁇ ).
  • is the pitch angle in the attitude angle of the head
  • is the yaw angle in the attitude angle of the head
  • is the roll angle in the attitude angle of the head.
  • the flying height obtaining unit 7032 is configured to obtain a flying height of the aircraft.
  • the flying height of the aircraft refers to the height of the current flight position of the aircraft relative to the take-off point of the aircraft. Specifically, the flying height of the aircraft may be determined according to a difference in height between a first height of the current position of the aircraft relative to the ground and a second height of the take-off point of the aircraft relative to the ground.
  • the deflection angle obtaining unit 7033 is configured to determine a deflection angle of the aircraft according to the target point, where the deflection angle is an angle deflected by a flight direction of the aircraft relative to an optical axis direction of the photographing device.
  • the optical axis direction of the photographing device is determined by a pitch angle among attitude angles of the gimbal.
  • the flying distance determining unit 7034 is configured to determine a flying distance of the aircraft according to the attitude information, the flying height, and the deflection angle.
  • the calculation formula for the flying distance determining unit 7034 to determine the flying distance of the aircraft according to the attitude information, the flying height, and the deflection angle is:
  • L is the flight distance of the aircraft
  • is the elevation angle in the attitude angle of the gimbal
  • h is the flight height of the aircraft
  • is the deflection angle
  • the flying distance of the aircraft can be obtained, that is, the corresponding point of the target point in the image in the actual scene space is determined, so that the aircraft flies a specified distance in a specified direction, and achieves the effect of which one is flying.
  • the flight control module 704 is configured to control the flight of the aircraft according to the flight direction and flight distance.
  • the flight control module 704 is specifically configured to: the terminal device generates a control instruction according to the flight direction and the flight distance, and sends the control instruction to the aircraft to control the flight of the aircraft through the control instruction; or the terminal device transmits the flight The direction and the flight distance are sent to the aircraft, so that the aircraft can control the flight of the aircraft through the control instruction generated by the flight direction and the flight distance.
  • the flight control device 70 of the aircraft can execute the flight control method of the aircraft provided by any method embodiment, and has corresponding function modules and beneficial effects of the execution method.
  • the flight control method of the aircraft provided in the method embodiment.
  • FIG. 8 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention. As shown in FIG. 8, the terminal device 80 includes:
  • One processor 801 is taken as an example in FIG. 8.
  • the processor 801 and the memory 802 may be connected through a bus or other manners. In FIG. 8, the connection through the bus is taken as an example.
  • the memory 802 is a non-volatile computer-readable storage medium, and may be used to store non-volatile software programs, non-volatile computer executable programs, and modules, as corresponding to the flight control method of an aircraft in the embodiment of the present invention.
  • Program instructions / modules for example, the first determination module 701, the second determination module 702, the third determination module 703, and the flight control module 704 shown in FIG. 7).
  • the processor 801 executes various functional applications and data processing of the terminal device 80 by running non-volatile software programs, instructions, and modules stored in the memory 802, that is, implementing flight control of the aircraft provided by the method embodiment method.
  • the memory 802 may include a storage program area and a storage data area, where the storage program area may store an operating system and application programs required for at least one function; the storage data area may store data created according to the use of the terminal device 80 and the like.
  • the memory 802 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory 802 may optionally include a memory remotely set relative to the processor 801, and these remote memories may be connected to the terminal device 80 through a network. Examples of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the one or more modules are stored in the memory 802, and when executed by the one or more processors 801, execute the flight control method of the aircraft in the arbitrary method embodiment, for example, perform the above described
  • the method steps 301 to 303 in FIG. 3 implement the functions of the 701 to 704 modules in FIG. 7.
  • the terminal device 80 can execute the flight control method of the aircraft provided by any method embodiment, and has corresponding function modules and beneficial effects of the execution method.
  • the terminal device 80 can execute the flight control method of the aircraft provided by any method embodiment, and has corresponding function modules and beneficial effects of the execution method.
  • An embodiment of the present invention provides a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions. When the program instructions are executed by a computer, , Causing the computer to execute the flight control method of the aircraft in the arbitrary method embodiment, for example, executing the method steps 301 to 303 in FIG. 3 described above to implement the functions of the modules 701 to 704 in FIG. 7.
  • An embodiment of the present invention provides a non-volatile computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to cause a computer to execute the arbitrary method embodiments.
  • the flight control method of the aircraft for example, executes the method steps 301 to 303 in FIG. 3 described above to implement the functions of the modules 701 to 704 in FIG. 7.
  • FIG. 9 is a schematic diagram of a flight control system according to an embodiment of the present invention.
  • the flight control system 90 includes: an aircraft 901, the terminal device 80 described above, and a remote controller 902. It is connected to the aircraft 901 and the terminal device 80, wherein the remote controller 902 and the aircraft 901 can perform wireless communication, and the remote controller 902 and the terminal device 80 can be connected through USB.
  • the terminal device 80 is configured to control the flight of the aircraft 901 according to the determined flight direction and flight distance, so that the aircraft 901 flies to a point corresponding to a target point in an image in an actual scene space without frequent operations.
  • the joystick of the remote control 902 reduces the tedious operation of controlling the flight of the aircraft 901 and effectively improves the user experience.
  • the terminal device 80 includes, but is not limited to, a smart phone, a tablet, a personal computer, a wearable device, and the like.
  • the remote controller 902 serves as an intermediate device between the aircraft 901 and the terminal device 80, and is used for transferring data, information, or instructions.
  • the remote controller 902 is not necessary, that is, the terminal device 80 directly communicates with the aircraft 901 to implement flight control of the aircraft 901.
  • the device embodiments described above are only schematic, and the modules described as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical Modules can be located in one place or distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • the embodiments can be implemented by means of software plus a general hardware platform, and of course, also by hardware.
  • the program can be stored in a computer-readable storage medium, and the program is being executed. In this case, the process of the embodiment of each method may be included.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RandomAccess Memory, RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种飞行器的飞行控制方法、装置、终端设备及飞行控制系统。飞行控制方法应用于终端设备(20),终端设备(20)与飞行器(10)通信连接,飞行器(10)包括云台及搭载于云台上的拍摄装置,飞行控制方法包括:确定图像中的目标点,图像为飞行器(10)在当前位置时拍摄装置所拍摄的图像;根据目标点确定飞行器(10)的飞行方向及飞行器(10)的飞行距离;根据飞行方向及飞行距离,控制飞行器(10)的飞行。可以准确地控制飞行器(10)飞行至用户输入的目标点,并在飞行器(10)飞行至目标点后即刻控制飞行器(10)停止飞行,从而实现指哪飞哪的飞行控制效果,使得用户不必在飞行器飞行至期望的位置后手动操作使其停止,有效提高用户体验。

Description

飞行器的飞行控制方法、装置、终端设备及飞行控制系统
相关申请交叉引用
申请要求于2018年9月5日申请的、申请号为201811033713.2、申请名称为“飞行器的飞行控制方法、装置、终端设备及飞行控制系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明实施例涉及飞行器技术领域,尤其涉及一种飞行器的飞行控制方法、飞行器的飞行控制装置、终端设备,以及飞行控制系统。
背景技术
近年来,飞行器,如无人飞行器(Unmanned Aerial Vehicle,UAV),简称无人机被应用于各个领域,如航拍、农业、植保、微型自拍、快递运输、灾难救援等等。在飞行器的实际应用中,可以由用户手动输入目标点,通过控制飞行器的飞行,以使飞行器飞往用户所指定的地点,从而完成指定任务,此控制飞行的方法被称为“指点飞行”。以灾难救援为例,利用飞行器的拍摄装置对受灾地区进行航拍及进行受灾人员的搜救等,当发现受灾人员在某地被困住时,可以采用指点飞行功能,由用户在例如飞行器的遥控器的屏幕之类的输入界面用手指输入目标点,通过控制飞行器以使飞行器飞往该目标点,即,受灾人员所在地,以便尽快的对受灾人员提供各种帮助,如投放食物、投放药物等。
在实现本发明过程中,发明人发现相关技术中至少存在如下问题:对于目前的通过指点飞行控制飞行器的飞行,会根据用户输入的目标点,控制飞行器沿着预定的方向一直飞行,但是,当飞行器飞行抵达了目标 位置,仍然也不会停止,而是会继续飞行,也就是说,现有的指点飞行功能,仅仅控制飞行器的飞行方向,却不能控制飞行器的飞行距离,在飞行器飞行至期望的位置后,用户仍需再次手动操作飞行器使其停止,所以并不能真正意义上实现指哪飞哪的效果。
发明内容
本申请发明实施例提供一种飞行器的飞行控制方法、飞行器的飞行控制装置、终端设备及飞行控制系统,可以真正实现指哪飞哪的飞行控制效果。
本发明实施例公开了如下技术方案:
第一方面,本发明实施例提供了飞行器的飞行控制方法,应用于终端设备,所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所述云台上的拍摄装置,所述方法包括:
确定图像中的目标点,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像;
根据所述目标点确定所述飞行器的飞行方向及所述飞行器的飞行距离;
根据所述飞行方向及所述飞行距离,控制所述飞行器的飞行。
可选的,根据所述目标点确定所述飞行器的飞行方向,具体包括:
获取所述拍摄装置的焦距;
根据所述目标点确定所述图像的中心点与所述目标点的像素差;
根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
可选的,所述获取所述拍摄装置的焦距,具体包括:
根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;
根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像 素,确定所述拍摄装置的焦距。
可选的,根据所述目标点确定所述飞行器的飞行距离,具体包括:
获取所述云台的姿态信息;
获取所述飞行器的飞行高度;
根据所述目标点确定所述飞行器的偏转角度,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度;
根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离。
可选的,所述姿态信息包括姿态角。
可选的,根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离的计算公式为:
Figure PCTCN2019103096-appb-000001
其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
可选的,所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角所确定。
可选的,所述确定图像中的目标点,具体包括:
获取用户在所述终端设备的屏幕上所选择的目标点;
将所述所选择的目标点转换为图像中的目标点。
第二方面,本发明实施例提供了一种飞行器的飞行控制装置,配置于终端设备,所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所述云台上的拍摄装置,所述装置包括:
第一确定模块,用于确定图像中的目标点,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像;
第二确定模块,用于根据所述目标点确定所述飞行器的飞行方向;
第三确定模块,用于根据所述目标点确定所述飞行器的飞行距离;
飞行控制模块,用于根据所述飞行方向及所述飞行距离,控制所述 飞行器的飞行。
可选的,所述第二确定模块包括:
焦距获取单元,用于获取所述拍摄装置的焦距;
像素差确定单元,用于根据所述目标点确定所述图像的中心点与所述目标点的像素差;
飞行方向确定单元,用于根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
可选的,所述焦距获取单元具体用于:
根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;
根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,确定所述拍摄装置的焦距。
可选的,所述第三确定模块包括:
姿态信息获取单元,用于获取所述云台的姿态信息;
飞行高度获取单元,用于获取所述飞行器的飞行高度;
偏转角度获取单元,用于根据所述目标点确定所述飞行器的偏转角度,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度;
飞行距离确定单元,用于根据所述姿态信息、所述飞行高度及偏转角度,确定所述飞行器的飞行距离。
可选的,所述姿态信息包括姿态角。
可选的,所述飞行距离确定单元根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离的计算公式为:
Figure PCTCN2019103096-appb-000002
其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
可选的,所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角 所确定。
可选的,所述第一确定模块具体用于:
获取用户在所述终端设备的屏幕上所选择的目标点;
将所述所选择的目标点转换为图像中的目标点。
第三方面,本发明实施例提供了一种终端设备,包括:
至少一个处理器;以及,
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如上所述的飞行器的飞行控制方法。
第四方面,本发明实施例提供了一种飞行控制系统,包括:飞行器及如上所述的终端设备,所述终端设备与所述飞行器连接。
第五方面,本发明实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行如上所述的飞行器的飞行控制方法。
第六方面,本发明实施例还提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行如上所述的飞行器的飞行控制方法。
本发明实施例,首先确定图像中的目标点,然后根据目标点确定飞行器的飞行方向及飞行器的飞行距离,基于飞行方向及飞行距离来控制飞行器的飞行,以使飞行器飞往所确定的图像中的目标点在实际场景空间下所对应的点,从而可以准确地控制飞行器飞行至用户输入的目标点,并在飞行器飞行至目标点后即刻控制飞行器停止飞行,真正实现指哪飞哪的飞行控制效果,使得用户不必在飞行器飞行至期望的位置后再次手动操作飞行器使其停止,方便了用户的操作,也减少飞行器飞丢的可能 性,有效提高用户体验。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本发明实施例提供的一种飞行器的飞行控制方法的应用环境的示意图;
图2是本发明实施例提供的一种飞行器的飞行控制方法的另一应用环境的示意图;
图3是本发明实施例提供的一种飞行器的飞行控制方法的流程示意图;
图4是本发明实施例提供的计算飞行距离的示意图;
图5是本发明实施例提供的根据所述目标点确定所述飞行器的飞行方向的流程示意图;
图6是本发明实施例提供的根据所述目标点确定所述飞行器的飞行距离的流程示意图;
图7是本发明实施例提供的一种飞行器的飞行控制装置的示意图;
图8是本发明实施例提供的一种终端设备的硬件结构示意图;
图9是本发明实施例提供的一种飞行控制系统的示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
此外,下面所描述的本发明各个实施方式中所涉及到的技术特征只要彼此之间未构成冲突就可以相互组合。
图1为本发明提供的飞行器的飞行控制方法的其中一种应用环境的示意图。其中,该应用环境中包括:飞行器10、终端设备20以及用户(图未示)。其中,该终端设备20与该飞行器10通信连接,以实现与该飞行器10之间的数据或信息等的传输。该用户可以手持该终端设备20,以对该终端设备20进行各种操作等。
该飞行器10可用于拍摄图像(或视频、影像或画面)等,并将所拍摄的图像传输至终端设备20,以便在该终端设备20的屏幕上显示飞行器10所拍摄的图像。当用户手持该终端设备20时,用户可以通过点击或触摸终端设备20的屏幕上的任意一点,作为飞行器10飞行的目标点。当用户在终端设备20的屏幕上选择某一点后,终端设备20可以经过处理得到用户在所述终端设备20的屏幕上所选择的目标点的坐标,再通映射关系将所选择的目标点的坐标转换为图像中的目标点的坐标,从而确定图像中的目标点。终端设备20在确定图像中的目标点,再结合飞行器10的状态参数(如飞行器的云台的姿态信息、飞行器的飞行高度)及飞行器10的固定参数(如飞行器的拍摄装置的焦距),便可确定飞行器10飞行方向及飞行距离,从而控制飞行器10飞往图像中的目标点在实际场景空间下所对应的点,从而可以准确地控制飞行器飞行至用户输入的目标点,并在飞行器飞行至目标点后即刻控制飞行器停止飞行,以达到指哪飞哪的飞行控制效果,使得用户不必在飞行器飞行至期望的位置后再次手动操作飞行器使其停止,方便了用户的操作,也减少飞行器飞丢的可能性,有效提高用户体验。
对于目前的飞行器的飞行控制,大部分是通过操作遥控器的摇杆来控制飞行器的飞行(如上下、左右、前后、旋转等飞行),当要使飞行器飞往指定地点时,需要频繁的操作遥控器的摇杆,操作繁琐。为了降 低操作的繁琐,可以采用指点飞行技术来控制飞行器的飞行。通过指点飞行技术虽然可以避免频繁的操作遥控器的摇杆,但是在目前的指点飞行技术中,当用户通过终端设备(例如,遥控器)的屏幕选择一个目标点后,终端设备会将用户输入的目标点在进行数据处理后下发给飞行器的视觉模块,由视觉模块通过接收到的数据信息结合所选择的坐标点计算出目标物的具体方位,并让飞行器按计算出的方位去飞行。飞行器并不能计算出目标物具体深度信息(如飞行距离)。因此,飞行器会沿着预定的方向一直飞行,即使飞行器到达了目标位置也不会停止,而还是继续飞行,并不能真正的实现指哪飞哪的效果。并且,若飞行器一直沿着预定的方向继续飞行还可能存在飞行器飞丢的风险。
与上述两种飞行控制方式相比,本发明实施例所提供的控制飞行的方式一方面无需频繁的操作遥控器的摇杆,降低了控制飞行器10的飞行的操作繁琐度,提高用户体验。另一方面,根据所述飞行方向及所述飞行距离控制所述飞行器10的飞行,以使飞行器10飞往图像中的目标点在实际场景空间下所对应的点,不但控制了飞行器10飞行的方向,还控制了飞行器10所需飞行的距离,以达到指哪飞哪的飞行控制效果,使得用户不必在飞行器10飞行至期望的位置后再次手动操作飞行器10使其停止,方便了用户的操作,也减少飞行器10飞丢的可能性,有效提高用户体验。
下面分别对飞行器10及终端设备20进行具体描述。
该飞行器10可以为任何合适的飞行的器械,例如,飞行器10可以为无人机、无人船或其它可移动装置等等。以下对本发明的描述使用无人机(Unmanned Aerial Vehicle,UAV)作为飞行器10的示例。无人机是由遥控设备或自备程序控制装置操纵,带任务载荷的不载人航空器。该无人机可以为各种类型的无人机,例如,该无人机可以是旋翼飞行器(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼飞行器,本发明的实施例并不限于此,无人机也可以是其它类型的无人机,如固 定翼无人机、无人飞艇、伞翼无人机、扑翼无人机等等。
无人机包括但不限于:机身、动力系统、飞控组件、云台、拍摄装置、图传模块等。其中,飞控组件、图传模块设置于机身内,动力系统、云台均安装于机身上,拍摄装置搭载于云台上。飞控组件可以与动力系统、云台、拍摄装置、图传模块进行耦合,以实现通信。
机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。该机臂的数量可以为2个、4个、6个等等。一个或多个机臂用于承载动力系统。
动力系统可以包括电子调速器(简称为电调)、一个或多个螺旋桨以及与一个或多个螺旋桨相对应的一个或多个第一电机,其中第一电机连接在电子调速器与螺旋桨之间,第一电机和螺旋桨设置在对应的机臂上;电子调速器用于接收飞控组件产生的驱动信号,并根据驱动信号提供驱动电流给第一电机,以控制第一电机的转速。第一电机用于驱动螺旋桨旋转,从而为无人机的飞行提供动力,该动力使得无人机能够实现一个或多个自由度的运动,如前后运动、上下运动等等。在某些实施例中,无人机可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴、平移轴和俯仰轴。可以理解的是,第一电机可以是直流电机,也可以交流电机。另外,第一电机可以是无刷电机,也可以有刷电机。
飞控组件具有对无人机的飞行和任务进行监控和操纵的能力,包含对无人机发射和回收控制的一组设备。飞控组件用于实现对无人机的飞行的控制。飞控组件可以包括飞行控制器和传感系统。传感系统用于测量无人机及无人机的各个部件的位置信息和状态信息等等,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度、飞行高度等等。传感系统例如可以包括红外传感器、声波传感器、陀螺仪、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。飞行控制 器用于控制无人机的飞行。可以理解的是,飞行控制器可以按照预先编好的程序指令对无人机进行控制,也可以通过响应来自其它设备的一个或多个控制指令对无人机进行控制。例如,终端设备20根据飞行方向及飞行距离生成的控制指令,并将该控制指令发送给飞行控制器,飞行控制器接收该控制指令,以通过该控制指令控制无人机的飞行;或者,飞行控制器接收终端设备20发送的飞行方向及飞行距离,并根据飞行方向及飞行距离生成的控制指令,通过该控制指令控制无人机的飞行。
云台用于搭载拍摄装置。云台上设置有第二电机,飞控组件可以控制云台,具体的控制第二电机的运动(如转速),来调节无人机拍摄的图像的角度。其中,第二电机可以是无刷电机,也可以有刷电机。云台可以位于机身的顶部,也可以位于机身的底部。另外,在本发明实施例中,云台作为无人机的一部分,可以理解的是,在一些其它实施例中,云台可以独立于无人机。
拍摄装置可以是照相机、拍照手机或摄像机等用于采集图像的装置,拍摄装置可以与飞控组件通信,并在飞控组件的控制下进行拍摄。例如,飞控组件控制拍摄装置拍摄图像的拍摄频率,也即每单位时间内拍摄多少次。或者,飞控组件通过云台控制拍摄装置的拍摄图像的角度。
图传模块用于将天空中处于飞行状态的无人机的拍摄装置所拍摄的图像、画面或视频等实时稳定的发射给地面无线图传接收设备,如终端设备20等。
可以理解的是,上述对于无人机各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。
终端设备20可以为任何合适的电子设备。例如,智能手机、平板、个人计算机、可穿戴设备等等。终端设备20中包括通信模块,该通信模块用于与上述飞行器10进行通信,该通信模块可为无线通信模块,如WiFi模块、蓝牙模块、红外模块、通用分组无线服务(General Packet Radio Service,GPRS)模块等等。终端设备20还包括屏幕,该屏幕可 以实现输入输出功能。例如,通过该屏幕接收用户在该屏幕上选择目标点,以及通过该屏幕显示飞行器10所拍摄的图像、画面或视频等。
图2为本发明提供的飞行器的飞行控制方法的另一种应用环境的示意图。其中,该应用环境中还包括:遥控器30。该遥控器30分别与飞行器10及终端设备20通信连接。其中,遥控器30与飞行器10可以进行无线通信,遥控器30与终端设备20可以通过USB连接。
该遥控器30可以是任何合适的遥控设备。遥控器30为受地(舰)面或空中平台上的遥控单元通过飞控组件控制飞行的航空器。在本发明实施例中,该遥控器30用于进行数据、信息或指令的中转,例如,遥控器30接收飞行器10发送的数据或信息(如所述拍摄装置所拍摄的图像),并将该数据、信息或指令发送给终端设备20;或者,遥控器30接收终端设备20发送的数据或信息(如根据所述飞行方向及飞行距离生成的控制指令),并将该数据或信息发送给飞行器10。
通常在飞行器10进行飞行的过程中,飞行器10与终端设备20有一定的距离,特别是对于一些高空拍摄,飞行器10与终端设备20通常距离较远,为了实现远距离控制飞行器10的飞行,需要通过该遥控器30进行数据、信息或指令等的中转。
可以理解的是,在一些实施例中,该遥控器30不是必须的,也即,可以通过终端设备20直接给飞行器10发送控制指令,以实现对飞行器10的飞行的控制。
下面结合附图,对本发明实施例作进一步阐述。
实施例1:
图3为本发明实施例提供的一种飞行器的飞行控制方法的流程示意图。该方法适用于对各种飞行器的飞行进行控制,例如,图1中的飞行器10。该方法可由各种终端设备执行,例如,图1中的终端设备20。所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所 述云台上的拍摄装置。
参照图3,所述飞行器的飞行控制方法包括:
301:确定图像中的目标点。
其中,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像。在飞行器与终端设备建立连接后,飞行器可以将该图像发送给终端设备,终端设备接收到该图像后,会对该图像进行一定的处理,如图像尺寸的缩放,以便适应于终端设备的屏幕的显示,经过处理后的图像显示于终端设备的屏幕上。当用户手持该终端设备时,可以通过该屏幕选择目标点,以便确定图像中的目标点。
具体的,终端设备确定图像中的目标点包括:获取用户在所述终端设备的屏幕上所选择的目标点;将所述所选择的目标点转换为图像中的目标点。
以图4为例,终端设备接收用户的输入操作以获取用户在所述终端设备的屏幕上所选择的目标点。例如,用户可以通过点击或触摸终端设备的屏幕等输入操作,在屏幕上选择一个点P0作为目标点。选定该目标点P0之后,终端设备可以计算出P0在屏幕所对应的平面坐标系下的坐标(x0,y0),也即通过坐标(x0,y0)用于表示所选择的目标点P0的位置,然后,将所述所选择的目标点P0转换为图像中的目标点P1,相当于将坐标(x0,y0)转为P1在对应的图像坐标系下的坐标(x1,y1)。
在一些其它实施例中,用户的输入操作还可以包括直接输入P0的坐标(x0,y0)。
302:根据所述目标点确定所述飞行器的飞行方向及所述飞行器的飞行距离。
如图5所示,其中,终端设备根据所述目标点确定所述飞行器的飞行方向包括以下步骤:
3021:获取所述拍摄装置的焦距。
在一种实现方式中,终端设备获取所述拍摄装置的焦距,包括:根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,确定所述拍摄装置的焦距。
其中,该拍摄装置的视场角(Field of view,FOV)为拍摄装置的镜头中心点到成像平面两条边缘构成的夹角。视场角的大小决定了拍摄装置的视野范围,视场角越大,视野就越大。由于成像平面有四条边,因此,会对应两个方向上的视场角。在本方面实施例中,该视场角是指俯仰轴方向所对应的视场角(如图4中的AOB)。
其中,用于表示该预设的视场角与像素的对应关系的映射关系表可以预先配置于终端设备中,当获取得到拍摄装置的视场角后可以,便通过该映射关系表得到所述拍摄装置的视场角所对应的像素。例如,假设拍摄装置的视场角FOV为48度,结合预设的视场角与像素的对应关系,可以得到其对应的像素为360。然后,再根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,便可确定所述拍摄装置的焦距。
具体的,确定所述拍摄装置的焦距的计算公式如下所示:
Figure PCTCN2019103096-appb-000003
其中,f表示为拍摄装置的焦距;FOV表示为拍摄装置的视场角;p表示为拍摄装置的视场角所对应的像素。例如,假设FOV=48,其对应的像素p=360,则f=180/tan24。
在一些其它实施例中,终端设备获取所述拍摄装置的焦距的方式还可以包括:通过接收输入的拍摄装置的焦距以获取所述拍摄装置的焦距;所述拍摄装置的焦距预先配置终端设备中,直接从终端设备读取所述拍摄装置的焦距;所述拍摄装置的焦距预先配置于飞行器或其它设备中,终端设备从飞行器或其它设备中读取所述拍摄装置的焦距等等。
3022:根据所述目标点确定所述图像的中心点与所述目标点的像素差。
以图4为例,其中,所述图像的中心点与所述目标点的像素差Δx是指图像坐标系中的y轴方向的像素差。根据图像的中心点C的坐标及所述目标点P1的坐标确定像素差Δx。例如,假设中心坐标点(480/2,360/2),目标点P1的坐标为(x1,y1),则所述图像的中心点与所述目标点的像素差Δx==|y1/360-180/360|,其中,取绝对值是为了保证所述图像的中心点与所述目标点的像素差Δx取正数。
3023:根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
飞行器的飞行方向为飞行器当前所在位置指向图像中的目标点的方向。以图4为例,所述飞行器的飞行方向即为从O点指向P1点的方向。其中,O点为飞行器在当前所在位置。根据焦距f、以及所述图像的中心点C与所述目标点P1的像素差Δx,可以确定偏转角度α,其中,所述偏转角度α为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度,从而可以确定所述飞行器的飞行方向。
具体的,确定偏转角度α的公式如下所示:
Figure PCTCN2019103096-appb-000004
其中,α表示为偏转角度;Δx表示为所述图像的中心点C与所述目标点P1的像素差;f表示为拍摄装置的焦距。
如图6所示,其中,终端设备根据所述目标点确定所述飞行器的飞行距离包括以下步骤:
3024:获取所述云台的姿态信息。
终端设备获取云台的姿态信息具体包括:首先由设置于云台上的姿态采集传感器得到云台的姿态信息,并将该云台的姿态信息发送至终端设备,以使终端设备获取得到该姿态信息。
其中,姿态采集传感器可以为惯性测量单元(Inertial measurement unit,IMU)等。IMU为一种测量物体三轴姿态角(或角速率)以及加速度等姿态信息的传感器。通常的,IMU有六轴的IMU和九轴的IMU。其中, 六轴的IMU中,一个IMU包含了三个单轴的加速度计和三个单轴的陀螺仪,加速度计检测物体在载体坐标系统独立三轴的加速度信号,而陀螺仪检测载体相对于导航坐标系的角速度信号,测量物体在三维空间中的角速度和加速度,并以此解算出物体的姿态角。九轴的IMU中,一个IMU包含了三个单轴的加速度计、三个单轴的陀螺仪和三个单轴的地磁计,九轴的IMU的加速度计与陀螺仪类似,九轴的IMU的地磁计用于检测地磁场在惯性系中水平面上的分量,该分量的方向始终指向北极。
其中,所述姿态信息包括姿态角。所述云台的姿态角用欧拉角表示,也即通过欧拉角(θ,ψ,φ)描述云台的姿态角。θ表示为所述云台的姿态角中的俯仰角,ψ表示为所述云台的姿态角中的偏航角,φ表示为所述云台的姿态角中的翻滚角。
3025:获取所述飞行器的飞行高度。
其中,所述飞行器的飞行高度是指飞行器的当前飞行位置相对于飞行器的起飞点的高度。
在一种实现方式中,可以在飞行器上设置至少一个距离采集传感器,以测量当前位置相对于地面的第一高度,飞行器将该第一高度发送给终端设备,以使终端设备获取得到当前位置相对于地面的第一高度。所述至少一个距离采集传感器可以包括但不限于:超声传感器、红外传感器、微波传感器等等。而飞行器的起飞点相对于地面的第二高度可以在飞行器起飞时通过至少一个距离采集传感器测量得到,通常为了保证飞行器的安全,起飞点相对于地面的第二高度为2米左右。类似的,飞行器将该第二高度发送给终端设备,以使终端设备获取得到起飞点相对于地面的第二高度。第一高度与第二高度的高度差即为所述飞行器的飞行高度。
3026:根据所述目标点确定所述飞行器的偏转角度。
其中,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度。其中,如步骤3023中所述的,终端设备根据所述目标点确定所述飞行器的偏转角度具体包括:根据焦距f、以及 所述图像的中心点C与所述目标点P1的像素差Δx,确定偏转角度α。
3027:根据所述姿态信息、所述飞行高度及偏转角度,确定所述飞行器的飞行距离。
其中,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度。所述姿态信息包括姿态角。所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角所确定。如图4所示,所述拍摄装置的光轴方向与俯仰轴方向的夹角即为云台的姿态角中的俯仰角,通过调整该俯仰角可以调整所述拍摄装置的光轴方向,从而调整所述拍摄装置拍摄图像的角度。
根据所述姿态信息、所述飞行高度及偏转角度确定所述飞行器的飞行距离的计算公式为:
Figure PCTCN2019103096-appb-000005
其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
通过上述公式,即可得到飞行器的飞行距离,也即确定图像中的目标点在实际场景空间下所对应的点P2,以便飞行器朝指定方向(如图4中由O指向P1的方向)飞行指定距离(如图4中的L),从而使得飞行器飞到P2的位置,实现指哪飞哪的效果。
303:根据所述飞行方向及所述飞行距离,控制所述飞行器的飞行。
终端设备根据所述飞行方向及飞行距离,控制所述飞行器的飞行包括:终端设备根据所述飞行方向及飞行距离生成的控制指令,并将该控制指令发送给飞行器,以通过该控制指令控制飞行器的飞行;或者,终端设备将所述飞行方向及飞行距离发送给飞行器,以使飞行器通过所述飞行方向及飞行距离生成的控制指令,以通过该控制指令控制飞行器的飞行。
需要说明的是,在本发明实施例中,本领域普通技术人员,根据本发明实施例的描述可以理解,在不同实施例中,在不矛盾的情况下,所 述步骤3021-3023、步骤3024-3027可以有不同的执行顺序,例如,先执行步骤3022再执行步骤3021,或者步骤3022与步骤3021同时执行等等。
在本发明实施例中,首先确定图像中的目标点,然后根据目标点确定飞行器的飞行方向及飞行器的飞行距离,基于飞行方向及飞行距离来控制飞行器的飞行,以使飞行器飞往所确定的图像中的目标点在实际场景空间下所对应的点,从而可以准确地控制飞行器飞行至用户输入的目标点,并在飞行器飞行至目标点后即刻控制飞行器停止飞行,真正实现了指哪飞哪的飞行控制效果,使得用户不必在飞行器飞行至期望的位置后再次手动操作飞行器使其停止,方便了用户的操作,也减少飞行器飞丢的可能性,有效提高用户体验。
实施例2:
图7为本发明实施例提供的一种飞行器的飞行控制装置示意图。该飞行器的飞行控制装置70可配置于各种终端设备中,例如,配置于图1中的终端设备20中。所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所述云台上的拍摄装置。
参照图7,所述飞行器的飞行控制装置70包括:第一确定模块701、第二确定模块702、第三确定模块703以及飞行控制模块704。
具体的,第一确定模块701用于确定图像中的目标点。
其中,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像。第一确定模块701具体用于:获取用户在所述终端设备的屏幕上所选择的目标点;将所述所选择的目标点转换为图像中的目标点。
第一确定模块701接收用户的输入操作以获取用户在所述终端设备的屏幕上所选择的目标点。其中该输入操作包括:点击屏幕、触摸屏幕或者输入所选择的目标点的坐标等等。
具体的,第二确定模块702用于根据所述目标点确定所述飞行器的 飞行方向。
其中,第二确定模块702包括焦距获取单元7021、像素差确定单元7022以及飞行方向确定单元7023。
其中,焦距获取单元7021用于获取所述拍摄装置的焦距。
在一种实现方式中,焦距获取单元7021具体用于:根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,确定所述拍摄装置的焦距。
其中,用于表示该预设的视场角与像素的对应关系的映射关系表可以预先配置于终端设备中,当焦距获取单元7021获取得到拍摄装置的视场角后可以,便通过读取配置于终端设备中的映射关系表得到所述拍摄装置的视场角所对应的像素。例如,假设拍摄装置的视场角FOV为48度,结合预设的视场角与像素的对应关系,可以得到其对应的像素为360。然后,再根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,便可确定所述拍摄装置的焦距。
具体的,焦距获取单元7021确定所述拍摄装置的焦距的计算公式如下所示:
Figure PCTCN2019103096-appb-000006
其中,f表示为拍摄装置的焦距;FOV表示为拍摄装置的视场角;p表示为拍摄装置的视场角所对应的像素。例如,假设FOV=48,其对应的像素p=360,则f=180/tan24。
在一些其它实施例中,焦距获取单元7021获取所述拍摄装置的焦距的方式还可以包括:通过接收输入的拍摄装置的焦距以获取所述拍摄装置的焦距;所述拍摄装置的焦距预先配置终端设备中,焦距获取单元7021直接从终端设备读取所述拍摄装置的焦距;所述拍摄装置的焦距预先配置于飞行器或其它设备中,焦距获取单元7021从飞行器或其它设备中读取所述拍摄装置的焦距等等。
其中,像素差确定单元7022用于根据所述目标点确定所述图像的中心点与所述目标点的像素差。
其中,飞行方向确定单元7023用于根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
飞行器的飞行方向为飞行器当前所在位置指向图像中的目标点的方向。飞行方向确定单元7023根据焦距f、以及所述图像的中心点与所述目标点的像素差Δx,可以确定偏转角度α,其中,所述偏转角度α为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度,从而可以确定所述飞行器的飞行方向。
具体的,飞行方向确定单元7023确定偏转角度α的公式如下所示:
Figure PCTCN2019103096-appb-000007
其中,α表示为偏转角度;Δx表示为所述图像的中心点与所述目标点的像素差;f表示为拍摄装置的焦距。
具体的,第三确定模块703用于根据所述目标点确定所述飞行器的飞行距离。
其中,第三确定模块703包括:姿态信息获取单元7031、飞行高度获取单元7032、偏转角度获取单元7033以及飞行距离确定单元7034。
其中,姿态信息获取单元7031用于获取所述云台的姿态信息。
姿态信息获取单元7031获取云台的姿态信息具体包括:首先由设置于云台上的姿态采集传感器得到云台的姿态信息,并将该云台的姿态信息发送至终端设备,以使终端设备获取得到该姿态信息。其中,姿态采集传感器可以为IMU等。
所述姿态信息包括姿态角。所述云台的姿态角用欧拉角表示,也即通过欧拉角(θ,ψ,φ)描述云台的姿态角。θ表示为所述云台的姿态角中的俯仰角,ψ表示为所述云台的姿态角中的偏航角,φ表示为所述云台的姿态角中的翻滚角。
其中,飞行高度获取单元7032用于获取所述飞行器的飞行高度。
其中,所述飞行器的飞行高度是指飞行器的当前飞行位置相对于飞行器的起飞点的高度。具体的,可以根据飞行器当前位置相对于地面的第一高度与飞行器的起飞点相对于地面的第二高度的高度差以确定所述飞行器的飞行高度。
其中,偏转角度获取单元7033用于根据所述目标点确定所述飞行器的偏转角度,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度。所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角所确定。
其中,飞行距离确定单元7034用于根据所述姿态信息、所述飞行高度及偏转角度,确定所述飞行器的飞行距离。
飞行距离确定单元7034根据所述姿态信息、所述飞行高度及所述偏转角度确定所述飞行器的飞行距离的计算公式为:
Figure PCTCN2019103096-appb-000008
其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
通过上述公式,即可得到飞行器的飞行距离,也即确定图像中的目标点在实际场景空间下所对应的点,以便飞行器朝指定方向飞行指定距离,实现指哪飞哪的效果。
具体的,飞行控制模块704用于根据所述飞行方向及飞行距离,控制所述飞行器的飞行。
飞行控制模块704具体用于:终端设备根据所述飞行方向及飞行距离生成的控制指令,并将该控制指令发送给飞行器,以通过该控制指令控制飞行器的飞行;或者,终端设备将所述飞行方向及飞行距离发送给飞行器,以使飞行器通过所述飞行方向及飞行距离生成的控制指令,以通过该控制指令控制飞行器的飞行。
需要说明的是,在本发明实施例中,所述飞行器的飞行控制装置70可执行任意方法实施例所提供的飞行器的飞行控制方法,具备执行方法 相应的功能模块和有益效果。未在飞行器的飞行控制装置70的实施例中详尽描述的技术细节,可参见方法实施例所提供的飞行器的飞行控制方法。
实施例3:
图8是本发明实施例提供的终端设备硬件结构示意图,如图8所示,所述终端设备80包括:
一个或多个处理器801以及存储器802,图8中以一个处理器801为例。
处理器801和存储器802可以通过总线或者其他方式连接,图8中以通过总线连接为例。
存储器802作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本发明实施例中的飞行器的飞行控制方法对应的程序指令/模块(例如,附图7所示的第一确定模块701、第二确定模块702、第三确定模块703以及飞行控制模块704)。处理器801通过运行存储在存储器802中的非易失性软件程序、指令以及模块,从而执行终端设备80的各种功能应用以及数据处理,即实现所述方法实施例所提供的飞行器的飞行控制方法。
存储器802可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据终端设备80使用所创建的数据等。此外,存储器802可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器802可选包括相对于处理器801远程设置的存储器,这些远程存储器可以通过网络连接至终端设备80。所述网络的实施例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个模块存储在所述存储器802中,当被所述一个或 者多个处理器801执行时,执行所述任意方法实施例中的飞行器的飞行控制方法,例如,执行以上描述的图3中的方法步骤301-步骤303,实现图7中的701-704模块的功能。
所述终端设备80可执行任意方法实施例所提供的飞行器的飞行控制方法,具备执行方法相应的功能模块和有益效果。未在终端设备实施例中详尽描述的技术细节,可参见任意方法实施例所提供的飞行器的飞行控制方法。
本发明实施例提供一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行所述任意方法实施例中的飞行器的飞行控制方法,例如,执行以上描述的图3中的方法步骤301-步骤303,实现图7中的701-704模块的功能。
本发明实施例提供一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行所述任意方法实施例中的飞行器的飞行控制方法,例如执行以上描述的图3中的方法步骤301-步骤303,实现图7中的701-704模块的功能。
实施例4:
图9是本发明实施例提供的飞行控制系统的示意图,如图9所示,所述飞行控制系统90包括:飞行器901、如上所述的终端设备80以及遥控器902,所述遥控器902分别与所述飞行器901和终端设备80相连,其中,遥控器902与飞行器901可以进行无线通信,遥控器902与终端设备80可以通过USB连接。
所述终端设备80用于根据确定的飞行方向及飞行距离,控制所述飞行器901的飞行,以使飞行器901飞往图像中的目标点在实际场景空间下所对应的点,而无需频繁的操作遥控器902的摇杆,降低了控制飞 行器901的飞行的操作繁琐度,有效提高用户体验。
其中,该终端设备80包括但不限于:智能手机、平板、个人计算机、可穿戴设备等等。
所述遥控器902作为飞行器901与终端设备80的中间设备,用于进行数据、信息或指令等的中转。
在一些其它实施例中,该遥控器902不是必须的,也即,终端设备80与飞行器901直接进行通信,以实现对飞行器901的飞行控制。
需要说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络模块上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施例的描述,本领域普通技术人员可以清楚地了解到各实施例可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现所述实施例方法中的全部或部分流程是可以通过计算机程序指令相关的硬件来完成,所述的程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如所述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(RandomAccessMemory,RAM)等。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;在本发明的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本发明的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改, 或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (18)

  1. 一种飞行器的飞行控制方法,应用于终端设备,其特征在于,所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所述云台上的拍摄装置,所述方法包括:
    确定图像中的目标点,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像;
    根据所述目标点确定所述飞行器的飞行方向及所述飞行器的飞行距离;
    根据所述飞行方向及所述飞行距离,控制所述飞行器的飞行。
  2. 根据权利要求1所述的方法,其特征在于,根据所述目标点确定所述飞行器的飞行方向,具体包括:
    获取所述拍摄装置的焦距;
    根据所述目标点确定所述图像的中心点与所述目标点的像素差;
    根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
  3. 根据权利要求2所述的方法,其特征在于,所述获取所述拍摄装置的焦距,具体包括:
    根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;
    根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,确定所述拍摄装置的焦距。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,根据所述目标点确定所述飞行器的飞行距离,具体包括:
    获取所述云台的姿态信息;
    获取所述飞行器的飞行高度;
    根据所述目标点确定所述飞行器的偏转角度,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度;
    根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离。
  5. 根据权利要求4所述的方法,其特征在于,所述姿态信息包括姿态角。
  6. 根据权利要求4或5所述的方法,其特征在于,根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离的计算公式为:
    Figure PCTCN2019103096-appb-100001
    其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
  7. 根据权利要求4-6中任一项所述的方法,其特征在于,所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角所确定。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述确定图像中的目标点,具体包括:
    获取用户在所述终端设备的屏幕上所选择的目标点;
    将所述所选择的目标点转换为图像中的目标点。
  9. 一种飞行器的飞行控制装置,配置于终端设备,其特征在于,所述终端设备与所述飞行器通信连接,所述飞行器包括云台及搭载于所 述云台上的拍摄装置,所述装置包括:
    第一确定模块,用于确定图像中的目标点,所述图像为飞行器在当前位置时所述拍摄装置所拍摄的图像;
    第二确定模块,用于根据所述目标点确定所述飞行器的飞行方向;
    第三确定模块,用于根据所述目标点确定所述飞行器的飞行距离;
    飞行控制模块,用于根据所述飞行方向及所述飞行距离,控制所述飞行器的飞行。
  10. 根据权利要求9所述的装置,其特征在于,所述第二确定模块包括:
    焦距获取单元,用于获取所述拍摄装置的焦距;
    像素差确定单元,用于根据所述目标点确定所述图像的中心点与所述目标点的像素差;
    飞行方向确定单元,用于根据所述焦距及所述图像的中心点与所述目标点的像素差,确定所述飞行器的飞行方向。
  11. 根据权利要求10所述的装置,其特征在于,所述焦距获取单元具体用于:
    根据预设的视场角与像素的对应关系,确定所述拍摄装置的视场角所对应的像素;
    根据所述拍摄装置的视场角及所述拍摄装置的视场角所对应的像素,确定所述拍摄装置的焦距。
  12. 根据权利要求9-11中任一项所述的装置,其特征在于,所述第三确定模块包括:
    姿态信息获取单元,用于获取所述云台的姿态信息;
    飞行高度获取单元,用于获取所述飞行器的飞行高度;
    偏转角度获取单元,用于根据所述目标点确定所述飞行器的偏转角度,所述偏转角度为所述飞行器的飞行方向相对于所述拍摄装置的光轴方向所偏转的角度;
    飞行距离确定单元,用于根据所述姿态信息、所述飞行高度及偏转角度,确定所述飞行器的飞行距离。
  13. 根据权利要求12所述的装置,其特征在于,所述姿态信息包括姿态角。
  14. 根据权利要求12或13所述的装置,其特征在于,所述飞行距离确定单元根据所述姿态信息、所述飞行高度及所述偏转角度,确定所述飞行器的飞行距离的计算公式为:
    Figure PCTCN2019103096-appb-100002
    其中,L表示为所述飞行器的飞行距离;θ表示为所述云台的姿态角中的俯仰角;h表示为所述飞行器的飞行高度;α表示为偏转角度。
  15. 根据权利要求12-14中任一项所述的装置,其特征在于,所述拍摄装置的光轴方向由所述云台的姿态角中的俯仰角所确定。
  16. 根据权利要求9-15中任一项所述的装置,其特征在于,所述第一确定模块具体用于:
    获取用户在所述终端设备的屏幕上所选择的目标点;
    将所述所选择的目标点转换为图像中的目标点。
  17. 一种终端设备,其特征在于,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-8的任一项所述的方法。
  18. 一种飞行控制系统,其特征在于,包括飞行器及如权利要求17所述的终端设备,所述终端设备与所述飞行器连接。
PCT/CN2019/103096 2018-09-05 2019-08-28 飞行器的飞行控制方法、装置、终端设备及飞行控制系统 WO2020048365A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811033713.2A CN109032184B (zh) 2018-09-05 2018-09-05 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
CN201811033713.2 2018-09-05

Publications (1)

Publication Number Publication Date
WO2020048365A1 true WO2020048365A1 (zh) 2020-03-12

Family

ID=64624128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103096 WO2020048365A1 (zh) 2018-09-05 2019-08-28 飞行器的飞行控制方法、装置、终端设备及飞行控制系统

Country Status (2)

Country Link
CN (1) CN109032184B (zh)
WO (1) WO2020048365A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032184B (zh) * 2018-09-05 2021-07-09 深圳市道通智能航空技术股份有限公司 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
CN110083180A (zh) * 2019-05-22 2019-08-02 深圳市道通智能航空技术有限公司 云台控制方法、装置、控制终端及飞行器系统
CN112771842A (zh) * 2020-06-02 2021-05-07 深圳市大疆创新科技有限公司 成像方法、成像装置、计算机可读存储介质
CN114253284A (zh) * 2021-12-22 2022-03-29 湖北襄开电力设备有限公司 无人机自动控制方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160304198A1 (en) * 2014-12-03 2016-10-20 Google Inc. Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object
CN106444797A (zh) * 2016-12-01 2017-02-22 腾讯科技(深圳)有限公司 一种控制飞行器降落的方法以及相关装置
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统
CN107807659A (zh) * 2017-10-24 2018-03-16 北京臻迪科技股份有限公司 一种无人机飞行控制方法及装置
CN108351653A (zh) * 2015-12-09 2018-07-31 深圳市大疆创新科技有限公司 用于uav飞行控制的系统和方法
CN109032184A (zh) * 2018-09-05 2018-12-18 深圳市道通智能航空技术有限公司 飞行器的飞行控制方法、装置、终端设备及飞行控制系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939B (zh) * 2013-02-26 2015-10-21 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
FR3058238B1 (fr) * 2016-10-28 2019-01-25 Parrot Drones Systeme autonome de prise de vues animees par un drone avec poursuite de cible et maintien de l'angle de prise de vue de la cible.
CN106919186A (zh) * 2017-04-21 2017-07-04 南京模幻天空航空科技有限公司 无人飞行器飞行控制操作方法以及装置
CN113163118A (zh) * 2017-05-24 2021-07-23 深圳市大疆创新科技有限公司 拍摄控制方法及装置
CN107576329B (zh) * 2017-07-10 2020-07-03 西北工业大学 基于机器视觉的固定翼无人机着降引导合作信标设计方法
CN107727079B (zh) * 2017-11-30 2020-05-22 湖北航天飞行器研究所 一种微小型无人机全捷联下视相机的目标定位方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160304198A1 (en) * 2014-12-03 2016-10-20 Google Inc. Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object
CN108351653A (zh) * 2015-12-09 2018-07-31 深圳市大疆创新科技有限公司 用于uav飞行控制的系统和方法
CN106444797A (zh) * 2016-12-01 2017-02-22 腾讯科技(深圳)有限公司 一种控制飞行器降落的方法以及相关装置
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统
CN107807659A (zh) * 2017-10-24 2018-03-16 北京臻迪科技股份有限公司 一种无人机飞行控制方法及装置
CN109032184A (zh) * 2018-09-05 2018-12-18 深圳市道通智能航空技术有限公司 飞行器的飞行控制方法、装置、终端设备及飞行控制系统

Also Published As

Publication number Publication date
CN109032184B (zh) 2021-07-09
CN109032184A (zh) 2018-12-18

Similar Documents

Publication Publication Date Title
US11649052B2 (en) System and method for providing autonomous photography and videography
US11604479B2 (en) Methods and system for vision-based landing
US20200346753A1 (en) Uav control method, device and uav
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
JP6803919B2 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
CN108769531B (zh) 控制拍摄装置的拍摄角度的方法、控制装置及遥控器
WO2020143677A1 (zh) 一种飞行控制方法及飞行控制系统
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
US11798172B2 (en) Maximum temperature point tracking method, device and unmanned aerial vehicle
WO2019227289A1 (zh) 延时拍摄控制方法和设备
WO2020062178A1 (zh) 基于地图识别目标对象的方法与控制终端
WO2021168819A1 (zh) 无人机的返航控制方法和设备
CN203845021U (zh) 飞行器全景航拍装置系统
WO2018214155A1 (zh) 用于设备姿态调整的方法、设备、系统和计算机可读存储介质
WO2021217371A1 (zh) 可移动平台的控制方法和装置
WO2019230604A1 (ja) 検査システム
WO2019183789A1 (zh) 无人机的控制方法、装置和无人机
WO2020062356A1 (zh) 控制方法、控制装置、无人飞行器的控制终端
WO2021168821A1 (zh) 可移动平台的控制方法和设备
WO2020244648A1 (zh) 一种飞行器控制方法、装置及飞行器
WO2020237429A1 (zh) 遥控设备的控制方法和遥控设备
JP2018078433A (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
WO2020168519A1 (zh) 拍摄参数的调整方法、拍摄设备以及可移动平台
WO2020042186A1 (zh) 可移动平台的控制方法、可移动平台、终端设备和系统
CN112882645B (zh) 航道规划方法、控制端、飞行器及航道规划系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19857346

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19857346

Country of ref document: EP

Kind code of ref document: A1