WO2022193081A1 - 无人机的控制方法、装置及无人机 - Google Patents

无人机的控制方法、装置及无人机 Download PDF

Info

Publication number
WO2022193081A1
WO2022193081A1 PCT/CN2021/080837 CN2021080837W WO2022193081A1 WO 2022193081 A1 WO2022193081 A1 WO 2022193081A1 CN 2021080837 W CN2021080837 W CN 2021080837W WO 2022193081 A1 WO2022193081 A1 WO 2022193081A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
orientation
sensor
uav
camera
Prior art date
Application number
PCT/CN2021/080837
Other languages
English (en)
French (fr)
Inventor
杜劼熹
周游
林毅
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/080837 priority Critical patent/WO2022193081A1/zh
Priority to CN202180084779.8A priority patent/CN116745720A/zh
Publication of WO2022193081A1 publication Critical patent/WO2022193081A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw

Definitions

  • the present application relates to the technical field of unmanned aerial vehicles, and in particular, to a control method and device of an unmanned aerial vehicle, and an unmanned aerial vehicle.
  • UAV ultrasonic aerial vehicle
  • sensing sensors to sense the surrounding environment information, so as to detect obstacles in the environment in time and avoid collisions.
  • how to better use the camera and sensor to perceive the flight environment and improve the flight safety of the UAV is very important.
  • the present application provides a control method and device for an unmanned aerial vehicle, and an unmanned aerial vehicle.
  • a control method of an unmanned aerial vehicle the unmanned aerial vehicle includes a perception sensor and a camera, the perception sensor is fixedly connected to the fuselage of the unmanned aerial vehicle, and the camera is based on a cloud
  • the stage is movably connected to the fuselage of the unmanned aerial vehicle;
  • the unmanned aerial vehicle further includes a power mechanism, and the fuselage of the unmanned aerial vehicle can adjust the posture and attitude under the action of the power mechanism;
  • the method includes :
  • controlling the camera to work and sending the image captured by the camera to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • Control the power mechanism and the pan/tilt to work make the camera face the moving direction of the target, and make the sensing sensor face other directions different from the moving direction of the target;
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • a control method of an unmanned aerial vehicle the unmanned aerial vehicle includes a perception sensor and a camera, the perception sensor is fixedly connected to the fuselage of the unmanned aerial vehicle, and the camera is based on a cloud
  • the stage is movably connected to the fuselage of the unmanned aerial vehicle; the fuselage of the unmanned aerial vehicle can adjust the posture and attitude under the action of the power mechanism, and the method includes:
  • controlling the camera to work and sending the image captured by the camera to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • a method for an unmanned aerial vehicle includes a power mechanism and a sensing sensor, the sensing sensor is fixedly connected to the fuselage of the unmanned aerial vehicle, and the unmanned aerial vehicle is The fuselage of the fuselage can adjust the pose under the action of the power mechanism, and the method includes:
  • the pose of the fuselage is adjusted by the power mechanism based on the task type, so as to adjust the orientation of the perception sensor to the target orientation; wherein, the perception sensor performs the to-be-executed task when the perception sensor is located in the target orientation
  • the accuracy is higher than the accuracy of performing the task to be performed when the sensor is located in other orientations.
  • a control device for an unmanned aerial vehicle includes a sensing sensor and a camera, the sensing sensor is fixedly connected to the body of the unmanned aerial vehicle, and the camera is based on a cloud
  • the platform is movably connected to the fuselage of the unmanned aerial vehicle;
  • the unmanned aerial vehicle further includes a power mechanism, and the fuselage of the unmanned aerial vehicle can adjust its posture under the action of the power mechanism;
  • the device includes A processor, a memory, and a computer program stored on the memory for execution by the processor, where the processor implements the following steps when executing the computer program:
  • controlling the camera to work and sending the image captured by the camera to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • a control device for an unmanned aerial vehicle includes a sensing sensor and a camera, the sensing sensor is fixedly connected to the body of the unmanned aerial vehicle, and the camera is based on a cloud
  • the platform is movably connected to the fuselage of the unmanned aerial vehicle;
  • the unmanned aerial vehicle further includes a power mechanism, and the fuselage of the unmanned aerial vehicle can adjust its posture under the action of the power mechanism;
  • the device includes A processor, a memory, and a computer program stored on the memory for execution by the processor, where the processor implements the following steps when executing the computer program:
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • a control device for controlling an unmanned aerial vehicle includes a sensing sensor, and the sensing sensor is fixedly connected to the fuselage of the unmanned aerial vehicle; the unmanned aerial vehicle is further It includes a power mechanism, and the fuselage of the UAV can adjust the posture and attitude under the action of the power mechanism; the device includes a processor, a memory, and a computer program stored on the memory for the processor to execute , the processor implements the following steps when executing the computer program:
  • the pose of the fuselage is adjusted by the power mechanism based on the task type, so as to adjust the orientation of the perception sensor to the target orientation; wherein, the perception sensor performs the to-be-executed task when the perception sensor is located in the target orientation
  • the accuracy is higher than the accuracy of performing the task to be performed when the sensor is located in other orientations.
  • an unmanned aerial vehicle comprising a perception sensor, a camera, a power mechanism, and the control device according to any one of the fourth aspect, the fifth aspect or the sixth aspect;
  • the perception sensor is fixedly connected to the fuselage of the unmanned aerial vehicle
  • the camera is movably connected to the fuselage of the unmanned aerial vehicle based on the gimbal, and the fuselage of the unmanned aerial vehicle can be connected to the power mechanism. Adjust the pose under the action of .
  • an operation command input by the user through the remote control device of the drone can be received, and the target direction that the user expects the camera of the drone to face can be determined based on the operation command.
  • Control the movement of the gimbal equipped with the camera on the drone to adjust the camera to the target direction and collect images in the target direction.
  • the motion of the power mechanism on the UAV can be controlled to drive the UAV to adjust the pose, and the orientation of the perception sensor fixedly connected to the fuselage on the UAV can be adjusted to other directions different from the target direction.
  • FIG. 1( a ) is a schematic diagram of controlling an unmanned aerial vehicle in a normal flight mode according to an embodiment of the present application.
  • FIG. 1( b ) is a schematic diagram of controlling a drone in a course lock mode according to an embodiment of the present application.
  • FIG. 1( c ) is a schematic diagram of controlling an unmanned aerial vehicle in an intelligent following mode according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an observation blind spot of a perception sensor of an unmanned aerial vehicle according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an unmanned aerial vehicle according to an embodiment of the present application.
  • FIG. 4 is a flowchart of a control method of an unmanned aerial vehicle according to an embodiment of the present application.
  • FIG. 5 is a flowchart of a control method of an unmanned aerial vehicle according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of adjusting the orientation of the binocular sensor of the unmanned aerial vehicle in a direction favorable for positioning according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of adjusting the binocular sensor of the UAV to face a direction favorable for obstacle avoidance according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a logical structure of a control device according to an embodiment of the present application.
  • Drones are widely used in various fields. Usually there is a camera on the drone.
  • the camera can be movably connected to the drone body through the gimbal to collect images of the flight environment and transmit them to the remote control device of the drone. Users can view the drone capture through the remote control device. image, and make corresponding adjustments to the motion state of the UAV based on the image.
  • the drone is generally equipped with a perception sensor, which is usually fixedly connected to the fuselage. obstacles, so as to assist the UAV to adjust its motion state and avoid collision.
  • FIG. 1(a) it is a schematic diagram of controlling the drone in normal flight mode.
  • the user's orientation is the same as that of the drone's nose
  • the user pushes the joystick of the remote controller in a direction of 45°.
  • the drone will fly in the direction of 45° diagonally based on its current orientation (that is, the direction of the movement speed is the direction of 45° diagonally), but the orientation of the nose and the camera will still keep the original orientation, which is convenient for the user to control the drone (
  • the original orientation is used as the reference during control, and the viewing angle of the camera is the same as the viewing angle of the user, and no coordinate system conversion is required).
  • FIG. 1(b) it is a schematic diagram of controlling the drone in the course lock mode.
  • the user's orientation is the same as that of the drone's head, the user pushes the joystick of the remote controller in a direction of 45°.
  • Stick control the drone will fly in a 45° oblique direction based on its current orientation, and the nose orientation is also inclined at a 45° direction, and the camera orientation is consistent with the real-time nose orientation.
  • an initial orientation (the nose orientation when entering this mode) will be recorded, and subsequent controls are based on the initial orientation, regardless of the real-time nose orientation.
  • Figure 1(c) it is a schematic diagram of controlling the UAV in the intelligent follow mode.
  • the user needs to select a follow target, and then the UAV will automatically control the direction of the nose to ensure that the camera is in real time.
  • the control is based on the positional relationship between the drone and the target. The user pushes the joystick of the remote control to the left, and the drone starts to circle the target to the left; push the stick to the right to circle to the right; push the stick forward, the drone is close to the target; .
  • the embodiments of the present application provide a method for controlling a drone.
  • the orientation of the camera of the drone can be controlled to be different from the orientation of the sensing sensor, so that the combination of the two can achieve greater
  • the observation range can improve the safety of the UAV during flight.
  • the unmanned aerial vehicle in the embodiment of the present application includes a perception sensor (31 in the figure) and a camera (32 in the figure).
  • the sensor is fixedly connected to the fuselage of the drone.
  • the sensing sensor can be set in various parts of the fuselage according to actual needs, such as one or more parts such as the head, tail, and upper and lower surfaces of the fuselage.
  • the perception sensor is fixedly connected to the fuselage, the orientation of the perception sensor can be adjusted with the adjustment of the posture and attitude of the drone body, and the perception sensor can be used to collect environmental observation information in the flying environment of the drone.
  • the camera in the UAV can be movably connected to the body of the UAV through the gimbal, and the gimbal can drive the camera to rotate and change the orientation of the camera to capture images of objects within a certain angle range in the flight environment.
  • the gimbal can be set at the head of the fuselage of the drone, or the tail of the fuselage, or other positions with little occlusion to the camera.
  • the UAV also includes a power mechanism, which can drive the UAV to move and adjust the posture of the UAV body.
  • the UAV control method provided by the embodiment of the present application includes the following steps:
  • S402. Control the camera to work, and send the image captured by the camera to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • the drone can control the camera to work to capture the image of the target in the flight environment, such as video or image, and send the image captured by the camera to the remote control device that controls the drone, for example, it is used in conjunction with the drone remote control, or other terminal equipment that can control the drone.
  • the remote control device can display the image on the screen so that the user can view the image.
  • the user can also adjust the motion state of the UAV based on the image. For example, if an obstacle is found in front of the UAV through the image, that is, adjust the moving direction or speed of the UAV, or adjust the UAV movement based on user needs. In order to collect images that meet the needs of users.
  • the user can control the drone through the remote control device.
  • the direction of the camera can be controlled by the remote control device to shoot the target, or the movement direction or speed of the drone can be adjusted by the joystick on the remote control device.
  • the UAV can receive the operation instruction input by the user through the remote control device, and determine the target direction of the UAV according to the operation instruction.
  • the target direction may be the direction the user expects the camera to face.
  • the gimbal movement of the UAV can be controlled to drive the camera toward the target direction and shoot the scene in the target direction.
  • the UAV can also control the movement of the power mechanism to adjust the posture of the fuselage, so that the sensing sensor on the fuselage faces other directions different from the target direction, and adjust the unmanned aerial vehicle based on the environmental observation information collected by the sensing sensor. machine movement.
  • the camera can capture images in the target direction expected by the user, observe the environment in the target direction, and adjust the perception sensor to match the target direction. different other directions to observe the environment in other directions.
  • the perception field of view and observation range of the drone can be increased.
  • the user can not only determine whether there are obstacles in the target direction according to the images collected by the camera, so as to adjust the movement of the UAV, and at the same time, the UAV itself can also combine the environmental information collected by the perception sensor to determine whether there are obstacles in other directions, and adjust The movement of the drone improves the safety of the drone during flight.
  • the sensing sensor in the embodiment of the present application may be any sensor capable of sensing the three-dimensional spatial information of each object in the environment, that is, a sensor that can perceive the distance between the three-dimensional point in the three-dimensional space and the drone, thereby determining the distribution of each object in the environment.
  • the perception sensor includes one or more of the following sensors: lidar, millimeter-wave radar, multi-camera, monocular, and spectral cameras.
  • the camera can be used to capture images in the direction of movement of the drone, so that the user can observe the situation in the direction of movement of the drone based on the images captured by the camera.
  • the target direction may be a target movement direction determined in response to a user's motion control operations on the remote control device.
  • the user can control the movement direction of the drone through the control device on the remote control device (for example, the joystick on the remote control), so the target movement direction of the drone can be determined based on the user's motion control operation on the remote control device, and then the target movement direction of the drone can be determined.
  • the camera moves in the direction of the target.
  • the target movement direction can be determined as the target direction.
  • the target direction may also be a fixed direction set in response to the user's orientation setting operation on the remote control device.
  • the user may want the camera to face a fixed direction to capture objects in a fixed direction.
  • the remote control device can provide the function of setting the camera to face a certain fixed direction, and the user can set the fixed direction the camera faces through the remote control device.
  • the drone receives the fixed direction set by the user through the remote control device, the fixed direction can be used as the target direction.
  • the fixed direction may be set based on the pointing of the remote control device.
  • some remote control devices may include position and orientation sensors such as gyroscopes, so the data collected by the position and attitude sensors can be used to determine the current direction of the remote control device. Therefore, if the user wants to set a fixed direction, he can directly rotate the remote control device and place the point to a certain direction, and the pose sensor can detect this direction as the fixed direction.
  • users can also turn their own orientation to change the orientation of the remote control in their hands, from setting a fixed orientation. At this time, point the camera toward the direction the remote controller points, and the angle of view of the image captured by the camera is consistent with the angle of view observed by the user, which is convenient for the user to control the drone.
  • the orientation of the perceptual sensor may be determined based on the observation range of the camera and/or environmental observation information collected by the perceptual sensor. For example, in order to realize that the drone can observe a larger range during the flight, so as to detect obstacles in all directions in time and avoid collision, the orientation of the perception sensor can be determined based on the observation range of the camera.
  • the orientation of the perception sensor can also be determined based on the environmental observation information collected by the perception sensor. For example, in order to better avoid obstacles and avoid drone collisions The perception sensor can be oriented towards the direction with more obstacles, so as to observe the direction with more obstacles. In some scenarios, in order to use the perception sensor for localization, the perception sensor can also be oriented in a direction that is beneficial to the positioning of the UAV.
  • the observation range of the perception sensor is at least partially different from the observation range of the camera, so that the perception sensor and the camera can be used to obtain a larger observation range and increase the receptive field of the UAV to observe in all directions , avoid obstacles and improve flight safety.
  • the environmental observation information collected by the perception sensor can be used to determine the target area in the environment where the drone is located,
  • the direction of the target area relative to the UAV is different from the direction of the target movement, that is, the target area is not in the direction of the target movement of the UAV, and then you can adjust the attitude of the UAV fuselage and orient the sensor towards the target area to
  • the target area is perceived and the camera pose is adjusted so that the camera moves towards the target movement direction.
  • the target area can be determined according to the task to be performed by the perception sensor.
  • the target area can be an area with dynamic obstacles
  • the perception sensor to locate the target area can be an area with rich textures .
  • the camera can face the fixed direction set by the user, it can be inconsistent with the movement direction of the drone.
  • the power mechanism and the gimbal are controlled to make the camera face the target direction and the sensing sensor to face the target direction.
  • the gimbal can be controlled to move to drive the camera to the target direction, and the pose of the drone can be adjusted through the motivation mechanism, so that the orientation of the sensor is consistent with the real-time movement direction of the drone .
  • the movement direction of the drone may be the direction input by the user through the remote control device, for example, the movement direction may be determined based on the user's control operation of the joystick on the remote control device.
  • the perception sensor can be used to locate the UAV.
  • the perception sensor can be directed to an area with rich textures in the environment, which is convenient for the UAV to observe the environment according to the environmental observation information collected by the perception sensor. to locate. Therefore, the environmental observation information collected by the perception sensor may be the texture information of the environment, and the target area may also be determined based on the texture information. For example, an area close to the drone and rich in texture can be determined as the target area.
  • the movement of the drone when the movement of the drone is adjusted based on the environmental observation information collected by the perception sensor, the movement of the drone may be adjusted based on the environmental observation information collected by the perception sensor so that the drone avoids contact with the environment.
  • the obstacle collided For example, when an obstacle is detected in the environment, the drone can be hovered to reduce the flying speed of the drone, or adjust the movement direction of the drone to avoid collision with the obstacle.
  • the drone when the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor, the drone can also be controlled to maintain a preset relative motion state with the target object in the environment based on the environmental information collected by the perception sensor, such as , following the target object in the environment.
  • the movement state data of the drone itself may be determined according to the environmental observation information collected by the perception sensor, and then according to the movement state data Adjust the movement of the drone.
  • the motion state data may be various data representing the motion of the drone, for example, one or more of acceleration, speed, moving distance, and rotation angle.
  • the motion monitoring data collected by the pose sensor on the UAV can also be obtained, and then according to the environment collected by the sensing sensor
  • the detection data and the motion detection data collected by the position sensor are fused to obtain motion state data.
  • the pose sensor can be various sensors that can detect the position and attitude of the UAV.
  • the pose sensor can be one of a gyroscope, an accelerometer, an inertial measurement unit, GPS, and GNSS. or more.
  • the displacement change data of the UAV relative to the fixed object in the environment may be calculated according to the environmental observation information, and then according to The displacement change data determines the motion state data of the UAV. For example, it is possible to determine how far the drone is currently flying based on changes in the relative distance of the drone from each building in the environment.
  • the accuracy information of the position and attitude sensors of the UAV can be monitored, and if the accuracy information of the position and attitude sensors of the UAV does not meet the preset conditions, it is indicated that the position and attitude sensors are used for positioning at this time.
  • the positioning result is not accurate enough, so the perception sensor can be used to locate the UAV, that is, the perception sensor is directed to the target area with rich texture.
  • the orientation of the sensing sensor can be adjusted to a target area with richer textures.
  • the perception sensor can be used for positioning, for example, the orientation of the perception sensor is adjusted to the direction consistent with the movement direction of the UAV .
  • the orientation of the perception sensor is also adjusted when the posture of the fuselage is adjusted.
  • the adjustment of the nose is generally considered to be conducive to the camera to take pictures or to the user to control the drone.
  • the nose of the drone should be oriented in the same direction as the initial direction. Even if the movement direction of the drone changes, the orientation of the sensing sensor will not be adjusted, so that the sensing sensor may not perceive the movement direction. environmental information.
  • the nose of the drone is always facing the target, so the sensing sensor is also facing the target and cannot observe other directions. It can be seen that in the current control mode of the UAV, the orientation of the sensing sensor is only adjusted mechanically following the adjustment of the orientation of the nose, and the role of the sensing sensor is not fully utilized.
  • the embodiment of the present application also provides another control method of the UAV, which can be based on various factors in the working process of the UAV (for example, the flight status information of the UAV, the accuracy of the positioning signal, the obstacle distribution, etc.) to determine the task type of the UAV’s current task to be performed, and then adjust the orientation of the UAV’s sensing sensor to a direction that is conducive to the execution of the to-be-executed task, so as to improve the accuracy of the sensor’s execution of the to-be-executed task. , make full use of perception sensors to improve flight safety.
  • the UAV in the embodiment of the present application includes a power mechanism, which can drive the UAV to move and adjust the posture of the UAV body.
  • the UAV also includes a sensing sensor, which is fixedly connected to the The fuselage of the drone, the adjustment of the posture and posture of the fuselage of the drone will also drive the adjustment of the orientation of the sensor.
  • the method includes the following steps:
  • various kinds of information in the working process of the UAV can be combined, for example, the flight status information of the UAV, the strength of the positioning signal received by the UAV, the obstacles in the UAV flight environment.
  • the distribution, whether the obstacles are in a moving state or a static state, etc. determine the mission type of the UAV currently to be performed.
  • the orientation of the sensing sensor is adjusted to the target orientation, wherein the accuracy of executing the to-be-executed task when the sensing sensor is located in the target orientation is higher than the accuracy of executing the to-be-executed task when the sensing sensor is located in other orientations.
  • the perception sensor can be directed towards the movement direction of the drone. If the current task to be performed by the perception sensor is positioning, adjust the perception sensor to a direction that is conducive to the positioning of the UAV. If the current task of the perception sensor is to observe dynamic obstacles and avoid the dynamic obstacles from colliding with the UAV, the orientation of the perception sensor can be adjusted to a direction in which moving obstacles can be observed.
  • the perception sensor can be lidar, millimeter-wave radar, multi-eye camera, monocular camera, and spectral camera.
  • the tasks to be performed by the perception sensor can be set according to the flight status of the UAV and the functions that the perception sensor can achieve. This application implements Examples are not limited.
  • the role of the sensing sensor can be fully exerted, and at the same time, it can improve the Sensing the accuracy of the sensor's execution of the task to be performed, thereby improving the flight safety of the UAV.
  • Perception sensors can be arranged in various parts of the fuselage, such as the head of the fuselage, the tail of the fuselage, and/or the upper and lower surfaces of the fuselage, which can be set according to actual needs.
  • the perception sensor may be located at the nose position of the UAV, and when adjusting the pose of the body of the UAV based on the mission type, the orientation of the nose of the UAV may be adjusted based on the mission type.
  • the drone is further provided with a camera, which is used to collect images or videos of the photographed target, and transmit them to the remote control device of the drone, so as to be displayed to the user through the remote control device.
  • the camera can be movably connected to the drone through the gimbal, and the gimbal can drive the camera to rotate to change the orientation of the camera.
  • the camera can always face the target to be photographed to ensure that the camera shoots the target.
  • After adjusting the posture of the body it can also control the The gimbal rotates to keep the orientation of the camera as the orientation set by the user.
  • the orientation set by the user is the orientation that is convenient for shooting the target.
  • the information collected by the perception sensor can be used for positioning.
  • the type of the task to be performed by the perception sensor can be determined according to the positioning accuracy of the positioning signal received by the drone. For example, in some embodiments, when it is determined that the positioning accuracy of the positioning signal received by the UAV is less than the preset accuracy, the task type of the task to be performed is a positioning task. Of course, if the positioning accuracy of the positioning signal received by the drone is good, at this time, the positioning of the drone can be accurately completed without the need for a sensor, then the sensor can be used to perform other tasks.
  • the positioning signal may be a GPS signal, of course, may also be other signals with a positioning function.
  • the positioning accuracy of the positioning signal is determined based on one or more of the following data: the precision factor of the positioning signal, the number of satellites transmitting the positioning signal, and data collected by the position and attitude sensor of the UAV.
  • the UAV can also be positioned based on the data collected by the attitude measurement unit. If it is found that the position determined based on the data collected by the pose sensor and the position determined based on the positioning signal are very different, the accuracy of the positioning signal may be insufficient.
  • the task type of the task to be performed by the perception sensor may also be determined according to the space environment information where the UAV is located. For example, when it is determined that the flying environment of the drone is surrounded by high-rise buildings, the positioning signal received by the drone is likely to be blocked by the high-rise buildings. At this time, the task to be performed by the drone's perception sensor can be determined as a positioning task. Alternatively, if it is determined that there are many obstacles in the space where the UAV is located, the task to be performed by the perception sensor can be determined as an obstacle avoidance task at this time.
  • the posture and attitude of the fuselage can be adjusted, and the orientation of the sensing sensor can be adjusted to a target orientation that is conducive to the positioning of the sensing sensor.
  • the orientation of the sensing sensor should be adjusted as far as possible to a direction with more distributed objects and richer object textures.
  • the perception sensor as a binocular vision sensor as an example, when the position of the drone relative to an object in the environment is determined by the binocular vision sensor, the binocular vision sensor can collect the image of the object, and then extract the feature points from the image, based on The parallax of the feature point determines the distance of the drone from the object.
  • the image collected by the perceptual sensor is an open area, or an area with few planes and textures, it is impossible to extract feature points and determine the distance, and the positioning effect of the perceptual sensor is also very poor.
  • the distance calculated from parallax will be less accurate. Therefore, when the perception sensor is used for positioning, the object is closer to the drone, and the accuracy of the positioning result will be higher.
  • the target orientation may meet the following conditions: that is, when the sensing sensor faces the target orientation, the sensing sensor can observe the target object, and the target object is the distance between the drone and the drone in the environment where the drone is located less than the first
  • the preset distance and the feature point density of the target object are greater than the second preset threshold.
  • the positioning accuracy of the perception sensor can be improved by orienting the UAV to a target object that is close to the UAV and has rich textures (that is, the density of feature points is high).
  • the first preset distance and the second preset threshold may be set according to actual requirements, which are not limited in the embodiment of the present application.
  • the posture of the fuselage may be continuously adjusted to adjust the orientation of the perception sensor, and then the environment in which the UAV is located can be adjusted through the perception sensor.
  • image acquisition multiple frames of images are obtained, and then the feature points in the multiple frames of images can be extracted to determine the density of the feature points in each frame of images.
  • depth information of feature points can be determined to determine the distance of objects in the environment from the drone.
  • the orientation of the perceptual sensor can be adjusted to the target orientation based on the feature point density of the multi-frame images and the depth information of the feature points.
  • the received positioning signal can be used for accurate positioning, so there is no need to consider the use of the perception sensor for positioning.
  • the orientation is adjusted to the orientation that is conducive to avoiding obstacles.
  • the target orientation should meet the following conditions: that is, when the sensing sensor is in the target orientation, no The current movement direction of the human-machine is within the field of view of the perception sensor.
  • the orientation of the perception sensor can be consistent with the movement direction, or the orientation of the perception sensor has a small deviation from the movement direction to ensure that the perception sensor can observe obstacles in the movement direction. thing.
  • the sensing The orientation of the sensor is adjusted to the target orientation. Since the orientation of the sensor needs to be adjusted, the orientation of the fuselage needs to be adjusted, and the orientation of the camera mounted on the gimbal will also be adjusted. to ensure that the orientation of the camera remains unchanged. Therefore, when adjusting the orientation of the sensing sensor so that it can observe the movement direction of the drone, it is necessary to determine the angle at which the fuselage can be rotated according to the maximum rotation angle of the gimbal and the current movement direction of the drone, so as to avoid adjusting the drone. After the posture, the gimbal cannot be rotated by the same angle in the opposite direction to ensure that the camera orientation remains unchanged.
  • the target angle when adjusting the orientation of the sensing sensor to the target orientation according to the movement direction and the maximum rotation angle of the gimbal, the target angle may be determined according to the movement direction and the current orientation of the sensing sensor, and the target angle is the orientation of the sensing sensor.
  • the angle to be adjusted if the target angle is smaller than the preset angle, the target orientation is the orientation after adjusting the target angle of the sensing sensor, wherein the preset angle is determined based on the maximum rotation angle,
  • the preset angle may be the maximum rotation angle of the gimbal, or an angle obtained by subtracting a certain buffer angle from the maximum rotation angle.
  • the target orientation if the target angle is greater than the preset angle, the target orientation is the orientation after adjusting the sensing sensor to the preset angle.
  • the drone shown in FIG. 3 includes a binocular vision sensor 31 that is fixedly connected to the nose of the drone, and a main camera 32 that is movably connected to the nose through a gimbal.
  • the nose of the drone In the current control mode of the UAV, in order to facilitate the user to control the UAV and facilitate the main camera to shoot video, the nose of the drone usually faces the same direction as the main camera.
  • the orientation of the binocular vision sensor and the main camera are also consistent, so that the binocular vision sensor cannot be pointed in the proper direction to give full play to the role of the binocular vision sensor.
  • This embodiment proposes a solution, in which the orientation of the nose of the drone can be adjusted according to the flying state of the drone, so as to adjust the orientation of the binocular vision sensor, so that the binocular vision sensor is oriented in a direction that is conducive to positioning or obstacle avoidance, and can pass the cloud
  • the station adjusts the direction of the main camera, so that the main camera remains in the original direction set by the user, which does not affect the user's operating feel, and also does not affect the shooting work of the main camera.
  • the drone When the GPS signal received by the drone is not good (for example, when the drone is flying near a building), the drone cannot accurately obtain its own positioning information, which will cause position drift, which is dangerous to control.
  • the nose can be pointed towards the nearest target object with rich texture, and the UAV can be positioned through the image collected by the binocular vision sensor, so as to ensure the flight safety.
  • the drone can effectively obtain its own accurate motion information, and the binocular vision sensor will be used to observe the obstacles in the movement direction of the drone to make up for the defect of the blind spot of the binocular vision sensor. , so as to ensure flight safety.
  • the drone when the drone takes off, both the drone and the user are facing forward. After the user pushes the stick forward through the remote control, the drone flies directly in front of the user.
  • the positioning accuracy of the GPS signal for example, you can refer to the DOP (Dilution of precision, precision factor) index and sAcc (Speed Accuracy Estimate) index of GPS, as well as the number of satellites, and combine the data collected by the inertial measurement unit on the UAV to GPS The signal is verified to comprehensively judge whether the current GPS positioning is accurate and available.
  • the gimbal facing straight ahead (initial direction) and rotate the fuselage left and right within the limit range of the gimbal to find the flight environment of the drone and the unmanned aerial vehicle.
  • the direction of the nearest target object with rich texture is located, and the nose of the aircraft is kept in this direction, so that the binocular vision sensor faces the target object, and completes the positioning based on the captured image of the target object.
  • the nose When determining the target object with rich texture that is closest to the UAV in the environment, the nose can be rotated to different directions in turn, and then images are collected through the binocular vision sensor, and then the features in the images collected when the nose is facing each direction can be determined.
  • the average density of points can be determined according to formula (1):
  • dc represents the average density of feature points in the image
  • P0 represents the number of extracted feature points
  • P1 represents the total number of pixel points in the image
  • the average depth of the feature points in the image can be calculated as follows:
  • the depth cannot be accurately calculated from the feature point matching results of the binocular sensor or the feature point matching results of the monocular sensor, assign a relatively large value to the average depth, such as 500m.
  • the rotation angle of the gimbal does not exceed (the maximum rotation angle is ⁇ 80°, and a margin of 15° is reserved for smoothing, it is generally ⁇ 65), and the field of view is not blocked. (For example, a tripod or propeller will be seen at 60°), orient the fuselage as far as possible in the current direction of motion, and observe and observe the environment in the direction of motion, while keeping the camera gimbal facing straight ahead.
  • the current orientation of the drone head can be obtained, and the movement direction of the user-controlled drone can be obtained. According to the difference between the two directions, combined with the perception range of the binocular vision sensor and the limit angle of the gimbal, the head orientation can be adjusted. At the same time, keep the gimbal facing the original direction. for example:
  • an embodiment of the present application also provides a control device for an unmanned aerial vehicle, the unmanned aerial vehicle includes a sensing sensor and a camera, the sensing sensor is fixedly connected to the fuselage of the unmanned aerial vehicle, and the camera is based on
  • the gimbal is movably connected to the fuselage of the UAV; the UAV also includes a power mechanism, and the fuselage of the UAV can adjust its posture under the action of the power mechanism; as shown in Figure 8
  • the device includes a processor 81, a memory 82, and a computer program stored in the memory 82 for the processor 81 to execute.
  • the processor 81 executes the computer program, the following steps can be implemented: controlling The camera works, and the image captured by the camera is sent to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • Control the power mechanism and the pan/tilt to work make the camera face the moving direction of the target, and make the sensing sensor face other directions different from the moving direction of the target;
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • the orientation of the perception sensor is determined based on the observation range of the camera and/or the environmental observation information.
  • the viewing range of the perceptual sensor is at least partially different from the viewing range of the camera.
  • the processor is further configured to:
  • the target area in the environment where the UAV is located is determined based on the environmental observation information of the UAV, and the direction of the target area relative to the UAV is the same as the target area.
  • the moving direction of the target is different;
  • the pose of the fuselage is adjusted, and the gimbal is controlled to work to adjust the pose of the camera, so that the camera is directed toward the target movement direction, and the perception sensor is directed toward the target area.
  • the environment observation information includes texture information of the environment, and the target area is determined based on the texture information.
  • the processor when the processor is configured to adjust the movement of the drone based on the environmental observation information collected by the perception sensor, the processor is specifically configured to:
  • the movement of the drone is adjusted based on the environmental observation information collected by the perception sensor so that the drone avoids collision with obstacles in the environment, and/or, relative to targets in the environment
  • the object maintains the preset relative motion state.
  • the processor when the processor is configured to adjust the movement of the drone based on the environmental observation information collected by the perception sensor, the processor is specifically configured to:
  • the motion of the drone is adjusted based on the motion state data.
  • the processor when the processor is configured to determine the motion state data of the drone itself based on the environmental observation information collected by the perception sensor, the processor is specifically configured to:
  • the motion state data is determined based on the displacement change data.
  • the processor is further configured to:
  • the motion state data is obtained based on the fusion of the environment observation information and the motion monitoring data.
  • the pose sensors include one or more of the following sensors: gyroscopes, accelerometers, inertial measurement units, GPS, and GNSS.
  • the motion state data includes one or more of the following information: acceleration, speed, movement distance, and rotation angle.
  • the processor is configured to control the power mechanism and the pan/tilt to work, make the camera face the target movement direction, and make the sensing sensor face a different direction from the target movement direction Before the other directions, also used to:
  • the accuracy information does not meet the preset accuracy conditions, acquire a target area with rich textures in the environment where the UAV is located, and control the perception sensor to face the target area.
  • the perception sensor includes one or more of the following sensors: lidar, millimeter-wave radar, multi-camera, monocular, and spectral cameras.
  • an embodiment of the present application also provides a control device for an unmanned aerial vehicle, the unmanned aerial vehicle includes a perception sensor and a camera, the perception sensor is fixedly connected to the fuselage of the unmanned aerial vehicle, and the camera is based on a cloud
  • the stage is movably connected to the fuselage of the UAV; the fuselage of the UAV can adjust the posture and attitude under the action of the power mechanism; as shown in FIG. 8 , the device includes a processor 81.
  • Memory 82 a computer program stored in the memory 82 for execution by the processor 81, when the processor 81 executes the computer program, the following steps can be implemented:
  • controlling the camera to work and sending the image captured by the camera to a remote control device for controlling the drone, and the image is used for display on the remote control device;
  • the movement of the UAV is adjusted based on the environmental observation information collected by the perception sensor.
  • the target direction is a target movement direction determined in response to a user's motion control operation on the remote controller.
  • the target direction is a fixed direction set in response to a user's orientation setting operation on the remote control.
  • the fixed direction is based on a pointing setting of the remote control.
  • the target direction is a fixed direction set in response to a user's orientation setting operation on the remote control
  • the processor When the processor is used to control the power mechanism and the pan/tilt to work, make the camera face the target direction, and make the perception sensor face other directions different from the target direction, it is specifically used for:
  • Control the power mechanism and the pan/tilt to work make the camera face the target direction, and make the perception sensor face consistent with the real-time movement direction of the movable platform.
  • the processor is further configured to:
  • the orientation of the sensing sensor is determined based on the accuracy information of the data collected by the pose sensor of the UAV.
  • the processor when the processor is configured to determine the orientation of the perception sensor based on the accuracy information of the data collected by the pose sensor of the UAV, the processor is specifically configured to:
  • a target area with rich textures in the environment where the UAV is located is acquired, and the perception sensor is controlled to face the target area.
  • an embodiment of the present application is also a control device for an unmanned aerial vehicle
  • the unmanned aerial vehicle includes a power mechanism and a sensing sensor
  • the sensing sensor is fixedly connected to the fuselage of the unmanned aerial vehicle
  • the unmanned aerial vehicle has a fuselage of the unmanned aerial vehicle.
  • the fuselage can adjust the pose under the action of the power mechanism.
  • the device includes a processor 81, a memory 82, and a computer program stored in the memory 82 for the processor 81 to execute, When the processor 81 executes the computer program, the following steps can be implemented:
  • the pose of the fuselage is adjusted by the power mechanism based on the task type, so as to adjust the orientation of the perception sensor to the target orientation; wherein, the perception sensor performs the to-be-executed task when the perception sensor is located in the target orientation
  • the accuracy is higher than the accuracy of performing the task to be performed when the sensor is located in other orientations.
  • the UAV is further provided with a camera, the camera is movably connected to the UAV through a gimbal, and the processor is further configured to:
  • the pan/tilt is controlled to rotate to keep the orientation of the camera as the orientation set by the user.
  • the processor when the processor is configured to determine the task type of the task to be performed by the sensing sensor, it is specifically configured to:
  • the task type of the task to be performed by the perception sensor is determined based on the space environment information where the UAV is located.
  • the processor when the processor is configured to determine the task type of the task to be performed by the perception sensor based on the positioning accuracy of the positioning signal received by the UAV, the processor is specifically configured to:
  • the task type is a positioning task.
  • the task type is a positioning task
  • the target orientation meets the following conditions:
  • a target object can be observed, the distance between the target object and the UAV is less than a first preset threshold and the feature point density of the target object is greater than a second preset threshold.
  • the processor when the processor is configured to adjust the orientation of the sensing sensor to the target orientation based on the task type, the processor is specifically configured to:
  • the sensing sensor is used to collect images of the environment to obtain multiple frames of images;
  • the orientation of the sensing sensor is adjusted to a target orientation according to the density of the feature points in the multi-frame images and the depth information of the feature points.
  • the processor when the processor is configured to determine the task type of the task to be performed by the perception sensor based on the positioning accuracy of the positioning signal received by the UAV, the processor is specifically configured to:
  • the task type is an obstacle avoidance task.
  • the task type is an obstacle avoidance task, and when the perception sensor is facing the target, the current movement direction of the drone is within the field of view of the perception sensor.
  • the processor when the processor is configured to adjust the orientation of the sensing sensor to the target orientation based on the task type, the processor is specifically configured to:
  • the orientation of the sensing sensor is adjusted to the target orientation according to the current movement direction of the drone and the maximum rotation angle of the gimbal.
  • the processor when the processor is configured to adjust the orientation of the sensing sensor to the target orientation according to the movement direction and the maximum rotation angle of the pan/tilt head, the processor is specifically configured to:
  • the target orientation is the orientation after adjusting the target angle of the sensing sensor, wherein the preset angle is determined based on the maximum rotation angle.
  • the processor when the processor is configured to adjust the orientation of the sensing sensor to the target orientation according to the movement direction and the maximum rotation angle of the pan/tilt head, the processor is specifically configured to:
  • the target orientation is the orientation of the sensing sensor after adjusting the preset angle.
  • the perception sensor is located at the nose position of the UAV, and the processing tool is used to adjust the orientation of the fuselage of the UAV based on the mission type, and is specifically used for:
  • the orientation of the drone nose is adjusted based on the mission type.
  • the positioning signal is a GPS signal
  • the positioning accuracy of the positioning signal is determined based on one or more of the following data: a precision factor of the positioning signal, the number of satellites transmitting the positioning signal, and the The data measured by the inertial measurement unit of the UAV.
  • the present application also provides an unmanned aerial vehicle, the unmanned aerial vehicle includes a sensing sensor, a camera, a power mechanism and the control device according to any one of the above embodiments, and the sensing sensor is fixedly connected to the unmanned aerial vehicle.
  • the fuselage of the drone is movably connected to the fuselage of the drone based on the gimbal, and the fuselage of the drone can adjust the posture and attitude under the action of the power mechanism.
  • an embodiment of the present specification further provides a computer storage medium, where a program is stored in the storage medium, and when the program is executed by a processor, the control method of the drone in any of the foregoing embodiments is implemented.
  • Embodiments of the present specification may take the form of a computer program product embodied on one or more storage media having program code embodied therein, including but not limited to disk storage, CD-ROM, optical storage, and the like.
  • Computer-usable storage media includes permanent and non-permanent, removable and non-removable media, and storage of information can be accomplished by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • PRAM phase-change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • Flash Memory or other memory technology
  • CD-ROM Compact Disc Read Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • DVD Digital Versatile Disc
  • Magnetic tape cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non-
  • the apparatus embodiments since they basically correspond to the method embodiments, reference may be made to the partial descriptions of the method embodiments for related parts.
  • the device embodiments described above are only illustrative, wherein the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed over multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment. Those of ordinary skill in the art can understand and implement it without creative effort.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机的控制方法、装置及无人机,在无人机的运动过程中,可以接收用户通过无人机的遥控设备输入的操作指令,基于该操作指令确定用户期望无人机的相机朝向的目标方向,然后可以控制无人机上搭载该相机的云台运动,以将相机调整至目标方向,采集目标方向上的图像。同时,可以控制无人机上的动力机构运动,以驱动无人机调整位姿,将无人机上与机身固定连接的感知传感器的朝向调整至与该目标方向不同的其他方向。通过将感知传感器的朝向调整至与相机的朝向不同,可以组合得到更大的观测访问,增大无人机的感受野,提升无人机的飞行安全。

Description

无人机的控制方法、装置及无人机 技术领域
本申请涉及无人机技术领域,具体而言,涉及一种无人机的控制方法、装置及无人机。
背景技术
无人机的应用越来越广泛。通常无人机上设有相机,用于采集飞行环境的图像并传输给无人机的遥控设备,用户可以通过遥控设备查看无人机采集的图像,并基于图像对无人机的运动状态做出相应的控制。此外,无人机上一般还设有感知传感器用于感知周围的环境信息,以及时发现环境中的障碍物,避免撞机。在无人机的飞行过程中,如何更好地利用相机和感知传感器对飞行环境进行感知,提升无人机的飞行安全非常关键。
发明内容
有鉴于此,本申请提供一种无人机的控制方法、装置及无人机。
根据本申请的第一方面,提供一种无人机的控制方法,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;所述方法包括:
控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标运动方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
根据本申请的第二方面,提供一种无人机的控制方法,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的所述机身;所述无人机的机身可在所述动力机构的作用下调整 位姿,所述方法包括:
控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
根据本申请的第三方面,提供一种无人机的方法,所述无人机包括动力机构和感知传感器,所述感知传感器固定连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿,所述方法包括:
确定所述感知传感器待执行任务的任务类型;
基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
根据本申请的第四方面,提供一种无人机的控制装置,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;所述装置包括处理器、存储器、存储于所述存储器上可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时实现以下步骤:
控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
根据本申请的第五方面,提供一种无人机的控制装置,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;所述装置包括处理器、存储器、存储于所述存储器上可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时实现以下步骤:
控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的 遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
根据本申请的第六方面,提供一种控制无人机的控制装置,所述无人机包括感知传感器,所述感知传感器固定连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;所述装置包括处理器、存储器、存储于所述存储器上可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时实现以下步骤:
确定所述感知传感器待执行任务的任务类型;
基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
根据本申请的第七方面,提供一种无人机,所述无人机包括感知传感器、相机、动力机构以及上述第四方面、第五方面或第六方面任一项所述的控制装置;所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿。
应用本申请提供的方案,在无人机的运动过程中,可以接收用户通过无人机的遥控设备输入的操作指令,基于该操作指令确定用户期望无人机的相机朝向的目标方向,然后可以控制无人机上搭载该相机的云台运动,以将相机调整至目标方向,采集目标方向上的图像。同时,可以控制无人机上的动力机构运动,以驱动无人机调整位姿,将无人机上与机身固定连接的感知传感器的朝向调整至与该目标方向不同的其他方向。通过将感知传感器的朝向调整至与相机的朝向不用,可以组合得到更大的观测范围,增大无人机的感受野,提升无人机的飞行安全。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例, 对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1(a)是本申请一个实施例的正常飞行模式下对无人机进行控制的示意图。
图1(b)是本申请一个实施例的航向锁定模式下对无人机进行控制的示意图。
图1(c)是本申请一个实施例的智能跟随模式下对无人机进行控制的示意图。
图2是本申请一个实施例的无人机的感知传感器的观测盲区的示意图。
图3是本申请一个实施例的无人机的示意图。
图4是本申请一个实施例的无人机的控制方法的流程图。
图5是本申请一个实施例的无人机的控制方法的流程图。
图6是本申请一个实施例的调整无人机的双目传感器朝向有利于定位的方向的示意图。
图7是本申请一个实施例的调整无人机的双目传感器朝向有利于避障的方向的示意图。
图8是本申请一个实施例的控制装置的逻辑结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
无人机被广泛应用于各个领域。通常无人机上设有相机,相机可以通过云台可活动连接在无人机机身,用于采集飞行环境的图像并传输给无人机的遥控设备,用户可以通过遥控设备查看无人机采集的图像,并基于该图像对无人机的运动状态做出相应的调整。当然,为了避免由于用户操控不当,导致出现撞机问题,无人机上一般还设有感知传感器,感知传感器通常固定连接于机身,感知传感器可以用于感知周围的环境信息,以及时发现环境中的障碍物,从而辅助无人机调整其运动状态,避免撞机。
目前无人机的控制模式主要有三种:正常飞行模式、航向锁定模式以及智能跟随模式。如图1(a)所示,为正常飞行模式中对无人机进行控制的示意图,当用户朝向与无人机机头朝向一致时,用户将遥控器的控制遥杆向斜45°方向打杆,无人机会以自身当前朝向为基准向斜45°方向飞行(即运动速度方向为朝向斜45°方向),但机头 和相机朝向仍保持原始朝向,方便用户对无人机进行控制(控制时以原始朝向为基准,且相机观测视角和用户观测视角一致,无需进行坐标系转换)。如图1(b)所示,为航向锁定模式中对无人机进行控制的示意图,当用户朝向与无人机机头朝向一致时,用户将遥控器的控制遥杆向斜45°方向打杆控制,无人机会以自身当前朝向为基准向斜45°方向飞行,同时机头朝向也朝向斜45°方向,相机朝向则与实时机头朝向一致。在航向锁定模式下,会记录一个初始朝向(进入该模式时的机头朝向),之后的控制都是以初始朝向为基准,和实时的机头朝向无关。如图1(c)所示,为智能跟随模式下对无人机进行控制的示意图,在智能跟随模式下,用户需选定一个跟随目标,随后无人机会自动控制机头朝向,保证相机实时朝向跟随目标,此时控制是以无人机和目标之间的位置关系为基准的。用户将遥控器的遥杆往左打杆,则无人机开始向左环绕目标,往右打杆则向右环绕,往前打杆,则无人机靠近目标,往后打杆则远离目标。
从目前无人机的几种控制模式可知,在对无人机进行控制时,主要是基于方便相机采集被拍摄目标的图像以及方便用户对无人机进行控制的目的调整无人机机头朝向,但是没有考虑无人机的飞行安全。比如,如图2所示,无人机感知传感器覆盖范围是有限制的,由于有螺旋桨和机臂遮挡,斜45°方向一般是观测盲区,如图中的灰色区域即为感知传感器的观测盲区。在如图1(a)的模式中,运动方向刚好为观测盲区,非常危险。在如图1(c)的模式中,由于感知传感器和相机都朝向跟随目标,也无法观测侧面的障碍物。并且,在上述各种模式下,无人机的感知传感器和相机的朝向是一致的,因而两者感知的飞行环境的范围存在很大重叠,无法最大化利用感知传感器和相机的观测范围,以对飞行环境中更大的角度范围进行观测。
基于此,本申请实施例提供一种无人机的控制方法,在无人机飞行过程中,可以控制无人机的相机的朝向和感知传感器的朝向不同,以使两者组合可以得到更大的观测范围,提升无人机在飞行过程中的安全性。
如图3所示,为本申请实施例一种无人机的示意图,本申请实施例中的无人机包括感知传感器(如图中的31)和相机(如图中的32),该感知传感器固定连接于无人机的机身,比如,感知传感器可以根据实际需求设置在机身的各个部位,比如,头部、尾部、机身上下表面等一个或者多个部位。由于感知传感器固定连接于机身,感知传感器的朝向可以随着无人机机身位姿的调整而调整,感知传感器可以用于采集无人机飞行环境中的环境观测信息。
无人机中的相机可以通过云台可活动地连接于无人机的机身,云台可以驱动相机转动,改变相机朝向,以对飞行环境中一定角度范围内的对象进行图像采集。其中,云台可 以设置在无人机机身的头部、或者机身的尾部,或者其他对相机遮挡较小的位置。此外,无人机还包括动力机构,动力机构可以驱动无人机运动,并调整无人机机身的位姿。
具体的,如图4所示,本申请实施例提供的无人机控制方法包括以下步骤:
S402、控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
S404、接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
S406、控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
S408、基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
无人机可以控制相机工作,以采集飞行环境中被拍摄目标的影像,如视频或图像,并将相机采集的影像发送给对无人机进行控制的遥控设备,比如,与无人机配套使用的遥控器、或者是其他可以控制无人机的终端设备。遥控设备可以通过屏幕将影像显示,以便用户查看影像。同时,用户也以可以基于该影像对无人机的运动状态进行调整,比如,通过影像发现前方有障碍物,即调整无人机的运动方向或速度,或者基于用户需求调整无人机运动,以采集得到符合用户需求的影像。用户可以通过遥控设备对无人机进行控制,比如,可以通过遥控设备控制相机的朝向,以对被拍摄目标进行拍摄,或者可以通过遥控设备上的遥杆调整无人机的运动方向或者速度。无人机可以接收用户通过遥控设备输入的操作指令,并根据该操作指令确定无人机的目标方向。其中,目标方向可以是用户期望相机朝向的方向。当无人机基于用户操作指令确定当前的目标方向时,可以控制无人机的云台运动,从而驱动相机朝向该目标方向,对该目标方向上的场景进行拍摄。此外,无人机还可以控制动力机构运动,从而调整机身的位姿,使得机身上的感知传感器朝向与该目标方向不同的其他方向,并基于感知传感器采集到的环境观测信息调整无人机的运动。通过将相机的朝向调整至与基于用户输入的指令确定的目标方向一致,可以通过相机采集用户期望的目标方向上的图像,对目标方向上的环境进行观测,同时将感知传感器调整至与目标方向不同的其他方向,以对其他方向上的环境进行观测。通过将相机和感知传感器朝向不同的方向,可以增大无人机的感知视野和观测范围。这样用户既可以根据相机采集的图像确定目标方向是否存在障碍物,从而调整无人机的运动,同时,无人机自身也可以结合感知传感器采集的环境信息确定其他方向是否存在障碍物,并调整无人机的运动,提升了无人机在飞行过程的安全。
本申请实施例中的感知传感器可以是任意能够感知环境中各对象的三维空间信息的传感器,即可以感知三维空间的三维点与无人机距离的传感器,从而确定各对象在环境中的分布。在一些实施例中,感知传感器包括以下传感器中的一种或者多种:激光雷达、毫米波雷达、多目摄像头、单目摄像头和光谱摄像头。
在一些实施例中,相机可以用于采集无人机运动方向上的图像,从而用户可以基于相机采集的图像观测无人机运动方向上的情况。因而,目标方向可以是响应于用户在遥控设备上的运动控制操作确定的目标运动方向。用户可以通过遥控设备上的控制装置(比如,遥控器上的遥杆)控制无人机的运动方向,因而可以基于用户在遥控设备上的运动控制操作确定无人机的目标运动方向,然后将相机朝向该目标运动方向。比如,用户希望无人机往当前机头方向的斜45°方向前进时,可以将遥控设备的遥杆打杆至当前机头方向的斜45°方向,无人机接收到根据用户的打杆操作确定目标运动方向时,即可以将该目标运动方向确定为目标方向。
在一些实施例中,该目标方向也可以是响应于用户在遥控设备上的朝向设置操作设置的固定方向。在一些场景,用户可能希望相机朝着固定方向,以对固定方向上的对象进行拍摄。此时,即便无人机的运动方向改变,相机的方向一直保持固定不变。因而,遥控设备可以提供设置相机朝向某个固定方向的功能,用户可以通过遥控设备设置相机朝向的固定方向。当无人机接收到用户通过遥控设备设置的固定方向时,即可以将固定方向作为目标方向。
在一些实施例中,该固定方向可以基于遥控设备的指向设置。比如,有些遥控设备自身可能包括陀螺仪等位姿传感器,因而可以结合位姿传感器采集的数据确定当前遥控设备的指向,所以,用户如果想设置固定方向时,可以直接转动遥控设备,将遥控设备的指向某个方向,此时位姿传感器可以检测到该方向,作为该固定方向。此外,用户也可以转动自身的朝向,以改变手中的遥控器的指向,从设置固定方向。此时,将相机朝向遥控器指向的方向,相机采集的图像的视角与用户观测的视角一致,方便用户控制无人机。
由于感知传感器可以感知环境中各对象的分布,以及各对象与无人机的距离,因而,利用感知传感器采集的环境观测信息,既可以实现对无人机进行定位,也可以实现无人机的避障。在一些实施例中,感知传感器的朝向可以基于相机的观测范围和/或感知传感器采集的环境观测信息确定。比如,为了实现无人机飞行在飞行过程中,可以观测到更大的范围,从而及时检测到各个方向的障碍物,避免撞机,感知传感器的朝向可以基于相机的观测范围确定,比如,可以调整感知传感器的范围,使其与相机的观测范围尽量不重合,从而得到更大的组合观测范围。当然,在一些场景,为了更好地利用感知传感器完成避障或 者定位任务,感知传感器的朝向也可以基于感知传感器采集的环境观测信息确定,比如,为了更好的避障,避免无人机撞机,感知传感器可以朝向障碍物分布较多的方向,以对障碍物分布较多的方向进行观测。在一些场景,为了利用感知传感器进行定位,也可以将感知传感器朝向有利于对无人机进行定位的朝向。
在一些实施例中,感知传感器的观测范围至少部分与相机的观测范围不同,从而,可以利用感知传感器和相机得到更大的观测范围,增大无人机的感受野,以对各个方向进行观测,避开障碍物,提升飞行安全。
在一些实施例中,如果目标方向为无人机的目标运动方向,即相机朝向无人机的运动方向时,可以利用感知传感器采集的环境观测信息确定无人机所处环境中的目标区域,该目标区域相对于无人机的方向与目标运动方向不同,即该目标区域不在无人机的目标运动方向上,然后可以调整无人机机身的姿态,将感知传感器朝向该目标区域,以对目标区域进行感知,并调整相机的位姿,以使相机朝向目标运动方向。其中,目标区域可以根据感知传感器待执行的任务确定,比如,要利用感知传感器避障,则目标区域可以是存在动态障碍物的区域,要利用感知传感器定位,则目标区域可以是纹理丰富的区域。
由于相机可以朝向用户设置的固定方向,和无人机的运动方向可以不一致。在一些实施例中,在相机对固定方向进行图像采集的同时,为了方便感知传感器对无人机运动方向进行观测,在控制动力机构和云台工作,使相机朝向目标方向,并使感知传感器朝向与目标方向不同的其他方向时,可以控云台运动,以驱动相机朝向该目标方向,同时可以通过动机机构调整无人机的位姿,使该感知传感器朝向与无人机的实时运动方向一致。其中,无人机的运动方向可以是用户通过遥控设备输入的方向,比如,可以基于用户对遥控设备上的遥杆的控制操作确定的运动方向。
在一些实施例中,可以利用感知传感器对无人机进行定位,为了获得比较准确的定位结果,可以将感知传感器朝向环境中纹理比较丰富的区域,方便无人机根据感知传感器采集的环境观测信息进行定位。因此,感知传感器采集的环境观测信息可以是环境的纹理信息,目标区域也可以基于纹理信息确定。比如,可以将与无人机距离比较近且纹理比较丰富的区域确定为目标区域。
在一些实施例中,在基于感知传感器采集到的环境观测信息调整无人机的运动时,可以基于感知传感器采集到的环境观测信息调整无人机的运动以使无人机避免与环境中的障碍物发生碰撞。比如,当检测到环境中存在障碍物时,可以将无人机悬停,减小无人机飞行速度、或者调整无人机运动方向,避免与障碍物碰撞。在一些实施中,基于感知传感器采集到的环境观测信息调整无人机的运动时,也可以基于感知传感器采集的环境信息控 制无人机与环境中的目标对象保持预设的相对运动状态,比如,跟随环境中的目标对象。
在一些实施例中,在根据感知传感器采集到的环境观测信息调整无人机的运动时,可以先根据感知传感器采集到的环境观测信息确定无人机自身的运动状态数据,然后根据运动状态数据调整无人机的运动。其中,运动状态数据可以是各种表征无人机运动情况的数据,比如,可以是加速度、速度、移动距离和转动角度等一种或者多种。
在一些实施例中,在确定无人机的运动状态数据,以调整无人机的运动时,也可以获取无人机上的位姿传感器采集到的运动监测数据,然后根据感知传感器采集到的环境检测数据和位置传感器采集到运动检测数据融合得到运动状态数据。其中,位姿传感器可以各种可以检测无人机的位置和姿态的传感器,比如,在一些实施例中,位姿传感器可以是陀螺仪、加速度计、惯性测量单元、GPS和GNSS中的一种或者多种。
在一些实施例中,在根据感知传感器采集到的环境观测信息确定无人机自身的运动状态数据时,可以根据环境观测信息计算无人机相对于环境中的固定物体的位移变化数据,然后根据该位移变化数据确定无人机的运动状态数据。比如,可以基于无人机与环境中每个建筑物的相对距离的变化,确定无人机当前飞行了多远的距离。
在一些实施例中,可以监测无人机的位姿传感器的精度信息,在无人机的位姿传感器的精度信息不满足预设条件的情况下,说明此时利用这些位姿传感器进行定位的定位结果不够准确,因而可以利用感知传感器对无人机进行定位,即将感知传感器朝向纹理比较丰富的目标区域。比如,当检测到无人机接收到的GPS定位信号的精度不满足预设的精度时,则可以将感知传感器的朝向调整至纹理比较丰富的目标区域。
当然,如果检测到无人机的位姿传感器的精度信息满足预设条件,即精度较高,则可以利用感知传感器进行定位,比如将感知传感器的朝向调整至与无人机运动方向一致的方向。
此外,由于无人机上的感知传感器通常都是与无人机的机身固定连接的,因而通常机身的位姿调整,感知传感器的朝向也会调整。以感知传感器设置在机身头部为例,目前,无人机的控制模式中,机头的调整一般是考虑有利于相机拍照或者有利于用户控制无人机,比如,在正常飞行模式中,为了方便用户控制无人机,以及方便主相机拍照,机头朝向一致朝向初始方向,即便无人机运动方向改变,也不会调整感知传感器的朝向,造成感知传感器可能感知不到运动方向上的环境信息。以如图1(a)所示的智能跟随模式为例,无人机机头朝向一直朝向被拍摄目标,从而感知传感器也朝向被拍摄目标,无法观测其他方向。可见,目前无人机的控制模式中,感知传感器的朝向只是机械地跟随机头朝向的调整而调整,并未充分发挥感知传感器的作用。
基于此,本申请实施例还提供了另一种无人机的控制方法,可以基于无人机的工作过程中的各类因素(比如,无人机的飞行状态信息、定位信号精度、障碍物的分布等)确定无人机当前待执行任务的任务类型,然后将无人机的感知传感器的朝向调整至有利于执行该待执行任务的方向,以提高感知传感器执行待执行任务时的准确度,充分利用感知传感器,提升飞行安全。
其中,本申请实施例中的无人机包括动力机构,动力机构可以驱动无人机运动,并调整无人机机身的位姿,此外,无人机还包括感知传感器,感知传感器固定连接于无人机的机身,无人机机身位姿的调整,也会带动感知传感器朝向的调整。
具体的,所述方法如图5所示,包括以下步骤:
S502、确定所述感知传感器待执行任务的任务类型;
S504、基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
在无人机工作过程中,可以结合无人机工作过程中的各类信息,比如,无人机的飞行状态信息,无人机接收到的定位信号强度、无人机飞行环境中障碍物的分布、障碍物处于运动状态还是静态状态等确定无人机当前待执行任务的任务类型,然后可以基于无人机待执行任务的任务类型通过无人机的动力机构调整机身的位姿,以将感知传感器的朝向调整至目标朝向,其中,感知传感器位于目标朝向时执行该待执行任务的准确度高于感知传感器位于其他朝向时执行该待执行任务的准确度。比如,如果感知传感器当前待执行任务是对无人机运动方向进行观测,以对运动方向进行避障,则感知传感器可以朝向无人机运动方向。如果感知传感器当前待执行任务是定位,则将感知传感器调整至有利于无人机定位的方向。如果感知传感器当前的任务是对动态障碍物进行观测,避免动态障碍物碰撞无人机,则可以将感知传感器的朝向调整至可以观测到运动障碍物的方向。其中,感知传感器可以是激光雷达、毫米波雷达、多目摄像头、单目摄像头和光谱摄像头,感知传感器的待执行任务可以根据无人机的飞行状况和感知传感器可实现的功能设置,本申请实施例不作限制。
通过根据无人机工作过程的各类信息确定感知传感器待执行任务,将感知传感器的朝向调整至有利于感知传感器执行待执行任务的朝向,可以更加充分的发挥感知传感器的作用,同时,可以提升感知传感器的执行待执行任务的准确度,进而提升无人机飞行安全。
感知传感器可以设置在机身的各个部位,比如,机身头部、机身尾部和/或者机身的上下表面,具体可以根据实际需求设置。在一些实施例中,感知传感器可以位于无人机的机 头位置,在基于该任务类型调整无人机的机身的位姿时,可以基于任务类型调整无人机机头的朝向。
在一些实施例中,无人机上还设有相机,用于采集被拍摄目标的图像或者视频,并传输至无人机的遥控设备中,以通过遥控设备展示给用户。相机可以通过云台可活动地连接于无人机,云台可以驱动相机转动,以改变相机的朝向。为了确保无人机在调整机身位姿,以将感知传感器调整至目标朝向时,相机可以一直朝向被拍摄目标,保证相机对被拍摄目标进行拍摄,在调整机身位姿后,还可以控制云台转动以保持相机的朝向为用户设置的朝向,通常用户设置的朝向即为方便对被拍摄目标进行拍摄的朝向。
感知传感器采集的信息可以用于定位,在一些实施例中,在确定感知传感器待执行任务的任务类型时,可以根据无人机接收到定位信号的定位精度确定感知传感器待执行任务的类型。比如,在一些实施例中,当确定无人机接收到的定位信号的定位精度小于预设精度,则待执行任务的任务类型为定位任务。当然,如果无人机接收到的定位信号的定位精度较好,此时,无需感知传感器也可以准确地完成无人机的定位,那么则可以利用感知传感器执行其他的任务。
在一些实施例中,定位信号可以是GPS信号,当然,也可以是其他具有定位功能的信号。定位信号的定位精度基于以下一种或多种数据确定:定位信号的精度因子、发送定位信号的卫星的数量和无人机的位姿传感器采集的数据。比如,无人机上的惯性测量单元、陀螺仪、加速度计采集的数据。基于姿态测单元采集的数据也可以对无人机进行定位,如果发现基于位姿传感器采集的数据确定的位置和基于定位信号确定位置差别很大,则有可能定位信号精度不够。
在一些实施例中,在确定感知传感器待执行任务的任务类型时,也可以根据无人机所处的空间环境信息确定感知传感器待执行任务的任务类型。比如,当确定无人机飞行环境四周高楼林立,则无人机接收到的定位信号很可能被高楼遮挡,则此时可以将无人机感知传感器待执行任务确定为定位任务。或者,如果确定无人机所处空间中障碍物分布较多,则此时可以将感知传感器待执行的任务确定为避障任务。
当确定无人机待执行的任务为定位任务时,则可以调整机身的位姿,将感知传感器的朝向调整至有利于感知传感器定位的目标朝向。通常,为了方便感知传感器定位,应尽可能将感知传感器的朝向调整至分布物体较多,物体纹理也较丰富的方向。以感知传感器为双目视觉传感器为例,通过双目视觉传感器确定无人机相对于环境中某个对象的位置时,双目视觉传感器可以采集该对象的图像,然后从图像提取特征点,基于特征点的视差确定无人机与该对象的距离。如果感知传感器采集的图像是一片空旷区域,或者是平面、纹理 很少的区域,则无法提取特征点和确定距离,进而用感知传感器定位的定位效果也很差。当然,如果该对象与无人机相据较远,那么根据视差计算得到的距离准确度也会降低。因而,在采用感知传感器定位时,该对象与无人机距离近一些,则定位结果准确度也会高一些。
所以,在一些实施例中,目标朝向可以符合以下条件:即感知传感器朝向目标朝向时,感知传感器可以观测到目标对象,目标对象为无人机所处环境中与无人机的距离小于第一预设距离且目标对象的特征点密度大于第二预设阈值。通过将无人机朝向与无人机距离较近且纹理比较丰富(即提特征点密度较大)的目标对象,可以提升感知传感器的定位精度。其中,第一预设距离和第二预设阈值可以根据实际需求设置,本申请实施例不作限制。
在一些实施例中,在根据任务类型将感知传感器的朝向调整至目标朝向时,可以不断调整机身位姿,以调整感知传感器的朝向,然后通过该感知传感器对无人机所处的环境进行图像采集,得到多帧图像,然后可以提取多帧图像中的特征点,确定各帧图像中特征点的密度。此外,还可以确定特征点的深度信息,以确定环境中各对象与无人机的距离。然后可以基于多帧图像的特征点密度和特征点的深度信息将感知传感器的朝向调整至目标朝向,即将感知传感器朝向环境中与无人机距离较近且纹理较丰富的目标对象。
在一些实施例中,在根据无人机接收到的定位信号的定位精度确定感知传感器待执行任务的任务类型时,如果确定无人机接收到的定位信号的定位精度大于预设精度,说明此时采用接收到的定位信号即可以准确定位,则无需考虑采用感知传感器进行定位,此时,可以将感知传感器的任务类型确定为避障任务,即优先考虑使用感知传感器避障,将感知传感器的朝向调整至有利于避障的朝向。
由于无人机运动方向上的障碍物需要重点关注,所以,在一些实施例中,当该任务类型为避障任务时,则目标朝向应符合以下条件:即当感知传感器处于目标朝向时,无人机当前的运动方向在感知传感器的视野范围内,比如,感知传感器的朝向可以与运动方向一致,或者感知传感器的朝向与运动方向偏差很小,以确保感知传感器可以观测到运动方向上的障碍物。
在一些实施例中,如果待执行任务为避障任务,则在基于该任务类型将感知传感器的朝向调整至目标朝向时,可以根据无人机当前的运动方向以及云台的最大旋转角度将感知传感器的朝向调整至目标朝向。由于调整感知传感器的朝向时,需调整机身的位姿,此时云台搭载的相机的朝向也会调整,为了保证相机可以继续对被拍摄目标进行图像采集,此时,需旋转云台,以保证相机的朝向不变。因而,在调整感知传感器的朝向,使其可以观测到无人机的运动方向时,需根据云台的最大旋转角度以及无人机当前的运动方向确定可 以将机身旋转的角度,避免调整机身位姿后,云台无法反向旋转同样的角度,以保证相机朝向不变。
在一些实施例中,在根据运动方向以及云台的最大旋转角度将感知传感器的朝向调整至目标朝向时,可以先根据运动方向和感知传感器的当前朝向确定目标角度,目标角度即为将感知传感器的朝向调整至与运动方向一致时,需调整的角度,若目标角度小于预设角度,则目标朝向为将感知传感器调整目标角度后的朝向,其中,所述预设角度基于最大旋转角度确定,比如,预设角度可以是云台的最大旋转角度,或者是最大旋转角度减去一定缓冲角度后的角度。在一些实施例中,若目标角度大于预设角度,则目标朝向为将感知传感器调整预设角度后的朝向。
为了进一步解释本申请的无人机控制的方法,以下结合一个具体的实施例对本申请的无人机控制方法加以解释。
如图3所示的无人机,包括固定连接于无人机机头的双目视觉传感器31,以及通过云台可活动地连接于机头的主相机32。目前无人机的控制模式中,为了方便用户对无人机的控制,以及方便主相机拍摄录像,机头通常与主相机朝向同一个方向,由于双目视觉传感器与机身固定连接,使得双目视觉传感器与主相机的朝向也一致,从而未能将双目视觉传感器指向合适的方向,以充分发挥双目视觉传感器的作用。
本实施例提出一种方案,可以根据无人机的飞行状态来调整机头朝向,从而调整双目视觉传感器的朝向,使双目视觉传感器朝向有利于定位或避障的方向,并且可以通过云台调整主相机的方向,使得主相机保持在用户设置的原方向上,不影响用户的操作手感,同时也不会影响主相机的拍摄工作。
双目视觉传感器的方向调整的大体原则如下:
当无人机接收到的GPS信号不好时(比如,无人机近楼飞行的场景),无人机无法准确获取自身的定位信息,会造成位置漂移,控制起来有一定的危险性,这时候可以将机头朝向最近的纹理丰富的目标对象,通过双目视觉传感器采集的图像对无人机进行定位,从而保证飞行安全。
当无人机接收到的GPS信号良好时,无人机能有效获取到自身准确的运动信息,则将利用双目视觉传感器观测无人机运动方向上的障碍物,弥补双目视觉传感器盲区的缺陷,从而保证飞行安全。
1、利用双目视觉传感器定位
如图6所示,无人机起飞时,无人机与用户均面向前方,用户通过遥控器向前推杆后,无人机向用户的正前方飞行,飞行过程中,可以实时检测接收到的GPS信号的 定位精度,比如,可以参考GPS的DOP(Dilution of precision,精度因子)指标和sAcc(Speed Accuracy Estimate)指标,以及卫星数,并结合无人机上的惯性测量单元采集的数据对GPS信号进行校验,综合判断当前GPS定位是否准确可用。
当然,检测的GPS定位信号的定位精度较差时,可以保持云台朝正前方(初始方向)的同时,在云台限位范围内左右旋转机身,查找无人机飞行环境中与无人机最近的、纹理丰富的目标对象所在方向,并将机头保持在这个方向上,以便双目视觉传感器朝向该目标对象,并基于采集的该目标对象的图像完成定位。
在确定环境中与无人机最近的、纹理丰富的目标对象时,可以依次旋转机头至不同的方向,然后通过双目视觉传感器采集图像,然后确定机头朝各方向时采集的图像中特征点的平均密度,可以根据公式(1)确定特征点的平均密度:
Figure PCTCN2021080837-appb-000001
其中,d c表示图中特征点的平均密度,P0表示提取的特征点的数量,P1表示图中像素点总数量;d c越大,纹理越丰富。
然后,针对机头朝向每个方向时采集的图像,可以计算图像中特征点的平均深度,具体如下:
(1)如果最新双目视觉传感器中左右目传感器采集的图像匹配成功的点数大于一定阈值,比如100个点,那就使用左右目传感器采集的图像进行特点匹配,计算的特征点的深度,并求取平均深度。
(2)如果不满足(1),则采用其中一个传感器采集的图像进行帧间特征点匹配,以计算平均深度;
当然,如果通过双目传感器的特征点匹配结果或者单目传感器的特征点匹配结果均无法准确计算深度,就给平均深度赋一个比较大的值,比如500m。
然后从机头朝向的各个方向中选取平均深度最小,特征点密度大于一定阈值的方向,将机头朝向该方向,并保持机头朝着该方向持续飞行。
当然,对于用户的对无人机的控制操作,可以无需考虑机头朝向,依然按照最初的方向作为基准进行控制。
2、利用双目视觉传感器避障
当GPS信号良好,则倾向于控制双目视觉传感器朝向无人机当前的运动方向,以进行避障。
如图7所示,在不超过云台的旋转角度(最大旋转角度为±80°,由于还要预留15°裕量来平滑,因而一般为±65),且视野不被遮挡的前提下(比如60°会看到脚架或螺 旋桨),将机身尽可能朝向当前的运动方向上,对运动方向上的环境进行感知观测,同时保持相机云台朝向正前方。
可以获取无人机机头当前的朝向,获取用户控制无人机的运动方向,根据两个方向之间的差,结合双目视觉传感器的感知的范围以及云台限位角度调整机头朝向,同时保持云台朝向原方向。比如:
基于用户的控制操作确定目标运动向为与当前机头方向的夹角为θ,若θ<(80°-15°)=65°则按θ角度旋转机身,观测运动方向上的障碍物,若θ>65°,则只按照65°旋转机身,观测视角还有±35.5°可以观测到飞行方向上的障碍物。
相应的,本申请实施例还提供了一种无人机的控制装置,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;如图8所示,所述装置包括处理器81、存储器82、存储于所述存储器82可供所述处理器81执行的计算机程序,所述处理器81执行所述计算机程序时,可实现以下步骤:控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标运动方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
在一些实施例中,所述感知传感器的朝向是基于所述相机的观测范围和/或所述环境观测信息确定的。
在一些实施例中,所述感知传感器的观测范围至少部分的与所述相机的观测范围不同。
在一些实施例中,所述处理器还用于:
获取所述无人机所处环境中的目标区域,所述目标区域是基于所述无人机对所述环境观测信息确定的,所述目标区域相对于所述无人机的方向,与所述目标运动方向不同;
调整所述机身的位姿,并控制所述云台工作以调整所述相机的位姿,以使所述相机朝向所述目标运动方向,并使所述感知传感器朝向所述目标区域。
在一些实施例中,所述环境观测信息包括所述环境的纹理信息,所述目标区域是基于所述纹理信息确定的。
在一些实施例中,所述处理器用于基于所述感知传感器采集到的环境观测信息调整所述无人机的运动时,具体用于:
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动以使所述无人机避免与所述环境中的障碍物发生碰撞,和/或,相对于所述环境中的目标物保持预设的相对运动状态。
在一些实施例中,所述处理器用于基于所述感知传感器采集到的环境观测信息调整所述无人机的运动时,具体用于:
基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据;
基于所述运动状态数据调整所述无人机的运动。
在一些实施例中,所述处理器用于基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据时,具体用于:
基于所述环境观测信息计算所述无人机相对于所述环境中的固定物体的位移变化数据;
基于所述位移变化数据确定所述运动状态数据。
在一些实施例中,所述处理器还用于:
获取所述无人机的位姿传感器采集到的运动监测数据;
所述运动状态数据是基于所述环境观测信息与所述运动监测数据融合得到的。
在一些实施例中,所述位姿传感器包括以下传感器中的一种或者多种:陀螺仪、加速度计、惯性测量单元、GPS和GNSS。
在一些实施例中,所述运动状态数据包括以下信息中的一种或者多种:加速度、速度、移动距离和转动角度。
在一些实施例中,所述处理器用于所述控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向之前,还用于:
监测所述无人机的位姿传感器的精度信息;
若所述精度信息不满足预设精度条件,则获取所述无人机所处环境中纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
在一些实施例中,所述感知传感器包括以下传感器中的一种或者多种:激光雷达、毫米波雷达、多目摄像头、单目摄像头和光谱摄像头。
此外,本申请实施例还提供了一种无人机的控制装置,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的所述机身;所述无人机的机身可在所述动力机构的作用下调整位姿;如图8所示,所述装置包括处理器81、存储器82、存储于所述存储器82可供所述处理器81执 行的计算机程序,所述处理器81执行所述计算机程序时,可实现以下步骤:
控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
在一些实施例中,所述目标方向为响应于用户在所述遥控器上的运动控制操作确定的目标运动方向。
在一些实施例中,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向。
在一些实施例中,所述固定方向基于所述遥控器的指向设置。
在一些实施例中,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向;
所述处理器用于控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向时,具体用于:
控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述可移动平台的实时运动方向一致。
在一些实施例中,所述处理器还用于:
基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向。
在一些实施例中,所述处理器用于基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向时,具体用于:
在所述精度信息不满足预设精度条件的情况下,获取所述无人机所处环境中的纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
此外,本申请实施例还一种无人机的控制装置,所述无人机包括动力机构和感知传感器,所述感知传感器固定连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿,如图8所示,所述装置包括处理器81、存储器82、存储于所述存储器82可供所述处理器81执行的计算机程序,所述处理器81执行所述计算机程序时,可实现以下步骤:
确定所述感知传感器待执行任务的任务类型;
基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝 向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
在一些实施例中,所述无人机还设有相机,所述相机通过云台可活动地连接于所述无人机,所述处理器还用于:
控制所述云台转动以保持所述相机的朝向为用户设置的朝向。
在一些实施例中,所述处理器用于确定所述感知传感器待执行任务的任务类型时,具体用于:
基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型;和/或
基于所述无人机所处的空间环境信息确定所述感知传感器待执行任务的任务类型。
在一些实施例中,所述处理器用于基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型时,具体用于:
当确定所述定位精度小于预设精度,则所述任务类型为定位任务。
在一些实施例中,所述任务类型为定位任务,所述目标朝向符合以下条件:
能够观测到目标对象,所述目标对象与所述无人机的距离小于第一预设阈值且所述目标对象的特征点密度大于第二预设阈值。
在一些实施例中,所述处理器用于基于所述任务类型将所述感知传感器的朝向调整至目标朝向时,具体用于:
在调整所述感知传感器的朝向的过程中,通过所述感知传感器对所述环境进行图像采集,得到多帧图像;
根据所述多帧图像中的特征点的密度以及所述特征点的深度信息将所述感知传感器的朝向调整至目标朝向。
在一些实施例中,所述处理器用于基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型时,具体用于:
当确定所述定位精度大于预设精度,则所述任务类型为避障任务。
在一些实施例中,所述任务类型为避障任务,在所述感知传感器处于所述目标朝向时,所述无人机当前的运动方向在所述感知传感器的视野范围内。
在一些实施例中,所述处理器用于基于所述任务类型将所述感知传感器的朝向调整至目标朝向时,具体用于:
根据所述无人机当前的运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向。
在一些实施例中,所述处理器用于根据所述运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向时,具体用于:
根据所述运动方向和所述感知传感器的当前朝向确定目标角度;
若所述目标角度小于预设角度,所述目标朝向为将所述感知传感器调整所述目标角度后的朝向,其中,所述预设角度基于所述最大旋转角度确定。
在一些实施例中,所述处理器用于根据所述运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向时,具体用于:
若所述目标角度大于预设角度,所述目标朝向为将所述感知传感器调整所述预设角度后的朝向。
在一些实施例中,所述感知传感器位于所述无人机的机头位置,所述处理利器用于基于所述任务类型调整所述无人机的机身的朝向时,具体用于:
基于所述任务类型调整所述无人机机头的朝向。
在一些实施例中,所述定位信号为GPS信号,所述定位信号的定位精度基于以下一种或多种数据确定:所述定位信号的精度因子、发送所述定位信号的卫星的数量和所述无人机的惯性测量单元测量的数据。
此外,本申请还提供一种无人机,所述无人机包括感知传感器、相机、动力机构以及上述任一项实施例所述的控制装置,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿。
相应地,本说明书实施例还提供一种计算机存储介质,所述存储介质中存储有程序,所述程序被处理器执行时实现上述任一实施例中无人机的控制方法。
本说明书实施例可采用在一个或多个其中包含有程序代码的存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。计算机可用存储介质包括永久性和非永久性、可移动和非可移动媒体,可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括但不限于:相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实 施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本发明实施例所提供的方法和装置进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (67)

  1. 一种无人机的控制方法,其特征在于,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;
    所述方法包括:
    控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
    接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标运动方向;
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向;
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
  2. 根据权利要求1所述的方法,其特征在于,所述感知传感器的朝向是基于所述相机的观测范围和/或所述环境观测信息确定的。
  3. 根据权利要求1所述的方法,其特征在于,其特征在于,所述感知传感器的观测范围至少部分的与所述相机的观测范围不同。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    获取所述无人机所处环境中的目标区域,所述目标区域是基于所述无人机对所述环境观测信息确定的,所述目标区域相对于所述无人机的方向,与所述目标运动方向不同;
    调整所述机身的位姿,并控制所述云台工作以调整所述相机的位姿,以使所述相机朝向所述目标运动方向,并使所述感知传感器朝向所述目标区域。
  5. 根据权利要求4所述的方法,其特征在于,所述环境观测信息包括所述环境的纹理信息,所述目标区域是基于所述纹理信息确定的。
  6. 根据权利要求1所述的方法,其特征在于,基于所述感知传感器采集到的环境观测信息调整所述无人机的运动,包括:
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动以使所述无人机避免与所述环境中的障碍物发生碰撞,和/或,相对于所述环境中的目标物保持预设的相对运动状态。
  7. 根据权利要求1所述的方法,其特征在于,基于所述感知传感器采集到的环境观测信息调整所述无人机的运动,包括:
    基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据;
    基于所述运动状态数据调整所述无人机的运动。
  8. 根据权利要求7所述的方法,其特征在于,基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据,包括:
    基于所述环境观测信息计算所述无人机相对于所述环境中的固定物体的位移变化数据;
    基于所述位移变化数据确定所述运动状态数据。
  9. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    获取所述无人机的位姿传感器采集到的运动监测数据;
    所述运动状态数据是基于所述环境观测信息与所述运动监测数据融合得到的。
  10. 根据权利要求9所述的方法,其特征在于,所述位姿传感器包括以下传感器中的一种或者多种:陀螺仪、加速度计、惯性测量单元、GPS和GNSS。
  11. 根据权利要求7所述的方法,其特征在于,其特征在于,所述运动状态数据包括以下信息中的一种或者多种:加速度、速度、移动距离和转动角度。
  12. 根据权利要求1所述的方法,其特征在于,所述控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向之前,还包括:
    监测所述无人机的位姿传感器采集的数据的精度信息;
    若所述精度信息不满足预设精度条件,则获取所述无人机所处环境中纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述感知传感器包括以下传感器中的一种或者多种:激光雷达、毫米波雷达、多目摄像头、单目摄像头和光谱摄像头。
  14. 一种无人机的控制方法,其特征在于,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的所述机身;所述无人机的机身可在所述动力机构的作用下调整位姿;
    所述方法包括:
    控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
    接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
  15. 根据权利要求14所述的方法,其特征在于,所述目标方向为响应于用户在所述遥控器上的运动控制操作确定的目标运动方向。
  16. 根据权利要求14所述的方法,其特征在于,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向。
  17. 根据权利要求16所述的方法,其特征在于,所述固定方向基于所述遥控器的指向设置。
  18. 根据权利要求14所述的方法,其特征在于,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向;
    所述控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向,包括:
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述可移动平台的实时运动方向一致。
  19. 根据权利要求14所述的方法,其特征在于,所述方法还包括:
    基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向。
  20. 根据权利要求19所述的方法,其特征在于,基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向包括:
    在所述精度信息不满足预设精度条件的情况下,获取所述无人机所处环境中的纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
  21. 一种无人机的控制方法,其特征在于,所述无人机包括动力机构和感知传感器,所述感知传感器固定连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿,所述方法包括:
    确定所述感知传感器待执行任务的任务类型;
    基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
  22. 根据权利要求21所述的方法,其特征在于,所述无人机还设有相机,所述相机通过云台可活动地连接于所述无人机,所述方法还包括:
    控制所述云台转动以保持所述相机的朝向为用户设置的朝向。
  23. 根据权利要求21所述的方法,其特征在于,确定所述感知传感器待执行任务的任务类型,包括:
    基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务 类型;和/或
    基于所述无人机所处的空间环境信息确定所述感知传感器待执行任务的任务类型。
  24. 根据权利要求23所述的方法,其特征在于,基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型,包括:
    当确定所述定位精度小于预设精度,则所述任务类型为定位任务。
  25. 根据权利要求24所述的方法,其特征在于,所述任务类型为定位任务,所述目标朝向符合以下条件:
    能够观测到目标对象,所述目标对象与所述无人机的距离小于第一预设阈值且所述目标对象的特征点密度大于第二预设阈值。
  26. 根据权利要求24或25所述的方法,其特征在于,基于所述任务类型将所述感知传感器的朝向调整至目标朝向,包括:
    在调整所述感知传感器的朝向的过程中,通过所述感知传感器对所述环境进行图像采集,得到多帧图像;
    根据所述多帧图像中的特征点的密度以及所述特征点的深度信息将所述感知传感器的朝向调整至目标朝向。
  27. 根据权利要求24-26任一项所述的方法,其特征在于,基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型,包括:
    当确定所述定位精度大于预设精度,则所述任务类型为避障任务。
  28. 根据权利要求27所述的方法,其特征在于,所述任务类型为避障任务,在所述感知传感器处于所述目标朝向时,所述无人机当前的运动方向在所述感知传感器的视野范围内。
  29. 根据权利要求27或28所述的方法,其特征在于,基于所述任务类型将所述感知传感器的朝向调整至目标朝向,包括:
    根据所述无人机当前的运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向。
  30. 根据权利要求29所述的方法,其特征在于,根据所述运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向,包括:
    根据所述运动方向和所述感知传感器的当前朝向确定目标角度;
    若所述目标角度小于预设角度,所述目标朝向为将所述感知传感器调整所述目标角度后的朝向,其中,所述预设角度基于所述最大旋转角度确定。
  31. 根据权利要求30所述的方法,其特征在于,根据所述运动方向以及所述云台的最 大旋转角度将所述感知传感器的朝向调整至所述目标朝向,包括:
    若所述目标角度大于预设角度,所述目标朝向为将所述感知传感器调整所述预设角度后的朝向。
  32. 根据权利要求22-31任一项所述的方法,其特征在于,所述感知传感器位于所述无人机的机头位置,基于所述任务类型调整所述无人机的机身的朝向,包括:
    基于所述任务类型调整所述无人机机头的朝向。
  33. 根据权利要求21-32任一项所述的方法,其特征在于,所述定位信号为GPS信号,所述定位信号的定位精度基于以下一种或多种数据确定:所述定位信号的精度因子、发送所述定位信号的卫星的数量和所述无人机的惯性测量单元测量的数据。
  34. 一种无人机的控制装置,其特征在于,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身;所述无人机还包括动力机构,所述无人机的机身可在所述动力机构的作用下调整位姿;
    所述装置包括处理器、存储器、存储于所述存储器可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时,可实现以下步骤:
    控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
    接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标运动方向;
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向;
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
  35. 根据权利要求34所述的装置,其特征在于,所述感知传感器的朝向是基于所述相机的观测范围和/或所述环境观测信息确定的。
  36. 根据权利要求34所述的装置,其特征在于,其特征在于,所述感知传感器的观测范围至少部分的与所述相机的观测范围不同。
  37. 根据权利要求34所述的装置,其特征在于,所述处理器还用于:
    获取所述无人机所处环境中的目标区域,所述目标区域是基于所述无人机对所述环境观测信息确定的,所述目标区域相对于所述无人机的方向,与所述目标运动方向不同;
    调整所述机身的位姿,并控制所述云台工作以调整所述相机的位姿,以使所述相机朝向所述目标运动方向,并使所述感知传感器朝向所述目标区域。
  38. 根据权利要求37所述的装置,其特征在于,所述环境观测信息包括所述环境的纹 理信息,所述目标区域是基于所述纹理信息确定的。
  39. 根据权利要求34所述的装置,其特征在于,所述处理器用于基于所述感知传感器采集到的环境观测信息调整所述无人机的运动时,具体用于:
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动以使所述无人机避免与所述环境中的障碍物发生碰撞,和/或,相对于所述环境中的目标物保持预设的相对运动状态。
  40. 根据权利要求34所述的装置,其特征在于,所述处理器用于基于所述感知传感器采集到的环境观测信息调整所述无人机的运动时,具体用于:
    基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据;
    基于所述运动状态数据调整所述无人机的运动。
  41. 根据权利要求40所述的装置,其特征在于,所述处理器用于基于所述感知传感器采集到的环境观测信息确定所述无人机自身的运动状态数据时,具体用于:
    基于所述环境观测信息计算所述无人机相对于所述环境中的固定物体的位移变化数据;
    基于所述位移变化数据确定所述运动状态数据。
  42. 根据权利要求40所述的装置,其特征在于,所述处理器还用于:
    获取所述无人机的位姿传感器采集到的运动监测数据;
    所述运动状态数据是基于所述环境观测信息与所述运动监测数据融合得到的。
  43. 根据权利要求42所述的装置,其特征在于,所述位姿传感器包括以下传感器中的一种或者多种:陀螺仪、加速度计、惯性测量单元、GPS和GNSS。
  44. 根据权利要求40所述的装置,其特征在于,其特征在于,所述运动状态数据包括以下信息中的一种或者多种:加速度、速度、移动距离和转动角度。
  45. 根据权利要求34所述的装置,其特征在于,所述处理器用于所述控制所述动力机构和所述云台工作,使所述相机朝向所述目标运动方向,并使所述感知传感器朝向与所述目标运动方向不同的其他方向之前,还用于:
    监测所述无人机的位姿传感器的精度信息;
    若所述精度信息不满足预设精度条件,则获取所述无人机所处环境中纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
  46. 根据权利要求34-45任一项所述的装置,其特征在于,所述感知传感器包括以下传感器中的一种或者多种:激光雷达、毫米波雷达、多目摄像头、单目摄像头和光谱摄像头。
  47. 一种无人机的控制装置,其特征在于,所述无人机包括感知传感器和相机,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的所述机身;所述无人机的机身可在所述动力机构的作用下调整位姿;
    所述装置包括处理器、存储器、存储于所述存储器可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时,可实现以下步骤:
    控制所述相机工作,将所述相机采集的影像发送至用于对所述无人机进行控制的遥控设备,所述影像用于在所述遥控设备上展示;
    接收所述遥控设备的操作指令,并基于所述操作指令确定所述无人机的目标方向;
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向;
    基于所述感知传感器采集到的环境观测信息调整所述无人机的运动。
  48. 根据权利要求47所述的装置,其特征在于,所述目标方向为响应于用户在所述遥控器上的运动控制操作确定的目标运动方向。
  49. 根据权利要求47所述的装置,其特征在于,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向。
  50. 根据权利要求49所述的装置,其特征在于,所述固定方向基于所述遥控器的指向设置。
  51. 根据权利要求47所述的装置,其特征在于,所述目标方向为响应于用户在所述遥控器上的朝向设置操作设置的固定方向;
    所述处理器用于控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述目标方向不同的其他方向时,具体用于:
    控制所述动力机构和所述云台工作,使所述相机朝向所述目标方向,并使所述感知传感器朝向与所述可移动平台的实时运动方向一致。
  52. 根据权利要求47所述的装置,其特征在于,所述处理器还用于:
    基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向。
  53. 根据权利要求52所述的装置,其特征在于,所述处理器用于基于所述无人机的位姿传感器采集的数据的精度信息,确定所述感知传感器的朝向时,具体用于:
    在所述精度信息不满足预设精度条件的情况下,获取所述无人机所处环境中的纹理丰富的目标区域,控制所述感知传感器朝向所述目标区域。
  54. 一种无人机的控制装置,其特征在于,所述无人机包括动力机构和感知传感器,所述感知传感器固定连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作 用下调整位姿,所述装置包括处理器、存储器、存储于所述存储器可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序时,可实现以下步骤:
    确定所述感知传感器待执行任务的任务类型;
    基于所述任务类型通过所述动力机构调整所述机身的位姿,以将所述感知传感器的朝向调整至目标朝向;其中,所述感知传感器位于所述目标朝向时执行所述待执行任务的准确度高于所述感知感器位于其他朝向时执行所述待执行任务的准确度。
  55. 根据权利要求54所述的装置,其特征在于,所述无人机还设有相机,所述相机通过云台可活动地连接于所述无人机,所述处理器还用于:
    控制所述云台转动以保持所述相机的朝向为用户设置的朝向。
  56. 根据权利要求54所述的装置,其特征在于,所述处理器用于确定所述感知传感器待执行任务的任务类型时,具体用于:
    基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型;和/或
    基于所述无人机所处的空间环境信息确定所述感知传感器待执行任务的任务类型。
  57. 根据权利要求56所述的装置,其特征在于,所述处理器用于基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型时,具体用于:
    当确定所述定位精度小于预设精度,则所述任务类型为定位任务。
  58. 根据权利要求57所述的装置,其特征在于,所述任务类型为定位任务,所述目标朝向符合以下条件:
    能够观测到目标对象,所述目标对象与所述无人机的距离小于第一预设阈值且所述目标对象的特征点密度大于第二预设阈值。
  59. 根据权利要求57或58所述的装置,其特征在于,所述处理器用于基于所述任务类型将所述感知传感器的朝向调整至目标朝向时,具体用于:
    在调整所述感知传感器的朝向的过程中,通过所述感知传感器对所述环境进行图像采集,得到多帧图像;
    根据所述多帧图像中的特征点的密度以及所述特征点的深度信息将所述感知传感器的朝向调整至目标朝向。
  60. 根据权利要求57-59任一项所述的装置,其特征在于,所述处理器用于基于所述无人机接收到的定位信号的定位精度确定所述感知传感器待执行任务的任务类型时,具体用于:
    当确定所述定位精度大于预设精度,则所述任务类型为避障任务。
  61. 根据权利要求60所述的装置,其特征在于,所述任务类型为避障任务,在所述感知传感器处于所述目标朝向时,所述无人机当前的运动方向在所述感知传感器的视野范围内。
  62. 根据权利要求60或61所述的装置,其特征在于,所述处理器用于基于所述任务类型将所述感知传感器的朝向调整至目标朝向时,具体用于:
    根据所述无人机当前的运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向。
  63. 根据权利要求62所述的装置,其特征在于,所述处理器用于根据所述运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向时,具体用于:
    根据所述运动方向和所述感知传感器的当前朝向确定目标角度;
    若所述目标角度小于预设角度,所述目标朝向为将所述感知传感器调整所述目标角度后的朝向,其中,所述预设角度基于所述最大旋转角度确定。
  64. 根据权利要求63所述的装置,其特征在于,所述处理器用于根据所述运动方向以及所述云台的最大旋转角度将所述感知传感器的朝向调整至所述目标朝向时,具体用于:
    若所述目标角度大于预设角度,所述目标朝向为将所述感知传感器调整所述预设角度后的朝向。
  65. 根据权利要求54-64任一项所述的装置,其特征在于,所述感知传感器位于所述无人机的机头位置,所述处理利器用于基于所述任务类型调整所述无人机的机身的朝向时,具体用于:
    基于所述任务类型调整所述无人机机头的朝向。
  66. 根据权利要求54-65任一项所述的装置,其特征在于,所述定位信号为GPS信号,所述定位信号的定位精度基于以下一种或多种数据确定:所述定位信号的精度因子、发送所述定位信号的卫星的数量和所述无人机的惯性测量单元测量的数据。
  67. 一种无人机,其特征在于,所述无人机包括感知传感器、相机、动力机构以及如权利要求34-66任一项所述的装置,所述感知传感器固定连接于所述无人机的机身,所述相机基于云台可活动的连接于所述无人机的机身,所述无人机的机身可在所述动力机构的作用下调整位姿。
PCT/CN2021/080837 2021-03-15 2021-03-15 无人机的控制方法、装置及无人机 WO2022193081A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/080837 WO2022193081A1 (zh) 2021-03-15 2021-03-15 无人机的控制方法、装置及无人机
CN202180084779.8A CN116745720A (zh) 2021-03-15 2021-03-15 无人机的控制方法、装置及无人机

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/080837 WO2022193081A1 (zh) 2021-03-15 2021-03-15 无人机的控制方法、装置及无人机

Publications (1)

Publication Number Publication Date
WO2022193081A1 true WO2022193081A1 (zh) 2022-09-22

Family

ID=83321772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/080837 WO2022193081A1 (zh) 2021-03-15 2021-03-15 无人机的控制方法、装置及无人机

Country Status (2)

Country Link
CN (1) CN116745720A (zh)
WO (1) WO2022193081A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115877872A (zh) * 2023-03-03 2023-03-31 中国人民解放军军事科学院国防科技创新研究院 一种基于无人机载体使用的天线云台综合控制方法及系统
WO2024092586A1 (zh) * 2022-11-02 2024-05-10 深圳市大疆创新科技有限公司 无人机的控制方法、装置及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0604252A1 (fr) * 1992-12-22 1994-06-29 SAT (Société Anonyme de Télécommunications) Procédé d'aide au pilotage d'un aéronef volant à basse altitude
US20110049290A1 (en) * 2008-02-13 2011-03-03 Parrot method of piloting a rotary-wing drone with automatic stabilization of hovering flight
CN105974929A (zh) * 2016-06-11 2016-09-28 深圳市哈博森科技有限公司 一种基于智能装置操控的无人机控制方法
CN108062106A (zh) * 2016-11-09 2018-05-22 三星电子株式会社 无人驾驶飞行器和用于使用无人驾驶飞行器拍摄被摄物的方法
CN109116865A (zh) * 2018-09-19 2019-01-01 苏州傲特欣智能科技有限公司 基于机器视觉的大型设备无人机巡检系统及其方法
CN109765930A (zh) * 2019-01-29 2019-05-17 理光软件研究所(北京)有限公司 一种无人机视觉导航系统
CN110347171A (zh) * 2019-07-12 2019-10-18 深圳市道通智能航空技术有限公司 一种飞行器控制方法及飞行器

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0604252A1 (fr) * 1992-12-22 1994-06-29 SAT (Société Anonyme de Télécommunications) Procédé d'aide au pilotage d'un aéronef volant à basse altitude
US20110049290A1 (en) * 2008-02-13 2011-03-03 Parrot method of piloting a rotary-wing drone with automatic stabilization of hovering flight
CN105974929A (zh) * 2016-06-11 2016-09-28 深圳市哈博森科技有限公司 一种基于智能装置操控的无人机控制方法
CN108062106A (zh) * 2016-11-09 2018-05-22 三星电子株式会社 无人驾驶飞行器和用于使用无人驾驶飞行器拍摄被摄物的方法
CN109116865A (zh) * 2018-09-19 2019-01-01 苏州傲特欣智能科技有限公司 基于机器视觉的大型设备无人机巡检系统及其方法
CN109765930A (zh) * 2019-01-29 2019-05-17 理光软件研究所(北京)有限公司 一种无人机视觉导航系统
CN110347171A (zh) * 2019-07-12 2019-10-18 深圳市道通智能航空技术有限公司 一种飞行器控制方法及飞行器

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024092586A1 (zh) * 2022-11-02 2024-05-10 深圳市大疆创新科技有限公司 无人机的控制方法、装置及存储介质
CN115877872A (zh) * 2023-03-03 2023-03-31 中国人民解放军军事科学院国防科技创新研究院 一种基于无人机载体使用的天线云台综合控制方法及系统

Also Published As

Publication number Publication date
CN116745720A (zh) 2023-09-12

Similar Documents

Publication Publication Date Title
WO2018214078A1 (zh) 拍摄控制方法及装置
WO2019113966A1 (zh) 一种避障方法、装置和无人机
WO2018107419A1 (zh) 控制方法、装置、设备及可移动平台
US20180348764A1 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
WO2018209702A1 (zh) 无人机的控制方法、无人机以及机器可读存储介质
WO2018227350A1 (zh) 无人机返航控制方法、无人机和机器可读存储介质
US11057604B2 (en) Image processing method and device
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
WO2022193081A1 (zh) 无人机的控制方法、装置及无人机
US20220086362A1 (en) Focusing method and apparatus, aerial camera and unmanned aerial vehicle
WO2018072063A1 (zh) 一种对飞行器的飞行控制方法、装置及飞行器
WO2020233682A1 (zh) 一种自主环绕拍摄方法、装置以及无人机
WO2020135447A1 (zh) 一种目标距离估计方法、装置及无人机
WO2022036500A1 (zh) 无人飞行器的飞行辅助方法、设备、芯片、系统及介质
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2020024182A1 (zh) 一种参数处理方法、装置及摄像设备、飞行器
WO2020019130A1 (zh) 运动估计方法及可移动设备
JP7435599B2 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2021056144A1 (zh) 可移动平台的返航控制方法、装置及可移动平台
WO2021238743A1 (zh) 一种无人机飞行控制方法、装置及无人机
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
WO2022000211A1 (zh) 拍摄系统的控制方法、设备、及可移动平台、存储介质
WO2022056683A1 (zh) 视场确定方法、视场确定装置、视场确定系统和介质
CN113625750A (zh) 一种基于毫米波与深度视觉相机结合的无人机避障系统
CN112313942A (zh) 一种进行图像处理和框架体控制的控制装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930678

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180084779.8

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930678

Country of ref document: EP

Kind code of ref document: A1