CN117693722A - Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium - Google Patents

Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN117693722A
CN117693722A CN202180100503.4A CN202180100503A CN117693722A CN 117693722 A CN117693722 A CN 117693722A CN 202180100503 A CN202180100503 A CN 202180100503A CN 117693722 A CN117693722 A CN 117693722A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
obstacle
route
detour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180100503.4A
Other languages
Chinese (zh)
Inventor
黄兴鸿
高翔
高文良
田原原
王璐
贾向华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN117693722A publication Critical patent/CN117693722A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method of controlling a drone, comprising: when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor and the visual sensor (S101); according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode, wherein the first bypass mode meets the following conditions: the difference between the height of the target work object from the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle when the unmanned aerial vehicle bypasses the obstacle is smaller than or equal to a preset threshold (S102). The method can improve the operation efficiency and the operation effect of the unmanned aerial vehicle while ensuring the operation safety of the unmanned aerial vehicle.

Description

Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium Technical Field
The application relates to the field of unmanned aerial vehicles, in particular to a control method and device of an unmanned aerial vehicle, the unmanned aerial vehicle and a storage medium.
Background
Along with the rapid development of unmanned aerial vehicle manufacturing industry, more and more users begin to adopt unmanned aerial vehicles to operate and, especially, utilize unmanned aerial vehicles to carry out pesticide spraying, chemical fertilizer spraying, crop scattering and the like, have advantages such as little to crops harm, pesticide utilization rate is high and reduce intensity of labour. At present, unmanned aerial vehicle is in the operation in-process, if discovery barrier, unmanned aerial vehicle opportunistics dodges the barrier, but when dodging the barrier, the problem that unmanned aerial vehicle skew operation thing appears easily, influences operating efficiency and operation effect, and user experience is not good.
Disclosure of Invention
Based on this, the embodiment of the application provides a control method, a control device, an unmanned aerial vehicle and a storage medium of the unmanned aerial vehicle, and aims to improve the operation efficiency and the operation effect of the unmanned aerial vehicle while ensuring the operation safety of the unmanned aerial vehicle.
In a first aspect, an embodiment of the present application provides a control method of an unmanned aerial vehicle, where the unmanned aerial vehicle is capable of acquiring sensing data of a distance sensor and a vision sensor, the method includes:
when the unmanned aerial vehicle performs ground-imitating flight, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor and the visual sensor;
according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode;
wherein, the first bypass mode satisfies: the unmanned aerial vehicle bypasses the difference between the height of the target operation object of the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle when the obstacle flies is smaller than or equal to a preset threshold value, and the ground-imitating flying height is the constant height of the unmanned aerial vehicle away from the target operation object when the unmanned aerial vehicle flies in a ground-imitating flying operation.
In a second aspect, an embodiment of the present application further provides a control device of an unmanned aerial vehicle, where the unmanned aerial vehicle can obtain sensing data of a distance sensor and a vision sensor, and the control device includes a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and implement the control method of the unmanned aerial vehicle as described above when the computer program is executed.
In a third aspect, an embodiment of the present application further provides an unmanned aerial vehicle, including:
a body;
and the power system is arranged on the machine body and is used for providing flight power for the unmanned aerial vehicle.
The distance sensor and the visual sensor are fixedly connected or detachably connected with the machine body;
the control device of the unmanned aerial vehicle is arranged in the machine body and used for controlling the unmanned aerial vehicle.
In a fourth aspect, embodiments of the present application further provide a storage medium storing a computer program, which when executed by a processor causes the processor to implement the method for controlling a drone as described above.
According to the control method, the control device, the unmanned aerial vehicle and the storage medium, when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the distance sensor and the visual sensor are effective, the distance sensor and the visual sensor are utilized to accurately identify the obstacle information in the air line direction of the unmanned aerial vehicle, then the unmanned aerial vehicle is controlled to bypass the obstacle flight in the air line direction according to the accurate obstacle information, so that the unmanned aerial vehicle can accurately bypass the obstacle flight, the difference value between the height of the unmanned aerial vehicle from a target operation object and the imitation ground flight height is smaller than or equal to a preset threshold value, the unmanned aerial vehicle is prevented from deviating from the operation object when the unmanned aerial vehicle avoids the obstacle, the operation effect and the operation efficiency are affected, and the operation efficiency and the operation effect of the unmanned aerial vehicle are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a scenario for implementing a control method of a drone according to an embodiment of the present application;
fig. 2 is a schematic flowchart of steps of a control method of a unmanned aerial vehicle according to an embodiment of the present application;
FIG. 3 is a schematic view of a three-dimensional coordinate system of the unmanned aerial vehicle according to the embodiment of the present application;
fig. 4 is a schematic flow chart of steps of another control method of the unmanned aerial vehicle according to the embodiment of the present application;
FIG. 5 is a schematic flow chart of a sub-step of the control method of the drone of FIG. 3;
FIG. 6 is another sub-step schematic flow chart of the control method of the drone of FIG. 3;
fig. 7 is a schematic block diagram of a control device of the unmanned aerial vehicle according to an embodiment of the present application;
Fig. 8 is a schematic block diagram of a structure of a unmanned aerial vehicle according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
Along with the rapid development of unmanned aerial vehicle manufacturing industry, more and more users begin to adopt unmanned aerial vehicles to operate and, especially, utilize unmanned aerial vehicles to carry out pesticide spraying, chemical fertilizer spraying, crop scattering and the like, have advantages such as little to crops harm, pesticide utilization rate is high and reduce intensity of labour. At present, unmanned aerial vehicle is in the operation in-process, if discovery barrier, unmanned aerial vehicle opportunistics dodges the barrier, but when dodging the barrier, the problem that unmanned aerial vehicle skew operation thing appears easily, influences operating efficiency and operation effect, and user experience is not good.
In order to solve the above problems, the embodiment of the application provides a control method, a control device, an unmanned aerial vehicle and a storage medium of the unmanned aerial vehicle, wherein when the unmanned aerial vehicle flies in an imitation ground, if the sensing of a distance sensor and a visual sensor are effective, the distance sensor and the visual sensor are utilized to accurately identify the obstacle information in the route direction of the unmanned aerial vehicle, and then the unmanned aerial vehicle is controlled to fly by bypassing the obstacle in the route direction according to the accurate obstacle information in a first bypassing mode, so that the unmanned aerial vehicle can fly by bypassing the obstacle accurately, and meanwhile, the difference value between the height of the unmanned aerial vehicle from a target operation object and the flight height of the imitation ground is smaller than or equal to a preset threshold value, the unmanned aerial vehicle is prevented from deviating from the operation object when avoiding the obstacle, the operation effect and the operation efficiency are affected, and the operation efficiency of the unmanned aerial vehicle are further improved.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic view of a scenario for implementing a control method of a unmanned aerial vehicle according to an embodiment of the present application. As shown in fig. 1, the scenario includes a drone 100 and a control terminal 200, the control terminal 200 being communicatively connected to the drone 100 for controlling the drone 100.
In an embodiment, the unmanned aerial vehicle 100 includes a body 110, a power system 120, a distance sensor 130, a vision sensor 140, and a control device (not shown in fig. 1), the power system 120 is disposed on the body 110 and is used for providing flight power for the unmanned aerial vehicle 100, the distance sensor 130 and the vision sensor 140 are fixedly connected or detachably connected with the body 110, the distance sensor 130 and the vision sensor 140 are used for sensing the environment around the unmanned aerial vehicle 100 to generate sensing data, and the control device is used for controlling the flight of the unmanned aerial vehicle 100. The distance sensor 130 includes a radar device, which may include millimeter wave radar, laser radar.
In an embodiment, the unmanned aerial vehicle 100 may not be equipped with the distance sensor 130 and the vision sensor 140, but may be in communication connection with a device equipped with the distance sensor and the vision sensor, so that the unmanned aerial vehicle 100 can acquire sensing data of the distance sensor and the vision sensor. The device on which the distance sensor and the vision sensor are mounted can recognize obstacle information in the course direction of the unmanned aerial vehicle 100 by using the distance sensor and/or the vision sensor.
The power system 120 may include one or more propellers 121, one or more motors 122 corresponding to the one or more propellers, and one or more electronic speed regulators (simply referred to as electric regulators), among others. Wherein, the motor 122 is connected between the electronic speed regulator and the propeller 121, and the motor 122 and the propeller 121 are arranged on the body 110 of the unmanned aerial vehicle 100; the electronic governor is used for receiving a driving signal generated by the control device and providing a driving current to the motor 122 according to the driving signal so as to control the rotating speed of the motor 122. The motor 122 is used to drive the propeller 121 in rotation to power the flight of the drone 100, which enables one or more degrees of freedom of movement of the drone 100. In certain embodiments, the drone 100 may rotate about one or more axes of rotation. For example, the rotation axis may include a yaw axis, and a pitch axis. It should be appreciated that the motor 122 may be a DC motor or an AC motor. The motor 122 may be a brushless motor or a brushed motor.
In an embodiment, when the control device controls the unmanned aerial vehicle 100 to perform ground-like flight, if the sensing of the distance sensor 130 and the visual sensor 140 are both effective, the distance sensor 130 and the visual sensor 140 are utilized to identify the obstacle information in the route direction of the unmanned aerial vehicle 100, and according to the obstacle information, the control device controls the unmanned aerial vehicle 100 to fly around the obstacle in the route direction according to the first detour direction; if the sensing of the distance sensor 130 is valid and the sensing of the vision sensor 140 is invalid, the distance sensor 130 is used to identify the obstacle information in the route direction of the unmanned aerial vehicle 100, and according to the obstacle information, the unmanned aerial vehicle 100 is controlled to fly around the obstacle in the route direction according to the second detour direction.
Wherein, the first bypass mode satisfies: the difference between the height of the target working object of the unmanned aerial vehicle 100 and the ground-imitating flying height of the unmanned aerial vehicle 100 when the unmanned aerial vehicle 100 flies around an obstacle is smaller than or equal to a preset threshold value, and the ground-imitating flying height is a constant height of the unmanned aerial vehicle 100 from the target working object when the unmanned aerial vehicle 100 flies in a ground-imitating flying operation. The second bypass mode satisfies: the unmanned aerial vehicle 100 bypasses the obstacle and flies at the same height from the target work object of the unmanned aerial vehicle 100 as the ground-imitating flying height of the unmanned aerial vehicle 100.
The control terminal 200 is communicatively connected to a display device 210, and the display device 210 is configured to display an image sent by the unmanned aerial vehicle 100. The display device 210 includes a display screen provided on the control terminal 200 or a display independent of the control terminal 200, and the display independent of the control terminal 200 may include a mobile phone, a tablet computer, a personal computer, or other electronic devices with a display screen. The display screen comprises an LED display screen, an OLED display screen, an LCD display screen and the like.
Unmanned aerial vehicle 100 includes rotor unmanned aerial vehicle, for example, twin-rotor unmanned aerial vehicle, quad-rotor unmanned aerial vehicle, six-rotor unmanned aerial vehicle, eight-rotor unmanned aerial vehicle, also can be fixed wing unmanned aerial vehicle, also can be the combination of wing section and fixed wing unmanned aerial vehicle, does not do not specifically limit here. The control terminal 200 may include, but is not limited to: smart phones/handsets, tablet computers, personal Digital Assistants (PDAs), desktop computers, media content players, video game stations/systems, virtual reality systems, augmented reality systems, wearable devices (e.g., watches, glasses, gloves, headwear (e.g., hats, helmets, virtual reality headphones, augmented reality headphones, head Mounted Devices (HMDs), headbands).
The following describes in detail a control method of the unmanned aerial vehicle provided in the embodiment of the present application with reference to the scenario in fig. 1. It should be noted that, the scenario in fig. 1 is only used to explain the control method of the unmanned aerial vehicle provided in the embodiment of the present application, but does not constitute limitation of the application scenario of the control method of the unmanned aerial vehicle provided in the embodiment of the present application.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating steps of a control method of a unmanned aerial vehicle according to an embodiment of the present application.
As shown in fig. 2, the control method of the unmanned aerial vehicle includes steps S101 to S102.
Step S101, when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the direction of the route of the unmanned aerial vehicle by using the distance sensor and the visual sensor;
and S102, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information.
The unmanned aerial vehicle can acquire sensing data of the distance sensor and the vision sensor. The unmanned aerial vehicle can carry the distance sensor and the vision sensor, or the distance sensor and the vision sensor are detachably connected with the unmanned aerial vehicle, or the unmanned aerial vehicle is in communication connection with equipment carrying the distance sensor and the vision sensor. Through when the sensing of distance sensor and vision sensor is all effective, utilize distance sensor and vision sensor, can discern the barrier information on unmanned aerial vehicle's the route direction accurately, improve the recognition accuracy of barrier information for unmanned aerial vehicle when carrying out imitative ground flight, can bypass the epaxial barrier of route accurately, guarantee unmanned aerial vehicle's operation safety.
The route direction of the unmanned aerial vehicle refers to the flight direction of the unmanned aerial vehicle when the unmanned aerial vehicle carries out simulated ground flight according to the planned simulated ground flight route. The first bypass mode satisfies: the difference between the height of the target operation object from the unmanned aerial vehicle when the unmanned aerial vehicle bypasses the obstacle and the ground-imitating flying height of the unmanned aerial vehicle is smaller than or equal to a preset threshold value, the ground-imitating flying height is the constant height of the unmanned aerial vehicle from the target operation object when the unmanned aerial vehicle is in ground-imitating flying operation, and the target operation object can comprise crops, fruit trees, buildings and the like. The preset threshold may be set based on actual situations, which is not specifically limited in this embodiment. For example, the preset threshold is 2 meters.
In one embodiment, the first bypass approach satisfies: the unmanned aerial vehicle flies around the obstacle in the direction of the route in the horizontal direction and/or the vertical direction, namely, the unmanned aerial vehicle flies around the obstacle in the direction of the route in a three-dimensional detour mode. Wherein, unmanned aerial vehicle is when the obstacle in the horizontal direction bypass route direction flies, unmanned aerial vehicle flies along first direction, second direction, third direction or fourth direction (forward, backward, left direction, right direction) horizontally, and unmanned aerial vehicle is when the obstacle in the vertical direction bypass route direction flies along fifth direction or sixth direction (such as vertically upwards, vertically downwards) vertically. Because unmanned aerial vehicle can bypass the obstacle flight on the route direction with three-dimensional mode of bypassing for unmanned aerial vehicle can bypass the obstacle according to the optimum route of bypassing, reduce unmanned aerial vehicle's energy loss.
For example, as shown in fig. 3, with the center of the unmanned aerial vehicle as the origin O, the head direction of the unmanned aerial vehicle is the X axis, the vertical direction is the Z axis, and the horizontal line is the Y axis, a three-dimensional coordinate system of the unmanned aerial vehicle is established, when the unmanned aerial vehicle flies around the obstacle in the course direction in the horizontal direction, the unmanned aerial vehicle flies along the first direction (X axis positive direction), the second direction (X axis negative direction), the third direction (Y axis positive direction) or the fourth direction (Y axis negative direction) in the plane formed by XOY, and when the unmanned aerial vehicle flies around the obstacle in the course direction in the vertical direction, the unmanned aerial vehicle flies along the fifth direction (Z axis positive direction) or the sixth direction (Z axis negative direction) in the plane formed by YOZ.
In an embodiment, as shown in fig. 4, the control method of the unmanned aerial vehicle includes steps S201 to S202.
Step S201, when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the visual sensor is invalid and the sensing of the distance sensor is valid, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor;
and S202, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information.
Wherein, the second bypass mode satisfies: the unmanned aerial vehicle is the same with the ground-imitating flying height of the unmanned aerial vehicle in the distance from the target working object of the unmanned aerial vehicle when flying around the obstacle in the direction of the route. Through when the sensing of vision sensor is invalid and the sensing of distance sensor is effective, utilize the obstacle information that distance sensor discerned, control unmanned aerial vehicle is according to the second mode of detouring on the route direction's obstacle flight for unmanned aerial vehicle is when detouring the obstacle apart from unmanned aerial vehicle's target operation thing high with unmanned aerial vehicle's imitative ground fly height the same, leads to unmanned aerial vehicle skew operation thing when avoiding unmanned aerial vehicle to dodge the obstacle, influences operation effect and operating efficiency, and then improves unmanned aerial vehicle's operating efficiency and operating effect.
In one embodiment, the second bypass approach satisfies: the unmanned aerial vehicle bypasses the obstacle to fly in the horizontal direction, namely, the unmanned aerial vehicle bypasses the obstacle to fly in a two-dimensional bypass mode. Wherein the distance sensor comprises a millimeter wave radar. When the sensing of the vision sensor is invalid and the sensing of the distance sensor is valid, the influence of the antenna radio frequency scheme and the object reflection intensity is limited, and the millimeter wave radar cannot sense the specific height of the obstacle, so that the unmanned aerial vehicle is controlled to bypass the obstacle to fly in the horizontal direction, the flying safety of the unmanned aerial vehicle can be ensured, and the unmanned aerial vehicle is prevented from colliding with the obstacle when the unmanned aerial vehicle bypasses the obstacle to fly in the vertical direction because the millimeter wave radar cannot sense the specific height of the obstacle.
In an embodiment, when the sensing of the vision sensor is invalid and the sensing of the distance sensor is valid, and the distance sensor is a laser radar, the laser radar is utilized to identify obstacle information in the route direction of the unmanned aerial vehicle, and according to the obstacle information, the unmanned aerial vehicle is controlled to bypass the obstacle to fly according to the first bypass mode. Due to the fact that the laser radar can sense the specific height of the obstacle and the relative position relation between the obstacle and the unmanned aerial vehicle, when the unmanned aerial vehicle flies around the obstacle in a three-dimensional detour mode, the flying safety of the unmanned aerial vehicle can be guaranteed, meanwhile, the unmanned aerial vehicle can detour the obstacle according to an optimal detour path, and the energy loss of the unmanned aerial vehicle is reduced.
In an embodiment, the drone is controlled to perform work while the drone is flying around an obstacle in the course direction. Wherein, unmanned aerial vehicle carries out the operation including spray operation, broadcast operation, survey and drawing operation or electric power inspection operation etc.. Through controlling unmanned aerial vehicle and bypassing the barrier when controlling unmanned aerial vehicle, control unmanned aerial vehicle carries out the operation, can improve unmanned aerial vehicle's operating efficiency.
In an embodiment, when the unmanned aerial vehicle flies around an obstacle in the direction of the route, the distance between the unmanned aerial vehicle and the target working object is obtained; when the distance between the unmanned aerial vehicle and the target operation object is smaller than or equal to a preset distance, controlling the unmanned aerial vehicle to operate; and when the distance between the unmanned aerial vehicle and the target operation is greater than the preset distance, controlling the unmanned aerial vehicle to pause the operation. Through controlling unmanned aerial vehicle and bypassing the barrier, based on unmanned aerial vehicle and the distance between the target operation, control unmanned aerial vehicle operation or pause operation can guarantee the operation effect, also can reduce the consumption of pesticide, chemical fertilizer or seed.
The preset distance may be set based on actual situations, which is not specifically limited in this embodiment. For example, the preset distance is 1 meter. For example, if unmanned aerial vehicle is spraying pesticide for crops, then when controlling unmanned aerial vehicle and bypassing the ascending barrier of route and fly, if the height between unmanned aerial vehicle and the crops canopy is greater than 1 meter, then control unmanned aerial vehicle pause and spray, if the height between unmanned aerial vehicle and the crops canopy is less than 1 meter, then control unmanned aerial vehicle sprays, so, both guaranteed unmanned aerial vehicle and sprayed the effect of pesticide for crops, also reduced the consumption of pesticide.
In an embodiment, controlling the unmanned aerial vehicle to pause operation in the process of the unmanned aerial vehicle flying around the obstacle in the direction of the route; when the unmanned aerial vehicle bypasses the obstacle in the direction of the route and then carries out ground-imitating flight again, starting operation. Through control unmanned aerial vehicle suspension operation when unmanned aerial vehicle bypasses the obstacle, and when bypassing the obstacle and carrying out imitative ground flight again, start the operation, can reduce the influence of obstacle to the operation effect, improve unmanned aerial vehicle's operation effect.
In one embodiment, the sensing of the distance sensor is determined to be valid when the distance sensor is in an on state, and the sensing of the distance sensor is determined to be invalid when the distance sensor is in an off state.
In an embodiment, an ambient light intensity, an image brightness difference between a plurality of the vision sensors, and/or a lens smudge level of the vision sensors are obtained; determining that the sensing of the vision sensor is invalid when the ambient light intensity is smaller than the preset light intensity, the image brightness difference value is larger than the preset brightness difference value and/or the lens dirt degree is larger than the preset dirt degree; and determining that the sensing of the vision sensor is effective when the ambient light intensity is greater than or equal to the preset light intensity, the image brightness difference value is less than or equal to the preset brightness difference value and/or the lens dirt degree is less than or equal to the preset dirt degree.
Alternatively, only one of the above three conditions may be detected, or any two or three of them may be detected. The preset illumination intensity, the preset brightness difference value and the preset dirt degree can be set based on actual conditions, and the embodiment is not particularly limited. By the ambient light intensity, the image brightness difference between the plurality of sensors in the vision sensor, and/or the lens smudge degree of the vision sensor, it is possible to accurately evaluate whether the sensing of the vision sensor is effective. For example, the vision sensor is a binocular vision sensor, including a left vision sensor and a right vision sensor, and the image brightness difference value refers to the brightness difference value of the images acquired by the left vision sensor and the right vision sensor.
In an embodiment, the intensity of ambient light collected by a photosensor onboard the drone is obtained. And when the ambient light intensity is smaller than the preset light intensity, the sensing of the visual sensor is invalid, then the distance sensor is used for identifying the obstacle information on the route direction of the unmanned aerial vehicle, and according to the obstacle information, the unmanned aerial vehicle is controlled to bypass the obstacle on the route direction of the unmanned aerial vehicle in a two-dimensional bypass mode, so that the unmanned aerial vehicle is controlled to bypass in two dimensions when the ambient light is over-dark, and the flight safety of the unmanned aerial vehicle can be ensured.
In an embodiment, the vision sensor includes at least a first sensor and a second sensor, the first sensor may be a first vision module, and the second sensor may be a second vision module. The image brightness difference between the plurality of visual sensors may be obtained by: acquiring a first image acquired by a first sensor and a second image acquired by a second sensor; a luminance difference between the first image and the second image is determined, and the luminance difference is determined as an image luminance difference.
The image brightness difference value between the plurality of sensors in the vision sensor is used for describing brightness consistency of images acquired by the plurality of sensors in the vision sensor, the image brightness difference value is larger than a preset brightness difference value, the brightness consistency is lower, and further, sensing invalidity of the vision sensor can be determined, so that obstacle information in the route direction of the unmanned aerial vehicle is identified by using the distance sensor, and according to the obstacle information, the unmanned aerial vehicle is controlled to bypass obstacles in the route direction of the unmanned aerial vehicle in a two-dimensional bypass mode, two-dimensional bypass of the unmanned aerial vehicle is controlled when ambient light is over-dark, and flight safety of the unmanned aerial vehicle can be guaranteed.
In an embodiment, the lens contamination level of the vision sensor may be obtained by: determining a light attenuation diagram and/or a light refraction diagram of an image acquired by the vision sensor; and determining the dirt degree of the lens according to the light attenuation diagram and/or the light refraction diagram. The inventor finds that when the lens is stained, light is attenuated when entering the inside of the vision sensor through the lens, and the refractive index of the lens is also changed, so that the degree of the stain of the lens of the vision sensor can be accurately determined by determining the light attenuation diagram and/or the light refractive diagram of the image acquired by the vision sensor.
In an embodiment, the manner of determining the lens contamination level according to the light attenuation map and/or the light refraction map may be: and determining the light attenuation degree of the lens of the vision sensor according to the light attenuation diagram, and determining the dirt degree of the lens of the vision sensor according to the light attenuation degree. Or determining the light refractive index of the lens of the vision sensor according to the light refractive index diagram, and determining the dirt degree of the lens of the vision sensor according to the light refractive index of the lens of the vision sensor.
Illustratively, a conversion relation between a pre-stored light attenuation degree and a lens smudge degree is obtained, and the lens smudge degree of the vision sensor is determined according to the conversion relation and the light attenuation degree of the lens of the vision sensor. The conversion relation between the light attenuation degree and the lens dirt degree can be obtained through multiple tests.
Exemplary, a target light refractive index is obtained, wherein the target light refractive index is a light refractive index when a lens of the vision sensor is in a clean state; determining a refractive index difference between the target light refractive index and the light refractive index determined based on the light refractive index map; and determining the lens fouling degree of the vision sensor according to a conversion relation between a pre-stored refractive index difference value and the lens fouling degree and a refractive index difference value between a target light refractive index and a light refractive index determined based on a light refractive index map. The conversion relation between the refractive index difference value and the lens dirt degree is obtained based on multiple tests.
In one embodiment, the degree of light attenuation of the lens of the vision sensor is determined according to the light attenuation map, the light refractive index of the lens of the vision sensor is determined according to the light refractive map, and the degree of lens smudge is determined according to the degree of light attenuation and the light refractive index. Illustratively, the first degree of soiling is determined from the degree of attenuation of the light and the second degree of soiling is determined from the refractive index of the light; and carrying out weighted summation on the first dirt degree and the second dirt degree to obtain the lens dirt degree of the vision sensor.
In an embodiment, as shown in fig. 5, step S102 may include: substeps S1021 to S1022.
Step S1021, planning a first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour mode;
and step S1022, controlling the unmanned aerial vehicle to fly around the obstacle in the direction of the route according to the first detour mode according to the first detour route.
Since the first detour approach needs to satisfy the condition: the difference between the height of the target working object of the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle is smaller than or equal to a preset threshold value when the unmanned aerial vehicle bypasses the obstacle to fly, and/or the unmanned aerial vehicle bypasses the obstacle to fly in the horizontal direction and/or the vertical direction. Therefore, the first detour route planned according to the obstacle information and the first detour manner also satisfies the above condition, and the first detour route may be a three-dimensional detour route or a two-dimensional detour route.
In an embodiment, if the first detour route is a three-dimensional detour route, when the unmanned aerial vehicle is controlled to detour the obstacle according to the first detour mode, at a first ground-like detour point on the first detour route, the unmanned aerial vehicle is controlled to detour the obstacle in the horizontal direction for flying, and at a second ground-like detour point on the first detour route, the unmanned aerial vehicle is controlled to detour the obstacle in the vertical direction for flying, so as to realize three-dimensional detour of the unmanned aerial vehicle; if the first detour route is a two-dimensional detour route, when the unmanned aerial vehicle is controlled to detour the obstacle according to the first detour mode, the unmanned aerial vehicle is controlled to detour the obstacle to fly in the horizontal direction at all the ground-imitating detour waypoints of the first detour route so as to realize the two-dimensional detour of the unmanned aerial vehicle.
In an embodiment, according to the obstacle information and the first detour mode, determining a ground-simulated detour waypoint of the unmanned aerial vehicle from a three-dimensional dense map of the environment in which the unmanned aerial vehicle is located; and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint. The three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established based on first sensing data acquired by the distance sensor and second sensing data acquired by the visual sensor, the accuracy is high, and through obstacle information and a first detour mode, an accurate optimal simulated ground detour waypoint meeting constraint conditions can be determined from the three-dimensional dense map with high accuracy, and when the unmanned aerial vehicle flies at the simulated ground detour waypoint, the unmanned aerial vehicle can detour the obstacle.
In an embodiment, determining the ground-engaging waypoints of the drone from a three-dimensional dense map of the environment in which the drone is located may include: and determining the simulated ground detour waypoints of the unmanned aerial vehicle from the three-dimensional dense map of the environment in which the unmanned aerial vehicle is located in the horizontal direction and/or the vertical direction of the unmanned aerial vehicle, namely determining the simulated ground detour waypoints of the unmanned aerial vehicle from the three-dimensional dense map in a three-dimensional planning mode. The simulated ground detour waypoints of the unmanned aerial vehicle are determined in a three-dimensional planning mode, so that a first detour route formed by the determined simulated ground detour waypoints is optimal, and the energy loss of obstacle avoidance can be reduced.
In an embodiment, a difference between the altitude of the partially or fully ground-engaging detour waypoint from the target work object and the ground-engaging flying altitude of the unmanned aerial vehicle is less than or equal to a preset threshold. The ground-engaging detour waypoints comprise a first ground-engaging detour waypoint in the horizontal direction of the unmanned aerial vehicle and/or a second ground-engaging detour waypoint in the vertical direction of the unmanned aerial vehicle. The difference value between the height of all the first ground-engaging detour waypoints from the target work object and the ground-engaging flying height of the unmanned aerial vehicle is smaller than or equal to a preset threshold value, and the difference value between the height of some or all of the second ground-engaging detour waypoints from the target work object and the ground-engaging flying height of the unmanned aerial vehicle is smaller than or equal to the preset threshold value.
It can be appreciated that, when determining the ground-engaging detour waypoints of the unmanned aerial vehicle from the three-dimensional dense map according to the obstacle information and the first detour manner, the ground-engaging detour waypoints are required to satisfy the constraint condition: the difference between the height of the ground-engaging detour waypoint from the target work object and the ground-engaging flying height is less than or equal to a preset threshold, but in some cases, the ground-engaging detour waypoint may not be required to meet a constraint condition, for example, a horizontal detour may not be able to detour around an obstacle, but only a vertical detour around an obstacle, but the height of the obstacle is higher, and in order to detour around an obstacle, the difference between the height of the ground-engaging detour waypoint from the target work object and the ground-engaging flying height may be greater than the preset threshold, and therefore, the difference between the determined ground-engaging detour waypoint from the target work object and the ground-engaging flying height may be greater than the preset threshold.
In an embodiment, a target route segment corresponding to an obstacle in the route direction is obtained from a ground-imitating flight route of an unmanned aerial vehicle; and adjusting the target route segment according to the simulated ground detour route point to obtain a first detour route of the unmanned aerial vehicle. The simulated ground flight route is a planned flight route in advance or in real time. Through the imitative ground of determining and detouring the waypoint, adjust the target route section that the barrier corresponds for deviation between first detouring route and the target route section is less, and then when controlling unmanned aerial vehicle and detouring the barrier according to first detouring route, unmanned aerial vehicle need not seriously deviate from imitative ground flight route, can guarantee flight safety simultaneously, improves operation effect and operating efficiency.
In an embodiment, as shown in fig. 6, step S102 may include: substeps S1023 to S1024.
Step S1023, acquiring movement state information of the unmanned aerial vehicle;
and step S1024, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information and the movement state information of the unmanned aerial vehicle.
The movement state information of the unmanned aerial vehicle may include a movement direction and/or a movement speed of the unmanned aerial vehicle. By means of the obstacle information and the movement state information of the unmanned aerial vehicle, the flight of the obstacle in the direction of the detour route according to the first detour mode can be controlled more optimally.
In an embodiment, a first detour route of the unmanned aerial vehicle is planned according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode; and controlling the unmanned aerial vehicle to fly around the obstacle in the direction of the route according to the first detour mode according to the first detour route. Based on the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode, a first detour route with better flight energy consumption can be planned, so that the flight energy loss of the obstacle avoidance can be reduced on the premise of ensuring the safe flight of the unmanned aerial vehicle, and the duration of the unmanned aerial vehicle is further improved.
In an embodiment, according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode, determining a ground-simulated detour waypoint of the unmanned aerial vehicle from a three-dimensional dense map of the environment in which the unmanned aerial vehicle is located; and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint. For example, based on the obstacle information and the first detour, a plurality of candidate waypoints are determined, and the plurality of candidate waypoints includes a first candidate waypoint in the horizontal direction of the unmanned aerial vehicle and a second candidate waypoint in the vertical direction of the unmanned aerial vehicle, and since the direction of movement of the unmanned aerial vehicle is forward-flying along the nose direction, the first candidate waypoint may be determined as a ground-detour waypoint.
In an embodiment, a target relative distance between the unmanned aerial vehicle and an obstacle in the direction of the route is obtained; determining a flight speed limit value of the unmanned aerial vehicle according to the target relative distance; and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the first bypass mode according to the flight speed limit value and the obstacle information. The flight speed limit value is the maximum flight speed of the unmanned aerial vehicle. The flight speed limit value of the unmanned aerial vehicle is determined through the relative distance between the unmanned aerial vehicle and the obstacle, and based on the flight speed limit value and the obstacle information, the unmanned aerial vehicle is controlled to bypass the obstacle flight in the route direction according to the first bypass mode, so that the flight safety of the unmanned aerial vehicle can be ensured.
In an embodiment, the manner of obtaining the target relative distance between the unmanned aerial vehicle and the obstacle in the course direction may be: determining a first relative distance between the unmanned aerial vehicle and an obstacle in the direction of the route according to a three-dimensional dense map of the environment where the unmanned aerial vehicle is located; determining a second relative distance between the unmanned aerial vehicle and the obstacle in the direction of the route according to the depth map acquired by the vision sensor of the unmanned aerial vehicle; and carrying out weighted summation on the first relative distance and the second relative distance to obtain the target relative distance. By comprehensively considering the three-dimensional dense map and the depth map acquired by the visual sensor, the relative distance between the unmanned aerial vehicle and the obstacle in the direction of the route can be more accurately determined, and the unmanned aerial vehicle can be accurately controlled to bypass the obstacle.
Illustratively, the manner of determining the speed limit of the drone based on the relative distance of the targets may be: acquiring a conversion relation between a pre-stored flight speed limit value and a pre-stored distance; and determining the flight speed limit value of the unmanned aerial vehicle according to the pre-stored conversion relation between the flight speed limit value and the distance and the target relative distance between the unmanned aerial vehicle and the obstacle in the direction of the route. The conversion relationship between the flight speed limit value and the distance may be set based on actual situations, which is not specifically limited in this embodiment.
In an embodiment, first sensing data acquired by a distance sensor and second sensing data acquired by a vision sensor are acquired; according to the first sensing data and the second sensing data, a three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established; and acquiring barrier information in the route direction of the unmanned aerial vehicle from the three-dimensional dense map. The first sensing data comprise raw point cloud data acquired by the radar device, and the second sensing data comprise image data acquired by the vision sensor. By fusing the point cloud data and the image data, a three-dimensional dense map of the environment where the unmanned aerial vehicle is located can be accurately established.
For example, according to the first sensing data and the second sensing data, a manner of establishing a three-dimensional dense map of an environment in which the unmanned aerial vehicle is located may be: filtering point clouds corresponding to false targets from the original point cloud data to obtain first point cloud data; performing rigid body change on the first point cloud data according to the pose of the unmanned aerial vehicle to obtain second point cloud data; determining a tracking target, and acquiring point cloud data corresponding to the tracking target from the second point cloud data to obtain target point cloud data; determining a depth map according to image data acquired by a vision sensor; and establishing a three-dimensional dense map of the environment where the unmanned aerial vehicle is positioned according to the cloud data of the target point and the depth map.
In one embodiment, the three-dimensional dense map is transmitted to a control terminal communicatively coupled to the drone for display by the control terminal. The control terminal displays the three-dimensional dense map, and simultaneously displays the position of the unmanned aerial vehicle, the position of the obstacle and the movement direction of the unmanned aerial vehicle in the three-dimensional dense map, wherein the displayed position comprises the current position of the unmanned aerial vehicle and the predicted position of the unmanned aerial vehicle in a period of time after the current system moment. Through displaying three-dimensional dense map, unmanned aerial vehicle's position, the position of barrier, unmanned aerial vehicle's direction of motion, can make things convenient for the user to know the relative relation between unmanned aerial vehicle and the barrier, the user of being convenient for controls unmanned aerial vehicle.
In an embodiment, according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the second detour mode may include: planning a second detour route of the unmanned aerial vehicle according to the obstacle information and the second detour mode; and controlling the unmanned aerial vehicle to fly around the obstacle in the direction of the route according to the second detour mode according to the second detour route. The second detour route is a two-dimensional detour route, namely, when the unmanned aerial vehicle is controlled to detour the obstacle according to the second detour mode, the unmanned aerial vehicle is controlled to detour the obstacle to fly in the horizontal direction at all the ground-imitating detour waypoints of the second detour route so as to realize the two-dimensional detour of the unmanned aerial vehicle.
In an embodiment, according to the obstacle information and the second detour mode, determining a ground-simulated detour waypoint of the unmanned aerial vehicle from a three-dimensional dense map of the environment in which the unmanned aerial vehicle is located; and planning a second detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint. The three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established based on first sensing data acquired by the distance sensor, and through obstacle information and a second detour mode, an accurate optimal simulated ground detour waypoint meeting constraint conditions can be determined from the three-dimensional dense map, and the unmanned aerial vehicle can detour the obstacle when flying around the simulated ground detour waypoint.
In an embodiment, determining the ground-engaging waypoints of the drone from a three-dimensional dense map of the environment in which the drone is located may include: and determining the simulated ground detour waypoints of the unmanned aerial vehicle from the three-dimensional dense map of the environment in which the unmanned aerial vehicle is located in a two-dimensional planning mode. The height of all the ground-imitating detour points in the second detour route from the target working object is the same as the ground-imitating flying height of the unmanned aerial vehicle. And determining the simulated ground detour waypoints of the unmanned aerial vehicle in a two-dimensional planning mode, so that a second detour route formed by the determined simulated ground detour waypoints is optimal, and the energy loss of obstacle avoidance can be reduced.
It can be understood that the planning manner of the second detour route is similar to that of the first detour route, and the specific planning process of the second detour route can refer to the planning process of the first detour route, which is not described herein.
Referring to fig. 7, fig. 7 is a schematic block diagram of a control device of an unmanned aerial vehicle according to an embodiment of the present application.
As shown in fig. 7, the control device 300 of the unmanned aerial vehicle includes a processor 310 and a memory 320, and the processor 310 and the memory 320 are connected through a bus 330, for example, an I2C (Inter-integrated Circuit) bus. The unmanned aerial vehicle can acquire sensing data of the distance sensor and the vision sensor. The unmanned aerial vehicle can carry the distance sensor and the vision sensor, or the distance sensor and the vision sensor are detachably connected with the unmanned aerial vehicle, or the unmanned aerial vehicle is in communication connection with equipment carrying the distance sensor and the vision sensor.
Specifically, the processor 310 may be a Micro-controller Unit (MCU), a central processing Unit (Central Processing Unit, CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
Specifically, the Memory 320 may be a Flash chip, a Read-Only Memory (ROM) disk, an optical disk, a U-disk, a removable hard disk, or the like.
Wherein the processor 310 is configured to run a computer program stored in the memory 320 and to implement the following steps when the computer program is executed:
when the unmanned aerial vehicle performs ground-imitating flight, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor and the visual sensor;
according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode;
wherein, the first bypass mode satisfies: the unmanned aerial vehicle bypasses the difference between the height of the target operation object of the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle when the obstacle flies is smaller than or equal to a preset threshold value, and the ground-imitating flying height is the constant height of the unmanned aerial vehicle away from the target operation object when the unmanned aerial vehicle flies in a ground-imitating flying operation.
Optionally, the first bypass mode satisfies: the unmanned aerial vehicle bypasses the obstacle in the horizontal direction and/or the vertical direction for flying.
Optionally, the processor is further configured to implement the following steps:
when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the visual sensor is invalid and the sensing of the distance sensor is valid, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor;
according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a second bypass mode;
wherein the second bypass mode satisfies: the unmanned aerial vehicle bypasses the obstacle and flies away from the height of the target operation object of the unmanned aerial vehicle to be the same as the ground-imitating flying height of the unmanned aerial vehicle.
Optionally, the second bypass mode satisfies: the unmanned aerial vehicle flies around the obstacle in the horizontal direction.
Optionally, the processor is further configured to implement the following steps:
and when the unmanned aerial vehicle flies around the obstacle in the route direction, controlling the unmanned aerial vehicle to operate.
Optionally, when the processor controls the unmanned aerial vehicle to operate when the unmanned aerial vehicle is implemented to fly around an obstacle in the route direction, the processor is configured to implement:
When the unmanned aerial vehicle flies around an obstacle in the route direction, acquiring the distance between the unmanned aerial vehicle and the target working object;
and when the distance between the unmanned aerial vehicle and the target working object is smaller than or equal to a preset distance, controlling the unmanned aerial vehicle to work.
Optionally, the processor is further configured to implement the following steps:
and controlling the unmanned aerial vehicle to pause the operation when the distance between the unmanned aerial vehicle and the target operation is larger than the preset distance.
Optionally, the processor is further configured to implement the following steps:
controlling the unmanned aerial vehicle to pause operation in the process of bypassing the obstacle flight in the route direction;
and starting the operation when the ground-imitating flight is carried out again after the unmanned aerial vehicle bypasses the obstacle in the route direction.
Optionally, the processor is further configured to implement the following steps:
acquiring ambient light intensity, image brightness differences among a plurality of visual sensors and/or lens fouling degree of the visual sensors;
and determining that the sensing of the vision sensor is invalid when the ambient light intensity is smaller than the preset light intensity, the image brightness difference value is larger than the preset brightness difference value and/or the lens dirt degree is larger than the preset dirt degree.
Optionally, the vision sensor includes at least a first sensor and a second sensor, and the processor is configured, when implementing obtaining the image brightness difference between the plurality of sensors in the vision sensor, to implement:
acquiring a first image acquired by the first sensor and a second image acquired by the second sensor;
a luminance difference between the first image and the second image is determined, and the luminance difference is determined as the image luminance difference.
Optionally, when the processor is configured to obtain the lens contamination level of the vision sensor, the processor is configured to:
determining a light attenuation diagram and/or a light refraction diagram of an image acquired by the vision sensor;
and determining the dirt degree of the lens according to the light attenuation diagram and/or the light refraction diagram.
Optionally, when the processor determines the lens contamination level according to the light attenuation diagram and/or the light refraction diagram, the processor is configured to implement:
determining the light attenuation degree of the lens of the vision sensor according to the light attenuation diagram;
determining the light refractive index of the lens of the vision sensor according to the light refractive index map;
And determining the dirt degree of the lens according to the light attenuation degree and the light refractive index.
Optionally, the processor is further configured to implement the following steps:
and determining that the sensing of the vision sensor is effective when the ambient light intensity is greater than or equal to a preset light intensity, the image brightness difference value is less than or equal to a preset brightness difference value and/or the lens dirt degree is less than or equal to a preset dirt degree.
Optionally, when the processor controls the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information, the processor is configured to implement:
planning a first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour mode;
and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
Optionally, when the processor is configured to plan the first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour mode, the processor is configured to implement:
according to the obstacle information and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
And planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
Optionally, when the processor is configured to plan the first detour route of the unmanned aerial vehicle according to the ground-like detour waypoint, the processor is configured to implement:
acquiring a target route segment corresponding to an obstacle in the route direction from a ground-imitating flight route of the unmanned aerial vehicle;
and adjusting the target route segment according to the ground-simulated detour route point to obtain a first detour route of the unmanned aerial vehicle.
Optionally, the difference between the height of the ground-imitating detour waypoint from the target working object and the ground-imitating flying height of the unmanned aerial vehicle is smaller than or equal to a preset threshold.
Optionally, the ground-engaging detour waypoint includes a first ground-engaging detour waypoint in a horizontal direction of the unmanned aerial vehicle and/or a second ground-engaging detour waypoint in a vertical direction of the unmanned aerial vehicle.
Optionally, the difference between the height of all the first ground-engaging detouring waypoints from the target working object and the ground-engaging flying height of the unmanned aerial vehicle is smaller than or equal to a preset threshold, and the difference between the height of some or all of the second ground-engaging detouring waypoints from the target working object and the ground-engaging flying height of the unmanned aerial vehicle is smaller than or equal to the preset threshold.
Optionally, when the processor controls the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information, the processor is configured to implement:
acquiring motion state information of the unmanned aerial vehicle;
and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the obstacle information and the movement state information of the unmanned aerial vehicle.
Optionally, when the processor controls the unmanned aerial vehicle to fly around the obstacle in the route direction according to the obstacle information and the motion state information of the unmanned aerial vehicle, the processor is configured to implement:
planning a first detour route of the unmanned aerial vehicle according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode;
and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
Optionally, the processor is configured to implement, when implementing planning the first detour route of the unmanned aerial vehicle according to the obstacle information, the movement state information of the unmanned aerial vehicle, and the first detour mode
According to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
Optionally, when the processor controls the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to the obstacle information, the processor is configured to implement:
acquiring a target relative distance between the unmanned aerial vehicle and an obstacle in the route direction;
determining a flight speed limit value of the unmanned aerial vehicle according to the target relative distance;
and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the flight speed limit value and the obstacle information.
Optionally, when implementing obtaining the target relative distance between the unmanned aerial vehicle and the obstacle in the course direction, the processor is configured to implement:
determining a first relative distance between the unmanned aerial vehicle and an obstacle in the direction of the route according to a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
Determining a second relative distance between the unmanned aerial vehicle and the obstacle in the route direction according to the depth map acquired by the vision sensor of the unmanned aerial vehicle;
and carrying out weighted summation on the first relative distance and the second relative distance to obtain the target relative distance.
Optionally, the distance sensor includes a millimeter wave radar.
Optionally, when implementing identifying obstacle information in the course direction of the unmanned aerial vehicle by using the distance sensor and the vision sensor, the processor is configured to implement:
acquiring first sensing data acquired by the distance sensor and second sensing data acquired by the vision sensor;
according to the first sensing data and the second sensing data, a three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established;
and acquiring barrier information in the route direction of the unmanned aerial vehicle from the three-dimensional dense map.
Optionally, the processor is further configured to implement the following steps:
and controlling the unmanned aerial vehicle to send the three-dimensional dense map to a control terminal in communication connection with the unmanned aerial vehicle so as to enable the control terminal to display the three-dimensional dense map.
It should be noted that, for convenience and brevity of description, a person skilled in the art may clearly understand that, for a specific working process of the control device of the unmanned aerial vehicle described above, reference may be made to a corresponding process in the foregoing embodiment of the control method of the unmanned aerial vehicle, which is not described herein again.
Referring to fig. 8, fig. 8 is a schematic block diagram of a structure of a unmanned aerial vehicle according to an embodiment of the present application.
As shown in fig. 8, the unmanned aerial vehicle 400 includes a body 410, a power system 420, a distance sensor 430, a vision sensor 440, and a control device 450. The power system 420 is disposed on the body 410 and is used for providing flight power for the unmanned aerial vehicle 400, the distance sensor 430 and the vision sensor 440 are fixedly connected or detachably connected with the body 410 and are used for collecting sensing data, and the control device 450 is disposed in the body 410 and is used for controlling the unmanned aerial vehicle 400. The control device 450 may be the control device 300 shown in fig. 7.
It should be noted that, for convenience and brevity of description, a person skilled in the art may clearly understand that, in the specific working process of the unmanned aerial vehicle described above, reference may be made to a corresponding process in the foregoing embodiment of the control method of the unmanned aerial vehicle, which is not described herein again.
The embodiment of the application also provides a storage medium, wherein the storage medium stores a computer program, the computer program comprises program instructions, and the processor executes the program instructions to realize the steps of the control method of the unmanned aerial vehicle provided by the embodiment.
The storage medium may be an internal storage unit of the unmanned aerial vehicle according to any of the foregoing embodiments, for example, a hard disk or a memory of the unmanned aerial vehicle. The storage medium may also be an external storage device of the unmanned aerial vehicle, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are equipped on the unmanned aerial vehicle.
It is to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (56)

  1. A control method of an unmanned aerial vehicle, wherein the unmanned aerial vehicle is capable of acquiring sensing data of a distance sensor and a vision sensor, the method comprising:
    when the unmanned aerial vehicle performs ground-imitating flight, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor and the visual sensor;
    according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode;
    wherein, the first bypass mode satisfies: the unmanned aerial vehicle bypasses the difference between the height of the target operation object of the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle when the obstacle flies is smaller than or equal to a preset threshold value, and the ground-imitating flying height is the constant height of the unmanned aerial vehicle away from the target operation object when the unmanned aerial vehicle flies in a ground-imitating flying operation.
  2. The method of claim 1, wherein the first detour approach satisfies: the unmanned aerial vehicle bypasses the obstacle in the horizontal direction and/or the vertical direction for flying.
  3. The method of controlling a drone of claim 1, wherein the method further comprises:
    when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the visual sensor is invalid and the sensing of the distance sensor is valid, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor;
    according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a second bypass mode;
    wherein the second bypass mode satisfies: the unmanned aerial vehicle bypasses the obstacle and flies away from the height of the target operation object of the unmanned aerial vehicle to be the same as the ground-imitating flying height of the unmanned aerial vehicle.
  4. A control method of a drone according to claim 3, wherein the second detour approach satisfies: the unmanned aerial vehicle flies around the obstacle in the horizontal direction.
  5. The method of controlling a drone of any one of claims 1-4, wherein the method further comprises:
    And when the unmanned aerial vehicle flies around the obstacle in the route direction, controlling the unmanned aerial vehicle to operate.
  6. The method of claim 5, wherein controlling the unmanned aerial vehicle to operate while the unmanned aerial vehicle is flying around the obstacle in the course direction comprises:
    when the unmanned aerial vehicle flies around an obstacle in the route direction, acquiring the distance between the unmanned aerial vehicle and the target working object;
    and when the distance between the unmanned aerial vehicle and the target working object is smaller than or equal to a preset distance, controlling the unmanned aerial vehicle to work.
  7. The method of controlling a drone of claim 6, wherein the method further comprises:
    and controlling the unmanned aerial vehicle to pause the operation when the distance between the unmanned aerial vehicle and the target operation is larger than the preset distance.
  8. The method of controlling a drone of any one of claims 1-4, wherein the method further comprises:
    controlling the unmanned aerial vehicle to pause operation in the process of bypassing the obstacle flight in the route direction;
    and starting the operation when the ground-imitating flight is carried out again after the unmanned aerial vehicle bypasses the obstacle in the route direction.
  9. The method of controlling a drone of any one of claims 1-8, wherein the method further comprises:
    acquiring ambient light intensity, image brightness differences among a plurality of visual sensors and/or lens fouling degree of the visual sensors;
    and determining that the sensing of the vision sensor is invalid when the ambient light intensity is smaller than the preset light intensity, the image brightness difference value is larger than the preset brightness difference value and/or the lens dirt degree is larger than the preset dirt degree.
  10. The method of claim 9, wherein the vision sensor includes at least a first sensor and a second sensor, and wherein obtaining the image brightness difference between the plurality of vision sensors includes:
    acquiring a first image acquired by the first sensor and a second image acquired by the second sensor;
    a luminance difference between the first image and the second image is determined, and the luminance difference is determined as the image luminance difference.
  11. The method of claim 9, wherein obtaining the lens contamination level of the vision sensor comprises:
    Determining a light attenuation diagram and/or a light refraction diagram of an image acquired by the vision sensor;
    and determining the dirt degree of the lens according to the light attenuation diagram and/or the light refraction diagram.
  12. The method according to claim 11, wherein determining the degree of lens fouling according to the light attenuation map and/or the light refraction map comprises:
    determining the light attenuation degree of the lens of the vision sensor according to the light attenuation diagram;
    determining the light refractive index of the lens of the vision sensor according to the light refractive index map;
    and determining the dirt degree of the lens according to the light attenuation degree and the light refractive index.
  13. The method of controlling a drone of claim 9, wherein the method further comprises:
    and determining that the sensing of the vision sensor is effective when the ambient light intensity is greater than or equal to a preset light intensity, the image brightness difference value is less than or equal to a preset brightness difference value and/or the lens dirt degree is less than or equal to a preset dirt degree.
  14. The method for controlling a drone according to any one of claims 1 to 13, wherein controlling the drone to fly around the obstacle in the course direction in a first detour manner according to the obstacle information includes:
    Planning a first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour mode;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
  15. The method of claim 14, wherein planning a first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour pattern comprises:
    according to the obstacle information and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
  16. The method of claim 15, wherein planning a first detour route for the unmanned aerial vehicle based on the simulated ground detour waypoint comprises:
    acquiring a target route segment corresponding to an obstacle in the route direction from a ground-imitating flight route of the unmanned aerial vehicle;
    and adjusting the target route segment according to the ground-simulated detour route point to obtain a first detour route of the unmanned aerial vehicle.
  17. The method of claim 15, wherein a difference between a height of a part or all of the ground-engaging navigational points from the target work object and a ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold.
  18. The method of claim 17, wherein the ground-engaging navigational waypoints comprise a first ground-engaging navigational waypoint in a horizontal direction of the unmanned aerial vehicle and/or a second ground-engaging navigational point in a vertical direction of the unmanned aerial vehicle.
  19. The method of claim 18, wherein a difference between a height of all of the first ground-engaging waypoints from the target work object and a ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold, and a difference between a height of some or all of the second ground-engaging waypoints from the target work object and a ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold.
  20. The method for controlling a drone according to any one of claims 1 to 13, wherein controlling the drone to fly around the obstacle in the course direction in a first detour manner according to the obstacle information includes:
    Acquiring motion state information of the unmanned aerial vehicle;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the obstacle information and the movement state information of the unmanned aerial vehicle.
  21. The method for controlling a unmanned aerial vehicle according to claim 20, wherein controlling the unmanned aerial vehicle to fly around the obstacle in the course direction in the first detour manner according to the obstacle information and the movement state information of the unmanned aerial vehicle comprises:
    planning a first detour route of the unmanned aerial vehicle according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
  22. The method of claim 21, wherein planning a first detour route of the unmanned aerial vehicle according to the obstacle information, the movement state information of the unmanned aerial vehicle, and the first detour manner comprises:
    according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    And planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
  23. The method of controlling a drone according to any one of claims 1-22, wherein controlling the drone to fly around the obstacle in the course direction in a first detour manner according to the obstacle information includes:
    acquiring a target relative distance between the unmanned aerial vehicle and an obstacle in the route direction;
    determining a flight speed limit value of the unmanned aerial vehicle according to the target relative distance;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the flight speed limit value and the obstacle information.
  24. The method of claim 23, wherein the obtaining a target relative distance between the drone and the obstacle in the course direction comprises:
    determining a first relative distance between the unmanned aerial vehicle and an obstacle in the direction of the route according to a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    determining a second relative distance between the unmanned aerial vehicle and the obstacle in the route direction according to the depth map acquired by the vision sensor of the unmanned aerial vehicle;
    And carrying out weighted summation on the first relative distance and the second relative distance to obtain the target relative distance.
  25. The method of controlling a drone of any of claims 1-22, wherein the distance sensor comprises a millimeter wave radar.
  26. The method of controlling a drone according to any one of claims 1-22, wherein the identifying obstacle information in the course direction of the drone using the distance sensor and the vision sensor comprises:
    acquiring first sensing data acquired by the distance sensor and second sensing data acquired by the vision sensor;
    according to the first sensing data and the second sensing data, a three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established;
    and acquiring barrier information in the route direction of the unmanned aerial vehicle from the three-dimensional dense map.
  27. The method of claim 26, further comprising:
    and sending the three-dimensional dense map to a control terminal in communication connection with the unmanned aerial vehicle so as to enable the control terminal to display the three-dimensional dense map.
  28. A control device of an unmanned aerial vehicle is characterized in that the unmanned aerial vehicle can acquire sensing data of a distance sensor and a vision sensor, the control device comprises a memory and a processor,
    the memory is used for storing a computer program;
    the processor is configured to execute the computer program and when executing the computer program, implement the following steps:
    when the unmanned aerial vehicle performs ground-imitating flight, if the sensing of the distance sensor and the visual sensor are effective, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor and the visual sensor;
    according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode;
    wherein, the first bypass mode satisfies: the unmanned aerial vehicle bypasses the difference between the height of the target operation object of the unmanned aerial vehicle and the ground-imitating flying height of the unmanned aerial vehicle when the obstacle flies is smaller than or equal to a preset threshold value, and the ground-imitating flying height is the constant height of the unmanned aerial vehicle away from the target operation object when the unmanned aerial vehicle flies in a ground-imitating flying operation.
  29. The unmanned aerial vehicle control of claim 28, wherein the first detour pattern satisfies: the unmanned aerial vehicle bypasses the obstacle in the horizontal direction and/or the vertical direction for flying.
  30. The unmanned aerial vehicle control of claim 28, wherein the processor is further configured to implement the steps of:
    when the unmanned aerial vehicle flies in an imitation ground, if the sensing of the visual sensor is invalid and the sensing of the distance sensor is valid, identifying obstacle information in the route direction of the unmanned aerial vehicle by using the distance sensor;
    according to the obstacle information, controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a second bypass mode;
    wherein the second bypass mode satisfies: the unmanned aerial vehicle bypasses the obstacle and flies away from the height of the target operation object of the unmanned aerial vehicle to be the same as the ground-imitating flying height of the unmanned aerial vehicle.
  31. The unmanned aerial vehicle control of claim 30, wherein the second detour pattern satisfies: the unmanned aerial vehicle flies around the obstacle in the horizontal direction.
  32. The unmanned aerial vehicle control of any of claims 28-31, wherein the processor is further configured to:
    And when the unmanned aerial vehicle flies around the obstacle in the route direction, controlling the unmanned aerial vehicle to operate.
  33. The control device of claim 32, wherein the processor, when configured to control the drone to operate when the drone is configured to:
    when the unmanned aerial vehicle flies around an obstacle in the route direction, acquiring the distance between the unmanned aerial vehicle and the target working object;
    and when the distance between the unmanned aerial vehicle and the target working object is smaller than or equal to a preset distance, controlling the unmanned aerial vehicle to work.
  34. The unmanned aerial vehicle control of claim 33, wherein the processor is further configured to implement the steps of:
    and controlling the unmanned aerial vehicle to pause the operation when the distance between the unmanned aerial vehicle and the target operation is larger than the preset distance.
  35. The unmanned aerial vehicle control of any of claims 28-31, wherein the processor is further configured to:
    controlling the unmanned aerial vehicle to pause operation in the process of bypassing the obstacle flight in the route direction;
    And starting the operation when the ground-imitating flight is carried out again after the unmanned aerial vehicle bypasses the obstacle in the route direction.
  36. The unmanned aerial vehicle control of any of claims 28-35, wherein the processor is further configured to:
    acquiring ambient light intensity, image brightness differences among a plurality of visual sensors and/or lens fouling degree of the visual sensors;
    and determining that the sensing of the vision sensor is invalid when the ambient light intensity is smaller than the preset light intensity, the image brightness difference value is larger than the preset brightness difference value and/or the lens dirt degree is larger than the preset dirt degree.
  37. The unmanned aerial vehicle control of claim 36, wherein the vision sensor comprises at least a first sensor and a second sensor, and wherein the processor, when configured to obtain the image brightness differences between the plurality of vision sensors, is configured to:
    acquiring a first image acquired by the first sensor and a second image acquired by the second sensor;
    a luminance difference between the first image and the second image is determined, and the luminance difference is determined as the image luminance difference.
  38. The unmanned aerial vehicle control of claim 36, wherein the processor, when configured to obtain the lens contamination level of the vision sensor, is configured to:
    determining a light attenuation diagram and/or a light refraction diagram of an image acquired by the vision sensor;
    and determining the dirt degree of the lens according to the light attenuation diagram and/or the light refraction diagram.
  39. The unmanned aerial vehicle control of claim 38, wherein the processor, when configured to determine the degree of lens fouling from the light attenuation map and/or the light refraction map, is configured to:
    determining the light attenuation degree of the lens of the vision sensor according to the light attenuation diagram;
    determining the light refractive index of the lens of the vision sensor according to the light refractive index map;
    and determining the dirt degree of the lens according to the light attenuation degree and the light refractive index.
  40. The unmanned aerial vehicle control of claim 36, wherein the processor is further configured to implement the steps of:
    and determining that the sensing of the vision sensor is effective when the ambient light intensity is greater than or equal to a preset light intensity, the image brightness difference value is less than or equal to a preset brightness difference value and/or the lens dirt degree is less than or equal to a preset dirt degree.
  41. The control device of any one of claims 28-40, wherein the processor, when configured to control the unmanned aerial vehicle to fly around the obstacle in the course direction in a first detour manner based on the obstacle information, is configured to:
    planning a first detour route of the unmanned aerial vehicle according to the obstacle information and the first detour mode;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
  42. The control device of claim 41, wherein the processor, when implementing planning a first detour route of the unmanned aerial vehicle based on the obstacle information and the first detour pattern, is configured to:
    according to the obstacle information and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
  43. The control device of claim 42, wherein the processor, when implementing planning a first detour route of the unmanned aerial vehicle based on the ground-engaging detour waypoint, is configured to:
    Acquiring a target route segment corresponding to an obstacle in the route direction from a ground-imitating flight route of the unmanned aerial vehicle;
    and adjusting the target route segment according to the ground-simulated detour route point to obtain a first detour route of the unmanned aerial vehicle.
  44. The control device of claim 42, wherein a difference between a height of a portion or all of the ground-engaging navigational points from the target work object and a ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold.
  45. The control device of claim 44, wherein the ground-engaging navigational waypoints comprise a first ground-engaging navigational waypoint in a horizontal direction of the unmanned aerial vehicle and/or a second ground-engaging navigational point in a vertical direction of the unmanned aerial vehicle.
  46. The unmanned aerial vehicle control of claim 45, wherein the difference between the height of all of the first ground-engaging navigational points from the target work object and the ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold and the difference between the height of some or all of the second ground-engaging navigational points from the target work object and the ground-engaging flying height of the unmanned aerial vehicle is less than or equal to a preset threshold.
  47. The control device of any one of claims 28-40, wherein the processor, when configured to control the unmanned aerial vehicle to fly around the obstacle in the course direction in a first detour manner based on the obstacle information, is configured to:
    acquiring motion state information of the unmanned aerial vehicle;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the obstacle information and the movement state information of the unmanned aerial vehicle.
  48. The control device of claim 47, wherein the processor, when implementing controlling the unmanned aerial vehicle to fly around the obstacle in the course direction according to the obstacle information and the movement state information of the unmanned aerial vehicle, is configured to implement:
    planning a first detour route of the unmanned aerial vehicle according to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the direction of the route according to the first bypass route.
  49. The control device of claim 48, wherein the processor is configured to, when implementing planning a first detour route for the unmanned aerial vehicle based on the obstacle information, the movement state information and the first detour pattern for the unmanned aerial vehicle
    According to the obstacle information, the movement state information of the unmanned aerial vehicle and the first detour mode, determining a simulated ground detour navigation point of the unmanned aerial vehicle from a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    and planning a first detour route of the unmanned aerial vehicle according to the simulated ground detour waypoint.
  50. The control device of any one of claims 28-49, wherein the processor, when configured to control the unmanned aerial vehicle to fly around the obstacle in the course direction in a first detour manner based on the obstacle information, is configured to:
    acquiring a target relative distance between the unmanned aerial vehicle and an obstacle in the route direction;
    determining a flight speed limit value of the unmanned aerial vehicle according to the target relative distance;
    and controlling the unmanned aerial vehicle to bypass the obstacle flight in the route direction according to a first bypass mode according to the flight speed limit value and the obstacle information.
  51. The control device of claim 50, wherein the processor, when implementing obtaining a target relative distance between the drone and an obstacle in the course direction, is to implement:
    Determining a first relative distance between the unmanned aerial vehicle and an obstacle in the direction of the route according to a three-dimensional dense map of the environment where the unmanned aerial vehicle is located;
    determining a second relative distance between the unmanned aerial vehicle and the obstacle in the route direction according to the depth map acquired by the vision sensor of the unmanned aerial vehicle;
    and carrying out weighted summation on the first relative distance and the second relative distance to obtain the target relative distance.
  52. The unmanned aerial vehicle control of any of claims 28-49, wherein the distance sensor comprises a millimeter wave radar.
  53. The control device of any one of claims 28-49, wherein the processor, when implementing identifying obstacle information in a course direction of the drone using the distance sensor and the vision sensor, is to implement:
    acquiring first sensing data acquired by the distance sensor and second sensing data acquired by the vision sensor;
    according to the first sensing data and the second sensing data, a three-dimensional dense map of the environment where the unmanned aerial vehicle is located is established;
    and acquiring barrier information in the route direction of the unmanned aerial vehicle from the three-dimensional dense map.
  54. The unmanned aerial vehicle control of claim 53, wherein the processor is further configured to:
    and controlling the unmanned aerial vehicle to send the three-dimensional dense map to a control terminal in communication connection with the unmanned aerial vehicle so as to enable the control terminal to display the three-dimensional dense map.
  55. An unmanned aerial vehicle, comprising:
    a body;
    and the power system is arranged on the machine body and is used for providing flight power for the unmanned aerial vehicle.
    The distance sensor and the visual sensor are fixedly connected or detachably connected with the machine body;
    the control device of the unmanned aerial vehicle of any of claims 28-54, disposed within the body, for controlling the unmanned aerial vehicle.
  56. A storage medium storing a computer program which, when executed by a processor, causes the processor to carry out the method of controlling a drone according to any one of claims 1 to 27.
CN202180100503.4A 2021-11-15 2021-11-15 Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium Pending CN117693722A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/130663 WO2023082257A1 (en) 2021-11-15 2021-11-15 Control method and control device for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium

Publications (1)

Publication Number Publication Date
CN117693722A true CN117693722A (en) 2024-03-12

Family

ID=86334951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180100503.4A Pending CN117693722A (en) 2021-11-15 2021-11-15 Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium

Country Status (2)

Country Link
CN (1) CN117693722A (en)
WO (1) WO2023082257A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117472082B (en) * 2023-12-26 2024-03-22 众芯汉创(江苏)科技有限公司 Unmanned aerial vehicle route generation method and device based on AI vision assistance

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106292699B (en) * 2016-08-03 2017-12-12 广州极飞科技有限公司 The method, apparatus and unmanned plane that unmanned plane flies imitatively
CN110989652A (en) * 2019-11-05 2020-04-10 北京金景科技有限公司 Method for unmanned aerial vehicle ground-imitating flight by using laser radar
CN112581590B (en) * 2020-12-28 2021-06-08 广东工业大学 Unmanned aerial vehicle cloud edge terminal cooperative control method for 5G security rescue networking
CN113281785A (en) * 2021-05-20 2021-08-20 风迈智能科技(重庆)有限公司 Power transmission channel tree obstacle early warning method based on unmanned aerial vehicle laser radar

Also Published As

Publication number Publication date
WO2023082257A1 (en) 2023-05-19

Similar Documents

Publication Publication Date Title
US11460844B2 (en) Unmanned aerial image capture platform
US11787543B2 (en) Image space motion planning of an autonomous vehicle
US11635775B2 (en) Systems and methods for UAV interactive instructions and control
US11323680B2 (en) Detecting optical discrepancies in captured images
JP7465615B2 (en) Smart aircraft landing
US11861892B2 (en) Object tracking by an unmanned aerial vehicle using visual sensors
CN104808675A (en) Intelligent terminal-based somatosensory flight operation and control system and terminal equipment
CN208126205U (en) A kind of unmanned flight's device of automatic obstacle-avoiding
CN117693722A (en) Unmanned aerial vehicle control method, unmanned aerial vehicle control device, unmanned aerial vehicle and storage medium
CN220518585U (en) Ultra-low altitude approaching reconnaissance unmanned aerial vehicle equipment capable of automatically avoiding obstacle
CN118567383A (en) Vehicle getting rid of poverty method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination