WO2021146972A1 - Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement - Google Patents

Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement Download PDF

Info

Publication number
WO2021146972A1
WO2021146972A1 PCT/CN2020/073659 CN2020073659W WO2021146972A1 WO 2021146972 A1 WO2021146972 A1 WO 2021146972A1 CN 2020073659 W CN2020073659 W CN 2020073659W WO 2021146972 A1 WO2021146972 A1 WO 2021146972A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
preset
view
distance
field
Prior art date
Application number
PCT/CN2020/073659
Other languages
English (en)
Chinese (zh)
Inventor
刘宝恩
王涛
李鑫超
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/073659 priority Critical patent/WO2021146972A1/fr
Priority to CN202080004081.6A priority patent/CN112567308A/zh
Publication of WO2021146972A1 publication Critical patent/WO2021146972A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present invention relates to the technical field of image processing, in particular to an airspace detection method, a movable platform, equipment and storage medium.
  • UAV is an unmanned aerial vehicle operated by radio remote control equipment and self-provided program control device. Compared with manned aircraft, it has the characteristics of small size and low cost. It has been widely used in many fields, such as street scene shooting, power inspection, traffic monitoring, post-disaster rescue and so on.
  • the invention provides an airspace detection method, a movable platform, equipment and a storage medium, which are used for realizing accurate detection of the flightable airspace above the drone.
  • the first aspect of the present invention is to provide an airspace detection method, the method includes:
  • the airspace above the drone is a flyable airspace.
  • the second aspect of the present invention is to provide a movable platform, the movable platform includes: a body, a power system and a control device;
  • the power system is arranged on the body and used to provide power for the movable platform
  • the control device includes a memory and a processor
  • the memory is used to store a computer program
  • the processor is configured to run a computer program stored in the memory to realize: responding to flight control instructions to make the UAV fly in a preset flight mode;
  • the airspace above the drone is a flyable airspace.
  • the third aspect of the present invention is to provide an airspace detection device, which includes:
  • Memory used to store computer programs
  • the processor is configured to run a computer program stored in the memory to realize: responding to flight control instructions to make the UAV fly in a preset flight mode;
  • the airspace above the drone is a flyable airspace.
  • the fourth aspect of the present invention is to provide a computer-readable storage medium, the storage medium is a computer-readable storage medium, the computer-readable storage medium stores program instructions, and the program instructions are used in the first aspect.
  • the airspace detection method, movable platform, equipment and storage medium provided by the present invention respond to flight control instructions, and make the UAV fly according to the preset flight mode according to the response result.
  • the environment image in the preset field of view above the drone is acquired and the environment image is recognized. If it is recognized that there is no object belonging to the obstacle category in the environmental image, it can be considered that there is no obstacle above the drone, and the airspace above the drone is determined to be flyable.
  • the present invention compared to the way in the prior art that the airspace above the drone is a flyable airspace, the present invention provides an airspace detection method to determine whether the airspace above the drone is a flyable airspace through detection.
  • the drone When it is detected that there are no obstacles in the airspace above, the drone can fly upwards and further return home to avoid damage to the drone.
  • the airspace detection method provided by the present invention does not detect the entire field of view above the drone, but detects the preset field of view obtained by flying according to the preset flight mode, so that the detection range in the detection process is Reduce, reduce the amount of calculation, and improve the detection efficiency.
  • FIG. 1 is a schematic flowchart of an airspace detection method provided by an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of a pan/tilt configured for drones in different states according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a circular field of view corresponding to an environmental image provided by an embodiment of the present invention
  • FIG. 4 is a schematic flowchart of another airspace detection method provided by an embodiment of the present invention.
  • FIG. 5a is a schematic flowchart of yet another airspace detection method according to an embodiment of the present invention.
  • FIG. 5b is a schematic flowchart of yet another airspace detection method according to an embodiment of the present invention.
  • FIG. 5c is a schematic flowchart of another airspace detection method according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of the relationship between the first distance and the second distance provided by an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of an airspace detection device provided by an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of an airspace detection device provided by an embodiment of the present invention.
  • the UAV is often in the range of beyond the visual range.
  • the UAV completes the flight mission or encounters the harsh natural environment during the flight, such as the protruding mountain peak, or the ground base station.
  • the communication connection is disconnected, in order to ensure the safety of the drone and avoid damage accidents, the drone often needs to return home automatically. Since there is an ascending flight stage during the automatic return of the UAV, whether there are obstacles above the UAV becomes an important factor affecting the automatic return of the UAV.
  • the airspace detection method provided by the following embodiments can be used to determine whether there is an obstacle above the drone, that is, to determine whether the drone can automatically return home.
  • FIG. 1 is a schematic flowchart of an airspace detection method provided by an embodiment of the present invention.
  • the main body of the airspace detection method is airspace detection equipment. It is understandable that the airspace detection device can be implemented as software or a combination of software and hardware.
  • the airspace detection device executes the airspace detection method to detect whether the airspace above the drone is a flightable airspace.
  • the airspace detection equipment in this embodiment and the following embodiments may specifically be a movable platform, such as a drone.
  • the method may include:
  • the drone When the drone needs to return home automatically, it will generate a flight control command.
  • the drone responds to this command to make itself fly according to the preset flight mode, and the preset flight mode has a one-to-one correspondence with the flight control command.
  • the flight control command can be rotating and flying in place at the first position, or flying from the first position to a second position beyond the preset distance, where the first position can be that the drone generates flight control.
  • the position at the time of the command can be any position during the entire flight of the UAV.
  • S102 Acquire an environment image corresponding to the preset field of view above the drone during the flight in the preset flight mode.
  • the upward-viewing camera configured by the UAV itself can continuously take pictures of the top of the UAV to obtain environmental images.
  • the particularity of the flying mode of the UAV directly leads to the particularity of the field of view corresponding to the obtained environmental image, that is, within a certain field of view.
  • its direct function is to obtain an image of the environment in the preset field of view.
  • the top-viewing camera configured on the UAV can be a monocular camera, and the top-viewing function of the monocular camera can be realized with the help of a pan-tilt, that is, the monocular camera is placed on a pan-tilt that can be lifted upwards.
  • the monocular camera can take images corresponding to the environment above the drone by lifting up the pan/tilt.
  • the lifted and non-lifted states of the PTZ can be shown in Figure 2.
  • an alternative way may be to control the UAV to rotate and fly one circle at the current position.
  • the field of view corresponding to the obtained environment image is a circular field of view, as shown in Figure 3, and the circular field of view The height is determined by the field of view of the monocular camera.
  • the drone will recognize the acquired environment image to determine the category of each object in the environment image.
  • This recognition is actually at the pixel level, that is, to identify the category to which each pixel in the environmental image belongs.
  • the pixel-level recognition process optionally, it can be done by a neural network model.
  • the neural network model may be a convolutional neural network (Convolutional Neural Networks, CNN) model.
  • the neural network model can include multiple computing nodes. Each computing node can include a convolution (Conv) layer, batch normalization (BN), and an activation function ReLU.
  • the computing nodes can use skip connection (Skip Connection). ) Way to connect.
  • K ⁇ H ⁇ W can be input into the neural network model, and after the neural network model is processed, the output data of C ⁇ H ⁇ W can be obtained.
  • K can represent the number of input channels, and K can be equal to 4, corresponding to the four channels of red (R, red), green (G, green), blue (B, blue) and depth (D, deep) respectively;
  • H can represent the height of the input image (ie, environmental image),
  • W can represent the width of the input image, and C can represent the number of categories.
  • the input image when the input image is too large, an input image can be cut into N sub-images.
  • the input data can be N ⁇ K ⁇ H' ⁇ W'
  • the output data can be N ⁇ C ⁇ H' ⁇ W', where H'can represent the height of the sub-image, and W'can represent the width of the sub-image.
  • H'can represent the height of the sub-image
  • W'can represent the width of the sub-image.
  • the feature map may also be obtained in other ways, which is not limited in this application.
  • Using the above-mentioned pre-trained neural network model to process the environment image to obtain a feature map may specifically include the following steps:
  • Step 1 Input the environment image into the neural network model to obtain the model output result of the neural network model.
  • the model output result of the neural network model may include the confidence feature maps output by multiple output channels, and the multiple output channels can correspond to multiple object categories one-to-one, and the pixel values of the confidence feature maps of a single object category are used To characterize the probability that a pixel is an object category.
  • Step 2 According to the model output result of the neural network model, a feature map containing semantic information is obtained.
  • the object category corresponding to the confidence feature map with the largest pixel value at the same pixel location in the multiple confidence feature maps one-to-one corresponding to the multiple output channels may be used as the object category of the pixel location to obtain the feature map.
  • the number of output channels of the neural network model is 4, and the output result of each channel is a confidence feature map, that is, the 4 confidence feature maps are the confidence feature map 1 to the confidence feature map 4, and the confidence The degree characteristic map 1 corresponds to the sky, the confidence characteristic map 2 corresponds to buildings, the confidence characteristic map 3 corresponds to trees, and the confidence characteristic map 4 corresponds to "other". In these categories, except for the sky, the rest can be regarded as obstacles.
  • the pixel value at the pixel location (100, 100) in the confidence feature map 1 is 70
  • the pixel value at the pixel location (100, 100) in the confidence feature map 2 is 50
  • the pixel at the pixel location (100, 100) in the confidence feature map 3 When the value is 20, and the pixel value of the pixel position (100, 100) in the confidence feature map 4 is 20, it can be determined that the pixel position (100, 100) is the sky.
  • the pixel value at the pixel location (100, 80) in the confidence feature map 1 is 20
  • the pixel value at the pixel location (100, 80) in the confidence feature map 2 is 30, and the pixel location in the confidence feature map 3
  • the pixel value of (100,80) is 20
  • the pixel value of pixel position (100,80) in the confidence feature figure 4 is 70
  • the UAV can start to fly and return home automatically.
  • the airspace above the drone is determined to be non-flyable airspace. At this time, the drone needs to continue hovering at the current position.
  • the airspace detection method provided in this embodiment responds to flight control instructions, and makes the UAV fly in a preset flight mode according to the response result.
  • the environment image in the preset field of view above the drone is acquired and the environment image is recognized. If it is recognized that there are no objects belonging to the obstacle category in the environmental image, it can be considered that there is no obstacle above the drone, and the airspace above the drone is determined to be flyable.
  • the present invention provides an airspace detection method to determine whether the airspace above the drone is flyable airspace through detection.
  • the drone When it is detected that there are no obstacles in the airspace above, the drone can fly upwards and further realize automatic return to avoid damage to the drone.
  • the airspace detection method provided by the present invention does not detect the entire field of view above the drone, but detects the preset field of view obtained by flying according to the preset flight mode, so that the detection range in the detection process is Reduce, reduce the amount of calculation, and improve the detection efficiency.
  • this embodiment only the image taken by the upward-viewable camera can be used to realize airspace detection, that is, a brand-new airspace detection method is provided. Compared with the method of using binocular cameras or depth sensors for airspace detection, this embodiment is more suitable for drones that are not equipped with binocular cameras or depth sensors.
  • the drone is equipped with an upward-looking camera, which usually has a larger field of view angle, such as 45 degrees.
  • the angle of the field of view of the camera can also be used as the upper limit, and the angle of the field of view can be adjusted in a targeted manner to obtain the preset field of view corresponding to the environmental image.
  • FIG. 4 is a schematic flowchart of another airspace detection method provided by an embodiment of the present invention. As shown in FIG. 4, after step 101, the airspace detection method may further include the following steps:
  • S201 Determine the field of view angle of the preset annular field of view according to the field of view angle of the upward-looking camera configured on the drone.
  • the angle of view of the top-view camera itself is referred to as the first angle
  • the angle of the preset circular field of view corresponding to the preset flying mode is referred to as the second angle.
  • the second angle is less than or equal to the first angle.
  • the second angle can be set equal to the first angle.
  • a second angle smaller than the first angle can also be determined. After the angle of the field of view is adjusted to the second angle, the ring width of the annular field of view also becomes smaller.
  • the second angle in addition to determining the second angle based on the first angle, it can be more refined, optionally, based on the volume of the drone and the flight environment of the drone. To determine the second angle.
  • the second angle can be determined as an angle slightly smaller than the first angle. Since the original images captured by the up-view camera correspond to the first angle, after the second angle is determined, the original image will be intercepted, and the intercepted image is the environment image. After the interception process, the size of the environment image will be smaller than the size of the original image, that is, the field of view of the spatial detection is reduced, thereby reducing the amount of calculation and improving the detection efficiency.
  • the second angle can be determined to be equal to the first angle to ensure that the airspace above the drone is detected within the maximum field of view angle, and as far as possible to avoid the occurrence of errors. Damaged man and machine.
  • the first angle of the up-view camera itself can also be adjusted according to parameters such as the flight environment and the volume of the drone to obtain the second angle that meets the actual flight.
  • the angle By adjusting the angle, the field of view of the airspace detection can be reduced, the calculation amount is reduced, and the detection efficiency can be improved.
  • FIG. 5a is a schematic flowchart of another airspace detection method provided by an embodiment of the present invention. As shown in Figure 5a, the method may include the following steps:
  • S301 Respond to the first flight control instruction to make the UAV rotate and fly one circle in situ at the first position.
  • the first flight control instruction may specifically be a control instruction for controlling the drone to rotate and fly one circle in situ at the first position.
  • the drone responds to this instruction and starts to fly one circle in the first position.
  • S303 Perform merging processing on multiple images to obtain an environmental image.
  • the field of view corresponding to this environment image is the first preset annular field of view
  • the scene included in the environment image is the scene in the first preset annular field of view above the drone
  • the first preset annular field of view corresponds to Above the fuselage of the drone.
  • step 304 and step 305 The execution process of the foregoing step 304 and step 305 is similar to the corresponding steps of the foregoing embodiment, and reference may be made to the related description in the embodiment shown in FIG. 1, which will not be repeated here.
  • the UAV in response to the first flight control instruction, rotates and flies once in the first position. It is precisely because of this special flight mode that the acquired environment image corresponds to a special field of view, that is, the first preset annular field of view above the drone. Finally, through the recognition of environmental images, it can be accurately determined whether there are obstacles in the airspace above the drone, that is, whether the drone can return home automatically.
  • the detection range is actually a minimum detection range.
  • obstacle detection is often required on the wing of the drone.
  • the UAV is affected by strong winds or other environmental factors during the ascending flight, and there may be shaking during the ascending flight. Based on the above two considerations, the detection range of obstacles can also be appropriately expanded.
  • Fig. 5b is a schematic flowchart of yet another airspace detection method according to an embodiment of the present invention. As shown in Figure 5b, the method may include the following steps:
  • S401 Respond to the first flight control instruction to make the UAV rotate and fly one circle in situ at the first position.
  • S402 During the rotating flight at the first position, acquire multiple images taken at intervals of a preset angle, and the multiple images acquired at the first location correspond to the first preset annular field of view above the drone.
  • step 401 and step 402 The execution process of the foregoing step 401 and step 402 is similar to the corresponding steps of the foregoing embodiment, and reference may be made to the related description in the embodiment shown in FIG. 5a, which will not be repeated here.
  • S403 Respond to the second flight control instruction to make the drone fly to a second position that is a preset distance from the first position.
  • S404 Acquire an image taken at the second position, where the image obtained at the second position corresponds to a preset angle of view above the drone.
  • S405 Generate an environment image based on the multiple images acquired at the first location and the images acquired at the second location.
  • the drone After responding to the first flight control instruction, it may continue to respond to the second flight control instruction, so that the drone flies from the current first position to the second position.
  • the first position and the second position are separated by a preset distance.
  • the up-view camera on the drone can continue to take images in this second position.
  • the number of images may be one, and the field of view of this image is the preset field of view angle above the drone when it is at the second position.
  • the preset field of view is composed of the first preset annular field of view and the preset angle of field of view.
  • the preset distance should not be less than the difference between the first distance L1 and the second distance L2.
  • the first distance L1 is the distance between the center of the drone's fuselage and the top-viewing camera
  • the second distance L2 is the maximum distance between the center of the fuselage and the edge of the drone's wing.
  • the multiple images taken by the rotating flight at the first position are merged, and the merged result and the image taken at the second position are jointly determined as the environment image.
  • step 406 and step 407 are similar to the corresponding steps of the foregoing embodiment, and reference may be made to the related description in the embodiment shown in FIG. 1, which will not be repeated here.
  • the drone When it is determined that the airspace above the drone is a flyable airspace, the drone will return to the first position again and return home at the first position.
  • the drone will first rotate and fly in the first position and then fly to the second position, thereby completing the preset flight mode.
  • multiple images will be taken at the first position, and images will also be taken at the second position, thereby generating environmental images based on all the captured images.
  • This environment image corresponds to the first preset annular field of view above the drone at the first position, and also corresponds to the preset angle of view above the drone at the second position.
  • the detection range of obstacles in this embodiment is expanded from the first position to the second position. This expansion of the range can determine whether there are obstacles above the UAV fuselage and wings at the same time. In this way, it is possible to more accurately determine whether the UAV can return home automatically.
  • the embodiment shown in Fig. 5b only performs obstacle detection based on an image taken at the second position, which is obviously It is not possible to fully detect whether there are obstacles in the airspace above the drone.
  • FIG. 5c is a schematic flowchart of another airspace detection method provided by an embodiment of the present invention. As shown in Figure 5c, the method may include the following steps:
  • S501 Respond to the first flight control instruction to make the UAV rotate and fly one circle in situ at the first position.
  • S502 During the rotating flight at the first position, acquire multiple images taken at intervals of a preset angle, and the multiple images acquired at the first location correspond to the first preset annular field of view above the drone.
  • S503 Respond to the third flight control instruction to make the UAV rotate and fly one circle in situ at a second position that is a preset distance from the first position.
  • S504 During the rotating flight at the second position, acquire multiple images taken at intervals of a preset angle, and the multiple images acquired at the second location correspond to the second preset annular field of view above the drone.
  • Step 503 to step 504 are actually basically similar to the execution process of step 501 to step 502, and the specific content can refer to the corresponding content in the foregoing embodiment.
  • S505 Perform merging processing on the multiple images respectively obtained at the first position and the second position respectively, to obtain an environment image.
  • multiple images taken at the first position and multiple images taken at the second position have been acquired.
  • the images taken at different positions are merged, so as to obtain the environment image corresponding to the first position and the environment image corresponding to the second position, so that the two environmental images can be subsequently identified.
  • step 506 and step 507 are similar to the corresponding steps of the foregoing embodiment, and reference may be made to the related description in the embodiment shown in FIG. 1, which will not be repeated here.
  • the UAV will rotate and fly one circle in the first position and the second position respectively, and recognize based on multiple images taken at the two positions.
  • the special flying method of rotating flight it is possible to comprehensively determine whether there are obstacles in the airspace above the drone's fuselage, and to comprehensively determine whether there are obstacles in the airspace above the drone's wing.
  • the embodiment shown in Figure 5b can more comprehensively and accurately determine whether the UAV can return home automatically.
  • the unmanned return can be controlled. It is easy to understand that any flight process of the drone requires battery power. Therefore, before controlling the drone to return home, you can also determine the power required during the return process. If the current remaining power is more than the return required When the battery is charged, the drone will be controlled to return home.
  • an alternative way is to first estimate the wind speed information from the current position to the destination of the return home based on the historical wind speed information. Then determine the ground speed information for landing from the current position to the return destination, so as to determine the power required for the drone's return process based on the wind speed information and ground speed information.
  • FIG. 7 is a schematic structural diagram of an airspace detection device provided by an embodiment of the present invention. referring to FIG. 5, this embodiment provides an airspace detection device, which can execute the above-mentioned airspace detection method; specifically ,
  • the airspace detection device includes:
  • the response module 11 is used to respond to flight control commands to make the UAV fly in a preset flight mode.
  • the acquiring module 12 is configured to acquire an environment image corresponding to the preset field of view above the drone during the flight in the preset flight mode.
  • the recognition module 13 is used to recognize the category to which the object in the environmental image belongs.
  • the determining module 14 is configured to determine that the airspace above the drone is a flyable airspace if there is no object belonging to the obstacle category in the environment image.
  • the device shown in FIG. 7 can also execute the methods of the embodiments shown in FIGS. 1 to 6.
  • FIGS. 1 to 6 For parts that are not described in detail in this embodiment, reference may be made to the related descriptions of the embodiments shown in FIGS. 1 to 6.
  • the implementation process and technical effects of this technical solution please refer to the description in the embodiment shown in FIG. 1 to FIG. 6, which will not be repeated here.
  • FIG. 8 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention. referring to FIG. 8, an embodiment of the present invention provides a movable platform, which is the following unmanned aerial vehicle; ,
  • the movable platform includes: a body 21, a power system 22, and a control device 23.
  • the power system 22 is arranged on the body 21 and is used to provide power for the movable platform.
  • the control device 23 includes a memory 231 and a processor 232.
  • the memory is used to store a computer program
  • the processor is configured to run a computer program stored in the memory to realize: responding to flight control instructions to make the UAV fly in a preset flight mode;
  • the airspace above the drone is a flyable airspace.
  • the processor 232 is further configured to: if an object belonging to an obstacle category is recognized in the environment image, determine that the airspace above the drone is a non-flyable airspace.
  • the body 21 is provided with a camera 24 capable of looking upwards.
  • the processor responds to the flight control instruction to make the drone fly in a preset flight mode, so that the processor obtains the environment image in the preset circular field of view above the drone;
  • the processor 232 is further configured to determine the field of view angle of the preset annular field of view according to the field of view angle of the up-view camera.
  • the processor 232 is further configured to: determine the field of view of the preset circular field of view according to the angle of the field of view of the up-view camera, the volume of the drone, and the flying environment of the drone angle.
  • the processor 232 is further configured to: respond to the first flight control instruction, so that the UAV rotates and flies once at the first position.
  • the processor 232 is further configured to: during the rotating flight at the first position, obtain a plurality of images taken at intervals of a preset angle, and the plurality of images obtained at the first position correspond to the The first preset annular field of view above the drone;
  • Merging the multiple images is performed to obtain the environmental image.
  • processor 232 is also used for:
  • the first distance is the distance between the center of the drone's fuselage and the up-view camera
  • the second distance is the distance between the center of the fuselage and the edge of the drone's wing The maximum distance.
  • the processor 232 is further configured to: acquire an image taken at the second position, where the image acquired at the second position corresponds to a preset angle of view above the drone;
  • the environment image is generated based on a plurality of images acquired at the first location and an image acquired at the second location.
  • the processor 232 is further configured to: respond to a third flight control instruction, so that the UAV rotates and flies in situ at a second position that is a preset distance from the first position, wherein the The preset distance is not less than the difference between the first distance and the second distance, the first distance is the distance between the center of the drone body and the up-view camera, and the second distance is the The maximum distance between the center of the fuselage and the edge of the UAV wing.
  • the processor 232 is further configured to: during the rotating flight at the second position, obtain a plurality of images taken at intervals of a preset angle, and the plurality of images obtained at the second position correspond to the The second preset annular field of view above the drone;
  • the multiple images respectively acquired at the first position and the second position are merged to obtain the environment image.
  • the movable platform shown in FIG. 8 can execute the methods of the embodiments shown in FIGS. 1 to 6.
  • parts that are not described in detail in this embodiment please refer to the related descriptions of the embodiments shown in FIGS. 1 to 6.
  • the implementation process and technical effects of this technical solution please refer to the description in the embodiment shown in FIG. 1 to FIG. 6, which will not be repeated here.
  • the structure of the airspace detection device shown in FIG. 9 can be implemented as an electronic device, and the electronic device can be a drone.
  • the electronic device may include: one or more processors 31 and one or more memories 32.
  • the memory 32 is used to store a program that supports the electronic device to execute the airspace detection method provided in the above-mentioned embodiments shown in FIGS. 1 to 6.
  • the processor 31 is configured to execute a program stored in the memory 32.
  • the program includes one or more computer instructions, and the following steps can be implemented when one or more computer instructions are executed by the processor 31:
  • the airspace above the drone is determined to be flyable airspace.
  • the structure of the airspace detection device may also include a communication interface 33 for the electronic device to communicate with other devices or a communication network.
  • the processor 31 is further configured to: if it is recognized that there is an object belonging to the obstacle category in the environment image, determine that the airspace above the drone is a non-flyable airspace.
  • the processor responds to the flight control instruction to make the drone fly in a preset flight mode, so that the processor obtains the environment image in the preset circular field of view above the drone;
  • the processor 31 is further configured to determine the field of view angle of the preset annular field of view according to the field of view angle of the up-view camera configured on the drone.
  • the processor 31 is further configured to: determine according to the field of view angle of the up-view camera, the volume of the drone, and the flight environment of the drone The field of view angle of the preset annular field of view.
  • processor 31 is further configured to: respond to the first flight control instruction to make the UAV rotate and fly once in the first position.
  • the processor 31 is further configured to: during the rotating flight at the first position, obtain a plurality of images taken at intervals of a preset angle, and the plurality of images obtained at the first position correspond to the The first preset annular field of view above the drone;
  • Merging the multiple images is performed to obtain the environmental image.
  • the processor 31 is further configured to: respond to a second flight control instruction to cause the drone to fly to a second position that is a preset distance from the first position, wherein the preset distance is not Is less than the difference between a first distance and a second distance, the first distance being the distance between the center of the drone's body and the up-view camera, and the second distance being the distance between the center of the body and the The maximum distance between the edges of the UAV's wings.
  • the processor 31 is further configured to: acquire an image taken at the second position, where the image acquired at the second position corresponds to a preset angle of view above the drone;
  • the environment image is generated based on a plurality of images acquired at the first location and an image acquired at the second location.
  • the processor 31 is further configured to: respond to a third flight control instruction, so that the UAV rotates and flies in situ at a second position that is a preset distance from the first position, wherein the The preset distance is not less than the difference between the first distance and the second distance, the first distance is the distance between the center of the drone body and the up-view camera, and the second distance is the The maximum distance between the center of the fuselage and the edge of the UAV wing.
  • the processor 31 is further configured to: during the rotating flight at the second position, obtain a plurality of images taken at intervals of a preset angle, and the plurality of images obtained at the second position correspond to the The second preset annular field of view above the drone;
  • the multiple images respectively acquired at the first position and the second position are merged to obtain the environment image.
  • the device shown in FIG. 9 can execute the methods of the embodiments shown in FIGS. 1 to 6.
  • parts that are not described in detail in this embodiment reference may be made to the related descriptions of the embodiments shown in FIGS. 1 to 6.
  • the implementation process and technical effects of this technical solution please refer to the description in the embodiment shown in FIG. 1 to FIG. 6, which will not be repeated here.
  • an embodiment of the present invention provides a computer-readable storage medium, the storage medium is a computer-readable storage medium, the computer-readable storage medium stores program instructions, and the program instructions are used to implement the above-mentioned airspace in FIGS. 1 to 6 Detection method.
  • the related detection device for example: IMU
  • the embodiments of the remote control device described above are merely illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or components. It can be combined or integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, remote control devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , Including several instructions to make a computer processor (processor) execute all or part of the steps of the method described in each embodiment of the present invention.
  • the aforementioned storage media include: U disk, mobile hard disk, Read-Only Memory (ROM), Random Access Memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un procédé de détection d'espace aérien, une plate-forme mobile, un dispositif, et un support d'enregistrement Le procédé comprend les étapes consistant à : répondre à une instruction de commande de vol de telle sorte qu'un véhicule aérien sans pilote vole selon un mode de vol prédéfini (S101) ; dans un processus de vol, obtenir une image d'environnement dans un champ de vision prédéfini au-dessus du véhicule aérien sans pilote (S102) ; et s'il est identifié qu'il n'y a aucun objet du type constituant un obstacle dans l'image d'environnement, déterminer que l'espace aérien au-dessus du véhicule aérien sans pilote est praticable (S104). D'une part, par comparaison avec des procédés considérant l'espace aérien au-dessus d'un véhicule aérien sans pilote comme étant praticable par défaut, la détection permet de déterminer si l'espace aérien au-dessus du véhicule aérien sans pilote est praticable. Lorsque l'espace aérien ci-dessus est praticable, le véhicule aérien sans pilote peut voler vers le haut, et un retour automatique est en outre mis en œuvre pour éviter des dommages au véhicule aérien sans pilote. D'autre part, le procédé de détection n'est pas conçu pour détecter tout le champ de vision au-dessus du véhicule aérien sans pilote, mais pour détecter le champ de vision prédéfini obtenu par le vol selon le mode de vol prédéfini, de telle sorte que la plage de détection est rétrécie, la quantité de calcul est réduite, et l'efficacité de détection est améliorée.
PCT/CN2020/073659 2020-01-21 2020-01-21 Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement WO2021146972A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/073659 WO2021146972A1 (fr) 2020-01-21 2020-01-21 Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement
CN202080004081.6A CN112567308A (zh) 2020-01-21 2020-01-21 空域检测方法、可移动平台、设备和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/073659 WO2021146972A1 (fr) 2020-01-21 2020-01-21 Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2021146972A1 true WO2021146972A1 (fr) 2021-07-29

Family

ID=75034938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/073659 WO2021146972A1 (fr) 2020-01-21 2020-01-21 Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement

Country Status (2)

Country Link
CN (1) CN112567308A (fr)
WO (1) WO2021146972A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444837A (zh) * 2016-10-17 2017-02-22 北京理工大学 一种无人机避障方法及系统
CN108171116A (zh) * 2017-12-01 2018-06-15 北京臻迪科技股份有限公司 飞行器辅助避障方法、装置和辅助避障系统
CN108594851A (zh) * 2015-10-22 2018-09-28 飞智控(天津)科技有限公司 一种基于双目视觉的无人机自主障碍物检测系统、方法及无人机
US10121117B1 (en) * 2016-09-08 2018-11-06 Amazon Technologies, Inc. Drone location signature filters
CN208544411U (zh) * 2018-06-11 2019-02-26 视海博(中山)科技股份有限公司 应用于受限空间安全探测的带有下置式旋翼的无人机
CN110171565A (zh) * 2019-05-17 2019-08-27 南京绿新能源研究院有限公司 一种用于光伏电站故障检测的无人机及其检测方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468029B (zh) * 2015-09-23 2018-03-02 杨珊珊 一种无人机航拍装置及方法
GB2568369B (en) * 2015-11-13 2019-11-27 Walmart Apollo Llc Product delivery methods and systems utilizing portable unmanned delivery aircraft
CN106292704A (zh) * 2016-09-07 2017-01-04 四川天辰智创科技有限公司 规避障碍物的方法及装置
CN208044406U (zh) * 2018-03-05 2018-11-02 仲恺农业工程学院 一种基于植保无人机多重检测的避障系统及无人机
CN108769569B (zh) * 2018-04-10 2021-04-13 昆山微电子技术研究院 一种用于无人机的360度立体全景观测系统及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594851A (zh) * 2015-10-22 2018-09-28 飞智控(天津)科技有限公司 一种基于双目视觉的无人机自主障碍物检测系统、方法及无人机
US10121117B1 (en) * 2016-09-08 2018-11-06 Amazon Technologies, Inc. Drone location signature filters
CN106444837A (zh) * 2016-10-17 2017-02-22 北京理工大学 一种无人机避障方法及系统
CN108171116A (zh) * 2017-12-01 2018-06-15 北京臻迪科技股份有限公司 飞行器辅助避障方法、装置和辅助避障系统
CN208544411U (zh) * 2018-06-11 2019-02-26 视海博(中山)科技股份有限公司 应用于受限空间安全探测的带有下置式旋翼的无人机
CN110171565A (zh) * 2019-05-17 2019-08-27 南京绿新能源研究院有限公司 一种用于光伏电站故障检测的无人机及其检测方法

Also Published As

Publication number Publication date
CN112567308A (zh) 2021-03-26

Similar Documents

Publication Publication Date Title
US11797028B2 (en) Unmanned aerial vehicle control method and device and obstacle notification method and device
US20200402410A1 (en) Unmanned Aerial Vehicle Visual Line Of Sight Control
WO2020187095A1 (fr) Procédé et appareil de suivi de cible, et véhicule aérien sans pilote
WO2021189456A1 (fr) Procédé et appareil d'inspection de véhicule aérien sans pilote, et véhicule aérien sans pilote
WO2020107372A1 (fr) Procédé et appareil de commande destinés à un dispositif de photographie, et dispositif et support de stockage
US20220091608A1 (en) Method, apparatus, and electronic device for obstacle avoidance
US11079242B2 (en) System and method for determining autonomous vehicle location using incremental image analysis
CN112180955B (zh) 自动巡检无人机的基于视觉反馈的二次复查方法和系统
CN106502257B (zh) 一种无人机精准降落抗干扰的控制方法
JP6515367B1 (ja) 撮像システム及び撮像方法
KR102195051B1 (ko) 드론의 영상 정보를 이용한 공간 정보 생성 시스템 및 방법과, 이를 위한 컴퓨터 프로그램
US10725479B2 (en) Aerial vehicle landing method, aerial vehicle, and computer readable storage medium
WO2021129351A1 (fr) Procédé et dispositif de protection de drone, drone
WO2022016534A1 (fr) Procédé de commande de vol d'engin volant sans pilote embarqué et engin volant sans pilote embarqué
US20200115050A1 (en) Control device, control method, and program
EP3893078A1 (fr) Procédé et appareil de génération de point de relais, et véhicule aérien sans pilote
WO2021146973A1 (fr) Procédé de commande de retour au point de départ de véhicule aérien sans pilote, dispositif, plate-forme mobile et support de stockage
WO2018018514A1 (fr) Réglage de l'exposition d'une image en fonction d'une cible
WO2021056139A1 (fr) Procédé et dispositif d'acquisition de position d'atterrissage, véhicule aérien sans pilote, système et support de stockage
WO2020114432A1 (fr) Procédé et appareil de détection d'eau, et véhicule aérien sans pilote
WO2020237422A1 (fr) Procédé d'arpentage aérien, aéronef et support d'informations
KR20210047490A (ko) 무인 비행체를 이용한 화재위험도 예측 시스템 및 그 방법
WO2021146972A1 (fr) Procédé de détection d'espace aérien, plate-forme mobile, dispositif, et support d'enregistrement
KR20190097350A (ko) 드론의 정밀착륙을 위한 방법, 이를 수행하기 위한 기록매체, 및 이를 적용한 드론
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20916200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20916200

Country of ref document: EP

Kind code of ref document: A1