CN116627155A - Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse - Google Patents

Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse Download PDF

Info

Publication number
CN116627155A
CN116627155A CN202310788698.7A CN202310788698A CN116627155A CN 116627155 A CN116627155 A CN 116627155A CN 202310788698 A CN202310788698 A CN 202310788698A CN 116627155 A CN116627155 A CN 116627155A
Authority
CN
China
Prior art keywords
distance
unmanned aerial
aerial vehicle
obstacle
ultrasonic sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310788698.7A
Other languages
Chinese (zh)
Inventor
岑峰
吕壹凡
戴兴平
杨兴杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202310788698.7A priority Critical patent/CN116627155A/en
Publication of CN116627155A publication Critical patent/CN116627155A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/25Greenhouse technology, e.g. cooling systems therefor

Abstract

The application relates to a multi-rotor unmanned aerial vehicle obstacle avoidance system for a greenhouse, which comprises an unmanned aerial vehicle, an onboard flight control module, a depth camera connected with the onboard flight control module, an onboard flight control module and an ultrasonic sensor, wherein the ultrasonic sensor comprises a first sub-ultrasonic sensor and a second sub-ultrasonic sensor, the onboard flight control module is fixed on the unmanned aerial vehicle, the depth camera is fixed at the front end of the unmanned aerial vehicle, and the first sub-ultrasonic sensor is fixed at the top of the unmanned aerial vehicle and is used for acquiring the distance between the unmanned aerial vehicle and the indoor top; the number of the second sub ultrasonic sensors is multiple, and the second sub ultrasonic sensors are uniformly distributed around the lower end of the unmanned aerial vehicle and used for obtaining the distance between the unmanned aerial vehicle and surrounding obstacles. Compared with the prior art, the method has the advantages of simplicity, wide application range and the like of the unmanned aerial vehicle obstacle avoidance method.

Description

Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to a multi-rotor unmanned aerial vehicle obstacle avoidance system and method for a greenhouse.
Background
The greenhouse is a common production facility for modern agriculture, plant fertilizer in the greenhouse at present depends on manual spraying, the labor cost is high, and the automation degree is low.
Compared with traditional manual spraying, the unmanned aerial vehicle has the greatest advantage of high automation. In modern agricultural production, unmanned aerial vehicle has been widely used in the aspect of plant protection of crops, inspection, insect pest monitoring etc. has advantages such as high efficiency, laborsaving, environmental protection. At present, the operation of an agricultural unmanned aerial vehicle can be divided into two modes of manual control flight and autonomous flight, and under the manual control flight mode, the unmanned aerial vehicle can be controlled by an operator at a ground station so as to complete various tasks. However, this mode still requires manual intervention on the unmanned aerial vehicle, has low flight autonomy, and requires high level of operation on the fly. In the autonomous flight mode, the unmanned aerial vehicle can autonomously plan a path according to a preset task and environmental conditions to execute the task.
At present, unmanned aerial vehicles are generally applied to outdoor environments, are positioned through satellite navigation, and select proper sensors [1]-[10] The functions of hovering, obstacle avoidance, path planning and the like are realized. Under indoor environment, the main technologies of unmanned aerial vehicle application are: synchronous positioning and mapping (Simultaneous Localization and Mapping), optical flow detection, and the like.
The optical flow method estimates the characteristics of the movement of the target in the image according to the gray level change of the front and rear frames of images generated by the relative movement of the target and the image shooting equipment. Many scholars have proposed various optical flow algorithms based on different hypothesis conditions and theoretical basis, such as differentiation, energy, phase, matching, with LK (Lucas Kanade) local smoothing, HS (Horn Shunk) global smoothing and SIFT (Scale-Invariant Feature Transform) [11] Algorithms are most widely used. Sun [12] On the basis of the HS algorithm, denoising is carried out on the optical flow through median filtering, and asymmetric pyramid features are introduced, so that the applicability of the algorithm to the image optical flow with large displacement motion is improved. The optical flow method can realize detection of potential various obstacle targets in the camera view field, but is easily influenced by environmental noise, and particularly, characteristic points are difficult to extract in a texture-free environment, so that the algorithm is invalid.
SLAM is widely used in robot systems and autopilotsVehicles, unmanned aerial vehicle systems, etc. Many students develop autonomous obstacle avoidance systems for unmanned aerial vehicles based on SLAM methods. Mono-SLAM method [13] Estimating camera motion and constructing a map using a continuous sequence of images, with a learner in order to further improve system real-time [14] A monocular vision SLAM system based on a key frame technique is presented. Because the visual SLAM can store a large amount of characteristic information and camera pose along with the continuous operation of the system, the calculated amount is linearly increased, and early researches have been carried out to design the visual SLAM system in a filtering mode. In recent years, researchers have found sparsity of BA (Bundle Adjustment) problem in visual SLAM, while with improved computational performance, a graph-optimized frame-based visual SLAM [15] Is becoming the mainstream and successfully applied to unmanned aerial vehicle's obstacle detection. Mueller et al [16] And a binocular camera is installed on the four-axis aircraft, positioning and image building are completed through binocular SLAM, and autonomous obstacle avoidance of the four-rotor unmanned aerial vehicle under an unknown environment is successfully realized. Fast-Planner of Chinese hong Kong university of science and technology [17] Through SLAM construction map, give the track instruction and include position, speed, acceleration and Yaw angle and send to the flight control and keep away the barrier through a carefully designed route planner. The visual SLAM method has high computational performance and memory requirements because it requires storing both pose information and map information of the unmanned aerial vehicle. In addition, both the optical flow and the visual SLAM methods require proper parameter settings to perform effective performance according to circumstances, and the generalization performance is poor.
These methods are premodeled by acquiring information from sensors such as lidar, depth cameras, etc. However, these techniques are not currently suitable for use in greenhouses. The main reasons are as follows:
1. in order to ensure the growth of crops in a greenhouse, materials with higher light transmittance, such as films, glass and the like, are often used for heat preservation, the texture information of the materials is less, the light transmittance is high, and under the condition of stronger illumination, the vision sensor can generate problems of reflection, facula and the like, and the recognition and tracking of obstacles can be influenced. For lidar, illumination may also have an impact on its performance. Under intense light, the lidar may be subject to reflection interference, resulting in increased measurement errors. In addition, sensors such as laser radars used outdoors are expensive and are difficult to popularize and apply in agricultural facilities. The obstacle distribution in the warmhouse booth is different, and unmanned aerial vehicle needs to possess the ability that detects different distance obstacles.
2. The greenhouse is filled with dynamic or static barriers with different sizes and shapes, such as spray heads, brackets, water pipes and the like. The traditional obstacle avoidance method often needs to be modeled in advance, has higher accuracy requirement on the shape of the obstacle, and is not suitable for the environment of the greenhouse, which is full of complex obstacles.
Therefore, there is a need to develop a complete unmanned aerial vehicle obstacle avoidance scheme that can use a variety of sensors to obtain environmental information and can accommodate various complications, such as strong illumination, high light transmittance, etc. Finally, the unmanned aerial vehicle obstacle avoidance scheme can automatically control the unmanned aerial vehicle to finish tasks such as plant protection inspection in the greenhouse, and can avoid collision with other objects in the greenhouse environment or facilities. Therefore, the plant growth efficiency and quality can be improved, and the workload and cost of manual inspection can be reduced.
Disclosure of Invention
The application aims to overcome the defects that in the prior art, when a visual sensor and a laser radar are adopted to detect an obstacle, the obstacle avoidance system and method for the multi-rotor unmanned aerial vehicle of the greenhouse are greatly influenced by the environment of the greenhouse, so that the measurement error is increased, the cost is high, the existing obstacle avoidance method is complex in flow and is not suitable for the obstacle avoidance of the greenhouse with complex environment obstacle.
The aim of the application can be achieved by the following technical scheme:
the multi-rotor unmanned aerial vehicle obstacle avoidance system for the greenhouse comprises an unmanned aerial vehicle, an onboard flight control module, a depth camera, an onboard flight control module and an ultrasonic sensor, wherein the depth camera, the onboard flight control module and the ultrasonic sensor are connected with the onboard flight control module; the number of the second sub ultrasonic sensors is multiple, and the second sub ultrasonic sensors are uniformly distributed around the lower end of the unmanned aerial vehicle and used for obtaining the distance between the unmanned aerial vehicle and surrounding obstacles.
Further, unmanned aerial vehicle still includes fixed plate and U type fastener, the one end of fixed plate is equipped with the U type groove, the second sub-ultrasonic sensor is located the U type inslot, the second sub-ultrasonic sensor passes through U type fastener to be fixed on the fixed plate, unmanned aerial vehicle is connected to the other end of fixed plate.
The scheme also provides a multi-rotor unmanned aerial vehicle obstacle avoidance method for a greenhouse, which comprises the following steps:
acquiring data of a flight path or a flight direction of the unmanned aerial vehicle;
acquiring data of an ultrasonic sensor and a depth camera, and fusing and synchronizing the acquired data;
judging whether an obstacle exists around the unmanned aerial vehicle according to the fused data information, and if the obstacle does not exist, keeping the original flight path to continue to fly; if an obstacle exists, carrying out obstacle avoidance path planning on the unmanned aerial vehicle;
judging whether the unmanned aerial vehicle safely avoids the obstacle according to the fused data information, transmitting a judging result to the onboard flight control module, and navigating according to a planned obstacle avoidance path if the unmanned aerial vehicle can avoid the obstacle; if the obstacle cannot be avoided, the on-board flight control module controls the unmanned aerial vehicle to navigate.
Further, a depth map of the environment where the unmanned aerial vehicle is located is obtained through the depth camera, so that the distance between the obstacle and the unmanned aerial vehicle is obtained, the distance between the obstacle which is difficult to shoot by the depth camera and the unmanned aerial vehicle is obtained through the ultrasonic sensor, and the obstacle distance obtained by the depth camera is revised, so that the information of the obstacle distance after fusion is obtained.
Further, the specific step of fusing the distance information of the obstacle acquired by the depth camera with the distance information of the obstacle acquired by the ultrasonic sensor includes: and respectively carrying out filtering treatment on the obstacle distance information acquired by the depth camera and the ultrasonic sensor through a Kalman filter, and fusing the obstacle distance information after the filtering treatment through a fuzzy logic algorithm to obtain the fused obstacle distance information.
Further, the specific steps of the obstacle distance information fusion through the fuzzy logic algorithm comprise:
acquiring measurement data of an ultrasonic sensor and a depth camera, wherein the measurement data is an input fuzzy variable;
defining a membership function for each fuzzy variable, wherein the membership function is used for converting the fuzzy variable into a fuzzy set, and elements in the fuzzy set represent the corresponding degree of the variable and the elements;
the measurement data of the ultrasonic sensor is represented as a first fuzzy set, and the measurement data of the depth camera is represented as a second fuzzy set;
and obtaining a fuzzy set of the obstacle distance through a fuzzy reasoning algorithm according to the first fuzzy set, the second fuzzy set and a preset fuzzy rule.
Further, the detection ranges of the ultrasonic sensor and the depth camera are divided into a far interval, a middle interval and a near interval, and the preset fuzzy rule is specifically as follows:
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is far, the obstacle distance is far;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is far, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is long, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is medium, the obstacle distance is medium;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is middle, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is medium, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is near, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is medium and the depth camera measurement distance is short, the obstacle distance is short;
when the ultrasonic sensor measurement distance is near and the depth camera measurement distance is near, the obstacle distance is near.
Further, the membership function has the expression:
wherein D is us (d 1 ) A first fuzzy set, d, representing the distance measured by the ultrasonic sensor 1 Represents the distance measured by the ultrasonic sensor, D dc (d 2 ) A second blur set, d, representing depth camera measurement distances 2 Represents the depth camera measurement distance d short Boundary value d representing the near interval divided by the detection range of the sensor mid Represents the boundary value, d, of the middle section of the sensor divided according to the detection range long Boundary value d representing far interval divided by sensor according to detection range w And representing parameters of the depth camera dynamically adjusted according to the trust weights.
Further, the function of the fuzzy set of obstacle distances is expressed as:
wherein D is fs (d) A fuzzy distance value D representing an obstacle i Representing the distance value, w, estimated by the ith sensor i Is the weight of the i-th sensor.
Further, the specific obstacle avoidance path planning process after the unmanned aerial vehicle judges that the obstacle exists is as follows:
after the unmanned aerial vehicle detects that an obstacle exists, a hovering state is carried out;
the unmanned aerial vehicle changes the course angle, carries out obstacle detection in the direction of 360 degrees around, and navigates the preset distance along the direction which has the minimum deviation from the original course and meets the preset obstacle avoidance distance through the data of the obstacle distance acquired by the ultrasonic sensor and the depth camera;
the unmanned aerial vehicle detects the area of 45 degrees about the current navigation direction, confirms unmanned aerial vehicle's direction of advance. Compared with the prior art, the application has the following advantages:
(1) This scheme is through setting up the depth camera on unmanned aerial vehicle, simultaneously set up ultrasonic sensor around unmanned aerial vehicle's lower extreme and top, to some barriers that receive environmental effect after, film and glass in the warmhouse booth influence depth camera testing result can carry out the barrier detection through ultrasonic sensor, avoided making a video recording under the environmental condition of highlight and can not detect the condition, ultrasonic sensor at top can guarantee unmanned aerial vehicle and greenhouse top and keep at safe distance moreover, guarantee unmanned aerial vehicle at the inside all-round obstacle avoidance of greenhouse, and the security of navigation.
(2) According to the scheme, the combination of the depth camera and the ultrasonic sensor is adopted, the distance data of the obstacle acquired by the depth camera and the ultrasonic sensor are fused through fuzzy processing, the obstacle distance data with more accurate position information is obtained, the problems that modeling is needed and the requirement on the shape of the obstacle is high in a traditional obstacle avoidance mode are avoided, the unmanned aerial vehicle has the capability of detecting obstacles at different distances, the remote obstacle can be detected, the obstacle can be effectively tracked, and the unmanned aerial vehicle can be ensured to fly stably in a safe distance.
Drawings
Fig. 1 is a schematic structural diagram of an obstacle avoidance system of an unmanned aerial vehicle provided by the application;
fig. 2 is a flowchart of an obstacle avoidance method of an unmanned aerial vehicle provided by the application;
FIG. 3 is a schematic view of an ultrasonic sensor mounting structure provided by the present application;
in the figure: 1. unmanned aerial vehicle, 2, on-board flight control module, 3, depth camera, 4, first sub-ultrasonic sensor, 5, second sub-ultrasonic sensor, 6, fixed plate, 7, U type fastener.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present application, it should be noted that, directions or positional relationships indicated by terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., are directions or positional relationships based on those shown in the drawings, or are directions or positional relationships conventionally put in use of the inventive product, are merely for convenience of describing the present application and simplifying the description, and are not indicative or implying that the apparatus or element to be referred to must have a specific direction, be constructed and operated in a specific direction, and thus should not be construed as limiting the present application.
It should be noted that the terms "first," "second," and "second" are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implying a number of technical features being indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
Furthermore, the terms "horizontal," "vertical," and the like do not denote a requirement that the component be absolutely horizontal or overhang, but rather may be slightly inclined. As "horizontal" merely means that its direction is more horizontal than "vertical", and does not mean that the structure must be perfectly horizontal, but may be slightly inclined.
Example 1
The embodiment provides a multi-rotor unmanned aerial vehicle obstacle avoidance system for a greenhouse, as shown in fig. 1, the multi-rotor unmanned aerial vehicle obstacle avoidance system comprises an unmanned aerial vehicle 1, an onboard flight control module 3, a depth camera 2 connected with the onboard flight control module 3, the onboard flight control module 3 and an ultrasonic sensor, wherein the ultrasonic sensor comprises a first sub-ultrasonic sensor 4 and a second sub-ultrasonic sensor 5, the onboard flight control module 3 is fixed on the unmanned aerial vehicle 1, the depth camera 2 is fixed at the front end of the unmanned aerial vehicle 1, and the first sub-ultrasonic sensor 4 is fixed at the top of the unmanned aerial vehicle and used for acquiring the distance between the unmanned aerial vehicle and the indoor top; the number of the second sub-ultrasonic sensors 5 is multiple, and the second sub-ultrasonic sensors 5 are uniformly distributed around the lower end of the unmanned aerial vehicle and used for obtaining the distance between the unmanned aerial vehicle and surrounding obstacles.
Through setting up the depth camera on unmanned aerial vehicle, simultaneously set up ultrasonic sensor around unmanned aerial vehicle's lower extreme and top, to some barriers that receive environmental effect after, influence depth camera testing result, film and glass in the warmhouse booth can carry out the barrier detection through ultrasonic sensor, avoided making a video recording the condition that can not be detected under the environmental condition of highlight, ultrasonic sensor at top can guarantee unmanned aerial vehicle and greenhouse top and keep at safe distance moreover, guarantee unmanned aerial vehicle and keep away the barrier in the all-round of greenhouse inside, and the security of navigation.
As a preferred embodiment, as shown in fig. 3, the unmanned aerial vehicle 1 further includes a fixing plate 6 and a U-shaped fastener 7, one end of the fixing plate 6 is provided with a U-shaped groove, the second sub-ultrasonic sensor 5 is located in the U-shaped groove, the second sub-ultrasonic sensor 5 is fixed on the fixing plate 6 through the U-shaped fastener 7, and the other end of the fixing plate 6 is connected with the unmanned aerial vehicle 1. Can be with ultrasonic sensor stable fix on unmanned aerial vehicle, conveniently carry out the detection of barrier to unmanned aerial vehicle place environment.
The embodiment also provides a multi-rotor unmanned aerial vehicle obstacle avoidance method for a greenhouse, as shown in fig. 2, comprising the following steps:
acquiring data of a flight path or a flight direction of the unmanned aerial vehicle;
acquiring data of an ultrasonic sensor and a depth camera, and fusing and synchronizing the acquired data;
judging whether an obstacle exists around the unmanned aerial vehicle according to the fused data information, and if the obstacle does not exist, keeping the original flight path to continue to fly; if an obstacle exists, carrying out obstacle avoidance path planning on the unmanned aerial vehicle;
judging whether the unmanned aerial vehicle safely avoids the obstacle according to the fused data information, transmitting a judging result to the onboard flight control module, and navigating according to a planned obstacle avoidance path if the unmanned aerial vehicle can avoid the obstacle; if the obstacle cannot be avoided, the on-board flight control module controls the unmanned aerial vehicle to navigate.
Through the combination of depth camera and ultrasonic sensor to the distance data of the obstacle that depth camera and ultrasonic sensor obtained fuses the detection data of sensor through fuzzy processing, obtains the more accurate obstacle distance data of positional information, has avoided the problem that needs modeling and require highly to the obstacle shape in the traditional obstacle avoidance mode, makes the unmanned aerial vehicle have the ability of surveying different distance obstacles, can detect the long-range obstacle, effectively tracks the obstacle, can ensure unmanned aerial vehicle's stable flight in the safe distance again.
Specifically, a depth map of the environment where the unmanned aerial vehicle is located is obtained through the depth camera, so that the distance between the obstacle and the unmanned aerial vehicle is obtained, the distance between the obstacle which is difficult to shoot by the depth camera and the unmanned aerial vehicle is obtained through the ultrasonic sensor, and the obstacle distance obtained by the depth camera is revised, so that the information of the obstacle distance after fusion is obtained.
The specific steps of fusing the distance information of the obstacle acquired by the depth camera with the distance information of the obstacle acquired by the ultrasonic sensor comprise the following steps: and respectively carrying out filtering treatment on the obstacle distance information acquired by the depth camera and the ultrasonic sensor through a Kalman filter, and fusing the obstacle distance information after the filtering treatment through a fuzzy logic algorithm to obtain the fused obstacle distance information.
The specific steps of the obstacle distance information fusion through the fuzzy logic algorithm comprise:
acquiring measurement data of an ultrasonic sensor and a depth camera, wherein the measurement data is an input fuzzy variable;
defining a membership function for each fuzzy variable, wherein the membership function is used for converting the fuzzy variable into a fuzzy set, and elements in the fuzzy set represent the corresponding degree of the variable and the elements;
the measurement data of the ultrasonic sensor is represented as a first fuzzy set, and the measurement data of the depth camera is represented as a second fuzzy set;
and obtaining a fuzzy set of the obstacle distance through a fuzzy reasoning algorithm according to the first fuzzy set, the second fuzzy set and a preset fuzzy rule.
Specifically, the detection ranges of the ultrasonic sensor and the depth camera are divided into a far interval, a middle interval and a near interval, and the preset fuzzy rule specifically comprises:
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is far, the obstacle distance is far;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is far, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is long, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is medium, the obstacle distance is medium;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is middle, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is medium, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is near, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is medium and the depth camera measurement distance is short, the obstacle distance is short;
when the ultrasonic sensor measurement distance is near and the depth camera measurement distance is near, the obstacle distance is near.
The membership function has the expression:
wherein D is us (d 1 ) A first fuzzy set, d, representing the distance measured by the ultrasonic sensor 1 Represents the distance measured by the ultrasonic sensor, D dc (d 2 ) A second blur set, d, representing depth camera measurement distances 2 Represents the depth camera measurement distance d short Boundary value d representing the near interval divided by the detection range of the sensor mid Represents the boundary value, d, of the middle section of the sensor divided according to the detection range long Boundary value d representing far interval divided by sensor according to detection range w Parameters representing the dynamic adjustment of the depth camera according to trust weights,
the function of the fuzzy set of obstacle distances is expressed as:
wherein D is fs (d) A fuzzy distance value D representing an obstacle i Representing the distance value, w, estimated by the ith sensor i Is the weight of the i-th sensor.
The specific obstacle avoidance path planning process after the unmanned aerial vehicle judges that the obstacle exists is as follows:
after the unmanned aerial vehicle detects that an obstacle exists, a hovering state is carried out;
the unmanned aerial vehicle changes the course angle, carries out obstacle detection in the direction of 360 degrees around, and navigates the preset distance along the direction which has the minimum deviation from the original course and meets the preset obstacle avoidance distance through the data of the obstacle distance acquired by the ultrasonic sensor and the depth camera;
the unmanned aerial vehicle detects the area of 45 degrees about the current navigation direction, confirms unmanned aerial vehicle's direction of advance.
In combination with the foregoing, this embodiment also provides a more specific implementation manner, which is specifically:
a multi-rotor unmanned aerial vehicle obstacle avoidance system used in a greenhouse is shown in fig. 1, and specifically comprises an unmanned aerial vehicle body, a rotor, an onboard flight control system, a depth camera and an ultrasonic sensor.
As shown in fig. 1, aiming at a multi-rotor unmanned aerial vehicle obstacle avoidance system in a greenhouse, a sensor layout method is invented, specifically, a depth camera is installed in the forward direction of the unmanned aerial vehicle, and aiming at the problem that a visual camera is insensitive to a film and a glass structure in the greenhouse, a plurality of ultrasonic sensors are installed in the horizontal direction of the unmanned aerial vehicle, the unmanned aerial vehicle is covered by 360 degrees, and the unmanned aerial vehicle can fly reliably within a safe distance. And the forward ultrasonic sensor and the depth camera are fused, so that the obstacle is effectively detected, and the data is utilized to carry out path planning. In addition, in order to ensure that the unmanned aerial vehicle flies stably and reliably in the greenhouse and avoid colliding with the top of the greenhouse, the application is additionally provided with the ultrasonic module above the unmanned aerial vehicle, thereby effectively detecting the structure above the greenhouse.
Specifically, the on-board flight control system comprises an on-board computer and a flight controller, and the Jetson Orin Nano is adopted as the on-board computer, so that the on-board flight control system has strong calculation power and small volume, and meanwhile, development environments such as ROS (Robot Operating System), python and the like are integrated. The flight controller adopts CUAV V < 6+ >, and can flexibly control the unmanned aerial vehicle, so that the calculation performance is high. To the selection of sensor, consider warmhouse booth interior border abominable, when carrying out the plant protection task, unmanned aerial vehicle rotor high-speed rotation produces down the cyclone field, can drive a large amount of steam to raise dust, this has put forward certain requirement to the waterproof dustproof performance of equipment. In consideration of the problems, KS114 is adopted as an ultrasonic sensor in implementation, and the sensor has IP67 level waterproof performance, small blind area, large detection range and high precision, and has good detection effect on obstacles such as films in a greenhouse. The ZED2i camera of StereoLab company is adopted as a forward vision module, the depth camera has IP 67-level waterproof and dustproof performance, the built-in polarization technology reduces the influence of strong light in the greenhouse on the camera, the detection range is large, the maximum can reach 20m, the unmanned aerial vehicle has enough reaction time to avoid obstacles, and the safety plant protection and inspection operation of the unmanned aerial vehicle in the greenhouse are effectively ensured.
The unmanned aerial vehicle obstacle avoidance method for the greenhouse specifically comprises the following steps of:
and step 1, inputting a flight path or a flight direction.
And 2, acquiring ultrasonic sensor and depth camera data, and fusing and synchronizing the sensor data.
And 3, acquiring data of the sensor in real time to detect and judge whether obstacles exist around the unmanned aerial vehicle in the flight process of the unmanned aerial vehicle in the greenhouse, and continuing to fly if the obstacles do not exist. If so, a planning algorithm is used for obstacle avoidance.
And step 4, judging whether the waypoints are reachable, if not, returning to the upper control system for processing, and carrying out flying according to the control instruction given by the control system. The unmanned aerial vehicle can be controlled to fly according to the planned obstacle avoidance flight path after obstacle avoidance is completed.
For step 1, according to the crop layout of the greenhouse, the flight direction and the like can be generated according to the input of the remote controller in the manual mode, and the flight path generated by the waypoint can be input in the autonomous flight mode.
And 2, acquiring a depth map through a depth camera to obtain fine distance information, and acquiring close-range obstacle information including information which is difficult to acquire by the depth camera such as a film, glass and the like by an ultrasonic sensor.
The application utilizes the depth camera and the ultrasonic sensor to avoid the obstacle, and the depth camera can detect the obstacle at a long distance and detect the obstacle at a short distance. The ultrasonic sensor detects a close-range obstacle and an obstacle which is difficult to detect by the depth camera, and corrects the distance detected by the depth camera.
And the forward multiple sensor data are fused and used for accurately planning the route. The plurality of sensors in the horizontal direction detect the obstacle in real time, so that the unmanned aerial vehicle can fly stably within a safe distance. The distance between the unmanned aerial vehicle and the greenhouse environment is detected in real time by the ultrasonic sensor additionally arranged above the unmanned aerial vehicle, and when the distance between the height of the unmanned aerial vehicle and the top end of the greenhouse is smaller than a certain threshold value, the unmanned aerial vehicle is controlled to be lowered by a certain height.
For the ultrasonic sensor and the visual sensor in the flight direction, a multi-sensor information fusion method is used to obtain more accurate distance information for planning a fine route. The sensor data fusion can adopt Bayesian network, fuzzy control, kalman filtering and the like. For example, a Kalman filter can be adopted to filter the sensor distance information, and then fuzzy logic is used to fuse the sensor data, so that compared with the data before fusion, the data variance after fusion is greatly reduced, and the sensor distance information is more reliable.
And 3, acquiring sensor data in real time and judging whether an obstacle exists or not in the flight process of the unmanned aerial vehicle. If no obstacle exists in the route, continuing the original flight path. If an obstacle is present, a path planning algorithm is used.
When the unmanned aerial vehicle detects that an obstacle exists in a certain direction, hovering is performed first, and a first stage is entered. Firstly, changing a course angle, searching in a direction of 360 degrees, selecting a direction which deviates least from an original course and meets a preset obstacle avoidance distance according to data of a forward depth camera and an ultrasonic sensor, sailing for a preset distance, then entering a second stage algorithm, searching the current course and the left and right 45-degree directions, confirming a final course and advancing.
And 4, judging whether the unmanned aerial vehicle can avoid the obstacle or not through an algorithm, continuing the mission point task, and returning a result to an upper control system. If the obstacle cannot be avoided (such as the waypoint is at a certain unreachable coordinate), the unmanned aerial vehicle is controlled to move through the upper control system. And if the unmanned aerial vehicle can avoid the obstacle, controlling the unmanned aerial vehicle to fly according to the planned obstacle avoidance track. The distance for obstacle avoidance can be set manually through experience, and obstacles smaller than the distance are avoided, and obstacles larger than the distance are ignored.
For the unmanned aerial vehicle obstacle avoidance method, the specific implementation method is as follows:
1. for autonomous flight, the longitude and latitude of the current coordinate are read through a Mavlink protocol to serve as a departure point, information such as a course angle, a flight distance and the like is input, and converted into the longitude and latitude of the departure point, and a route is generated. For manual flight, an estimated flight path or direction is generated from the input of the remote control.
2. And using a smbus2 library of Python to read data of the ultrasonic sensor, reading data of the depth camera through sdk of the ZED2i and Opencv, performing data fusion and synchronization, and sending the data to the on-board flight control system for processing.
3. And executing the waypoint task according to the existing path, executing the obstacle avoidance algorithm if an obstacle exists, and continuing the original flight path if the obstacle does not exist.
4. And judging whether the waypoint is reachable (if a certain waypoint and the current waypoint have no path reachable), and if not, returning the result to an upper control system to control the unmanned aerial vehicle to land or return to the flying point. And if the obstacle avoidance task is available, generating a route by using a path planning algorithm, and finishing the obstacle avoidance task.
For data fusion in step 2, one method that the application can employ is: and data fusion is carried out on the forward ultrasonic sensor and the depth camera by using fuzzy logic.
For distance measurement data of ultrasound and depth cameras, they can be represented as two blurred sets, respectively, in particular:
the fuzzy set of ultrasonic measurement distance is D us (d) Where D represents distance, D us (d) A degree of blurring representing an obstacle having a distance d;
the fuzzy set of the depth camera measurement distance is D dc (d) Where D represents distance, D dc (d) The degree of blurring of an obstacle having a distance d is indicated.
Then, a fuzzy rule is defined, for example, the distance data of the ultrasonic wave and the depth camera are fused, and the following fuzzy rule can be designed:
firstly, dividing the ultrasonic sensor into a far section, a middle section and a near section according to the detection range of the ultrasonic sensor and the depth camera, and then designing a fuzzy rule according to the sections.
If the ultrasonic distance is far and the depth camera distance is far, the obstacle distance is far. If the ultrasonic distance is medium and the depth camera distance is far, the obstacle distance is medium. If the ultrasonic distance is short and the depth camera distance is long, the obstacle distance is short. If the ultrasonic distance is far and the depth camera distance is medium, the obstacle distance is medium. If the ultrasonic distance is medium and the depth camera distance is medium, the obstacle distance is medium. If the ultrasonic distance is short and the depth camera distance is medium, the obstacle distance is short. If the ultrasonic distance is far and the depth camera distance is near, the obstacle distance is medium. If the ultrasonic distance is medium and the depth camera distance is near, then the obstacle distance is near. If the ultrasonic distance is near and the depth camera distance is near, then the obstacle distance is near.
Then, the membership function is designed: for each fuzzy variable, a membership function is defined. The function of the membership function is to transform an input fuzzy variable into a fuzzy set, where each element represents the degree to which the variable corresponds to that element. For this variable, the application divides the detection range into three sections, for example, the detection range of the ultrasonic sensor is 0.3-5.2m,0.3-2.5m is a short-distance section, 2.5m-4m is a medium-distance section, and more than 4m is a long-distance section. The detection range of the depth camera is 0.2-20m, the detection range is classified into a short-distance range from 0.2 m to 2.5m, a medium-distance range from 2.5m to 4m, and a long-distance range from more than 4 m. For the above sensors and intervals, the following membership functions are designed:
according to the fuzzy rule and the two fuzzy sets, the fuzzy set D of the obstacle distance can be calculated by using a fuzzy reasoning method fs (d) I.e. the fused data.
Taking the distance between the unmanned plane and the obstacle as an example, a specific distance value can be obtained by using a gravity center method. The gravity center method is to perform weighted average on fuzzy sets of output variables, wherein the weight of each fuzzy quantity is determined by the membership degree of each fuzzy quantity. Specifically, for three fuzzy amounts (near, middle and far) of the obstacle distance, membership degrees and corresponding weights are calculated respectively, and then the three weights are weighted and averaged to obtain a final output result.
Wherein D is fs (d) For the final blur distance value, D i Estimated distance value, w, for the i-th sensor i Is the weight of the i-th sensor. Specifically, when the obstacle distance is long, the depth camera data is mainly used as an information source, and when the obstacle distance is short or medium, the two sensor data are fused, so that more reliable data are obtained. The fuzzy set can be deblurred through the method, an accurate distance value is obtained, and the distance value is sent to the on-board flight control system to execute the obstacle avoidance process.
The foregoing describes in detail preferred embodiments of the present application. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the application by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.

Claims (10)

1. The multi-rotor unmanned aerial vehicle obstacle avoidance system for the greenhouse comprises an unmanned aerial vehicle (1) and is characterized by further comprising an onboard flight control module (3), a depth camera (2) connected with the onboard flight control module (3), the onboard flight control module (3) and an ultrasonic sensor, wherein the ultrasonic sensor comprises a first sub-ultrasonic sensor (4) and a second sub-ultrasonic sensor (5), the onboard flight control module (3) is fixed on the unmanned aerial vehicle (1), the depth camera (2) is fixed at the front end of the unmanned aerial vehicle (1), and the first sub-ultrasonic sensor (4) is fixed at the top of the unmanned aerial vehicle and is used for acquiring the distance between the unmanned aerial vehicle and the indoor top; the number of the second sub ultrasonic sensors (5) is multiple, and the second sub ultrasonic sensors (5) are uniformly distributed around the lower end of the unmanned aerial vehicle and used for obtaining the distance between the unmanned aerial vehicle and surrounding obstacles.
2. The multi-rotor unmanned aerial vehicle obstacle avoidance system for a greenhouse according to claim 1, wherein the unmanned aerial vehicle (1) further comprises a fixing plate (6) and a U-shaped fastener (7), one end of the fixing plate (6) is provided with a U-shaped groove, the second sub-ultrasonic sensor (5) is located in the U-shaped groove, the second sub-ultrasonic sensor (5) is fixed on the fixing plate (6) through the U-shaped fastener (7), and the other end of the fixing plate (6) is connected with the unmanned aerial vehicle (1).
3. A method of a multi-rotor unmanned aerial vehicle obstacle avoidance system for a greenhouse based on any one of claims 1 to 2, comprising the steps of:
acquiring data of a flight path or a flight direction of the unmanned aerial vehicle;
acquiring data of an ultrasonic sensor and a depth camera, and fusing and synchronizing the acquired data;
judging whether an obstacle exists around the unmanned aerial vehicle according to the fused data information, and if the obstacle does not exist, keeping the original flight path to continue to fly; if an obstacle exists, carrying out obstacle avoidance path planning on the unmanned aerial vehicle;
judging whether the unmanned aerial vehicle safely avoids the obstacle according to the fused data information, transmitting a judging result to the onboard flight control module, and navigating according to a planned obstacle avoidance path if the unmanned aerial vehicle can avoid the obstacle; if the obstacle cannot be avoided, the on-board flight control module controls the unmanned aerial vehicle to navigate.
4. The multi-rotor unmanned aerial vehicle obstacle avoidance method for the greenhouse according to claim 3, wherein a depth image of an environment where the unmanned aerial vehicle is located is obtained through a depth camera to obtain the distance between an obstacle and the unmanned aerial vehicle, the distance between the obstacle which is difficult to shoot by the depth camera and the unmanned aerial vehicle is obtained through an ultrasonic sensor, and the obstacle distance obtained by the depth camera is revised to obtain the information of the distance between the obstacle after fusion.
5. The multi-rotor unmanned aerial vehicle obstacle avoidance method for a greenhouse of claim 4, wherein the specific step of fusing the obstacle distance information acquired by the depth camera with the obstacle distance information acquired by the ultrasonic sensor comprises the following steps: and respectively carrying out filtering treatment on the obstacle distance information acquired by the depth camera and the ultrasonic sensor through a Kalman filter, and fusing the obstacle distance information after the filtering treatment through a fuzzy logic algorithm to obtain the fused obstacle distance information.
6. The multi-rotor unmanned aerial vehicle obstacle avoidance method for a greenhouse of claim 5, wherein the specific step of performing obstacle distance information fusion by a fuzzy logic algorithm comprises the following steps:
acquiring measurement data of an ultrasonic sensor and a depth camera, wherein the measurement data is an input fuzzy variable;
defining a membership function for each fuzzy variable, wherein the membership function is used for converting the fuzzy variable into a fuzzy set, and elements in the fuzzy set represent the corresponding degree of the variable and the elements;
the measurement data of the ultrasonic sensor is represented as a first fuzzy set, and the measurement data of the depth camera is represented as a second fuzzy set;
and obtaining a fuzzy set of the obstacle distance through a fuzzy reasoning algorithm according to the first fuzzy set, the second fuzzy set and a preset fuzzy rule.
7. The multi-rotor unmanned aerial vehicle obstacle avoidance method for a greenhouse according to claim 6, wherein the detection range of the ultrasonic sensor and the depth camera is divided into a far interval, a middle interval and a near interval, and the preset fuzzy rule is specifically as follows:
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is far, the obstacle distance is far;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is far, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is long, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is medium, the obstacle distance is medium;
when the ultrasonic sensor measurement distance is middle and the depth camera measurement distance is middle, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is short and the depth camera measurement distance is medium, the obstacle distance is short;
when the ultrasonic sensor measuring distance is far and the depth camera measuring distance is near, the obstacle distance is middle;
when the ultrasonic sensor measurement distance is medium and the depth camera measurement distance is short, the obstacle distance is short;
when the ultrasonic sensor measurement distance is near and the depth camera measurement distance is near, the obstacle distance is near.
8. The multi-rotor unmanned aerial vehicle obstacle avoidance method for a warmhouse booth of claim 6, wherein the membership function has the expression:
wherein D is us (d 1 ) A first fuzzy set, d, representing the distance measured by the ultrasonic sensor 1 Represents the distance measured by the ultrasonic sensor, D dc (d 2 ) A second blur set, d, representing depth camera measurement distances 2 Represents the depth camera measurement distance d short Boundary value d representing the near interval divided by the detection range of the sensor mid Boundary value d representing middle section of sensor divided according to detection range long Boundary value d representing far interval divided by sensor according to detection range w And representing parameters of the depth camera dynamically adjusted according to the trust weights.
9. The multi-rotor unmanned aerial vehicle obstacle avoidance method for a warmhouse booth of claim 6, wherein the function of the fuzzy set of obstacle distances is expressed as:
wherein D is fs (d) A fuzzy distance value D representing an obstacle i Representing the distance value, w, estimated by the ith sensor i Is the weight of the i-th sensor.
10. The multi-rotor unmanned aerial vehicle obstacle avoidance method for the greenhouse according to claim 3, wherein the specific obstacle avoidance path planning process after the unmanned aerial vehicle judges that the obstacle exists is as follows:
after the unmanned aerial vehicle detects that an obstacle exists, a hovering state is carried out;
the unmanned aerial vehicle changes the course angle, carries out obstacle detection in the direction of 360 degrees around, and navigates the preset distance along the direction which has the minimum deviation from the original course and meets the preset obstacle avoidance distance through the data of the obstacle distance acquired by the ultrasonic sensor and the depth camera;
the unmanned aerial vehicle detects the area of 45 degrees about the current navigation direction, confirms unmanned aerial vehicle's direction of advance.
CN202310788698.7A 2023-06-30 2023-06-30 Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse Pending CN116627155A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310788698.7A CN116627155A (en) 2023-06-30 2023-06-30 Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310788698.7A CN116627155A (en) 2023-06-30 2023-06-30 Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse

Publications (1)

Publication Number Publication Date
CN116627155A true CN116627155A (en) 2023-08-22

Family

ID=87610048

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310788698.7A Pending CN116627155A (en) 2023-06-30 2023-06-30 Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse

Country Status (1)

Country Link
CN (1) CN116627155A (en)

Similar Documents

Publication Publication Date Title
US20210064024A1 (en) Scanning environments and tracking unmanned aerial vehicles
Schäfer et al. Multicopter unmanned aerial vehicle for automated inspection of wind turbines
Lee et al. Deep learning-based monocular obstacle avoidance for unmanned aerial vehicle navigation in tree plantations: Faster region-based convolutional neural network approach
CN105652891B (en) A kind of rotor wing unmanned aerial vehicle movement Target self-determination tracks of device and its control method
CN110174903B (en) System and method for controlling a movable object within an environment
Caballero et al. Vision-based odometry and SLAM for medium and high altitude flying UAVs
WO2017177533A1 (en) Method and system for controlling laser radar based micro unmanned aerial vehicle
Chambers et al. Perception for a river mapping robot
JP2015006874A (en) Systems and methods for autonomous landing using three dimensional evidence grid
CN111090283B (en) Unmanned ship combined positioning and orientation method and system
Singh et al. Comparative analysis of range sensors for the robust autonomous navigation–a review
CN115933754A (en) Electric power inspection unmanned aerial vehicle obstacle avoidance method based on millimeter wave radar and binocular vision
Liu et al. A survey of computer vision applied in aerial robotic vehicles
KR20220129218A (en) Speed control method of unmanned vehicle to awareness the flight situation about an obstacle, and, unmanned vehicle the performed the method
Chow et al. Toward underground localization: Lidar inertial odometry enabled aerial robot navigation
Pritzl et al. Cooperative navigation and guidance of a micro-scale aerial vehicle by an accompanying UAV using 3D LiDAR relative localization
Kamat et al. A survey on autonomous navigation techniques
Lee et al. See and avoidance behaviors for autonomous navigation
Marlow et al. Local terrain mapping for obstacle avoidance using monocular vision
Rydell et al. Autonomous UAV-based forest mapping below the canopy
CN116627155A (en) Multi-rotor unmanned aerial vehicle obstacle avoidance system and method for greenhouse
Rodrigues et al. A coverage planner for AUVs using B-splines
Grönwall et al. Two imaging systems for positioning and navigation
Fantoni et al. Optic flow-based control and navigation of mini aerial vehicles
Emmanuel et al. Review of Agricultural Unmanned Aerial Vehicles (UAV) Obstacle Avoidance System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination