WO2019080924A1 - 导航图配置方法、避障方法以及装置、终端、无人飞行器 - Google Patents

导航图配置方法、避障方法以及装置、终端、无人飞行器

Info

Publication number
WO2019080924A1
WO2019080924A1 PCT/CN2018/112077 CN2018112077W WO2019080924A1 WO 2019080924 A1 WO2019080924 A1 WO 2019080924A1 CN 2018112077 W CN2018112077 W CN 2018112077W WO 2019080924 A1 WO2019080924 A1 WO 2019080924A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
point
map
navigation map
obstacle
Prior art date
Application number
PCT/CN2018/112077
Other languages
English (en)
French (fr)
Inventor
郑立强
刘鹏
Original Assignee
广州极飞科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州极飞科技有限公司 filed Critical 广州极飞科技有限公司
Priority to EP18871786.2A priority Critical patent/EP3702731A4/en
Priority to AU2018355491A priority patent/AU2018355491B2/en
Priority to US16/641,763 priority patent/US20200394924A1/en
Priority to KR1020207005722A priority patent/KR102385820B1/ko
Priority to JP2020517902A priority patent/JP2020535545A/ja
Publication of WO2019080924A1 publication Critical patent/WO2019080924A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/40UAVs specially adapted for particular uses or applications for agriculture or forestry operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the present application relates to the field of aircraft technology, and in particular, the present invention relates to a navigation map configuration method, an automatic obstacle avoidance method, and a device and a terminal.
  • aircraft such as drones, in aerial photography, agriculture, plant protection, micro-self-timer, express delivery, disaster relief, observation of wildlife, surveillance of infectious diseases, mapping, news reports, power inspection, disaster relief, film and television
  • the field of shooting and the like has been widely used.
  • the operating route of the aircraft is a scanning route that is automatically generated in advance by the ground control device based on the parcel information. Since the scanning route is pre-generated, the return path after the operation is generated well before the takeoff, so there is no way to cope with the temporary task change. For example, when the aircraft operates according to the pre-generated scan route, the user needs the aircraft to advance A work area of an unplanned work route is simultaneously operated, or when the aircraft is operating according to a pre-generated scan route, the user needs the aircraft to automatically avoid obstacles to unknown obstacles that are not planned in advance.
  • the present application is directed to the shortcomings of the prior art, and provides a navigation map configuration method, an automatic obstacle avoidance method, and a device and a terminal, which are used to solve the problem that the work route planning method existing in the prior art cannot cope with temporary task changes. Dynamically generate job routes to effectively respond to temporary mission changes.
  • an embodiment of the present application provides a navigation map configuration method, including the steps of:
  • the three-dimensional position information of each point is projected into the partial navigation map centered on the current flight position according to the respective set weights.
  • an embodiment of the present application further provides an obstacle avoidance method, including the steps of:
  • the sub-area is set as an obstacle area to indicate that the aircraft performs obstacle avoidance on the obstacle area;
  • An obstacle area and a job boundary area are set in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
  • Embodiments of the present application further provide an aircraft navigation map configuration apparatus, including:
  • An information acquisition module configured to acquire a current flight position, attitude information of the aircraft, and a depth map detected at the current flight position
  • a three-dimensional position information obtaining module configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map;
  • the projection module is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights.
  • an embodiment of the present application further provides an obstacle avoidance device, including:
  • a first information acquiring module configured to acquire a current flight position, posture information of the aircraft, and a depth map detected at the current flight position
  • a three-dimensional position information obtaining module configured to obtain three-dimensional position information of each point according to the current flight position, the attitude information, and the depth map;
  • a projection module configured to project the three-dimensional position information of each point into a partial navigation map centered on the current flight position according to a weight value set by itself, wherein the partial navigation map includes a plurality of sub-regions;
  • a first area setting module configured to set the sub-area as an obstacle area when the weight of all points in the sub-area is greater than a preset threshold, to instruct the aircraft to perform obstacle avoidance on the obstacle area;
  • a second information acquiring module configured to acquire mapping data set by the user for indicating an obstacle area and a job boundary area, and three-dimensional position information for indicating an obstacle area in the partial navigation map;
  • the second area setting module is configured to set an obstacle area and a job boundary area in the preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
  • Embodiments of the present application further provide a terminal, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the program arbitrarily The steps of the method described in the examples.
  • Embodiments of the present application further provide a terminal, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the program arbitrarily The steps of the method described in the examples.
  • Embodiments of the present application according to a seventh aspect, further a storage medium, comprising: a stored program, wherein, when the program is running, controlling a device in which the storage medium is located performs the first aspect described above Navigation map configuration method.
  • Embodiments of the present application further provide a storage medium, the storage medium comprising a stored program, wherein, when the program is running, controlling the device where the storage medium is located to perform the second aspect described above The obstacle avoidance method.
  • an embodiment of the present application further provides an unmanned aerial vehicle including a communication module, a sensor, a controller, and a storage medium; the sensor includes an image sensor, a GPS receiver, an RTK positioning sensor, and an inertial sensor.
  • the communication module is configured to communicate with a ground control device
  • the GPS receiver and the positioning sensor are configured to determine a current flight position of the unmanned aerial vehicle
  • the inertial sensor is configured to determine posture information of the unmanned aerial vehicle
  • the image sensor is configured to detect a depth map at a current flight position
  • the controller is coupled to the storage medium, the storage medium being configured to store a program, the program being operative to perform the steps of the method of the first aspect described above.
  • an embodiment of the present application further provides an unmanned aerial vehicle including a communication module, a sensor, a controller, and a storage medium; the sensor includes an image sensor, a GPS receiver, an RTK positioning sensor, and an inertial sensor.
  • the communication module is configured to communicate with a ground control device
  • the GPS receiver and the positioning sensor are configured to determine a current flight position of the unmanned aerial vehicle
  • the inertial sensor is configured to determine posture information of the unmanned aerial vehicle
  • the image sensor is configured to detect a depth map at a current flight position
  • the controller is coupled to the storage medium, the storage medium being configured to store a program, the program being operative to perform the steps of the method of the second aspect described above.
  • the navigation map configuration method, the automatic obstacle avoidance method, the device, and the terminal dynamically generate a partial navigation map centered on the current flight position of the aircraft, and analyze each location information according to the position information, the posture information, and the depth map acquired by the aircraft during the flight.
  • Three-dimensional position information of each point, the three-dimensional position information of each point may be information of an unknown obstacle encountered by the aircraft during flight, or may be other unplanned other encounters encountered by the aircraft during flight.
  • the information of the object, the three-dimensional position information of each point is projected into the local navigation map, and the real-time operation route planning can be performed according to the local navigation map.
  • the operation route is dynamically generated according to the information acquired during the flight of the aircraft, it can effectively cope with temporary task changes, for example, simultaneously working on a work area of a previously unplanned operation route, or automatically avoiding the area where the unknown obstacle is located. Barriers and so on.
  • FIG. 1 is a schematic flowchart of an embodiment of a navigation map configuration method according to the present application.
  • FIG. 2 is a schematic diagram of a specific embodiment of a method for determining a preset area according to the present application
  • FIG. 3 is a schematic structural diagram of an embodiment of an aircraft navigation map configuration apparatus according to the present application.
  • FIG. 4 is a schematic flow chart of an embodiment of an obstacle avoidance method according to the present application.
  • FIG. 5 is a schematic diagram of a specific embodiment of a method for acquiring a global navigation map boundary according to the present application.
  • FIG. 6 is a schematic diagram of a specific embodiment of setting an obstacle area and a job boundary area in a global navigation map according to the present application
  • FIG. 7 is a schematic structural view of an embodiment of an obstacle avoidance device according to the present application.
  • FIG. 8 is a schematic structural diagram of an unmanned aerial vehicle 800 according to an embodiment of the present application.
  • the aircraft obstacle avoidance system designed by the present application is divided into two parts, one is a global obstacle avoidance planning part mainly based on a global navigation map, and the other is a partial obstacle avoidance planning part mainly based on a partial navigation map.
  • Both the global navigation map and the local navigation map are used to indicate the flight of the aircraft.
  • the global navigation map and the local navigation map are not interdependent, and the problems faced are different.
  • the mapping strategy is also different (more on this later). ), the purpose is to adapt to agricultural applications while reducing resource consumption.
  • the global obstacle avoidance plan is used for returning or pointing flight, mainly using global navigation maps, and facing known obstacles.
  • the work route is a scan route that is automatically generated in advance by the ground station based on the parcel information. Since the scanning route is pre-generated, even the return route after the operation is generated well before take-off, so there is no way to cope with temporary task changes, such as sudden consumption of the drug, almost exhausted power or sudden user wants to fly back. .
  • the global navigation map is used to deal with this kind of scene, and it is possible to carry out the obstacle-free path planning in the whole map at any time. This kind of planning is long-distance, and there is no need to consider the problem of spraying. In this scenario, the required map area is large, the map granularity does not need to be very thin, and the map area can be determined before taking off.
  • the local obstacle avoidance plan is used to fly along the work route or encounter the unknown obstacles during the flight along the globally planned route.
  • the local navigation map is mainly used to encounter unknown obstacles during the operation.
  • the map size is required to be small, because it is necessary to fit the original route as much as possible to minimize the leakage, so the corresponding plan is generally short-distance, the map can be relatively small, and the map center moves with the aircraft.
  • the local navigation map designed by the present application can be applied to other aspects in addition to the obstacle avoidance, for example, working on a fruit tree area that has not been planned in advance, and the like, and the present application does not perform the scene applied to the partial navigation map. limited.
  • a navigation map configuration method includes the steps of:
  • the aircraft can be a planting drone, etc.
  • the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
  • the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
  • the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
  • the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
  • the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
  • the size of the depth map can be set according to needs, and the present application does not limit this.
  • the detected depth map is a 640*480 size image.
  • the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
  • each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
  • the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
  • the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
  • the navigation coordinate system is a northeast coordinate system.
  • the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
  • the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
  • the center of the local navigation map is moved.
  • the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
  • the local navigation map is a two-dimensional map. After acquiring the point cloud, that is, the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight for dynamically planning the operation route. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. The information is a proportional relationship, as shown in the following formula:
  • point_weight is the weight of a point
  • point_weight_com is the common weight of the point, that is, the preset weight, which can be obtained empirically, the common weight is consistent for all points
  • distance_factor is the distance-related factor, and distance It is a proportional relationship, that is, its value increases linearly with the increase of the distance information, and decreases linearly with the decrease of the distance information.
  • the distance information is distance information represented by a gray value of each pixel in the aforementioned depth map.
  • the distance factor is proportional to the distance information because the distance object has a small number of point clouds, so the weight of each point should be greater than the close distance point.
  • the operation route planning can be performed according to the local navigation map, for example, obstacle avoidance is performed, a detected new area is operated, and the like.
  • obstacle avoidance is performed, a detected new area is operated, and the like.
  • the local navigation map includes a plurality of sub-areas; the three-dimensional position information of each point is projected according to a weight value set to a local navigation map centered on the current flight position, and then And further comprising: if the weight of all points in the sub-area is greater than a preset threshold, setting the sub-area as an obstacle area to instruct the aircraft to implement obstacle avoidance.
  • the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
  • the local navigation map can be a raster map, and each grid is a sub-region.
  • each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed on each sub-area according to the following formula to obtain the total weight of each sub-area. value.
  • Map_value+ point_weight
  • map_value represents the weight of a sub-area.
  • the preset threshold according to experience, for example, set the preset threshold to 1, and also set the specific form indicating the obstacle area according to actual needs.
  • the weight of the grid is 0.
  • the position is a free position, the aircraft is free to pass, and a grid weight of 1 indicates that there is an obstacle in the position, and the aircraft needs to bypass. Then you can set the sub-area according to the following formula:
  • the three-dimensional position information of each point is projected according to the respective set weights into a partial navigation map centered on the current flight position, and then includes: if all points in the sub-area The weight value is less than or equal to a preset threshold, and the sub-area is set as a pass area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area. For example, for a raster map, if the sum of the weights of all the points in a grid is less than 1, you can set the weight of the grid to 0 to indicate that the aircraft can pass.
  • the depth map needs to be preprocessed.
  • the method further includes: performing sparse processing on the depth map.
  • the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the proximity, and thus, in one embodiment, the pair is
  • the depth map is subjected to the sparse processing, and the thinning processing is performed on the depth map by using a change step size, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
  • the unequal spacing sparse processing can be performed from the image boundary, so that the pixel points near the center of the image are dense, and the pixel points at the edge of the image are sparse.
  • the pseudo code for sparse processing is as follows:
  • Img_height and img_width are the width and length of the image, respectively;
  • I_step and j_step are the step sizes of traversing the image, and the initial values are all 1;
  • Height_step and width_step are the sparse factors of the image in the vertical and horizontal directions, respectively;
  • HandleImage() represents the subsequent processing of the depth map.
  • the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, obtaining the three-dimensional position information of each point according to the current flight position, the posture information, and the depth map, including: performing coordinate conversion on the depth map, obtaining navigation Each point in the coordinate system; obtaining three-dimensional position information of each point according to each point in the navigation coordinate system, the current flight position, and the posture information.
  • the coordinate transformation of the depth map to obtain various points in the navigation coordinate system includes: according to the camera internal reference matrix, each of the depth maps is The points are converted into points in the camera coordinate system; the transformation matrix from the camera coordinate system to the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
  • the weight of the points in the preset area in the local navigation map is attenuated, and then the total weight of each sub-area is calculated to determine whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment. Improve the accuracy of the judgment of obstacle areas.
  • the attenuating the weight of each point in the preset area in the partial navigation map comprises: weighting each point in the preset area with a preset attenuation factor Multiply.
  • the attenuation factor can be set empirically. If the preset area just includes N sub-areas, you can perform the attenuation operation according to the following formula:
  • Map_value* damping_factor.
  • map_value represents the total weight of a sub-area within the preset area
  • damping_factor represents the attenuation factor
  • the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
  • O represents the map center of the local navigation map, that is, the current flight position of the aircraft
  • represents the size of the field of view of the binocular system.
  • O represents the map center of the local navigation map, that is, the current flight position of the aircraft
  • represents the size of the field of view of the binocular system.
  • d denotes the attenuation distance, which is a fixed value set according to experience
  • the sector area determined by the above three parameters is the attenuation area, and the weight of the point in the attenuation area is performed. Attenuation, while the weight of points outside the attenuation zone does not need to be attenuated.
  • FIG. 2 only shows the attenuation area of the binocular system installed in the front of the aircraft. If the binocular system is also installed behind or on the side of the aircraft, the position is also set at the position where the attenuation area is symmetrical or the side is also set.
  • the area attenuates the weight of the points in the attenuation area, ie the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
  • an aircraft navigation map configuration apparatus includes:
  • the information acquisition module 110 is configured to acquire a current flight position, attitude information of the aircraft, and a depth map detected at the current flight position.
  • the aircraft can be a plant protection drone and so on.
  • the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
  • the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
  • the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
  • the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
  • the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
  • the size of the depth map can be set as needed, and this application does not limit this.
  • the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
  • the three-dimensional position information obtaining module 120 is configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map.
  • each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
  • the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
  • the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
  • the navigation coordinate system is a northeast coordinate system.
  • the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
  • the projection module 130 is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights.
  • the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
  • the center of the local navigation map is moved.
  • the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
  • the local navigation map is a two-dimensional map. After acquiring the point cloud, that is, the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight for dynamically planning the operation route. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. Information is a proportional relationship.
  • the operation route planning can be performed according to the local navigation map, for example, obstacle avoidance is performed, a detected new area is operated, and the like.
  • obstacle avoidance is performed, a detected new area is operated, and the like.
  • the following is a detailed description of the obstacle avoidance based on the local navigation map.
  • the local navigation map includes a plurality of sub-areas; the apparatus further includes an obstacle area setting module connected to the projection module 130, and the weights of all points in the sub-area are greater than a preset.
  • the sub-area is set as an obstacle area to instruct the aircraft to implement obstacle avoidance.
  • the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
  • the local navigation map can be a raster map, and each grid is a sub-region.
  • each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed for each sub-area to obtain the total weight of each sub-area. If the total weight is greater than the preset threshold, the sub-area is set as the obstacle area.
  • the apparatus further includes a transit area setting module connected to the projection module 130, configured to set the sub-area to be when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
  • the apparatus may further include sparse connection between the information acquisition module 110 and the three-dimensional information location acquisition module 120. a processing module, the sparse processing module being configured to perform sparse processing on the depth map.
  • the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the distance, and therefore, in one embodiment, the sparse processing module is adopted.
  • the change step size performs a sparse process on the depth map, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
  • the original depth map, or the sparse depth map after sparse processing, can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, the three-dimensional position information obtaining module 120 performs coordinate transformation on the depth map to obtain various points in the navigation coordinate system; according to each point in the navigation coordinate system, the current flight position, and The posture information obtains three-dimensional position information of each point.
  • the three-dimensional position information obtaining module 120 converts each point in the depth map into points in the camera coordinate system according to the camera internal reference matrix; according to the camera coordinate system Transformation matrix of the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
  • the apparatus further includes an attenuation module coupled between the projection module 130 and the obstacle area setting module (and/or the transit area setting module), the attenuation module being configured to A weight of each point in the preset area in the partial navigation map is attenuated; and a sum of weights of all points in each sub-area after attenuation is obtained.
  • the attenuation module first attenuates the weights of the points in the preset area in the local navigation map, and then calculates the total weight of each sub-area. Based on the total weight value, it is judged whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment and improving the accuracy of the obstacle area judgment.
  • the attenuation module multiplies the weight of each point within the predetermined area by a predetermined attenuation factor.
  • the attenuation factor can be set empirically. If the preset area includes exactly N sub-areas, the total weight of each sub-area can be multiplied by the attenuation factor.
  • the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
  • the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
  • the present application also provides a terminal, which may be an aircraft or other device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor implementing the program to implement the above The steps of any of the methods described.
  • an obstacle avoidance method includes the steps of:
  • the aircraft can be a plant protection drone and so on.
  • the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
  • the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
  • the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
  • the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
  • the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
  • the size of the depth map can be set according to needs, and the present application does not limit this.
  • the detected depth map is a 640*480 size image.
  • the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
  • each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
  • the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
  • the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
  • the navigation coordinate system is a northeast coordinate system.
  • the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
  • the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
  • the center of the local navigation map is moved.
  • the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
  • the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
  • the local navigation map can be a raster map, and each grid is a sub-region.
  • the local navigation map is a two-dimensional map. After the point cloud is acquired, that is, after the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight to determine which areas are specific. Obstacle area. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. The information is a proportional relationship, as shown in the following formula:
  • Point_weight point_weight_com*distance_factor
  • point_weight is the weight of a point
  • point_weight_com is the common weight of the point, that is, the preset weight, which can be obtained empirically, the common weight is consistent for all points
  • distance_factor is the distance-related factor, and distance It is a proportional relationship, that is, its value increases linearly with the increase of the distance information, and decreases linearly with the decrease of the distance information.
  • the distance information is distance information represented by a gray value of each pixel in the aforementioned depth map.
  • the distance factor is proportional to the distance information because the distance object has a small number of point clouds, so the weight of each point should be greater than the close distance point.
  • each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed on each sub-area according to the following formula to obtain the total weight of each sub-area. value.
  • Map_value+ point_weight
  • map_value represents the weight of a sub-area.
  • the preset threshold according to experience, for example, set the preset threshold to 1, and also set the specific form indicating the obstacle area according to actual needs.
  • the weight of the grid is 0.
  • the position is a free position, the aircraft is free to pass, and a grid weight of 1 indicates that there is an obstacle in the position, and the aircraft needs to bypass. Then you can set the sub-area according to the following formula:
  • the three-dimensional position information of each point is projected according to the respective set weights into a partial navigation map centered on the current flight position, and then includes: if all points in the sub-area The weight value is less than or equal to a preset threshold, and the sub-area is set as a pass area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area. For example, for a raster map, if the sum of the weights of all the points in a grid is less than 1, you can set the weight of the grid to 0 to indicate that the aircraft can pass.
  • mapping data including boundaries and obstacles
  • data of the local navigation map The data of the local navigation map is updated to the global navigation map with a certain period.
  • the survey data can be data manually tested by the user, or each data selected by the user through the map interface.
  • the survey data contains the boundary points of the obstacle area and the boundary points of the job boundary area.
  • the user-mapped map data can be uploaded to the aircraft through the data link, for example, uploaded to the aircraft's binocular system for mapping operations of the global navigation map.
  • the three-dimensional position information is data that has been set as an obstacle area in the partial navigation map.
  • the local navigation map will be updated in a certain period.
  • the position of the obstacle area can be selected and placed in an obstacle queue.
  • the obstacle queue can be deleted or added: when it is determined as an obstacle in the local navigation map, the information of the obstacle area is added to the queue; when the obstacle moves or disappears, the information of the obstacle area is taken from the queue. Delete it inside.
  • the information of the obstacle area in the obstacle queue is updated to the global navigation map in a certain cycle, so that the global navigation map also contains the information detected by the binocular system. It should be noted that the present application is not limited to updating the data of the global navigation map in the form of a queue, and the user may also update the information of the obstacle area in the partial navigation map to the global navigation map by other forms as needed.
  • the obstacle By updating the information of the obstacle area of the local navigation map to the global navigation map, the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding the finding only caused by the mapping data. Paths cannot avoid unmapped obstacles.
  • S260 Set an obstacle area and a job boundary area in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
  • the global navigation map can be a raster map.
  • the global navigation map is much larger than the local navigation map.
  • the local navigation map can be small.
  • the global navigation map needs to include the flight range of the aircraft.
  • the corresponding area can be identified in the global navigation map according to the position information of the job boundary area and the position information of the obstacle area included in the survey data. For example, set the weight of some of the grids in the global navigation map to 1 based on the survey data.
  • the global navigation map is updated, and the corresponding position in the global navigation map is updated to the obstacle area.
  • the aircraft can perform obstacle avoidance on the obstacle area and the work boundary area according to the global navigation map.
  • the global navigation map has to be initialized before the aircraft takes off.
  • the content of the initialization is to determine the size of the global navigation map and the location of the center point.
  • the center and size of the preset global navigation map are obtained based on the position of the aircraft prior to takeoff and the survey data.
  • the information for initializing the global navigation map comes from the mapping data set by the user.
  • the geographic location represented by the map center and the size of the map are determined at the time of initialization. After determining this information, it is possible to allocate storage space for the global navigation map, and determine the storage location of the obstacle information according to the geographical location of the obstacle, which is convenient for storing and accessing obstacle information.
  • the horizontal boundary of the global navigation map is determined by the position and the maximum and minimum values of the survey data on the Y-axis, the vertical boundary of the global navigation map being from the position And determining that the survey data is expanded after the maximum value and the minimum value on the X-axis are expanded.
  • the maximum value on the Y-axis is found from the survey data and the position information before the take-off of the aircraft, and the maximum value is expanded by a certain distance to obtain a horizontal boundary above the global navigation map; from the survey data and the position before the aircraft takes off.
  • FIG. 5 it is a schematic diagram of a global navigation map boundary acquisition of a specific embodiment.
  • d is the expansion distance.
  • the global navigation map uses polygons to represent the work area boundary B and the obstacle area O, and the map uploaded to the binocular system contains the position information of the vertices of these polygons.
  • the vertex position of the boundary B of the work area, and the vertex position of the obstacle area O it is calculated that the maximum value on the X-axis in the navigation coordinate system is the right side.
  • the maximum X value of the obstacle area, the minimum value on the X axis in the navigation coordinate system is the X value of the position before the takeoff of the aircraft, and the maximum value on the Y axis in the navigation coordinate system is the uppermost Y value of the boundary B of the work area, and the navigation coordinates
  • the minimum value on the Y-axis is the Y value of the position before the take-off of the aircraft, and the four values calculated above are respectively expanded according to the expansion distance d, and the boundary of the global navigation map shown in FIG. 5 can be obtained. At this point, you can get the size, boundary, and center point position information of the global navigation map to complete the global map initialization.
  • the obstacle area is set in the preset global navigation map and
  • the job boundary area includes: obtaining the first obstacle area and the first job boundary area according to the acquired survey data and the three-dimensional position information; and expanding the first obstacle area and the first work boundary area, A second obstacle area and a second work boundary area are obtained; the second obstacle area and the second work boundary area are set as areas for indicating that the aircraft implements obstacle avoidance.
  • the distance between the first obstacle area and the first working boundary area may be set according to actual needs. These expanded areas are also dangerous areas, and the passage of the aircraft is prohibited. Therefore, it is also necessary to set the area for the obstacle avoidance of the aircraft to keep the aircraft A safe distance from the obstacle and the boundary of the work area.
  • the present application is not limited to the manner of setting the obstacle area and the job boundary area in the global navigation map, and the user may directly set the prohibition in the global navigation map according to the survey data and the obstacle information in the local navigation map.
  • the distances in which the respective directions are expanded may be set to be the same, or different expansion distances may be set for the respective directions.
  • FIG. 6 a schematic diagram of setting an obstacle area and a job boundary area in a global navigation map according to a specific embodiment, wherein the global navigation map is a grid map, and when the weight of the grid in the grid map is 1, Indicates that the grid is an area that is prohibited from passing. When the weight of the grid is 0, it indicates that the grid is an area that allows traffic. As shown in FIG. 6 , a schematic diagram of setting an obstacle area and a job boundary area in a global navigation map according to a specific embodiment, wherein the global navigation map is a grid map, and when the weight of the grid in the grid map is 1, Indicates that the grid is an area that is prohibited from passing. When the weight of the grid is 0, it indicates that the grid is an area that allows traffic. As shown in FIG.
  • the weights of the original job boundary area and the original obstacle area in the global navigation map are all set to 1, indicating that the area has been completely Obstacle possession, prohibiting the passage of aircraft; using the depth-first algorithm or other algorithms to expand the original boundary area and the original obstacle area, and set the weight of the expansion area to 1, indicating that the expansion area is also a dangerous area, not allowed
  • the aircraft is close and can be used to maintain a safe distance from the obstacle.
  • the depth map needs to be preprocessed.
  • the method further includes: performing sparse processing on the depth map.
  • the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the proximity, and thus, in one embodiment, the pair is
  • the depth map is subjected to the sparse processing, and the thinning processing is performed on the depth map by using a change step size, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
  • the unequal spacing sparse processing can be performed from the image boundary, so that the pixel points near the center of the image are dense, and the pixel points at the edge of the image are sparse.
  • the pseudo code for sparse processing is as follows:
  • Img_height and img_width are the width and length of the image, respectively;
  • I_step and j_step are the step sizes of traversing the image, and the initial values are all 1;
  • Height_step and width_step are the sparse factors of the image in the vertical and horizontal directions, respectively;
  • HandleImage() represents the subsequent processing of the depth map.
  • the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, obtaining the three-dimensional position information of each point according to the current flight position, the posture information, and the depth map, including: performing coordinate conversion on the depth map, obtaining navigation Each point in the coordinate system; obtaining three-dimensional position information of each point according to each point in the navigation coordinate system, the current flight position, and the posture information.
  • the coordinate transformation of the depth map to obtain various points in the navigation coordinate system includes: according to the camera internal reference matrix, each of the depth maps is The points are converted into points in the camera coordinate system; the transformation matrix from the camera coordinate system to the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
  • the point cloud calculated using the depth map also has noise points, which will follow the loop of the above steps in the local navigation map. Accumulation, leading to erroneous measurement of obstacles, referred to as misdetection.
  • the three-dimensional position information of each point is projected into the partial navigation map according to the weight value set by each point, and then, the method further includes: pre-predetermining the partial navigation map Let the weight of each point in the area be attenuated; obtain the sum of the weights of all points in each sub-area after attenuation.
  • the weight of the points in the preset area in the local navigation map is attenuated, and then the total weight of each sub-area is calculated to determine whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment. Improve the accuracy of the judgment of obstacle areas.
  • the attenuating the weight of each point in the preset area in the partial navigation map comprises: weighting each point in the preset area with a preset attenuation factor Multiply.
  • the attenuation factor can be set empirically. If the preset area just includes N sub-areas, you can perform the attenuation operation according to the following formula:
  • Map_value* damping_factor.
  • map_value represents the total weight of a sub-area within the preset area
  • damping_factor represents the attenuation factor
  • the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
  • O represents the map center of the local navigation map, that is, the current flight position of the aircraft
  • represents the size of the field of view of the binocular system.
  • O represents the map center of the local navigation map, that is, the current flight position of the aircraft
  • represents the size of the field of view of the binocular system.
  • d denotes the attenuation distance, which is a fixed value set according to experience
  • the sector area determined by the above three parameters is the attenuation area, and the weight of the point in the attenuation area is performed. Attenuation, while the weight of points outside the attenuation zone does not need to be attenuated.
  • FIG. 2 only shows the attenuation area of the binocular system installed in the front of the aircraft. If the binocular system is also installed behind or on the side of the aircraft, the position is also set at the position where the attenuation area is symmetrical or the side is also set.
  • the area attenuates the weight of the points in the attenuation area, ie the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
  • the present application further provides an obstacle avoidance device, and a specific implementation manner of the device of the present application is described in detail below with reference to the accompanying drawings.
  • an obstacle avoidance device includes:
  • the first information acquiring module 210 is configured to acquire a current flight position, posture information of the aircraft, and a depth map detected at the current flight position.
  • the aircraft can be a plant protection drone and so on.
  • the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
  • the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
  • the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
  • the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
  • the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
  • the size of the depth map can be set as needed, and this application does not limit this.
  • the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
  • the three-dimensional position information obtaining module 220 is configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map.
  • each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
  • the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
  • the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
  • the navigation coordinate system is a northeast coordinate system.
  • the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
  • the projection module 230 is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights, wherein the partial navigation map includes a plurality of sub-areas.
  • the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
  • the center of the local navigation map is moved.
  • the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
  • the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
  • the local navigation map can be a raster map, and each grid is a sub-region.
  • the local navigation map is a two-dimensional map. After the point cloud is acquired, that is, after the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight to determine which areas are specific. Obstacle area. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. Information is a proportional relationship.
  • the first area setting module 240 is configured to set the sub-area as an obstacle area when the weight of all points in the sub-area is greater than a preset threshold, to instruct the aircraft to implement obstacle avoidance on the obstacle area .
  • each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed for each sub-area to obtain the total weight of each sub-area. If the total weight is greater than the preset threshold, the sub-area is set as the obstacle area.
  • the obstacle avoidance device further includes a transit area setting module connected to the projection module 230, configured to set the sub-area to be equal to or less than a preset threshold when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
  • a transit area setting module connected to the projection module 230, configured to set the sub-area to be equal to or less than a preset threshold when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
  • the second information acquiring module 250 is configured to acquire mapping data set by the user for indicating the obstacle area and the job boundary area, and three-dimensional position information for indicating the obstacle area in the partial navigation map.
  • mapping data including boundaries and obstacles
  • data of the local navigation map The data of the local navigation map is updated to the global navigation map with a certain period.
  • the survey data can be data manually tested by the user, or each data selected by the user through the map interface.
  • the survey data contains the boundary points of the obstacle area and the boundary points of the job boundary area.
  • the user-mapped map data can be uploaded to the aircraft through the data link, for example, uploaded to the aircraft's binocular system for mapping operations of the global navigation map.
  • the three-dimensional position information is data that has been set as an obstacle area in the partial navigation map.
  • the local navigation map will be updated in a certain period.
  • the position determined as the obstacle area can be selected and placed in an obstacle queue.
  • the obstacle queue can be deleted or added: when it is determined as an obstacle in the local navigation map, the information of the obstacle area is added to the queue; when the obstacle moves or disappears, the information of the obstacle area is taken from the queue. Delete it inside.
  • the information of the obstacle area in the obstacle queue is updated to the global navigation map in a certain cycle, so that the global navigation map also contains the information detected by the binocular system. It should be noted that the present application is not limited to updating the data of the global navigation map in the form of a queue, and the user may also update the information of the obstacle area in the partial navigation map to the global navigation map by other forms as needed.
  • the obstacle By updating the information of the obstacle area of the local navigation map to the global navigation map, the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding the finding only caused by the mapping data. Paths cannot avoid unmapped obstacles.
  • the second area setting module 260 is configured to set an obstacle area and a job boundary area in the preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
  • the global navigation map can be a raster map.
  • the global navigation map is much larger than the local navigation map.
  • the local navigation map can be small.
  • the global navigation map needs to include the flight range of the aircraft.
  • the corresponding area can be identified in the global navigation map according to the position information of the job boundary area and the position information of the obstacle area included in the survey data. For example, set the weight of some of the grids in the global navigation map to 1 based on the survey data.
  • the global navigation map is updated, and the corresponding position in the global navigation map is updated to the obstacle area.
  • the aircraft can perform obstacle avoidance on the obstacle area and the work boundary area according to the global navigation map.
  • the global navigation map has to be initialized before the aircraft takes off.
  • the content of the initialization is to determine the size of the global navigation map and the location of the center point.
  • the center and size of the preset global navigation map are obtained based on the position of the aircraft prior to takeoff and the survey data.
  • the information for initializing the global navigation map comes from the mapping data set by the user.
  • the geographic location represented by the map center and the size of the map are determined at the time of initialization. After determining the information, it is possible to allocate storage space for the global navigation map, determine the storage location of the obstacle information according to the geographical location of the obstacle, and conveniently store and access the obstacle information.
  • the horizontal boundary of the global navigation map is determined by the position and the maximum and minimum values of the survey data on the Y-axis, the vertical boundary of the global navigation map being from the position And determining that the survey data is expanded after the maximum value and the minimum value on the X-axis are expanded.
  • the maximum value on the Y-axis is found from the survey data and the position information before the take-off of the aircraft, and the maximum value is expanded by a certain distance to obtain a horizontal boundary above the global navigation map; from the survey data and the position before the aircraft takes off.
  • the second area setting module 260 is based on the acquired survey data. And the three-dimensional position information, obtaining the first obstacle area and the first work boundary area; expanding the first obstacle area and the first work boundary area to obtain the second obstacle area and the second work boundary area; The second obstacle area and the second work boundary area are set as areas for indicating that the aircraft implements obstacle avoidance.
  • the distance between the first obstacle area and the first working boundary area may be set according to actual needs. These expanded areas are also dangerous areas, and the passage of the aircraft is prohibited. Therefore, it is also necessary to set the area for the obstacle avoidance of the aircraft to keep the aircraft A safe distance from the obstacle and the boundary of the work area.
  • the present application is not limited to the manner of setting the obstacle area and the job boundary area in the global navigation map, and the user may directly set the prohibition in the global navigation map according to the survey data and the obstacle information in the local navigation map.
  • the area where the aircraft passes, without expansion or the like, or only the obstacle area or the work boundary area is expanded.
  • the distances in which the respective directions are expanded may be set to be the same, or different expansion distances may be set for the respective directions.
  • the obstacle avoidance device may further include being connected between the first information acquisition module 210 and the three-dimensional information location acquisition module 220. a sparse processing module, the sparse processing module being configured to perform sparse processing on the depth map.
  • the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the distance, and therefore, in one embodiment, the sparse processing module is adopted.
  • the change step size performs a sparse process on the depth map, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
  • the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, the three-dimensional position information obtaining module 220 performs coordinate transformation on the depth map to obtain various points in the navigation coordinate system; according to each point in the navigation coordinate system, the current flight position, and The posture information obtains three-dimensional position information of each point.
  • the three-dimensional position information obtaining module 220 converts each point in the depth map into points in the camera coordinate system according to the camera internal reference matrix; according to the camera coordinate system Transformation matrix of the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
  • the obstacle avoidance device further includes an attenuation module coupled between the projection module 230 and the first area setting module 240 (and/or the transit area setting module), the attenuation module setting To attenuate the weight of each point in the preset area in the partial navigation map; obtain the weight sum of all points in each sub-area after the attenuation.
  • the attenuation module first attenuates the weights of the points in the preset area in the local navigation map, and then calculates the total weight of each sub-area. Based on the total weight value, it is judged whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment and improving the accuracy of the obstacle area judgment.
  • the attenuation module multiplies the weight of each point within the predetermined area by a predetermined attenuation factor.
  • the attenuation factor can be set empirically. If the preset area includes exactly N sub-areas, the total weight of each sub-area can be multiplied by the attenuation factor.
  • the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
  • the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
  • the present application also provides a terminal, which may be an aircraft or other device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor implementing the program to implement the above The steps of the method described in the navigation map configuration method and the automatic obstacle avoidance method.
  • the present application also provides a storage medium, the storage medium including a stored program, wherein the device in which the storage medium is located is controlled to execute the navigation map configuration method according to the first aspect described above when the program is running.
  • the application further provides a storage medium, the storage medium comprising a stored program, wherein the device in which the storage medium is located is controlled to perform the obstacle avoidance method according to the second aspect described above when the program is running.
  • FIG. 8 is a schematic structural view of an unmanned aerial vehicle 800 according to an embodiment of the present application.
  • unmanned aerial vehicle 800 includes a controller 810 that is coupled to one or more sensors or sensing systems 801a-c in a wired or wireless manner.
  • the sensor can be connected to the controller via a controller area network (CAN).
  • the controller 810 can also be coupled to one or more actuators 820 to control the state of the UAV.
  • the sensor may include any of the sensors described herein, such as an inertial sensor, a GPS receiver, a compass, an RTK positioning sensor, a magnetometer, an altimeter, a distance sensor (eg, an infrared sensor or a lidar sensor), a visual or image sensor (eg, a camera) Or camera), photoelectric sensor, motion sensor, touch sensor, pressure sensor, temperature sensor, magnetic sensor, etc.
  • sensors described herein such as an inertial sensor, a GPS receiver, a compass, an RTK positioning sensor, a magnetometer, an altimeter, a distance sensor (eg, an infrared sensor or a lidar sensor), a visual or image sensor (eg, a camera) Or camera), photoelectric sensor, motion sensor, touch sensor, pressure sensor, temperature sensor, magnetic sensor, etc.
  • the inertial sensor also called IMU
  • the inertial sensor can be set to determine the attitude information of the aircraft, including a three-axis gyroscope, a three-axis acceleration sensor, a three-axis geomagnetic sensor and a barometer, among which three-axis gyroscope, three-axis acceleration sensor, three
  • the three axes in the geomagnetic sensor refer to the left and right sides of the UAV, the front and rear, the vertical direction of the three axes, the sensor is mainly responsible for measuring the inclination of the three axes of XYZ; the three-axis acceleration sensor is responsible for measuring the three axes of the XYZ of the drone Acceleration; geomagnetic sensor senses geomagnetism, allowing the drone to know its own nose and flight direction to find the mission position; the barometer can calculate the pressure difference to obtain the current altitude by measuring the air pressure at different positions.
  • the IMU inertial measurement unit can sense changes in the attitude of
  • some sensors may be coupled to a field programmable gate array (FPGA, not shown).
  • the field programmable gate array can be coupled to the controller (e.g., via a general purpose memory controller (GPMC)).
  • some sensors eg, vision sensors
  • GPMC general purpose memory controller
  • some sensors and/or the field programmable gate arrays can be coupled to the transmission module.
  • the transmission module can be used to communicate data acquired by the sensor (eg, image data) to any suitable external device or system, such as a terminal or remote device as described herein.
  • the controller can include one or more programmable processors (eg, a central processing unit).
  • the controller can be coupled to a storage medium such as a non-transitory computer readable medium 830.
  • the storage medium may include one or more storage units (eg, removable media or external storage such as an SD card or a random access memory).
  • data from the sensor eg, a camera
  • DMA direct memory access connection
  • the storage unit of the storage medium may store code and/or program instructions.
  • the controller executes the code and/or program instructions to perform the method embodiments described herein.
  • the controller can execute instructions such that one or more processors of the controller analyze data generated by one or more sensors or sensing systems to determine the orientation of the UAV described in this specification and / or motion information, detected external contact information and / or detected external signal information.
  • the controller can execute an instruction such that one or more processors of the controller determine whether to control the UAV to take off or land autonomously.
  • the storage unit of the storage medium 830 stores sensed data from the one or more sensing systems that will be processed by the controller.
  • the storage unit may store the UAV azimuth and/or motion information, detected external contact information, and/or detected external signal information.
  • the storage unit may store predetermined or pre-stored data to control the UAV (eg, a threshold of predetermined sensing data, parameters to control the actuator, the The intended flight path, speed, acceleration or direction of the unmanned aerial vehicle).
  • the actuators can include motors, electronic governors, mechanical transmissions, hydraulic transmissions, pneumatic transmissions, and the like.
  • the motor may include a magnetic motor, an electrostatic motor, or a piezoelectric motor.
  • the actuator comprises a brushed or brushless DC motor.
  • the controller can be coupled to the communication module 840 for transmitting and/or receiving data from one or more external devices (eg, terminals, display devices, ground controls, or other remote controls).
  • the communication module can use any suitable communication means, such as wired communication or wireless communication.
  • the communication module can employ one or more local area networks, wide area networks, infrared, radio waves, WiFi, point-to-point (P2P) networks, telecommunications networks, cloud communications, and the like.
  • a relay station such as a tower, satellite or mobile station may be employed.
  • the wireless communication may be affected by distance or by distance. In some embodiments, communication can be within sight or outside of line of sight.
  • the communication module can transmit and/or receive one or more sensed data, orientation and/or motion information from the sensing system, external contact information obtained by processing the sensed data, and/or an external signal Information, predetermined control data, user commands from the terminal or remote control, and the like.
  • the components of the UAV can be configured in any suitable manner.
  • one or more components of the UAV may be disposed on the UAV, carrier, load, terminal, sensing system, or any other remote device or system in communication with one or more of the devices described above.
  • FIG. 8 depicts a single controller and a single storage medium, those skilled in the art will appreciate that the description is not a limitation on the UAV, which may include multiple controllers and/or storage media. .
  • one or more of the plurality of controllers and/or storage media may be disposed at different locations, such as in the UAV, carrier, load, terminal, sensing system, or any Other remote devices or systems in which one or more of the above devices are in communication, or a suitable combination thereof, such that the UAV facilitates performing processing and/or storage functions at one or more of the locations described above.
  • the UAV includes, but is not limited to, a single rotor aircraft, a multi-rotor aircraft, and a rotorcraft.
  • Rotorcraft typically use a propeller to rotate around a shaft or shaft to generate lift.
  • the rotorcraft includes, for example, helicopters, rolling wings, rotation gyroplanes, rotary helicopters, and the like.
  • the rotorcraft may have a plurality of rotors mounted at a plurality of locations of the aircraft.
  • the UAV may include a quadrotor helicopter, a six-rotor helicopter, a ten-rotor helicopter, and the like.
  • the UAV can move freely with respect to six degrees of freedom (eg, three translational degrees of freedom and three degrees of rotational freedom).
  • the UAV may be limited to one or more degrees of freedom motion, such as being limited to a predetermined track or trajectory.
  • the motion can be driven by any suitable drive mechanism, such as by an engine or motor.
  • the UAV can be driven by a propulsion system.
  • the propulsion system may include, for example, an engine, a motor, a wheel, an axle, a magnet, a rotor, a propeller, a paddle, a nozzle, or any suitable combination of the above.
  • the motion of the UAV may be powered by any suitable source of energy, such as electrical energy, magnetic energy, solar energy, wind energy, gravity energy, chemical energy, nuclear energy, or any suitable combination of energy sources.
  • the UAV may be of different sizes, sizes, and/or configurations.
  • the UAV may be a multi-rotor UAV, and the axial spacing of the counter-rotating rotors does not exceed a certain threshold.
  • the threshold may be about 5 m, 4 m, 3 m, 2 m, 1 m, and the like.
  • the value of the axial spacing of the counter-rotating rotor may be 350 mm, 450 mm, 800 mm, 900 mm, or the like.
  • the UAV is sized and/or sized to accommodate a person in or on it.
  • the UAV is not sized and/or sized to accommodate a person in or on it.
  • the UAV's largest dimensions do not exceed 5 m, 4 m, 3 m, 2 m, 1 m, 0.5 m, or 0.1 m.
  • the axial distance of the counter-rotating rotor may not exceed 5 m, 4 m, 3 m, 2 m, 1 m, 0.5 m or 0.1 m.
  • the UAV may have a volume of less than 100 cm x 100 cm x 100 cm.
  • the UAV may have a volume of less than 50 cm x 50 cm x 30 cm. In certain embodiments, the UAV may have a volume of less than 5 cm x 5 cm x 3 cm. In certain embodiments, the footprint of the UAV (the area of the cross-section of the UAV) may be less than approximately 32,000 cm2, 20,000 cm2, 10,000 cm2, 1,000 cm2, 500 cm2, 100 cm2 Or smaller. In some cases, the unmanned aerial vehicle may weigh no more than 1000 kg, 500 kg, 100 kg, 10 kg, 5 kg, 1 kg, or 0.5 kg.
  • the UAV can carry a load.
  • the load may include one or more cargo, devices, instruments, and the like.
  • the load can have a housing. Alternatively, part or all of the load may have no housing.
  • the load may be rigidly fixed relative to the UAV. Alternatively, the load may be moved relative to the UAV (eg, translated or rotated relative to the UAV).
  • the carrier can be moved relative to the UAV (eg, relative to one, two, or three translational degrees of freedom and/or one, two, or three rotational degrees of freedom) such that the load is relative to
  • a suitable reference coordinate system maintains its position/and or direction without being affected by the movement of the UAV.
  • the reference coordinate system may be a fixed reference coordinate system (eg, a surrounding environment).
  • the reference coordinate system may be a motion reference coordinate system (eg, the unmanned aerial vehicle, load).
  • the carrier can move the load relative to the carrier and/or the unmanned aerial vehicle.
  • the motion may be relative to achieving three degrees of freedom (eg, along one, two, or three axes), relative to achieving three degrees of freedom (eg, along one, two, or three axes) or random combination.
  • the carrier can include a frame assembly and an actuator assembly.
  • the frame assembly can provide structural support for the load.
  • the frame assembly can include a plurality of separate frame members, some of which can move relative to each other.
  • the frame assembly and/or the separate frame member can be coupled to a drive assembly that drives the frame assembly to move.
  • the drive assembly can include one or more actuators (e.g., motors) configured to urge the separate frame members to move.
  • the actuator may cause a plurality of frame members to move simultaneously or only one frame member to move at a time.
  • the movement of the frame member can cause the load to move accordingly.
  • the drive assembly can drive one or more frame members to rotate about one or more axes of rotation, such as a roll axis, a pitch axis, or a heading axis. Rotation of the one or more frame members may cause the load to rotate about the one or more axes of rotation relative to the UAV.
  • the drive assembly can drive one or more frame members to translate along one or more translation axes to translate the load relative to the unmanned aerial vehicle along one or more corresponding translation axes .
  • the load may be coupled to the UAV by the carrier either directly (e.g., in direct contact with the UAV) or indirectly (e.g., without contacting the UAV).
  • the load may be mounted on the UAV without a carrier.
  • the load may be integral with the carrier.
  • the load can be detachably coupled to the carrier.
  • the load may include one or more load elements that, as previously described, may move relative to the UAV and/or carrier.
  • the load can include one or more sensors configured to measure one or more targets.
  • the load may comprise any suitable sensor, such as an image acquisition device (such as a camera), a sound acquisition device (such as a parabolic microphone), an infrared imaging device, or an ultraviolet imaging device.
  • the sensor can provide static sensing data (eg, photos) or dynamic sensing data (eg, video).
  • the sensor provides sensed data to a sensed object of the load.
  • the load may include one or more transmitters arranged to provide signals to one or more sensing objects.
  • the transmitter can be any suitable transmitter, such as a light source or a sound source.
  • the load includes one or more transceivers, for example for communicating with a module remote from the UAV.
  • the controller 810 by calling a program stored in the storage medium 830, the controller 810 is configured to acquire second map data and a flight position when flying according to the flight route; and to the first map data The work area and the second map data are matched to calculate a flight offset of the flight position from the flight path; flight correction is performed according to the flight offset to modify to the flight path.
  • controller 810 is further configured to:
  • the three-dimensional position information of each point is projected into the partial navigation map centered on the current flight position according to the respective set weights.
  • control 810 is further configured to:
  • the sub-area is set as an obstacle area to indicate that the aircraft performs obstacle avoidance on the obstacle area;
  • An obstacle area and a job boundary area are set in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
  • the navigation map configuration method, the automatic obstacle avoidance method, the device, the terminal, and the unmanned aerial vehicle dynamically generate a local navigation map centered on the current flight position of the aircraft, according to the position information, the posture information, and the depth map acquired by the aircraft during the flight.
  • the three-dimensional position information of each point is analyzed, and the three-dimensional position information of each point may be information of an unknown obstacle encountered by the aircraft during the flight, or may be a prior encounter of the aircraft during the flight.
  • the information of other objects to be planned, and the three-dimensional position information of each point is projected into the partial navigation map, and the real-time operation route planning can be performed according to the local navigation map.
  • the operation route is dynamically generated according to the information acquired during the flight of the aircraft, it can effectively cope with temporary task changes, for example, simultaneously working on a work area of a previously unplanned operation route, or automatically avoiding the area where the unknown obstacle is located. Barriers and so on.
  • the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding only the mapping data. The path found cannot avoid unmapped obstacles.
  • the functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the storage medium includes, but is not limited to, any type of disk (including a floppy disk, a hard disk, an optical disk, a CD-ROM, and a magneto-optical disk), a ROM (Read-Only Memory), and a RAM (Random AcceSS Memory).
  • a storage medium includes any medium that is stored or transmitted by a device (eg, a computer) in a readable form. It can be a read only memory, a disk or a disc.
  • steps, measures, and solutions in the various operations, methods, and processes that have been discussed in the present application may be alternated, changed, combined, or deleted.
  • steps, measures, and schemes of the various operations, methods, and processes that have been discussed in this application can be alternated, modified, rearranged, decomposed, combined, or deleted.
  • the steps, measures, and solutions in the various operations, methods, and processes disclosed in the present application may also be alternated, modified, rearranged, decomposed, combined, or deleted.
  • the solution provided by the present application can be applied to the field of UAV navigation, and the navigation map configuration method includes the steps of: acquiring a current flight position, posture information of the UAV, and a depth map detected at a current flight position; a current flight position, the attitude information, and the depth map, obtaining three-dimensional position information of each point; projecting the three-dimensional position information of each point to a local part centered on the current flight position according to a weight value set by each In the navigation map.
  • the solution of the present application can dynamically generate a work route and effectively cope with temporary task changes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)

Abstract

一种导航图配置方法,包括步骤:获取无人飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图(S110);根据当前飞行位置、姿态信息以及深度图,获得每个点的三维位置信息(S120);将每个点的三维位置信息按照各自设定的权值投影到以当前飞行位置为中心的局部导航图中(S130)。还提供一种自动避障方法、一种避障装置、一种终端以及一种无人飞行器(800)。

Description

导航图配置方法、避障方法以及装置、终端、无人飞行器 技术领域
本申请涉及飞行器技术领域,具体而言,本申请涉及一种导航图配置方法、自动避障方法以及装置、终端。
背景技术
随着科技的发展,飞行器,例如无人机等,在航拍、农业、植保、微型自拍、快递运输、灾难救援、观察野生动物、监控传染病、测绘、新闻报道、电力巡检、救灾、影视拍摄等等领域得到广泛的应用。
传统技术中,飞行器的作业航线是由地面控制装置根据地块信息,预先自动生成的扫描航线。由于扫描航线是预先生成的,连作业后的返航路径都是起飞前就生成好的,所以没办法应对临时的任务变化,例如,在飞行器根据预先生成的扫描航线作业时,用户需要飞行器对事先未规划作业航线的一片作业区域同时进行作业,或者,在飞行器根据预先生成的扫描航线作业时,用户需要飞行器对事先未规划的未知障碍物能够自动避障等等。
发明内容
本申请针对现有方式的缺点,提出一种导航图配置方法、自动避障方法以及装置、终端,用以解决现有技术中存在的作业航线规划方式无法应对临时的任务变化的问题,以能够动态生成作业航线,有效应对临时的任务变化。
本申请的实施例根据第一个方面,提供了一种导航图配置方法,包括步骤:
获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
本申请的实施例根据第二个方面,还提供了一种避障方法,包括步骤:
获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;
若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;
获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;
在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
本申请的实施例根据第三个方面,还提供了一种飞行器导航图配置装置,包括:
信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
本申请的实施例根据第四个方面,还提供了一种避障装置,包括:
第一信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信 息以及所述深度图,获得每个点的三维位置信息;
投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;
第一区域设置模块,设置为在子区域中所有点的权值和大于预设阈值时,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;
第二信息获取模块,设置为获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;
第二区域设置模块,设置为在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
本申请的实施例根据第五个方面,还提供了一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时执行任意实施例所述方法的步骤。
本申请的实施例根据第六个方面,还提供了一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时执行任意实施例所述方法的步骤。
本申请的实施例根据第七个方面,还提供了一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述第一方面所述的导航图配置方法。
本申请的实施例根据第八个方面,还提供了一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述第二方面所述的避障方法。
本申请的实施例根据第九个方面,还提供了一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感器,
所述通信模块,设置为与地面控制装置进行通信;
所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;
所述惯性传感器,设置为确定无人飞行器的姿态信息;
所述图像传感器,设置为在当前飞行位置探测深度图;
所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行上述第一方面所述方法的步骤。
本申请的实施例根据第十个方面,还提供了一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感器,
所述通信模块,设置为与地面控制装置进行通信;
所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;
所述惯性传感器,设置为确定无人飞行器的姿态信息;
所述图像传感器,设置为在当前飞行位置探测深度图;
所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行上述第二方面所述方法的步骤。
上述导航图配置方法、自动避障方法以及装置、终端,动态生成以飞行器当前飞行位置为中心的局部导航图,根据飞行器在飞行过程中获取到的位置信息、姿态信息以及深度图,分析出每个点的三维位置信息,该每个点的三维位置信息可能是飞行器在飞行过程中所遇到的未知障碍物的信息,也可能是飞行器在飞行过程中所遇到的事先未被规划的其它物体的信息,将各个点的三维位置信息投影到局部导航图中,就可以根据该局部导航图进行实时作业航线规划。由于作业航线根据飞行器飞行过程中获取的信息动态生成,因此可以有效应对临时的任务变化,例如,对事先未规划作业航线的一片作业区域同时进行作业,或者,自动对未知障碍物所在区域进行避障等等。
附图说明
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1为本申请导航图配置方法一实施例的流程示意图;
图2为本申请预设区域确定方法一具体实施例的示意图;
图3为本申请飞行器导航图配置装置一实施例的结构示意图;
图4为本申请避障方法一实施例的流程示意图;
图5为本申请全局导航图边界获取方式一具体实施例的示意图;
图6为本申请在全局导航图中设置障碍物区域和作业边界区域一具体实施例的示意图;
图7为本申请避障装置一实施例的结构示意图。
图8为本申请一个实施例的一种无人飞行器800的结构示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,仅用于解释本 申请,而不能解释为对本申请的限制。
本技术领域技术人员可以理解,除非特意声明,这里使用的单数形式“一”、“一个”、“所述”和“该”也可包括复数形式。应该进一步理解的是,本申请的说明书中使用的措辞“包括”是指存在所述特征、整数、步骤、操作、元件和/或组件,但是并不排除存在或添加一个或多个其他特征、整数、步骤、操作、元件、组件和/或它们的组。
本技术领域技术人员可以理解,除非另外定义,这里使用的所有术语(包括技术术语和科学术语),具有与本申请所属领域中的普通技术人员的一般理解相同的意义。还应该理解的是,诸如通用字典中定义的那些术语,应该被理解为具有与现有技术的上下文中的意义一致的意义,并且除非像这里一样被特定定义,否则不会用理想化或过于正式的含义来解释。
为了更好的理解本申请,首先对本申请的整体技术构思进行简单介绍。
本申请所设计的飞行器避障系统分为两个部分,一个是主要基于全局导航图的全局避障规划部分,另一个是主要基于局部导航图的局部避障规划部分。全局导航图和局部导航图均用于指示飞行器的飞行,其中,全局导航图和局部导航图的建立不相互依赖,所面对的问题不一样,建图策略也不一样(以后部分会详细介绍),目的是在减少资源消耗的同时贴合农业应用。
全局避障规划用于返航或指点飞行,主要使用全局导航图,面对的是已知的障碍。在飞行器的应用中,例如,农业无人机植保应用中,作业航线是由地面站根据地块信息,预先自动生成的扫描航线。由于扫描航线是预先生成的,连作业后的返航路径都是起飞前就生成好的,所以没办法应对临时的任务变化,例如药量突然用完、电量近乎用完或用户突然不想飞要返航。全局导航图就是用来应对这种场景,随时能够在整张地图中进行无障碍路径规划,这种规划是长距离的,而且不需要考虑喷药等问题。这种场景下,需要的地图区域较大,地图粒度不需要很细,地图区域在起飞前就能确定。
局部避障规划用于沿着作业航线飞行、或沿着全局规划的路径飞行过 程中遇到未知障碍物的情况,局部导航图主要用于在作业过程中遇到未知障碍物的情景,这种情景下,需要地图粒度较小,因为需要尽量贴合原有航线,尽量减少漏喷,所以相应的规划一般是短距的,地图可以比较小,地图中心跟飞行器一起移动。
另外,本申请所设计的局部导航图除了应用于避障,还可以应用于其它方面,例如,对事先未规划的一片果树区域进行作业等等,本申请并不对局部导航图所应用的场景进行限定。
基于上述技术构思,下面对本申请所提供的导航图配置方法、自动避障方法以及装置、终端的具体实施方式进行详细介绍。
首先,从基于局部导航图规划的角度对本申请进行详细介绍。
如图1所示,在一个实施例中,一种导航图配置方法,包括步骤:
S110、获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图。
飞行器可以为植保无人机等等。当前飞行位置指的是飞行器在当前时刻所处的地理位置,例如飞行器的经纬度信息等。姿态信息指的是飞行器在当前时刻的飞行姿态,例如俯仰角、横滚角和偏航角等等。飞行器的当前飞行位置和姿态信息均可以通过飞行器的飞控获取。深度图为拍摄到的目标的二维图像,所述深度图包括每个点与所述当前飞行位置的距离信息,即深度图中每个像素点的灰度值用于表征所拍摄目标与飞行器当前位置的距离信息。实际应用中,深度图可以通过飞行器的双目系统探测到。在进行局部避障时,所采用的深度图可以为多张深度图,也可以为单张深度图,本申请并不对此做出限定。另外,深度图的大小可以根据需要自行设定,本申请对此也不做出限定,例如,探测到的深度图是一张640*480大小的图像。获取到的当前飞行位置、姿态信息和深度图用于局部导航图的建图。
S120、根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息。
由于深度图中每个像素点的灰度值包含了所拍摄目标与所述飞行器的距离信息,因此根据当前飞行位置、姿态信息以及所述深度图中每个像 素点的像素坐标以及灰度值,可以计算出每个像素点所对应的三维位置信息,也即是深度图中每一个二维的像素点对应得到一个三维的点。
导航坐标系即当地水平坐标系,是在导航时根据导航系统工作的需要而选取的作为导航基准的坐标系,是用来做导航计算时使用的。考虑到这些三维的点用于飞行器的导航,所以计算出来的三维的点一般指的是导航坐标系下的点。可选的,导航坐标系为北东地坐标系。
由于局部导航图只需要对飞行器附近的障碍物进行指示,而得到的点云(各个三维的点)可能包括较远范围的点,因此可以只保留影响飞行器飞行的点,例如只保留距离飞行器上下一定范围内的点。
S130、将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
局部导航图用于为飞行器进行局部导航,其是动态的,地图的中心即飞行器的当前位置,地图的大小可以通过人为或者程序预先设定。当飞行器的当前位置变化时,就移动局部导航地图的中心,同时,对整张地图的内容做一次平移,平移后,超出地图范围的原有内容会被删除,新进的内容会被置零。
局部导航图为二维地图,获取到点云后,也即是导航坐标系的点云后,需要将这些点云以一定的权值叠加到局部导航图中,以用于动态规划作业航线。计算每个点的权值的方式有很多,例如,在一个实施例中,每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系,即如下公式所示:
point_weight=point_weight_com*distance_factor
其中,point_weight为一个点的权值;point_weight_com为点的通用权值,即预设权值,可以根据经验得到,该通用权值对于所有的点都一致;distance_factor为与距离相关的因子,与距离为正比例关系,即其值随距离信息的增大而线性增大,随距离信息的减小而线性减小。该距离信息为前述深度图中每个像素点的灰度值所表征的距离信息。距离因子与距离信息为正比例关系,原因在于远距离的物体,其点云数目较少,因而每个点的权值理应大于近距离的点。
根据步骤S130得到更新后的局部导航图后,就可以根据该局部导航图进行作业航线规划,例如对障碍物进行避障,对探测到的一块新的区域进行作业等等。为了更好的理解本申请,下面以基于局部导航图进行避障为例进行详细介绍。
在一个实施例中,所述局部导航图包括若干个子区域;所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器实施避障。
按照一定的规则将局部导航图划分为各个子区域,具体的划分规则可以根据实际情况进行设定,本申请并不对此做出限定。例如,在一个实施例中,所述局部导航图可以为栅格地图,每一个栅格为一个子区域。
由于获得的点可能是障碍物的点,也可能是噪声点等等,因此需要对这些点的权值进行进一步计算,以得到实际的障碍物区域。将各个点按照权值叠加到局部导航图中后,局部导航图中的每个子区域可能包含多个点,按照下述公式对每一个子区域进行总权值计算,得到每个子区域的总权值。
map_value+=point_weight。
其中,map_value表示一个子区域的权值。
根据经验设定预设阈值,例如将预设阈值设置为1,另外还需要根据实际需要设置表示障碍物区域的具体形式,例如,对于栅格地图来说,栅格的权值为0表示该位置为自由位置,飞行器可以自由通行,栅格权值为1则标识该位置存在障碍,飞行器需要绕行。那么就可以按照下述公式对子区域进行设置:
If(map_value>1)then map_value=1。
在一个实施例中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。如果子区域的总权值小于等于预设阈值,意味着该子区域不是真正的障碍物区域,因此就可以将该子区域标识为通行 区域。例如,对于栅格地图来说,如果一个栅格内所有点的权值和小于1,则可以将该栅格的权值设置为0,以指示飞行器可以通行。
因为深度图每个像素点都包含了目标的距离信息,包含的信息量过大,如果根据原始深度图获得每个点的三维位置信息,则计算量过大。因此,需要对深度图进行预处理。对深度图进行预处理以降低计算量的方式有很多,例如,在一个实施例中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还可以包括:对所述深度图进行稀疏处理。
根据相机小孔成像模型,远处的障碍会较为集中在图像的中心处,而近处的障碍则因为距离近,在图像中的面积较大,因此,在一个实施例中,所述对所述深度图进行稀疏处理,包括:采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
具体实施时,可以从图像边界出发做不等距的稀疏处理,使得靠近图像中心的像素点较为稠密,而图像边缘的像素点则较为稀疏。稀疏处理的伪代码如下所示:
Figure PCTCN2018112077-appb-000001
其中:
img_height与img_width分别为图像的宽度和长度;
i_step与j_step为遍历图像的步长,初始值都为1;
height_step与width_step分别为图像纵向和横向的稀疏因子;
HandleImage()表示的是对深度图的后续处理。
原始深度图,或者经过稀疏处理后的稀疏深度图,要经过坐标转换才 能转换到导航坐标系下,才能作为障碍物的信息更新到局部导航图中。因此,在一个实施例中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:对所述深度图进行坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
坐标转换的方式有很多,例如,在一个实施例中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵
Figure PCTCN2018112077-appb-000002
将相机坐标系下的各个点转化为机体坐标系的各个点;根据机体坐标系到导航坐标系的转换矩阵
Figure PCTCN2018112077-appb-000003
将机体坐标系下的各个点转换为导航坐标系的各个点。
由于深度图本身存在噪声以及障碍物可能会移动(例如树叶随风飘动),使用深度图计算出来的点云也会存在噪声点,这些噪声会随着上述步骤的循环进行而在局部导航图中累积,导致对障碍物的错误测量,简称误测。为了减小误测概率,在一个实施例中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。先对局部导航图中的预设区域内的点的权值进行衰减,然后再计算每个子区域的总权值,判断每个子区域是否为障碍物区域,从而降低噪声对障碍物判断的影响,提高障碍物区域判断的准确性。
在一个实施例中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:将所述预设区域内的每个点的权值与预设衰减因子相乘。衰减因子可以根据经验设定。如果预设区域刚好包括N个子区域,则可以根据下述公式进行衰减操作:
map_value*=damping_factor。
其中,map_value表示预设区域内的一个子区域的总权值,damping_factor表示衰减因子。
预设区域确定的方式有很多,例如,在一个实施例中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的 水平视场角以及设定衰减距离确定。
如图2所示,为一具体实施例的预设区域确定方法的示意图,其中,O表示局部导航图的地图中心,也即是飞行器的当前飞行位置,θ表示双目系统视场角的大小,由双目系统的参数确定,d表示衰减距离,为根据经验设定的固定值,那么由以上三个参数所确定的扇形区域即为衰减区域,对该衰减区域内的点的权值进行衰减,而衰减区域外的点的权值无需进行衰减。
需要说明的是,图2仅对飞行器中前方安装双目系统的衰减区域进行示意,如果飞行器后方或者侧面也安装有双目系统,则在图示衰减区域对称的位置或者侧面的位置也设置衰减区域,对衰减区域内的点的权值进行衰减,即衰减区域与双目系统安装的方式相关。如果通过其它设备获取深度图,则确定预设区域的方式与基于双目系统确定预设区域的构思相同。
基于同一发明构思,本申请还提供一种飞行器导航图配置装置,下面结合附图对本申请装置的具体实施方式进行详细介绍。
如图3所示,在一个实施例中,一种飞行器导航图配置装置,包括:
信息获取模块110,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图。
飞行器可以为植保无人机等等。当前飞行位置指的是飞行器在当前时刻所处的地理位置,例如飞行器的经纬度信息等。姿态信息指的是飞行器在当前时刻的飞行姿态,例如俯仰角、横滚角和偏航角等等。飞行器的当前飞行位置和姿态信息均可以通过飞行器的飞控获取。深度图为拍摄到的目标的二维图像,所述深度图包括每个点与所述当前飞行位置的距离信息,即深度图中每个像素点的灰度值用于表征所拍摄目标与飞行器当前位置的距离信息。实际应用中,深度图可以通过飞行器的双目系统探测到。在进行局部避障时,所采用的深度图可以为多张深度图,也可以为单张深度图,本申请并不对此做出限定。另外,深度图的大小可以根据需要自行设定,本申请对此也不做出限定。获取到的当前飞行位置、姿态信息和深度图用于局部导航图的建图。
三维位置信息获得模块120,设置为根据所述当前飞行位置、所述姿 态信息以及所述深度图,获得每个点的三维位置信息。
由于深度图中每个像素点的灰度值包含了所拍摄目标与所述飞行器的距离信息,因此根据当前飞行位置、姿态信息以及所述深度图中每个像素点的像素坐标以及灰度值,可以计算出每个像素点所对应的三维位置信息,也即是深度图中每一个二维的像素点对应得到一个三维的点。
导航坐标系即当地水平坐标系,是在导航时根据导航系统工作的需要而选取的作为导航基准的坐标系,是用来做导航计算时使用的。考虑到这些三维的点用于飞行器的导航,所以计算出来的三维的点一般指的是导航坐标系下的点。可选的,导航坐标系为北东地坐标系。
由于局部导航图只需要对飞行器附近的障碍物进行指示,而得到的点云(各个三维的点)可能包括较远范围的点,因此可以只保留影响飞行器飞行的点,例如只保留距离飞行器上下一定范围内的点。
投影模块130,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
局部导航图用于为飞行器进行局部导航,其是动态的,地图的中心即飞行器的当前位置,地图的大小可以通过人为或者程序预先设定。当飞行器的当前位置变化时,就移动局部导航地图的中心,同时,对整张地图的内容做一次平移,平移后,超出地图范围的原有内容会被删除,新进的内容会被置零。
局部导航图为二维地图,获取到点云后,也即是导航坐标系的点云后,需要将这些点云以一定的权值叠加到局部导航图中,以用于动态规划作业航线。计算每个点的权值的方式有很多,例如,在一个实施例中,每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
投影模块130得到更新后的局部导航图后,就可以根据该局部导航图进行作业航线规划,例如对障碍物进行避障,对探测到的一块新的区域进行作业等等。为了更好的理解本申请,下面以基于局部导航图进行避障为例进行详细介绍。
在一个实施例中,所述局部导航图包括若干个子区域;所述装置还包 括与所述投影模块130相连的障碍物区域设置模块,设置为在子区域中所有点的权值和大于预设阈值时,将所述子区域设置为障碍物区域,以指示所述飞行器实施避障。
按照一定的规则将局部导航图划分为各个子区域,具体的划分规则可以根据实际情况进行设定,本申请并不对此做出限定。例如,在一个实施例中,所述局部导航图可以为栅格地图,每一个栅格为一个子区域。
由于获得的点可能是障碍物的点,也可能是噪声点等等,因此需要对这些点的权值进行进一步计算,以得到实际的障碍物区域。将各个点按照权值叠加到局部导航图中后,局部导航图中的每个子区域可能包含多个点,对每一个子区域进行总权值计算,得到每个子区域的总权值。如果总权值大于预设阈值,则将该子区域设置为障碍物区域。
在一个实施例中,所述装置还包括与所述投影模块130相连的通行区域设置模块,设置为在子区域中所有点的权值和小于等于预设阈值时,将所述子区域设置为通行区域,以允许所述飞行器通过。如果子区域的总权值小于等于预设阈值,意味着该子区域不是真正的障碍物区域,因此就可以将该子区域标识为通行区域。
因为深度图每个像素点都包含了目标的距离信息,包含的信息量过大,如果根据原始深度图获得每个点的三维位置信息,则计算量过大。因此,需要对深度图进行预处理。对深度图进行预处理以降低计算量的方式有很多,例如,在一个实施例中,所述装置还可以包括连接在所述信息获取模块110以及所述三维信息位置获取模块120之间的稀疏处理模块,所述稀疏处理模块设置为对所述深度图进行稀疏处理。
根据相机小孔成像模型,远处的障碍会较为集中在图像的中心处,而近处的障碍则因为距离近,在图像中的面积较大,因此,在一个实施例中,稀疏处理模块采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
原始深度图,或者经过稀疏处理后的稀疏深度图,要经过坐标转换才能转换到导航坐标系下,才能作为障碍物的信息更新到局部导航图中。因此,在一个实施例中,所述三维位置信息获得模块120对所述深度图进行 坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
坐标转换的方式有很多,例如,在一个实施例中,三维位置信息获得模块120根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵
Figure PCTCN2018112077-appb-000004
将相机坐标系下的各个点转化为机体坐标系的各个点;根据机体坐标系到导航坐标系的转换矩阵
Figure PCTCN2018112077-appb-000005
将机体坐标系下的各个点转换为导航坐标系的各个点。
由于深度图本身存在噪声以及障碍物可能会移动(例如树叶随风飘动),使用深度图计算出来的点云也会存在噪声点,这些噪声会随着上述步骤的循环进行而在局部导航图中累积,导致对障碍物的错误测量,简称误测。为了减小误测概率,在一个实施例中,所述装置还包括连接在投影模块130和障碍物区域设置模块(和/或通行区域设置模块)之间的衰减模块,所述衰减模块设置为对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。衰减模块先对局部导航图中的预设区域内的点的权值进行衰减,然后再计算每个子区域的总权值。基于该总权值判断每个子区域是否为障碍物区域,从而降低噪声对障碍物判断的影响,提高障碍物区域判断的准确性。
在一个实施例中,所述衰减模块将所述预设区域内的每个点的权值与预设衰减因子相乘。衰减因子可以根据经验设定。如果预设区域刚好包括N个子区域,则可以分别将每个子区域的总权值与衰减因子相乘。
预设区域确定的方式有很多,例如,在一个实施例中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。衰减区域与双目系统安装的方式相关。如果通过其它设备获取深度图,则确定预设区域的方式与基于双目系统确定预设区域的构思相同。
本申请还提供一种终端,该终端可以为飞行器或者其他设备,其包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现上述任意一项所述方法的步骤。
下面,从基于全局导航图的全局避障规划的角度对本申请进行详细介绍。
如图4所示,在一个实施例中,一种避障方法,包括步骤:
S210、获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图。
飞行器可以为植保无人机等等。当前飞行位置指的是飞行器在当前时刻所处的地理位置,例如飞行器的经纬度信息等。姿态信息指的是飞行器在当前时刻的飞行姿态,例如俯仰角、横滚角和偏航角等等。飞行器的当前飞行位置和姿态信息均可以通过飞行器的飞控获取。深度图为拍摄到的目标的二维图像,所述深度图包括每个点与所述当前飞行位置的距离信息,即深度图中每个像素点的灰度值用于表征所拍摄目标与飞行器当前位置的距离信息。实际应用中,深度图可以通过飞行器的双目系统探测到。在进行局部避障时,所采用的深度图可以为多张深度图,也可以为单张深度图,本申请并不对此做出限定。另外,深度图的大小可以根据需要自行设定,本申请对此也不做出限定,例如,探测到的深度图是一张640*480大小的图像。获取到的当前飞行位置、姿态信息和深度图用于局部导航图的建图。
S220、根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息。
由于深度图中每个像素点的灰度值包含了所拍摄目标与所述飞行器的距离信息,因此根据当前飞行位置、姿态信息以及所述深度图中每个像素点的像素坐标以及灰度值,可以计算出每个像素点所对应的三维位置信息,也即是深度图中每一个二维的像素点对应得到一个三维的点。
导航坐标系即当地水平坐标系,是在导航时根据导航系统工作的需要而选取的作为导航基准的坐标系,是用来做导航计算时使用的。考虑到这些三维的点用于飞行器的导航,所以计算出来的三维的点一般指的是导航坐标系下的点。可选的,导航坐标系为北东地坐标系。
由于局部导航图只需要对飞行器附近的障碍物进行指示,而得到的点云(各个三维的点)可能包括较远范围的点,因此可以只保留影响飞行器 飞行的点,例如只保留距离飞行器上下一定范围内的点。
S230、将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域。
局部导航图用于为飞行器进行局部导航,其是动态的,地图的中心即飞行器的当前位置,地图的大小可以通过人为或者程序预先设定。当飞行器的当前位置变化时,就移动局部导航地图的中心,同时,对整张地图的内容做一次平移,平移后,超出地图范围的原有内容会被删除,新进的内容会被置零。
按照一定的规则将局部导航图划分为各个子区域,具体的划分规则可以根据实际情况进行设定,本申请并不对此做出限定。例如,在一个实施例中,所述局部导航图可以为栅格地图,每一个栅格为一个子区域。
局部导航图为二维地图,获取到点云后,也即是导航坐标系的点云后,需要将这些点云以一定的权值叠加到局部导航图中,以用于判断具体哪些区域为障碍物区域。计算每个点的权值的方式有很多,例如,在一个实施例中,每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系,即如下公式所示:
point_weight=point_weight_com*distance_factor
其中,point_weight为一个点的权值;point_weight_com为点的通用权值,即预设权值,可以根据经验得到,该通用权值对于所有的点都一致;distance_factor为与距离相关的因子,与距离为正比例关系,即其值随距离信息的增大而线性增大,随距离信息的减小而线性减小。该距离信息为前述深度图中每个像素点的灰度值所表征的距离信息。距离因子与距离信息为正比例关系,原因在于远距离的物体,其点云数目较少,因而每个点的权值理应大于近距离的点。
S240、若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障。
由于获得的点可能是障碍物的点,也可能是噪声点等等,因此需要对这些点的权值进行进一步计算,以得到实际的障碍物区域。将各个点按照 权值叠加到局部导航图中后,局部导航图中的每个子区域可能包含多个点,按照下述公式对每一个子区域进行总权值计算,得到每个子区域的总权值。
map_value+=point_weight。
其中,map_value表示一个子区域的权值。
根据经验设定预设阈值,例如将预设阈值设置为1,另外还需要根据实际需要设置表示障碍物区域的具体形式,例如,对于栅格地图来说,栅格的权值为0表示该位置为自由位置,飞行器可以自由通行,栅格权值为1则标识该位置存在障碍,飞行器需要绕行。那么就可以按照下述公式对子区域进行设置:
If(map_value>1)then map_value=1。
在一个实施例中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。如果子区域的总权值小于等于预设阈值,意味着该子区域不是真正的障碍物区域,因此就可以将该子区域标识为通行区域。例如,对于栅格地图来说,如果一个栅格内所有点的权值和小于1,则可以将该栅格的权值设置为0,以指示飞行器可以通行。
S250、获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息。
全局避障规划时数据来源有两方面:一是用户的测绘数据(包含边界以及障碍物),二是局部导航图的数据,局部导航图的数据会以一定的周期更新到全局导航图中。
测绘数据可以是用户手动测试的数据,也可以是用户通过地图界面选择的各个数据。一般情况下,测绘数据包含障碍物区域的边界点以及作业边界区域的边界点。在飞行器起飞之前,通过数据链路就可以将用户测绘的地图数据上传到飞行器中,例如,上传到飞行器的双目系统中,用于全局导航图的建图运算。
三维位置信息为局部导航图中已设置为障碍物区域的数据。局部导航 图会以一定的周期进行更新,在更新的同时,确定为障碍物区域的位置可以选择填放至一个障碍物队列里面。该障碍物队列可以删除也可以添加:当在局部导航图中确定为障碍物时,就将障碍物区域的信息添加到队列里面;障碍物移动或者消失时,则将障碍物区域的信息从队列里面删除。以一定的周期,将障碍物队列里面的障碍物区域的信息更新到全局导航图中,这样全局导航图也包含了双目系统探测出来的信息。需要说明的是,本申请并不限定于通过队列的形式对全局导航图的数据进行更新,用户还可以根据需要通过其他形式将局部导航图中的障碍物区域的信息更新至全局导航图中。
将局部导航图的障碍物区域的信息更新到全局导航图中,那么在进行全局规划时就可以直接避开障碍物,可以在全局范围内找到最短路径,从而避免只有测绘数据而导致的找到的路径无法避开未测绘的障碍物。
S260、在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
全局导航图可以为栅格地图。全局导航图比局部导航图大很多,局部导航图可以很小,只要满足局部避障需求,全局导航图需要把飞行器当次飞行的飞行范围都包含在内。获得测绘数据后,就可以根据测绘数据所包含的作业边界区域的位置信息以及障碍物区域的位置信息,在全局导航图中标识出相应的区域。例如,根据测绘数据将全局导航图中的某一些栅格的权值设置为1。获得局部导航图中的障碍物区域的位置信息后,就对全局导航图进行更新,将全局导航图中相应的位置更新为障碍物区域。设置好障碍物区域和作业边界区域后,飞行器根据该全局导航图就可以对障碍物区域和作业边界区域进行避障。
全局导航图在飞行器起飞作业前就得完成初始化。初始化的内容为确定全局导航图的大小以及中心点位置。在一个实施例中,所述预设的全局导航图的中心和大小根据所述飞行器起飞前的位置以及所述测绘数据获得。初始化全局导航图的信息来自于用户设定的测绘数据。地图中心所表示的地理位置以及地图的大小在初始化的时候就已经确定。确定这些信息后,就可以为全局导航图分配存储空间,根据障碍物的地理位置确定障碍 物信息的存储位置,方便存储和访问障碍物信息。
在一个实施例中,所述全局导航图的水平边界由所述位置和所述测绘数据在Y轴上的最大值和最小值膨胀后确定,所述全局导航图的竖直边界由所述位置和所述测绘数据在X轴上的最大值和最小值膨胀后确定。具体的,从测绘数据和飞行器起飞前的位置信息中找到在Y轴上的最大值,将该最大值膨胀一定的距离后得到全局导航图上面的水平边界;从测绘数据和飞机起飞前的位置信息中找到在Y轴上的最小值,将该最小值膨胀一定的距离后得到全局导航图下面的水平边界;从测绘数据和飞行器起飞前的位置信息中找到在X轴上的最大值,将该最大值膨胀一定的距离后得到全局导航图右边的竖直边界;从测绘数据和飞行器起飞前的位置信息中找到在X轴上的最小值,将该最小值膨胀一定的距离后得到全局导航图左边的竖直边界,其中膨胀的距离可以根据实际需要进行设定。得到全局导航图的边界信息后,就可以获得全局导航图的大小以及中心点的位置,完成全局导航图的初始化。
如图5所示,为一具体实施例的全局导航图边界获取的示意图。图5中,d为膨胀距离,从该图可以看出,该全局导航图使用多边形表示作业区域边界B以及障碍物区域O,上传到双目系统的地图就包含了这些多边形顶点的位置信息。在确定全局导航图的边界时,从飞行器起飞前的当前位置、作业区域边界B的顶点位置以及障碍物区域O的顶点位置中,计算出:导航坐标系下X轴上的最大值为右边的障碍物区域最大的X值,导航坐标系下X轴上的最小值为飞行器起飞前位置的X值,导航坐标系下Y轴上的最大值为作业区域边界B最上面的Y值,导航坐标系下Y轴上的最小值为飞行器起飞前位置的Y值,将上述计算出的四个值分别按照膨胀距离d进行膨胀,就可以得到图5所示的全局导航图的边界。此时可以获得全局导航图的大小、边界以及中心点位置信息,完成全局地图初始化。
根据测绘数据和局部导航图中的数据设置全局导航图中障碍物区域和作业边界区域的方式有很多,例如,在一个实施例中,所述在预设的全局导航图中设置障碍物区域和作业边界区域,包括:根据获取的所述测绘 数据以及三维位置信息,获得第一障碍物区域和第一作业边界区域;对所述第一障碍物区域和所述第一作业边界区域进行膨胀,获得第二障碍物区域和第二作业边界区域;将所述第二障碍物区域和所述第二作业边界区域设置为用于指示飞行器实施避障的区域。对第一障碍物区域和第一作业边界区域膨胀的距离可以根据实际需要进行设置,这些膨胀的区域也是危险区域,禁止飞行器通行,因此也需要设置为飞行器实施避障的区域,以使飞行器保持与障碍物和作业区域边界的安全距离。
需要说明的是,本申请并不限制于上述设置全局导航图中障碍物区域和作业边界区域的方式,用户也可以直接根据测绘数据和局部导航图中的障碍物信息在全局导航图中设置禁止飞行器通行的区域,而不进行膨胀,或者只对障碍物区域或者作业边界区域进行膨胀。另外,在膨胀时,可以将各个方向膨胀的距离设置为相同,也可以针对各个方向分别设置不同的膨胀距离。
如图6所示,为一具体实施例的在全局导航图中设置障碍物区域和作业边界区域的示意图,其中全局导航图为栅格地图,栅格地图中栅格的权值为1时,表示该栅格为禁止通行的区域,栅格的权值为0时,表示该栅格为允许通行的区域。如图6所示,根据测绘数据和局部导航图中障碍物的信息,将原有作业边界区域和原有障碍物区域在全局导航图中的权值全部置为1,表明该区域已经完全被障碍物占有,禁止飞行器通行;使用深度优先算法或者其他算法对原有边界区域以及原有障碍物区域进行膨胀,并将膨胀区域的权值全部置1,表明膨胀区域也为危险区域,不允许飞行器接近,可用于飞行器保持与障碍物的安全距离。
因为深度图每个像素点都包含了目标的距离信息,包含的信息量过大,如果根据原始深度图获得每个点的三维位置信息,则计算量过大。因此,需要对深度图进行预处理。对深度图进行预处理以降低计算量的方式有很多,例如,在一个实施例中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还可以包括:对所述深度图进行稀疏处理。
根据相机小孔成像模型,远处的障碍会较为集中在图像的中心处,而 近处的障碍则因为距离近,在图像中的面积较大,因此,在一个实施例中,所述对所述深度图进行稀疏处理,包括:采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
具体实施时,可以从图像边界出发做不等距的稀疏处理,使得靠近图像中心的像素点较为稠密,而图像边缘的像素点则较为稀疏。稀疏处理的伪代码如下所示:
Figure PCTCN2018112077-appb-000006
其中:
img_height与img_width分别为图像的宽度和长度;
i_step与j_step为遍历图像的步长,初始值都为1;
height_step与width_step分别为图像纵向和横向的稀疏因子;
HandleImage()表示的是对深度图的后续处理。
原始深度图,或者经过稀疏处理后的稀疏深度图,要经过坐标转换才能转换到导航坐标系下,才能作为障碍物的信息更新到局部导航图中。因此,在一个实施例中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:对所述深度图进行坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
坐标转换的方式有很多,例如,在一个实施例中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵
Figure PCTCN2018112077-appb-000007
将相机坐标系下的各个点转化为机体坐标系的 各个点;根据机体坐标系到导航坐标系的转换矩阵
Figure PCTCN2018112077-appb-000008
将机体坐标系下的各个点转换为导航坐标系的各个点。
由于深度图本身存在噪声以及障碍物可能会移动(例如树叶随风飘动),使用深度图计算出来的点云也会存在噪声点,这些噪声会随着上述步骤的循环进行而在局部导航图中累积,导致对障碍物的错误测量,简称误测。为了减小误测概率,在一个实施例中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。先对局部导航图中的预设区域内的点的权值进行衰减,然后再计算每个子区域的总权值,判断每个子区域是否为障碍物区域,从而降低噪声对障碍物判断的影响,提高障碍物区域判断的准确性。
在一个实施例中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:将所述预设区域内的每个点的权值与预设衰减因子相乘。衰减因子可以根据经验设定。如果预设区域刚好包括N个子区域,则可以根据下述公式进行衰减操作:
map_value*=damping_factor。
其中,map_value表示预设区域内的一个子区域的总权值,damping_factor表示衰减因子。
预设区域确定的方式有很多,例如,在一个实施例中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。
如图2所示,为一具体实施例的预设区域确定方法的示意图,其中,O表示局部导航图的地图中心,也即是飞行器的当前飞行位置,θ表示双目系统视场角的大小,由双目系统的参数确定,d表示衰减距离,为根据经验设定的固定值,那么由以上三个参数所确定的扇形区域即为衰减区域,对该衰减区域内的点的权值进行衰减,而衰减区域外的点的权值无需进行衰减。
需要说明的是,图2仅对飞行器中前方安装双目系统的衰减区域进行示意,如果飞行器后方或者侧面也安装有双目系统,则在图示衰减区域对 称的位置或者侧面的位置也设置衰减区域,对衰减区域内的点的权值进行衰减,即衰减区域与双目系统安装的方式相关。如果通过其它设备获取深度图,则确定预设区域的方式与基于双目系统确定预设区域的构思相同。
基于同一发明构思,本申请还提供一种避障装置,下面结合附图对本申请装置的具体实施方式进行详细介绍。
如图7所示,在一个实施例中,一种避障装置,包括:
第一信息获取模块210,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图。
飞行器可以为植保无人机等等。当前飞行位置指的是飞行器在当前时刻所处的地理位置,例如飞行器的经纬度信息等。姿态信息指的是飞行器在当前时刻的飞行姿态,例如俯仰角、横滚角和偏航角等等。飞行器的当前飞行位置和姿态信息均可以通过飞行器的飞控获取。深度图为拍摄到的目标的二维图像,所述深度图包括每个点与所述当前飞行位置的距离信息,即深度图中每个像素点的灰度值用于表征所拍摄目标与飞行器当前位置的距离信息。实际应用中,深度图可以通过飞行器的双目系统探测到。在进行局部避障时,所采用的深度图可以为多张深度图,也可以为单张深度图,本申请并不对此做出限定。另外,深度图的大小可以根据需要自行设定,本申请对此也不做出限定。获取到的当前飞行位置、姿态信息和深度图用于局部导航图的建图。
三维位置信息获得模块220,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息。
由于深度图中每个像素点的灰度值包含了所拍摄目标与所述飞行器的距离信息,因此根据当前飞行位置、姿态信息以及所述深度图中每个像素点的像素坐标以及灰度值,可以计算出每个像素点所对应的三维位置信息,也即是深度图中每一个二维的像素点对应得到一个三维的点。
导航坐标系即当地水平坐标系,是在导航时根据导航系统工作的需要而选取的作为导航基准的坐标系,是用来做导航计算时使用的。考虑到这些三维的点用于飞行器的导航,所以计算出来的三维的点一般指的是导航坐标系下的点。可选的,导航坐标系为北东地坐标系。
由于局部导航图只需要对飞行器附近的障碍物进行指示,而得到的点云(各个三维的点)可能包括较远范围的点,因此可以只保留影响飞行器飞行的点,例如只保留距离飞行器上下一定范围内的点。
投影模块230,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域。
局部导航图用于为飞行器进行局部导航,其是动态的,地图的中心即飞行器的当前位置,地图的大小可以通过人为或者程序预先设定。当飞行器的当前位置变化时,就移动局部导航地图的中心,同时,对整张地图的内容做一次平移,平移后,超出地图范围的原有内容会被删除,新进的内容会被置零。
按照一定的规则将局部导航图划分为各个子区域,具体的划分规则可以根据实际情况进行设定,本申请并不对此做出限定。例如,在一个实施例中,所述局部导航图可以为栅格地图,每一个栅格为一个子区域。
局部导航图为二维地图,获取到点云后,也即是导航坐标系的点云后,需要将这些点云以一定的权值叠加到局部导航图中,以用于判断具体哪些区域为障碍物区域。计算每个点的权值的方式有很多,例如,在一个实施例中,每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
第一区域设置模块240,设置为在子区域中所有点的权值和大于预设阈值时,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障。
由于获得的点可能是障碍物的点,也可能是噪声点等等,因此需要对这些点的权值进行进一步计算,以得到实际的障碍物区域。将各个点按照权值叠加到局部导航图中后,局部导航图中的每个子区域可能包含多个点,对每一个子区域进行总权值计算,得到每个子区域的总权值。如果总权值大于预设阈值,则将该子区域设置为障碍物区域。
在一个实施例中,避障装置还包括与所述投影模块230相连的通行区域设置模块,设置为在子区域中所有点的权值和小于等于预设阈值时,将 所述子区域设置为通行区域,以允许所述飞行器通过。如果子区域的总权值小于等于预设阈值,意味着该子区域不是真正的障碍物区域,因此就可以将该子区域标识为通行区域。
第二信息获取模块250,设置为获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息。
全局避障规划时数据来源有两方面:一是用户的测绘数据(包含边界以及障碍物),二是局部导航图的数据,局部导航图的数据会以一定的周期更新到全局导航图中。
测绘数据可以是用户手动测试的数据,也可以是用户通过地图界面选择的各个数据。一般情况下,测绘数据包含障碍物区域的边界点以及作业边界区域的边界点。在飞行器起飞之前,通过数据链路就可以将用户测绘的地图数据上传到飞行器中,例如,上传到飞行器的双目系统中,用于全局导航图的建图运算。
三维位置信息为局部导航图中已设置为障碍物区域的数据。局部导航图会以一定的周期进行更新,在更新的同时,确定为障碍物区域的位置可以选择填放至一个障碍物队列里面。该障碍物队列可以删除也可以添加:当在局部导航图中确定为障碍物时,就将障碍物区域的信息添加到队列里面;障碍物移动或者消失时,则将障碍物区域的信息从队列里面删除。以一定的周期,将障碍物队列里面的障碍物区域的信息更新到全局导航图中,这样全局导航图也包含了双目系统探测出来的信息。需要说明的是,本申请并不限定于通过队列的形式对全局导航图的数据进行更新,用户还可以根据需要通过其他形式将局部导航图中的障碍物区域的信息更新至全局导航图中。
将局部导航图的障碍物区域的信息更新到全局导航图中,那么在进行全局规划时就可以直接避开障碍物,可以在全局范围内找到最短路径,从而避免只有测绘数据而导致的找到的路径无法避开未测绘的障碍物。
第二区域设置模块260,设置为在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界 区域实施避障。
全局导航图可以为栅格地图。全局导航图比局部导航图大很多,局部导航图可以很小,只要满足局部避障需求,全局导航图需要把飞行器当次飞行的飞行范围都包含在内。获得测绘数据后,就可以根据测绘数据所包含的作业边界区域的位置信息以及障碍物区域的位置信息,在全局导航图中标识出相应的区域。例如,根据测绘数据将全局导航图中的某一些栅格的权值设置为1。获得局部导航图中的障碍物区域的位置信息后,就对全局导航图进行更新,将全局导航图中相应的位置更新为障碍物区域。设置好障碍物区域和作业边界区域后,飞行器根据该全局导航图就可以对障碍物区域和作业边界区域进行避障。
全局导航图在飞行器起飞作业前就得完成初始化。初始化的内容为确定全局导航图的大小以及中心点位置。在一个实施例中,所述预设的全局导航图的中心和大小根据所述飞行器起飞前的位置以及所述测绘数据获得。初始化全局导航图的信息来自于用户设定的测绘数据。地图中心所表示的地理位置以及地图的大小在初始化的时候就已经确定。确定这些信息后,就可以为全局导航图分配存储空间,根据障碍物的地理位置确定障碍物信息的存储位置,方便存储和访问障碍物信息。
在一个实施例中,所述全局导航图的水平边界由所述位置和所述测绘数据在Y轴上的最大值和最小值膨胀后确定,所述全局导航图的竖直边界由所述位置和所述测绘数据在X轴上的最大值和最小值膨胀后确定。具体的,从测绘数据和飞行器起飞前的位置信息中找到在Y轴上的最大值,将该最大值膨胀一定的距离后得到全局导航图上面的水平边界;从测绘数据和飞机起飞前的位置信息中找到在Y轴上的最小值,将该最小值膨胀一定的距离后得到全局导航图下面的水平边界;从测绘数据和飞行器起飞前的位置信息中找到在X轴上的最大值,将该最大值膨胀一定的距离后得到全局导航图右边的竖直边界;从测绘数据和飞行器起飞前的位置信息中找到在X轴上的最小值,将该最小值膨胀一定的距离后得到全局导航图左边的竖直边界,其中膨胀的距离可以根据实际需要进行设定。得到全局导航图的边界信息后,就可以获得全局导航图的大小以及中心点的位置,完成全 局导航图的初始化。
根据测绘数据和局部导航图中的数据设置全局导航图中障碍物区域和作业边界区域的方式有很多,例如,在一个实施例中,所述第二区域设置模块260根据获取的所述测绘数据以及三维位置信息,获得第一障碍物区域和第一作业边界区域;对所述第一障碍物区域和所述第一作业边界区域进行膨胀,获得第二障碍物区域和第二作业边界区域;将所述第二障碍物区域和所述第二作业边界区域设置为用于指示飞行器实施避障的区域。对第一障碍物区域和第一作业边界区域膨胀的距离可以根据实际需要进行设置,这些膨胀的区域也是危险区域,禁止飞行器通行,因此也需要设置为飞行器实施避障的区域,以使飞行器保持与障碍物和作业区域边界的安全距离。
需要说明的是,本申请并不限制于上述设置全局导航图中障碍物区域和作业边界区域的方式,用户也可以直接根据测绘数据和局部导航图中的障碍物信息在全局导航图中设置禁止飞行器通行的区域,而不进行膨胀等,或者只对障碍物区域或者作业边界区域进行膨胀。另外,在膨胀时,可以将各个方向膨胀的距离设置为相同,也可以针对各个方向分别设置不同的膨胀距离。
因为深度图每个像素点都包含了目标的距离信息,包含的信息量过大,如果根据原始深度图获得每个点的三维位置信息,则计算量过大。因此,需要对深度图进行预处理。对深度图进行预处理以降低计算量的方式有很多,例如,在一个实施例中,避障装置还可以包括连接在所述第一信息获取模块210以及所述三维信息位置获取模块220之间的稀疏处理模块,所述稀疏处理模块设置为对所述深度图进行稀疏处理。
根据相机小孔成像模型,远处的障碍会较为集中在图像的中心处,而近处的障碍则因为距离近,在图像中的面积较大,因此,在一个实施例中,稀疏处理模块采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
原始深度图,或者经过稀疏处理后的稀疏深度图,要经过坐标转换才能转换到导航坐标系下,才能作为障碍物的信息更新到局部导航图中。因 此,在一个实施例中,所述三维位置信息获得模块220对所述深度图进行坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
坐标转换的方式有很多,例如,在一个实施例中,三维位置信息获得模块220根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵
Figure PCTCN2018112077-appb-000009
将相机坐标系下的各个点转化为机体坐标系的各个点;根据机体坐标系到导航坐标系的转换矩阵
Figure PCTCN2018112077-appb-000010
将机体坐标系下的各个点转换为导航坐标系的各个点。
由于深度图本身存在噪声以及障碍物可能会移动(例如树叶随风飘动),使用深度图计算出来的点云也会存在噪声点,这些噪声会随着上述步骤的循环进行而在局部导航图中累积,导致对障碍物的错误测量,简称误测。为了减小误测概率,在一个实施例中,避障装置还包括连接在投影模块230和第一区域设置模块240(和/或通行区域设置模块)之间的衰减模块,所述衰减模块设置为对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。衰减模块先对局部导航图中的预设区域内的点的权值进行衰减,然后再计算每个子区域的总权值。基于该总权值判断每个子区域是否为障碍物区域,从而降低噪声对障碍物判断的影响,提高障碍物区域判断的准确性。
在一个实施例中,所述衰减模块将所述预设区域内的每个点的权值与预设衰减因子相乘。衰减因子可以根据经验设定。如果预设区域刚好包括N个子区域,则可以分别将每个子区域的总权值与衰减因子相乘。
预设区域确定的方式有很多,例如,在一个实施例中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。衰减区域与双目系统安装的方式相关。如果通过其它设备获取深度图,则确定预设区域的方式与基于双目系统确定预设区域的构思相同。
本申请还提供一种终端,该终端可以为飞行器或者其他设备,其包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现上述导航图配置方法和自动避障方法中所 述方法的步骤。
本申请还提供了一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述第一方面所述的导航图配置方法。
本申请还提供了一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述第二方面所述的避障方法。
图8是本申请一个实施例的一种无人飞行器800的结构示意图。如图8所示,无人飞行器800包括控制器810,所述控制器810以有线或无线方式与一个或多个传感器或感测系统801a-c连接。所述传感器可以通过控制器局域网(controller area network,CAN)与所述控制器连接。所述控制器810也可以与一个或多个致动器820连接以控制所述无人飞行器的状态。
所述传感器可以包括本说明书描述的任意传感器,例如惯性传感器、GPS接收器、指南针、RTK定位传感器、磁力计、高度计、距离传感器(例如红外线传感器或激光雷达传感器)、视觉或图像传感器(例如相机或摄像机)、光电传感器、运动传感器、触控传感器、压力传感器、温度传感器、磁传感器等等。
所述惯性传感器,也称IMU,可以设置为确定飞行器的姿态信息,包括三轴陀螺仪,三轴加速度传感器,三轴地磁传感器和气压计等,其中三轴陀螺仪,三轴加速度传感器,三轴地磁传感器中的三轴指的就是无人机左右,前后,垂直方向上下这三个轴,传感器主要负责测量XYZ三个轴的倾角;三轴加速度传感器负责测量无人机XYZ三个轴的加速度;地磁传感器感知地磁,可以让无人机知道自己的机头和飞行朝向,找到任务位置;气压计可通过测量不同位置的气压,计算压差获得到当前的高度。通过组合上述传感器,IMU惯性测量单元可以感知无人机姿态的变化。例如无人机当前是前倾还是左右倾斜,机头朝向、高度等最基本的姿态数据。
所述图像传感器可以设置为确定无人飞行器各个方向的障碍物信息,所述图像传感器包括双目系统,该双目系统至少包括两个摄像头,通过图 像处理算法可以确定物体的三维信息,构建物体的三维模型。
在某些实施例中,可以将一些传感器(例如视觉传感器)与现场可编程门阵列(field programmable gate array,FPGA,图上未示出)连接。可以将所述现场可编程门阵列与所述控制器连接(例如通过通用存储控制器(general purpose memory controller,GPMC)连接)。在某些实施例中,可以将一些传感器(例如视觉传感器)及/或所述现场可编程门阵列与传输模块连接。所述传输模块可以用来将所述传感器获取的数据(例如图像数据)传送给任意适合的外部设备或系统,例如本说明书描述的终端或远程设备。
所述控制器可以包括一个或多个可编程处理器(例如中央处理器)。所述控制器可以与存储介质(如非易失性计算机可读介质)830连接。所述存储介质可以包括一个或多个存储单元(例如可移动介质或外部存储器,如SD卡或随机存储器)。在某些实施例中,来自于所述传感器(例如相机)的数据可以直接传送及存储于所述存储介质的存储单元中(例如通过直接内存访问连接(DMA))。所述存储介质的存储单元可以存储代码及/或程序指令。所述控制器执行该代码及/或程序指令,以执行本说明书描述的方法实施例。例如,所述控制器可以执行指令,使得所述控制器的一个或多个处理器分析一个或多个传感器或感测系统产生的数据,以确定本说明书描述的所述无人飞行器的方位及/或运动信息、检测的外部接触信息及/或检测的外部信号信息。又如,所述控制器可以执行指令,使得所述控制器的一个或多个处理器决定是否控制所述无人飞行器自主起飞或降落。
所述存储介质830的存储单元存储来自于所述一个或多个感测系统的感测数据,该感测数据将由所述控制器处理。在某些实施例中,所述存储单元可以存储所述无人飞行器方位及/或运动信息、检测的外部接触信息及/或检测的外部信号信息。可选地或结合地,所述存储单元可以存储用以控制所述无人飞行器的预定或预存的数据(例如预定的感测数据的阈值、用以控制所述致动器的参数、所述无人飞行器的预定飞行路径、速度、加速度或方向)。
如前所述,所述控制器810可以通过一个或多个致动器820调整所述无人飞行器的状态。例如,所述控制器可以控制所述无人飞行器的转子(例如控制转子的旋转速度),因而调整所述无人飞行器或其部件(例如负载、负载的载体)相对于多达六个自由度(沿X、Y及Z轴的平移运动及横滚轴、俯仰轴及航向轴的旋转运动)的空间布局。可选地或结合地,所述控制器可以调整所述无人飞行器相对于六个自由度的速度或加速度。在某些实施例中,所述控制器可以基于预定的控制数据或所述无人飞行器的位置、外部接触或外部信号信息来控制所述无人飞行器。通过处理来自于一个或多个感测系统的感测数据,可以获得所述无人飞行器的方位、外部接触或外部信号信息。例如,所述控制器可以基于是否需要起飞或降落来为所述致动器提供加速或减速信号。
在不同的实施例中,所述致动器可以包括电机、电子调速器、机械传动装置、液压传动装置、气压传动装置等等。所述电机可以包括磁力电机、静电电机或压电电机。例如,在某个实施例中,所述致动器包括有刷或无刷直流电机。
所述控制器可以与通信模块840连接,用以传送及/或接收来自于一个或多个外部设备(例如终端、显示设备、地面控制装置或其他遥控器)的数据。所述通信模块可以使用任意适用的通信方式,例如有线通信或无线通信。例如,所述通信模块可以采用一个或多个局域网、广域网、红外线、无线电波、WiFi、点对点(point-to-point,P2P)网络、电信网络、云通信等等。可选地,可以采用中继站,例如发射塔、卫星或移动站。所述无线通信可以受距离的影响也可以不受距离的影响。在某些实施例中,可以在视线之内通信也可以在视线之外通信。所述通信模块可以传送及/或接收来自于所述感测系统的一个或多个感测数据、方位及/或运动信息、通过处理所述感测数据获得的外部接触信息及/或外部信号信息、预定的控制数据、来自于终端或遥控器的用户命令等等。
所述无人飞行器的部件可以进行任意适合的配置。例如,该无人飞行器的一个或多个部件可以设置在所述无人飞行器、载体、负载、终端、感测系统或任意与上述一个或多个设备相通信的其他远程设备或系统上。此 外,尽管图8描述单个控制器及单个存储介质,本领域的技术人员应当知道,该描述并非对所述无人飞行器的限制,所述无人飞行器可以包括多个控制器及/或存储介质。在某些实施例中,所述多个控制器及/或存储介质中的一个或多个可以设置在不同位置,例如在所述无人飞行器、载体、负载、终端、感测系统或任意与上述一个或多个设备相通信的其他远程设备或系统或其适当的组合上,使得所述无人飞行器便于在上述一个或多个位置执行处理及/或存储功能。
所述无人飞行器包括但不限于单旋翼飞行器、多旋翼飞行器及旋翼飞行器。旋翼飞行器通常利用螺旋桨绕杆或轴旋转产生升力。所述旋翼飞行器包括例如直升机、滚翼机、自转旋翼机、旋翼式直升飞机等等。所述旋翼飞行器可以有多个安装在所述飞行器的多个位置的转子。例如,所述无人飞行器可以包括四旋翼直升机、六旋翼直升机、十旋翼直升机等等。
在不同的实施例中,所述无人飞行器可以相对于六个自由度(例如三个平移自由度及三个旋转自由度)自由运动。或者,所述无人飞行器可以限制在一个或多个自由度运动,例如限制在预定轨道或轨迹。所述运动可以由任意适合的驱动机制驱动,例如由引擎或电机驱动。在某些实施例中,所述无人飞行器可以受推进系统驱动。推进系统可以包括例如引擎、电机、轮子、轮轴、磁铁、转子、螺旋桨、桨叶、喷嘴或任何适合的上述部件的组合。可以由任意适合的能源,例如电能、磁能、太阳能、风能、重力能、化学能、核能或任何适合的能源的组合为所述无人飞行器的运动提供动力。
在不同的实施例中,所述无人飞行器可以采用不同的大小、尺寸及/或结构。例如,在一个实施例中,所述无人飞行器可以是多旋翼无人飞行器,反向转动的转子的轴间距不超过某一阈值。所述阈值可以是大约5m、4m、3m、2m、1m等等。例如,所述反向转动的转子的轴间距的数值可以是350mm、450mm、800mm、900mm等等。
在某些实施例中,所述无人飞行器的大小及/或尺寸足以容纳一个人在其中或其上。或者,所述无人飞行器的大小及/或尺寸不足以容纳一个人在其中或其上。在某些情况下,所述无人飞行器的最大的尺寸(例如长、 宽、高、直径、对角线)不超过5m、4m、3m、2m、1m、0.5m或0.1m。例如,所述反向转动的转子的轴间距可以不超过5m、4m、3m、2m、1m、0.5m或0.1m。在某些实施例中,所述无人飞行器的体积可以小于100cm x100cm x 100cm。在某些实施例中,所述无人飞行器的体积可以小于50cm x 50cm x 30cm。在某些实施例中,所述无人飞行器的体积可以小于5cm x5cm x 3cm。在某些实施例中,所述无人飞行器的占地面积(所述无人飞行器的横截面的面积)可以小于大约32,000cm2、20,000cm2、10,000cm2、1,000cm2、500cm2、100cm2或更小。在某些情况下,所述无人飞行器的重量可以不超过1000kg、500kg、100kg、10kg、5kg、1kg或者0.5kg。
在不同的实施例中,所述无人飞行器可以搭载载荷。所述载荷可以包括一个或多个货物、装置、仪器等等。所述载荷可以有壳体。可选地,所述载荷的部分或整个可以没有壳体。所述载荷可以相对于所述无人飞行器刚性固定。或者,所述载荷可以相对于所述无人飞行器运动(例如相对于所述无人飞行器平移或旋转)。
在某些实施例中,所述载荷包括负载及搭载所述负载的载体,例如,药箱。所述载体可以与所述无人飞行器一体成型。或者,所述载体可以可拆卸地连接到所述无人飞行器。所述载体可以与所述无人飞行器直接或间接连接。所述载体可以支撑所述负载(例如至少支撑所述负载的部分重量)。所述载体可以包括适合的安装结构(例如云台),能够稳定及/或控制所述负载的运动。在某些实施例中,所述载体可以适用于控制所述负载相对于所述无人飞行器的状态(例如位置及/或方向)。例如,所述载体可以相对于所述无人飞行器运动(例如相对于一个、两个或三个平移自由度及/或一个、两个或三个旋转自由度运动),使得所述负载相对于适合的参考坐标系保持其位置/及或方向而不受所述无人飞行器运动的影响。所述参考坐标系可以是固定参考坐标系(例如周围环境)。或者,所述参考坐标系可以是运动参考坐标系(例如所述无人飞行器、负载)。
在某些实施例中,所述载体可以使得所述负载相对于所述载体及/或无人飞行器运动。所述运动可以是相对于达到三个自由度(例如沿一个、两个或三个轴)的平移、相对于达到三个自由度(例如沿一个、两个或三 个轴)的旋转或者其任意组合。例如,所述载体可以包括框架组件及致动器组件。所述框架组件可以为所述负载提供结构支撑。所述框架组件可以包括多个单独的框架部件,其中一些框架部件可以相互运动。
所述框架组件及/或单独的框架部件可以与驱动组件连接,该驱动组件驱使所述框架组件运动。所述驱动组件可以包括一个或多个致动器(例如电机),设置为驱使所述单独的框架部件运动。所述致动器可以使得多个框架部件同时运动或每次只有一个框架部件运动。所述框架部件的运动可以使得所述负载相应运动。例如,所述驱动组件可以驱使一个或多个框架部件绕一个或多个旋转轴(例如横滚轴、俯仰轴或航向轴)旋转。所述一个或多个框架部件的旋转可以使得负载相对于所述无人飞行器绕一个或多个旋转轴旋转。可选地或结合地,所述驱动组件可以驱使一个或多个框架部件沿一个或多个平移轴平移,从而使所述负载相对于所述无人飞行器沿一个或多个对应的平移轴平移。
所述负载可以通过所述载体与所述无人飞行器直接(例如直接接触所述无人飞行器)或间接(例如不接触所述无人飞行器)连接。可选地,所述负载可以无需载体安装在所述无人飞行器上。所述负载可以与所述载体形成一个整体。或者,所述负载可以可拆卸地与所述载体连接。在某些实施例中,所述负载可以包括一个或多个负载元件,如前所述,所述负载元件可以相对于所述无人飞行器及/或载体运动。所述负载可以包括设置为测量一个或多个目标的一个或多个传感器。所述负载可以包含任意适合的传感器,例如图像获取设备(如相机)、声音获取设备(如抛物面麦克风)、红外线成像设备或紫外线成像设备。所述传感器可以提供静态感测数据(例如照片)或动态感测数据(例如视频)。在某些实施例中,所述传感器将感测数据提供给所述负载的感测对象。可选地或结合地,所述负载可以包括一个或多个发射器,设置为将信号提供给一个或多个感测对象。所述发射器可以是任意适合的发射器,例如光源或声源。在某些实施例中,所述负载包括一个或多个收发器,例如用于与远离所述无人飞行器的模组通信。在本申请实施例中,通过调用存储在存储介质830内的程序,控制器810设置为当按照所述飞行航线进行飞行时,获取第二地图数据及飞行 位置;对所述第一地图数据的作业区域和所述第二地图数据进行匹配,以计算所述飞行位置偏离所述飞行航线的飞行偏移量;依据所述飞行偏移量进行飞行修正,以修正至所述飞行航线。
可选地,控制器810还设置为:
获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
可选地,控制810还设置为:
获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;
若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;
获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;
在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
上述导航图配置方法、自动避障方法以及装置、终端、无人飞行器,动态生成以飞行器当前飞行位置为中心的局部导航图,根据飞行器在飞行过程中获取到的位置信息、姿态信息以及深度图,分析出每个点的三维位置信息,该每个点的三维位置信息可能是飞行器在飞行过程中所遇到的未知障碍物的信息,也可能是飞行器在飞行过程中所遇到的事先未被规划的 其它物体的信息,将各个点的三维位置信息投影到局部导航图中,就可以根据该局部导航图进行实时作业航线规划。由于作业航线根据飞行器飞行过程中获取的信息动态生成,因此可以有效应对临时的任务变化,例如,对事先未规划作业航线的一片作业区域同时进行作业,或者,自动对未知障碍物所在区域进行避障等等。另外,将局部导航图的障碍物区域的信息更新到全局导航图中,那么在进行全局规划时就可以直接避开障碍物,可以在全局范围内找到最短路径,从而避免只有测绘数据而导致的找到的路径无法避开未测绘的障碍物。
在本申请各实施例中的各功能单元可集成在一个处理模块中,也可以各个单元单独物理存在,也可以两个或两个以上单元集成于一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取的存储介质中。所述存储介质包括但不限于任何类型的盘(包括软盘、硬盘、光盘、CD-ROM、和磁光盘)、ROM(Read-Only Memory,只读存储器)、RAM(Random AcceSS Memory,随即存储器)、EPROM(EraSable Programmable Read-Only Memory,可擦写可编程只读存储器)、EEPROM(Electrically EraSable Programmable Read-Only Memory,电可擦可编程只读存储器)、闪存、磁性卡片或光线卡片。也就是,存储介质包括由设备(例如,计算机)以能够读的形式存储或传输信息的任何介质。可以是只读存储器,磁盘或光盘等。
本技术领域技术人员可以理解,本申请中已经讨论过的各种操作、方法、流程中的步骤、措施、方案可以被交替、更改、组合或删除。可选地,具有本申请中已经讨论过的各种操作、方法、流程中的其他步骤、措施、方案也可以被交替、更改、重排、分解、组合或删除。可选地,现有技术中的具有与本申请中公开的各种操作、方法、流程中的步骤、措施、方案也可以被交替、更改、重排、分解、组合或删除。
以上所述仅是本申请的部分实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进 和润饰,这些改进和润饰也应视为本申请的保护范围。
工业实用性
本申请提供的方案,可应用于无人机导航领域,导航图配置方法,包括步骤:获取所述无人飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。本申请的方案能够动态生成作业航线,有效应对临时的任务变化。

Claims (34)

  1. 一种导航图配置方法,包括步骤:
    获取飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
    根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
    将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
  2. 根据权利要求1所述的导航图配置方法,其中,所述局部导航图包括若干个子区域;
    所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:
    若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器实施避障。
  3. 根据权利要求2所述的导航图配置方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:
    若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。
  4. 根据权利要求1所述的导航图配置方法,其中,所述深度图包括每个点与所述当前飞行位置的距离信息;每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
  5. 根据权利要求2所述的导航图配置方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:
    对所述局部导航图中预设区域内的每个点的权值进行衰减;
    获得衰减后每个子区域中所有点的权值和。
  6. 根据权利要求5所述的导航图配置方法,其中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:
    将所述预设区域内的每个点的权值与预设衰减因子相乘。
  7. 根据权利要求5所述的导航图配置方法,其中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。
  8. 根据权利要求1所述的导航图配置方法,其中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:
    对所述深度图进行坐标转换,获得在导航坐标系中的各个点;
    根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
  9. 根据权利要求8所述的导航图配置方法,其中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:
    根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;
    根据相机坐标系到机体坐标系的转换矩阵,将相机坐标系下的各个点转化为机体坐标系的各个点;
    根据机体坐标系到导航坐标系的转换矩阵,将机体坐标系下的各个点转换为导航坐标系的各个点。
  10. 根据权利要求1所述的导航图配置方法,其中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还包括:
    对所述深度图进行稀疏处理。
  11. 根据权利要求10所述的导航图配置方法,其中,所述对所述深度图进行稀疏处理,包括:
    采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
  12. 根据权利要求2至11任意一项所述的导航图配置方法,其中,所述局部导航图为栅格地图,每一个栅格为一个子区域。
  13. 一种避障方法,包括步骤:
    获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
    根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
    将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;
    若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;
    获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;
    在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
  14. 根据权利要求13所述的避障方法,其中,所述在预设的全局导航图中设置障碍物区域和作业边界区域,包括:
    根据获取的所述测绘数据以及三维位置信息,获得第一障碍物区域和第一作业边界区域;
    对所述第一障碍物区域和所述第一作业边界区域进行膨胀,获得第二障碍物区域和第二作业边界区域;
    将所述第二障碍物区域和所述第二作业边界区域设置为用于指示飞行器实施避障的区域。
  15. 根据权利要求13所述的避障方法,其中,所述预设的全局导航图的中心和大小根据所述飞行器起飞前的位置以及所述测绘数据获得。
  16. 根据权利要求15所述的避障方法,其中,所述全局导航图的水平边界由所述位置和所述测绘数据在Y轴上的最大值和最小值膨胀后确定,所述全局导航图的竖直边界由所述位置和所述测绘数据在X轴上的最大值和最小值膨胀后确定。
  17. 根据权利要求13所述的避障方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:
    若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。
  18. 根据权利要求13所述的避障方法,其中,所述深度图包括每个点与所述当前飞行位置的距离信息;每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
  19. 根据权利要求13所述的避障方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:
    对所述局部导航图中预设区域内的每个点的权值进行衰减;
    获得衰减后每个子区域中所有点的权值和。
  20. 根据权利要求19所述的避障方法,其中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:
    将所述预设区域内的每个点的权值与预设衰减因子相乘。
  21. 根据权利要求19所述的避障方法,其中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。
  22. 根据权利要求13所述的避障方法,其中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:
    对所述深度图进行坐标转换,获得在导航坐标系中的各个点;
    根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
  23. 根据权利要求22所述的避障方法,其中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:
    根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;
    根据相机坐标系到机体坐标系的转换矩阵,将相机坐标系下的各个点转化为机体坐标系的各个点;
    根据机体坐标系到导航坐标系的转换矩阵,将机体坐标系下的各个点转换为导航坐标系的各个点。
  24. 根据权利要求13所述的避障方法,其中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还包括:
    对所述深度图进行稀疏处理。
  25. 根据权利要求24所述的避障方法,其中,所述对所述深度图进行稀疏处理,包括:
    采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
  26. 根据权利要求13至25任意一项所述的避障方法,其中,所述局部导航图和所述全局导航图为栅格地图,每一个栅格为一个子区域。
  27. 一种飞行器导航图配置装置,包括:
    信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
    三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
    投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
  28. 一种避障装置,包括:
    第一信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;
    三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;
    投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;
    第一区域设置模块,设置为在子区域中所有点的权值和大于预设阈值时,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;
    第二信息获取模块,设置为获取用户设定的用于指示障碍物区域和作 业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;
    第二区域设置模块,设置为在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
  29. 一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现权利要求1-12中任意一项所述方法的步骤。
  30. 一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现权利要求13-26中任意一项所述方法的步骤。
  31. 一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求1-12中任意一项所述的导航图配置方法。
  32. 一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求13-26中任意一项所述的避障方法。
  33. 一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感器,
    所述通信模块,设置为与地面控制装置进行通信;
    所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;
    所述惯性传感器,设置为确定无人飞行器的姿态信息;
    所述图像传感器,设置为在当前飞行位置探测深度图;
    所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行权利要求1-12任一项所述方法的步骤。
  34. 一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感 器,
    所述通信模块,设置为与地面控制装置进行通信;
    所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;
    所述惯性传感器,设置为确定无人飞行器的姿态信息;
    所述图像传感器,设置为在当前飞行位置探测深度图;
    所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行权利要求13-26任一项所述方法的步骤。
PCT/CN2018/112077 2017-10-26 2018-10-26 导航图配置方法、避障方法以及装置、终端、无人飞行器 WO2019080924A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP18871786.2A EP3702731A4 (en) 2017-10-26 2018-10-26 NAVIGATION DIAGRAM CONFIGURATION METHOD, OBSTACLE AVOIDANCE PROCESS AND DEVICE, TERMINAL, UNPILOT AIR VEHICLE
AU2018355491A AU2018355491B2 (en) 2017-10-26 2018-10-26 Method for configuring navigation chart, obstacle avoidance method and device, terminal, unmanned aerial vehicle
US16/641,763 US20200394924A1 (en) 2017-10-26 2018-10-26 Method for Configuring Navigation Chart, Method for Avoiding Obstacle and Device, Terminal and Unmanned Aerial Vehicle
KR1020207005722A KR102385820B1 (ko) 2017-10-26 2018-10-26 내비게이션 차트 구성 방법, 장애물 회피 방법 및 장치, 단말기, 무인 항공기
JP2020517902A JP2020535545A (ja) 2017-10-26 2018-10-26 ナビゲーションチャート構成方法、障害物回避方法及び装置、端末、無人航空機

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711021920.1 2017-10-26
CN201711021920.1A CN109708636B (zh) 2017-10-26 2017-10-26 导航图配置方法、避障方法以及装置、终端、无人飞行器

Publications (1)

Publication Number Publication Date
WO2019080924A1 true WO2019080924A1 (zh) 2019-05-02

Family

ID=66247759

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/112077 WO2019080924A1 (zh) 2017-10-26 2018-10-26 导航图配置方法、避障方法以及装置、终端、无人飞行器

Country Status (7)

Country Link
US (1) US20200394924A1 (zh)
EP (1) EP3702731A4 (zh)
JP (1) JP2020535545A (zh)
KR (1) KR102385820B1 (zh)
CN (1) CN109708636B (zh)
AU (1) AU2018355491B2 (zh)
WO (1) WO2019080924A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950524A (zh) * 2020-08-28 2020-11-17 广东省现代农业装备研究所 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统
CN113012479A (zh) * 2021-02-23 2021-06-22 欧阳嘉兰 一种基于障碍物分析的飞行限重测量方法、装置及系统
CN113077551A (zh) * 2021-03-30 2021-07-06 苏州臻迪智能科技有限公司 占据栅格地图构建方法、装置、电子设备和存储介质
CN113448326A (zh) * 2020-03-25 2021-09-28 北京京东乾石科技有限公司 机器人定位方法及装置、计算机存储介质、电子设备
CN113485359A (zh) * 2021-07-29 2021-10-08 北京超维世纪科技有限公司 一种工业类巡检机器人多传感器融合避障系统

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262556A (zh) * 2019-07-12 2019-09-20 黑梭智慧技术(北京)有限公司 快递物流无人飞行器航线设计方法和装置
CN110471421B (zh) * 2019-08-27 2022-03-18 广州小鹏汽车科技有限公司 一种车辆安全行驶的路径规划方法及路径规划系统
CN112313476A (zh) * 2019-11-05 2021-02-02 深圳市大疆创新科技有限公司 无人飞行器的航线规划方法和装置
US11244164B2 (en) * 2020-02-03 2022-02-08 Honeywell International Inc. Augmentation of unmanned-vehicle line-of-sight
JP7412037B2 (ja) * 2020-02-20 2024-01-12 株式会社ナイルワークス ドローンシステム、操作器および作業エリアの定義方法
US20210300551A1 (en) * 2020-03-25 2021-09-30 Tencent America LLC Systems and methods for unmanned aerial system communication
CN113465614B (zh) * 2020-03-31 2023-04-18 北京三快在线科技有限公司 无人机及其导航地图的生成方法和装置
CN111854754B (zh) * 2020-06-19 2023-01-24 北京三快在线科技有限公司 无人机航线规划方法、装置、无人机及存储介质
CN112033413B (zh) * 2020-09-07 2023-06-16 北京信息科技大学 一种基于结合环境信息的改进a*算法的路径规划方法
CN112066976B (zh) * 2020-09-07 2023-06-16 北京信息科技大学 一种自适应膨胀处理方法、系统、机器人及存储介质
CN112116643A (zh) * 2020-09-14 2020-12-22 哈工大机器人(合肥)国际创新研究院 一种基于tof相机深度图和点云图的避障处理方法及系统
CN112416018B (zh) * 2020-11-24 2021-07-09 广东技术师范大学 基于多信号采集与路径规划模型的无人机避障方法和装置
CN112859893B (zh) * 2021-01-08 2024-07-26 中国商用飞机有限责任公司北京民用飞机技术研究中心 一种飞行器避障方法、装置
CN113086227A (zh) * 2021-03-30 2021-07-09 武汉学院 矢量共轴手持云台一体无人机及其智能系统
CN113310493B (zh) * 2021-05-28 2022-08-05 广东工业大学 一种基于事件触发机制的无人机实时导航方法
CN113465606A (zh) * 2021-06-30 2021-10-01 三一机器人科技有限公司 末端工位定位方法、装置及电子设备
CN115222808B (zh) * 2021-06-30 2023-10-20 达闼机器人股份有限公司 基于无人机的定位方法、装置、存储介质和电子设备
CN113532471B (zh) * 2021-07-15 2024-07-19 浙江东进航科信息技术有限公司 一种多源飞行轨迹数据融合的处理方法、设备及介质
CN114088094A (zh) * 2021-09-27 2022-02-25 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) 一种无人艇的智能航路规划方法及系统
CN113867349B (zh) * 2021-09-28 2024-04-09 浙江大华技术股份有限公司 一种机器人的避障方法、系统及智能机器人
CN113642092B (zh) * 2021-10-18 2022-01-04 西南交通大学 一种建筑空间路径捕获方法
WO2023070667A1 (zh) * 2021-11-01 2023-05-04 深圳市大疆创新科技有限公司 可移动平台及用于处理其数据的方法和装置、终端设备
CN114313243B (zh) * 2021-12-19 2023-06-02 四川省天域航通科技有限公司 一种植保用避障无人机
KR102622623B1 (ko) * 2022-05-12 2024-01-10 한국광기술원 3차원 영상 정보를 제공하기 위한 이동형 영상 촬영 장치, 이에 대한 방법 및 이를 포함하는 시스템
CN114879704B (zh) * 2022-07-11 2022-11-25 山东大学 一种机器人绕障控制方法及系统
CN115150784B (zh) * 2022-09-02 2022-12-06 汕头大学 基于基因调控网络的无人机集群区域覆盖方法及设备
WO2024195211A1 (ja) * 2023-03-17 2024-09-26 日本電気株式会社 制御装置、制御方法およびプログラム
CN116757582B (zh) * 2023-08-18 2023-11-17 山西汇能科技有限公司 基于无人机的物流配送系统及方法
CN116907511B (zh) * 2023-09-12 2023-12-05 北京宝隆泓瑞科技有限公司 一种将管道坐标转换为图像坐标的方法
CN118470237B (zh) * 2024-07-12 2024-09-17 中航材导航技术(北京)有限公司 自动生成飞行程序标准仪表图的方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN105571588A (zh) * 2016-03-10 2016-05-11 赛度科技(北京)有限责任公司 一种无人机三维空中航路地图构建及其航路显示方法
CN105910604A (zh) * 2016-05-25 2016-08-31 武汉卓拔科技有限公司 一种基于多传感器的自主避障导航系统
CN106595659A (zh) * 2016-11-03 2017-04-26 南京航空航天大学 城市复杂环境下多无人机视觉slam的地图融合方法
CN106931961A (zh) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 一种自动导航方法及装置

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4061596B2 (ja) * 2004-05-20 2008-03-19 学校法人早稲田大学 移動制御装置、環境認識装置及び移動体制御用プログラム
JP5233432B2 (ja) * 2008-06-16 2013-07-10 アイシン・エィ・ダブリュ株式会社 運転支援システム、運転支援方法及び運転支援プログラム
JP5093020B2 (ja) * 2008-09-18 2012-12-05 トヨタ自動車株式会社 レーダ装置
CN102359784B (zh) * 2011-08-01 2013-07-24 东北大学 一种室内移动机器人自主导航避障系统及方法
CN103576686B (zh) * 2013-11-21 2017-01-18 中国科学技术大学 一种机器人自主导引及避障的方法
US9772712B2 (en) * 2014-03-11 2017-09-26 Textron Innovations, Inc. Touch screen instrument panel
WO2016015251A1 (en) * 2014-07-30 2016-02-04 SZ DJI Technology Co., Ltd. Systems and methods for target tracking
JP6278539B2 (ja) * 2014-09-05 2018-02-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 状況に基づく飛行モード選択
CN104236548B (zh) * 2014-09-12 2017-04-05 清华大学 一种微型无人机室内自主导航方法
JP6387782B2 (ja) * 2014-10-17 2018-09-12 ソニー株式会社 制御装置、制御方法及びコンピュータプログラム
US9399524B2 (en) * 2014-10-21 2016-07-26 Honeywell International Inc. System and method for displaying runway landing information
KR101736089B1 (ko) * 2015-01-08 2017-05-30 서울대학교산학협력단 깊이 지도를 이용한 사물 형상 맵핑 및 실시간 유도를 위한 무인기 비행 제어 장치 및 방법
US9470528B1 (en) * 2015-03-26 2016-10-18 Honeywell International Inc. Aircraft synthetic vision systems utilizing data from local area augmentation systems, and methods for operating such aircraft synthetic vision systems
WO2017071143A1 (en) * 2015-10-30 2017-05-04 SZ DJI Technology Co., Ltd. Systems and methods for uav path planning and control
CN105678754B (zh) * 2015-12-31 2018-08-07 西北工业大学 一种无人机实时地图重建方法
CN105761265A (zh) * 2016-02-23 2016-07-13 英华达(上海)科技有限公司 利用影像深度信息提供避障的方法及无人飞行载具
CN114610049A (zh) * 2016-02-26 2022-06-10 深圳市大疆创新科技有限公司 用于修改无人飞行器自主飞行的系统和方法
CN105955258B (zh) * 2016-04-01 2018-10-30 沈阳工业大学 基于Kinect传感器信息融合的机器人全局栅格地图构建方法
JP6327283B2 (ja) * 2016-04-06 2018-05-23 トヨタ自動車株式会社 車両用情報提供装置
CN106780592B (zh) * 2016-06-30 2020-05-22 华南理工大学 基于相机运动和图像明暗的Kinect深度重建方法
CN106127788B (zh) * 2016-07-04 2019-10-25 触景无限科技(北京)有限公司 一种视觉避障方法和装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN105571588A (zh) * 2016-03-10 2016-05-11 赛度科技(北京)有限责任公司 一种无人机三维空中航路地图构建及其航路显示方法
CN105910604A (zh) * 2016-05-25 2016-08-31 武汉卓拔科技有限公司 一种基于多传感器的自主避障导航系统
CN106595659A (zh) * 2016-11-03 2017-04-26 南京航空航天大学 城市复杂环境下多无人机视觉slam的地图融合方法
CN106931961A (zh) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 一种自动导航方法及装置

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHEN, BAO-HUA ET AL.: "Instant Dense 3D Reconstruction-Based UAV Vision Localization", ACTA ELECTRONICA SINICA, vol. 45, no. 6, 30 June 2017 (2017-06-30), pages 1294 - 1300, XP055683181, ISSN: 0372-2112, DOI: 10.3969/j.issn.0372-2112.2017.06.003 *
See also references of EP3702731A4 *
YANG, WEI ET AL.: "A Fast Autonomous Obstacle Avoidance Algorithm Based on RGB-D Camera", JOURNAL OF HUNNAN UNIVERSITY OF TECHNOLOGY, vol. 29, no. 6, 30 November 2015 (2015-11-30), pages 74 - 79, XP009519863, ISSN: 1673-9833, DOI: 10.3969/j.issn.1673-9833.2015.06.015 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448326A (zh) * 2020-03-25 2021-09-28 北京京东乾石科技有限公司 机器人定位方法及装置、计算机存储介质、电子设备
CN111950524A (zh) * 2020-08-28 2020-11-17 广东省现代农业装备研究所 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统
CN111950524B (zh) * 2020-08-28 2024-03-29 广东省现代农业装备研究所 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统
CN113012479A (zh) * 2021-02-23 2021-06-22 欧阳嘉兰 一种基于障碍物分析的飞行限重测量方法、装置及系统
CN113077551A (zh) * 2021-03-30 2021-07-06 苏州臻迪智能科技有限公司 占据栅格地图构建方法、装置、电子设备和存储介质
CN113485359A (zh) * 2021-07-29 2021-10-08 北京超维世纪科技有限公司 一种工业类巡检机器人多传感器融合避障系统

Also Published As

Publication number Publication date
US20200394924A1 (en) 2020-12-17
JP2020535545A (ja) 2020-12-03
AU2018355491B2 (en) 2022-03-17
KR20200031165A (ko) 2020-03-23
CN109708636B (zh) 2021-05-14
AU2018355491A1 (en) 2020-06-11
CN109708636A (zh) 2019-05-03
EP3702731A4 (en) 2021-07-28
KR102385820B1 (ko) 2022-04-12
EP3702731A1 (en) 2020-09-02

Similar Documents

Publication Publication Date Title
WO2019080924A1 (zh) 导航图配置方法、避障方法以及装置、终端、无人飞行器
US10914590B2 (en) Methods and systems for determining a state of an unmanned aerial vehicle
US11237572B2 (en) Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof
US20210247764A1 (en) Multi-sensor environmental mapping
Meyer et al. Comprehensive simulation of quadrotor uavs using ros and gazebo
CN109219785B (zh) 一种多传感器校准方法与系统
US20200007746A1 (en) Systems, methods, and devices for setting camera parameters
US10459445B2 (en) Unmanned aerial vehicle and method for operating an unmanned aerial vehicle
US10240930B2 (en) Sensor fusion
US20190346562A1 (en) Systems and methods for radar control on unmanned movable platforms
ES2889000T3 (es) Métodos y sistema para controlar un objeto móvil
WO2010137596A1 (ja) 移動体制御装置及び移動体制御装置を搭載した移動体
WO2016187758A1 (en) Sensor fusion using inertial and image sensors
US10983535B2 (en) System and method for positioning a movable object
WO2016023224A1 (en) System and method for automatic sensor calibration
US10937325B2 (en) Collision avoidance system, depth imaging system, vehicle, obstacle map generator, and methods thereof
WO2021199449A1 (ja) 位置算出方法及び情報処理システム
US20210229810A1 (en) Information processing device, flight control method, and flight control system
JP2024021143A (ja) 3次元データ生成システム、及び3次元データ生成方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18871786

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20207005722

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020517902

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018871786

Country of ref document: EP

Effective date: 20200526

ENP Entry into the national phase

Ref document number: 2018355491

Country of ref document: AU

Date of ref document: 20181026

Kind code of ref document: A