WO2019080924A1 - 导航图配置方法、避障方法以及装置、终端、无人飞行器 - Google Patents
导航图配置方法、避障方法以及装置、终端、无人飞行器Info
- Publication number
- WO2019080924A1 WO2019080924A1 PCT/CN2018/112077 CN2018112077W WO2019080924A1 WO 2019080924 A1 WO2019080924 A1 WO 2019080924A1 CN 2018112077 W CN2018112077 W CN 2018112077W WO 2019080924 A1 WO2019080924 A1 WO 2019080924A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- point
- map
- navigation map
- obstacle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 84
- 230000036961 partial effect Effects 0.000 claims description 69
- 238000012545 processing Methods 0.000 claims description 32
- 230000009466 transformation Effects 0.000 claims description 21
- 238000004891 communication Methods 0.000 claims description 20
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000013507 mapping Methods 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 6
- 230000033001 locomotion Effects 0.000 description 14
- 238000004364 calculation method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 9
- 230000000717 retained effect Effects 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 7
- 230000002238 attenuated effect Effects 0.000 description 7
- 238000013519 translation Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 4
- 238000009825 accumulation Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 238000010276 construction Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000009347 mechanical transmission Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/006—Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
- G08G5/0013—Transmission of traffic-related information to or from an aircraft with a ground station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/40—UAVs specially adapted for particular uses or applications for agriculture or forestry operations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- the present application relates to the field of aircraft technology, and in particular, the present invention relates to a navigation map configuration method, an automatic obstacle avoidance method, and a device and a terminal.
- aircraft such as drones, in aerial photography, agriculture, plant protection, micro-self-timer, express delivery, disaster relief, observation of wildlife, surveillance of infectious diseases, mapping, news reports, power inspection, disaster relief, film and television
- the field of shooting and the like has been widely used.
- the operating route of the aircraft is a scanning route that is automatically generated in advance by the ground control device based on the parcel information. Since the scanning route is pre-generated, the return path after the operation is generated well before the takeoff, so there is no way to cope with the temporary task change. For example, when the aircraft operates according to the pre-generated scan route, the user needs the aircraft to advance A work area of an unplanned work route is simultaneously operated, or when the aircraft is operating according to a pre-generated scan route, the user needs the aircraft to automatically avoid obstacles to unknown obstacles that are not planned in advance.
- the present application is directed to the shortcomings of the prior art, and provides a navigation map configuration method, an automatic obstacle avoidance method, and a device and a terminal, which are used to solve the problem that the work route planning method existing in the prior art cannot cope with temporary task changes. Dynamically generate job routes to effectively respond to temporary mission changes.
- an embodiment of the present application provides a navigation map configuration method, including the steps of:
- the three-dimensional position information of each point is projected into the partial navigation map centered on the current flight position according to the respective set weights.
- an embodiment of the present application further provides an obstacle avoidance method, including the steps of:
- the sub-area is set as an obstacle area to indicate that the aircraft performs obstacle avoidance on the obstacle area;
- An obstacle area and a job boundary area are set in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
- Embodiments of the present application further provide an aircraft navigation map configuration apparatus, including:
- An information acquisition module configured to acquire a current flight position, attitude information of the aircraft, and a depth map detected at the current flight position
- a three-dimensional position information obtaining module configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map;
- the projection module is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights.
- an embodiment of the present application further provides an obstacle avoidance device, including:
- a first information acquiring module configured to acquire a current flight position, posture information of the aircraft, and a depth map detected at the current flight position
- a three-dimensional position information obtaining module configured to obtain three-dimensional position information of each point according to the current flight position, the attitude information, and the depth map;
- a projection module configured to project the three-dimensional position information of each point into a partial navigation map centered on the current flight position according to a weight value set by itself, wherein the partial navigation map includes a plurality of sub-regions;
- a first area setting module configured to set the sub-area as an obstacle area when the weight of all points in the sub-area is greater than a preset threshold, to instruct the aircraft to perform obstacle avoidance on the obstacle area;
- a second information acquiring module configured to acquire mapping data set by the user for indicating an obstacle area and a job boundary area, and three-dimensional position information for indicating an obstacle area in the partial navigation map;
- the second area setting module is configured to set an obstacle area and a job boundary area in the preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
- Embodiments of the present application further provide a terminal, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the program arbitrarily The steps of the method described in the examples.
- Embodiments of the present application further provide a terminal, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the program arbitrarily The steps of the method described in the examples.
- Embodiments of the present application according to a seventh aspect, further a storage medium, comprising: a stored program, wherein, when the program is running, controlling a device in which the storage medium is located performs the first aspect described above Navigation map configuration method.
- Embodiments of the present application further provide a storage medium, the storage medium comprising a stored program, wherein, when the program is running, controlling the device where the storage medium is located to perform the second aspect described above The obstacle avoidance method.
- an embodiment of the present application further provides an unmanned aerial vehicle including a communication module, a sensor, a controller, and a storage medium; the sensor includes an image sensor, a GPS receiver, an RTK positioning sensor, and an inertial sensor.
- the communication module is configured to communicate with a ground control device
- the GPS receiver and the positioning sensor are configured to determine a current flight position of the unmanned aerial vehicle
- the inertial sensor is configured to determine posture information of the unmanned aerial vehicle
- the image sensor is configured to detect a depth map at a current flight position
- the controller is coupled to the storage medium, the storage medium being configured to store a program, the program being operative to perform the steps of the method of the first aspect described above.
- an embodiment of the present application further provides an unmanned aerial vehicle including a communication module, a sensor, a controller, and a storage medium; the sensor includes an image sensor, a GPS receiver, an RTK positioning sensor, and an inertial sensor.
- the communication module is configured to communicate with a ground control device
- the GPS receiver and the positioning sensor are configured to determine a current flight position of the unmanned aerial vehicle
- the inertial sensor is configured to determine posture information of the unmanned aerial vehicle
- the image sensor is configured to detect a depth map at a current flight position
- the controller is coupled to the storage medium, the storage medium being configured to store a program, the program being operative to perform the steps of the method of the second aspect described above.
- the navigation map configuration method, the automatic obstacle avoidance method, the device, and the terminal dynamically generate a partial navigation map centered on the current flight position of the aircraft, and analyze each location information according to the position information, the posture information, and the depth map acquired by the aircraft during the flight.
- Three-dimensional position information of each point, the three-dimensional position information of each point may be information of an unknown obstacle encountered by the aircraft during flight, or may be other unplanned other encounters encountered by the aircraft during flight.
- the information of the object, the three-dimensional position information of each point is projected into the local navigation map, and the real-time operation route planning can be performed according to the local navigation map.
- the operation route is dynamically generated according to the information acquired during the flight of the aircraft, it can effectively cope with temporary task changes, for example, simultaneously working on a work area of a previously unplanned operation route, or automatically avoiding the area where the unknown obstacle is located. Barriers and so on.
- FIG. 1 is a schematic flowchart of an embodiment of a navigation map configuration method according to the present application.
- FIG. 2 is a schematic diagram of a specific embodiment of a method for determining a preset area according to the present application
- FIG. 3 is a schematic structural diagram of an embodiment of an aircraft navigation map configuration apparatus according to the present application.
- FIG. 4 is a schematic flow chart of an embodiment of an obstacle avoidance method according to the present application.
- FIG. 5 is a schematic diagram of a specific embodiment of a method for acquiring a global navigation map boundary according to the present application.
- FIG. 6 is a schematic diagram of a specific embodiment of setting an obstacle area and a job boundary area in a global navigation map according to the present application
- FIG. 7 is a schematic structural view of an embodiment of an obstacle avoidance device according to the present application.
- FIG. 8 is a schematic structural diagram of an unmanned aerial vehicle 800 according to an embodiment of the present application.
- the aircraft obstacle avoidance system designed by the present application is divided into two parts, one is a global obstacle avoidance planning part mainly based on a global navigation map, and the other is a partial obstacle avoidance planning part mainly based on a partial navigation map.
- Both the global navigation map and the local navigation map are used to indicate the flight of the aircraft.
- the global navigation map and the local navigation map are not interdependent, and the problems faced are different.
- the mapping strategy is also different (more on this later). ), the purpose is to adapt to agricultural applications while reducing resource consumption.
- the global obstacle avoidance plan is used for returning or pointing flight, mainly using global navigation maps, and facing known obstacles.
- the work route is a scan route that is automatically generated in advance by the ground station based on the parcel information. Since the scanning route is pre-generated, even the return route after the operation is generated well before take-off, so there is no way to cope with temporary task changes, such as sudden consumption of the drug, almost exhausted power or sudden user wants to fly back. .
- the global navigation map is used to deal with this kind of scene, and it is possible to carry out the obstacle-free path planning in the whole map at any time. This kind of planning is long-distance, and there is no need to consider the problem of spraying. In this scenario, the required map area is large, the map granularity does not need to be very thin, and the map area can be determined before taking off.
- the local obstacle avoidance plan is used to fly along the work route or encounter the unknown obstacles during the flight along the globally planned route.
- the local navigation map is mainly used to encounter unknown obstacles during the operation.
- the map size is required to be small, because it is necessary to fit the original route as much as possible to minimize the leakage, so the corresponding plan is generally short-distance, the map can be relatively small, and the map center moves with the aircraft.
- the local navigation map designed by the present application can be applied to other aspects in addition to the obstacle avoidance, for example, working on a fruit tree area that has not been planned in advance, and the like, and the present application does not perform the scene applied to the partial navigation map. limited.
- a navigation map configuration method includes the steps of:
- the aircraft can be a planting drone, etc.
- the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
- the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
- the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
- the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
- the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
- the size of the depth map can be set according to needs, and the present application does not limit this.
- the detected depth map is a 640*480 size image.
- the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
- each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
- the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
- the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
- the navigation coordinate system is a northeast coordinate system.
- the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
- the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
- the center of the local navigation map is moved.
- the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
- the local navigation map is a two-dimensional map. After acquiring the point cloud, that is, the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight for dynamically planning the operation route. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. The information is a proportional relationship, as shown in the following formula:
- point_weight is the weight of a point
- point_weight_com is the common weight of the point, that is, the preset weight, which can be obtained empirically, the common weight is consistent for all points
- distance_factor is the distance-related factor, and distance It is a proportional relationship, that is, its value increases linearly with the increase of the distance information, and decreases linearly with the decrease of the distance information.
- the distance information is distance information represented by a gray value of each pixel in the aforementioned depth map.
- the distance factor is proportional to the distance information because the distance object has a small number of point clouds, so the weight of each point should be greater than the close distance point.
- the operation route planning can be performed according to the local navigation map, for example, obstacle avoidance is performed, a detected new area is operated, and the like.
- obstacle avoidance is performed, a detected new area is operated, and the like.
- the local navigation map includes a plurality of sub-areas; the three-dimensional position information of each point is projected according to a weight value set to a local navigation map centered on the current flight position, and then And further comprising: if the weight of all points in the sub-area is greater than a preset threshold, setting the sub-area as an obstacle area to instruct the aircraft to implement obstacle avoidance.
- the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
- the local navigation map can be a raster map, and each grid is a sub-region.
- each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed on each sub-area according to the following formula to obtain the total weight of each sub-area. value.
- Map_value+ point_weight
- map_value represents the weight of a sub-area.
- the preset threshold according to experience, for example, set the preset threshold to 1, and also set the specific form indicating the obstacle area according to actual needs.
- the weight of the grid is 0.
- the position is a free position, the aircraft is free to pass, and a grid weight of 1 indicates that there is an obstacle in the position, and the aircraft needs to bypass. Then you can set the sub-area according to the following formula:
- the three-dimensional position information of each point is projected according to the respective set weights into a partial navigation map centered on the current flight position, and then includes: if all points in the sub-area The weight value is less than or equal to a preset threshold, and the sub-area is set as a pass area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area. For example, for a raster map, if the sum of the weights of all the points in a grid is less than 1, you can set the weight of the grid to 0 to indicate that the aircraft can pass.
- the depth map needs to be preprocessed.
- the method further includes: performing sparse processing on the depth map.
- the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the proximity, and thus, in one embodiment, the pair is
- the depth map is subjected to the sparse processing, and the thinning processing is performed on the depth map by using a change step size, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
- the unequal spacing sparse processing can be performed from the image boundary, so that the pixel points near the center of the image are dense, and the pixel points at the edge of the image are sparse.
- the pseudo code for sparse processing is as follows:
- Img_height and img_width are the width and length of the image, respectively;
- I_step and j_step are the step sizes of traversing the image, and the initial values are all 1;
- Height_step and width_step are the sparse factors of the image in the vertical and horizontal directions, respectively;
- HandleImage() represents the subsequent processing of the depth map.
- the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, obtaining the three-dimensional position information of each point according to the current flight position, the posture information, and the depth map, including: performing coordinate conversion on the depth map, obtaining navigation Each point in the coordinate system; obtaining three-dimensional position information of each point according to each point in the navigation coordinate system, the current flight position, and the posture information.
- the coordinate transformation of the depth map to obtain various points in the navigation coordinate system includes: according to the camera internal reference matrix, each of the depth maps is The points are converted into points in the camera coordinate system; the transformation matrix from the camera coordinate system to the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
- the weight of the points in the preset area in the local navigation map is attenuated, and then the total weight of each sub-area is calculated to determine whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment. Improve the accuracy of the judgment of obstacle areas.
- the attenuating the weight of each point in the preset area in the partial navigation map comprises: weighting each point in the preset area with a preset attenuation factor Multiply.
- the attenuation factor can be set empirically. If the preset area just includes N sub-areas, you can perform the attenuation operation according to the following formula:
- Map_value* damping_factor.
- map_value represents the total weight of a sub-area within the preset area
- damping_factor represents the attenuation factor
- the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
- O represents the map center of the local navigation map, that is, the current flight position of the aircraft
- ⁇ represents the size of the field of view of the binocular system.
- O represents the map center of the local navigation map, that is, the current flight position of the aircraft
- ⁇ represents the size of the field of view of the binocular system.
- d denotes the attenuation distance, which is a fixed value set according to experience
- the sector area determined by the above three parameters is the attenuation area, and the weight of the point in the attenuation area is performed. Attenuation, while the weight of points outside the attenuation zone does not need to be attenuated.
- FIG. 2 only shows the attenuation area of the binocular system installed in the front of the aircraft. If the binocular system is also installed behind or on the side of the aircraft, the position is also set at the position where the attenuation area is symmetrical or the side is also set.
- the area attenuates the weight of the points in the attenuation area, ie the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
- an aircraft navigation map configuration apparatus includes:
- the information acquisition module 110 is configured to acquire a current flight position, attitude information of the aircraft, and a depth map detected at the current flight position.
- the aircraft can be a plant protection drone and so on.
- the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
- the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
- the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
- the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
- the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
- the size of the depth map can be set as needed, and this application does not limit this.
- the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
- the three-dimensional position information obtaining module 120 is configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map.
- each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
- the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
- the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
- the navigation coordinate system is a northeast coordinate system.
- the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
- the projection module 130 is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights.
- the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
- the center of the local navigation map is moved.
- the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
- the local navigation map is a two-dimensional map. After acquiring the point cloud, that is, the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight for dynamically planning the operation route. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. Information is a proportional relationship.
- the operation route planning can be performed according to the local navigation map, for example, obstacle avoidance is performed, a detected new area is operated, and the like.
- obstacle avoidance is performed, a detected new area is operated, and the like.
- the following is a detailed description of the obstacle avoidance based on the local navigation map.
- the local navigation map includes a plurality of sub-areas; the apparatus further includes an obstacle area setting module connected to the projection module 130, and the weights of all points in the sub-area are greater than a preset.
- the sub-area is set as an obstacle area to instruct the aircraft to implement obstacle avoidance.
- the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
- the local navigation map can be a raster map, and each grid is a sub-region.
- each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed for each sub-area to obtain the total weight of each sub-area. If the total weight is greater than the preset threshold, the sub-area is set as the obstacle area.
- the apparatus further includes a transit area setting module connected to the projection module 130, configured to set the sub-area to be when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
- the apparatus may further include sparse connection between the information acquisition module 110 and the three-dimensional information location acquisition module 120. a processing module, the sparse processing module being configured to perform sparse processing on the depth map.
- the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the distance, and therefore, in one embodiment, the sparse processing module is adopted.
- the change step size performs a sparse process on the depth map, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
- the original depth map, or the sparse depth map after sparse processing, can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, the three-dimensional position information obtaining module 120 performs coordinate transformation on the depth map to obtain various points in the navigation coordinate system; according to each point in the navigation coordinate system, the current flight position, and The posture information obtains three-dimensional position information of each point.
- the three-dimensional position information obtaining module 120 converts each point in the depth map into points in the camera coordinate system according to the camera internal reference matrix; according to the camera coordinate system Transformation matrix of the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
- the apparatus further includes an attenuation module coupled between the projection module 130 and the obstacle area setting module (and/or the transit area setting module), the attenuation module being configured to A weight of each point in the preset area in the partial navigation map is attenuated; and a sum of weights of all points in each sub-area after attenuation is obtained.
- the attenuation module first attenuates the weights of the points in the preset area in the local navigation map, and then calculates the total weight of each sub-area. Based on the total weight value, it is judged whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment and improving the accuracy of the obstacle area judgment.
- the attenuation module multiplies the weight of each point within the predetermined area by a predetermined attenuation factor.
- the attenuation factor can be set empirically. If the preset area includes exactly N sub-areas, the total weight of each sub-area can be multiplied by the attenuation factor.
- the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
- the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
- the present application also provides a terminal, which may be an aircraft or other device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor implementing the program to implement the above The steps of any of the methods described.
- an obstacle avoidance method includes the steps of:
- the aircraft can be a plant protection drone and so on.
- the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
- the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
- the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
- the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
- the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
- the size of the depth map can be set according to needs, and the present application does not limit this.
- the detected depth map is a 640*480 size image.
- the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
- each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
- the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
- the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
- the navigation coordinate system is a northeast coordinate system.
- the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
- the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
- the center of the local navigation map is moved.
- the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
- the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
- the local navigation map can be a raster map, and each grid is a sub-region.
- the local navigation map is a two-dimensional map. After the point cloud is acquired, that is, after the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight to determine which areas are specific. Obstacle area. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. The information is a proportional relationship, as shown in the following formula:
- Point_weight point_weight_com*distance_factor
- point_weight is the weight of a point
- point_weight_com is the common weight of the point, that is, the preset weight, which can be obtained empirically, the common weight is consistent for all points
- distance_factor is the distance-related factor, and distance It is a proportional relationship, that is, its value increases linearly with the increase of the distance information, and decreases linearly with the decrease of the distance information.
- the distance information is distance information represented by a gray value of each pixel in the aforementioned depth map.
- the distance factor is proportional to the distance information because the distance object has a small number of point clouds, so the weight of each point should be greater than the close distance point.
- each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed on each sub-area according to the following formula to obtain the total weight of each sub-area. value.
- Map_value+ point_weight
- map_value represents the weight of a sub-area.
- the preset threshold according to experience, for example, set the preset threshold to 1, and also set the specific form indicating the obstacle area according to actual needs.
- the weight of the grid is 0.
- the position is a free position, the aircraft is free to pass, and a grid weight of 1 indicates that there is an obstacle in the position, and the aircraft needs to bypass. Then you can set the sub-area according to the following formula:
- the three-dimensional position information of each point is projected according to the respective set weights into a partial navigation map centered on the current flight position, and then includes: if all points in the sub-area The weight value is less than or equal to a preset threshold, and the sub-area is set as a pass area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area. For example, for a raster map, if the sum of the weights of all the points in a grid is less than 1, you can set the weight of the grid to 0 to indicate that the aircraft can pass.
- mapping data including boundaries and obstacles
- data of the local navigation map The data of the local navigation map is updated to the global navigation map with a certain period.
- the survey data can be data manually tested by the user, or each data selected by the user through the map interface.
- the survey data contains the boundary points of the obstacle area and the boundary points of the job boundary area.
- the user-mapped map data can be uploaded to the aircraft through the data link, for example, uploaded to the aircraft's binocular system for mapping operations of the global navigation map.
- the three-dimensional position information is data that has been set as an obstacle area in the partial navigation map.
- the local navigation map will be updated in a certain period.
- the position of the obstacle area can be selected and placed in an obstacle queue.
- the obstacle queue can be deleted or added: when it is determined as an obstacle in the local navigation map, the information of the obstacle area is added to the queue; when the obstacle moves or disappears, the information of the obstacle area is taken from the queue. Delete it inside.
- the information of the obstacle area in the obstacle queue is updated to the global navigation map in a certain cycle, so that the global navigation map also contains the information detected by the binocular system. It should be noted that the present application is not limited to updating the data of the global navigation map in the form of a queue, and the user may also update the information of the obstacle area in the partial navigation map to the global navigation map by other forms as needed.
- the obstacle By updating the information of the obstacle area of the local navigation map to the global navigation map, the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding the finding only caused by the mapping data. Paths cannot avoid unmapped obstacles.
- S260 Set an obstacle area and a job boundary area in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
- the global navigation map can be a raster map.
- the global navigation map is much larger than the local navigation map.
- the local navigation map can be small.
- the global navigation map needs to include the flight range of the aircraft.
- the corresponding area can be identified in the global navigation map according to the position information of the job boundary area and the position information of the obstacle area included in the survey data. For example, set the weight of some of the grids in the global navigation map to 1 based on the survey data.
- the global navigation map is updated, and the corresponding position in the global navigation map is updated to the obstacle area.
- the aircraft can perform obstacle avoidance on the obstacle area and the work boundary area according to the global navigation map.
- the global navigation map has to be initialized before the aircraft takes off.
- the content of the initialization is to determine the size of the global navigation map and the location of the center point.
- the center and size of the preset global navigation map are obtained based on the position of the aircraft prior to takeoff and the survey data.
- the information for initializing the global navigation map comes from the mapping data set by the user.
- the geographic location represented by the map center and the size of the map are determined at the time of initialization. After determining this information, it is possible to allocate storage space for the global navigation map, and determine the storage location of the obstacle information according to the geographical location of the obstacle, which is convenient for storing and accessing obstacle information.
- the horizontal boundary of the global navigation map is determined by the position and the maximum and minimum values of the survey data on the Y-axis, the vertical boundary of the global navigation map being from the position And determining that the survey data is expanded after the maximum value and the minimum value on the X-axis are expanded.
- the maximum value on the Y-axis is found from the survey data and the position information before the take-off of the aircraft, and the maximum value is expanded by a certain distance to obtain a horizontal boundary above the global navigation map; from the survey data and the position before the aircraft takes off.
- FIG. 5 it is a schematic diagram of a global navigation map boundary acquisition of a specific embodiment.
- d is the expansion distance.
- the global navigation map uses polygons to represent the work area boundary B and the obstacle area O, and the map uploaded to the binocular system contains the position information of the vertices of these polygons.
- the vertex position of the boundary B of the work area, and the vertex position of the obstacle area O it is calculated that the maximum value on the X-axis in the navigation coordinate system is the right side.
- the maximum X value of the obstacle area, the minimum value on the X axis in the navigation coordinate system is the X value of the position before the takeoff of the aircraft, and the maximum value on the Y axis in the navigation coordinate system is the uppermost Y value of the boundary B of the work area, and the navigation coordinates
- the minimum value on the Y-axis is the Y value of the position before the take-off of the aircraft, and the four values calculated above are respectively expanded according to the expansion distance d, and the boundary of the global navigation map shown in FIG. 5 can be obtained. At this point, you can get the size, boundary, and center point position information of the global navigation map to complete the global map initialization.
- the obstacle area is set in the preset global navigation map and
- the job boundary area includes: obtaining the first obstacle area and the first job boundary area according to the acquired survey data and the three-dimensional position information; and expanding the first obstacle area and the first work boundary area, A second obstacle area and a second work boundary area are obtained; the second obstacle area and the second work boundary area are set as areas for indicating that the aircraft implements obstacle avoidance.
- the distance between the first obstacle area and the first working boundary area may be set according to actual needs. These expanded areas are also dangerous areas, and the passage of the aircraft is prohibited. Therefore, it is also necessary to set the area for the obstacle avoidance of the aircraft to keep the aircraft A safe distance from the obstacle and the boundary of the work area.
- the present application is not limited to the manner of setting the obstacle area and the job boundary area in the global navigation map, and the user may directly set the prohibition in the global navigation map according to the survey data and the obstacle information in the local navigation map.
- the distances in which the respective directions are expanded may be set to be the same, or different expansion distances may be set for the respective directions.
- FIG. 6 a schematic diagram of setting an obstacle area and a job boundary area in a global navigation map according to a specific embodiment, wherein the global navigation map is a grid map, and when the weight of the grid in the grid map is 1, Indicates that the grid is an area that is prohibited from passing. When the weight of the grid is 0, it indicates that the grid is an area that allows traffic. As shown in FIG. 6 , a schematic diagram of setting an obstacle area and a job boundary area in a global navigation map according to a specific embodiment, wherein the global navigation map is a grid map, and when the weight of the grid in the grid map is 1, Indicates that the grid is an area that is prohibited from passing. When the weight of the grid is 0, it indicates that the grid is an area that allows traffic. As shown in FIG.
- the weights of the original job boundary area and the original obstacle area in the global navigation map are all set to 1, indicating that the area has been completely Obstacle possession, prohibiting the passage of aircraft; using the depth-first algorithm or other algorithms to expand the original boundary area and the original obstacle area, and set the weight of the expansion area to 1, indicating that the expansion area is also a dangerous area, not allowed
- the aircraft is close and can be used to maintain a safe distance from the obstacle.
- the depth map needs to be preprocessed.
- the method further includes: performing sparse processing on the depth map.
- the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the proximity, and thus, in one embodiment, the pair is
- the depth map is subjected to the sparse processing, and the thinning processing is performed on the depth map by using a change step size, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
- the unequal spacing sparse processing can be performed from the image boundary, so that the pixel points near the center of the image are dense, and the pixel points at the edge of the image are sparse.
- the pseudo code for sparse processing is as follows:
- Img_height and img_width are the width and length of the image, respectively;
- I_step and j_step are the step sizes of traversing the image, and the initial values are all 1;
- Height_step and width_step are the sparse factors of the image in the vertical and horizontal directions, respectively;
- HandleImage() represents the subsequent processing of the depth map.
- the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, obtaining the three-dimensional position information of each point according to the current flight position, the posture information, and the depth map, including: performing coordinate conversion on the depth map, obtaining navigation Each point in the coordinate system; obtaining three-dimensional position information of each point according to each point in the navigation coordinate system, the current flight position, and the posture information.
- the coordinate transformation of the depth map to obtain various points in the navigation coordinate system includes: according to the camera internal reference matrix, each of the depth maps is The points are converted into points in the camera coordinate system; the transformation matrix from the camera coordinate system to the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
- the point cloud calculated using the depth map also has noise points, which will follow the loop of the above steps in the local navigation map. Accumulation, leading to erroneous measurement of obstacles, referred to as misdetection.
- the three-dimensional position information of each point is projected into the partial navigation map according to the weight value set by each point, and then, the method further includes: pre-predetermining the partial navigation map Let the weight of each point in the area be attenuated; obtain the sum of the weights of all points in each sub-area after attenuation.
- the weight of the points in the preset area in the local navigation map is attenuated, and then the total weight of each sub-area is calculated to determine whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment. Improve the accuracy of the judgment of obstacle areas.
- the attenuating the weight of each point in the preset area in the partial navigation map comprises: weighting each point in the preset area with a preset attenuation factor Multiply.
- the attenuation factor can be set empirically. If the preset area just includes N sub-areas, you can perform the attenuation operation according to the following formula:
- Map_value* damping_factor.
- map_value represents the total weight of a sub-area within the preset area
- damping_factor represents the attenuation factor
- the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
- O represents the map center of the local navigation map, that is, the current flight position of the aircraft
- ⁇ represents the size of the field of view of the binocular system.
- O represents the map center of the local navigation map, that is, the current flight position of the aircraft
- ⁇ represents the size of the field of view of the binocular system.
- d denotes the attenuation distance, which is a fixed value set according to experience
- the sector area determined by the above three parameters is the attenuation area, and the weight of the point in the attenuation area is performed. Attenuation, while the weight of points outside the attenuation zone does not need to be attenuated.
- FIG. 2 only shows the attenuation area of the binocular system installed in the front of the aircraft. If the binocular system is also installed behind or on the side of the aircraft, the position is also set at the position where the attenuation area is symmetrical or the side is also set.
- the area attenuates the weight of the points in the attenuation area, ie the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
- the present application further provides an obstacle avoidance device, and a specific implementation manner of the device of the present application is described in detail below with reference to the accompanying drawings.
- an obstacle avoidance device includes:
- the first information acquiring module 210 is configured to acquire a current flight position, posture information of the aircraft, and a depth map detected at the current flight position.
- the aircraft can be a plant protection drone and so on.
- the current flight position refers to the geographic location of the aircraft at the current time, such as the latitude and longitude information of the aircraft.
- the attitude information refers to the flight attitude of the aircraft at the current time, such as the pitch angle, the roll angle, and the yaw angle.
- the current flight position and attitude information of the aircraft can be obtained by the flight control of the aircraft.
- the depth map is a two-dimensional image of the captured target, the depth map including distance information of each point from the current flight position, that is, the gray value of each pixel in the depth map is used to represent the captured object and the aircraft Distance information for the current location. In practice, the depth map can be detected by the aircraft's binocular system.
- the depth map used may be a plurality of depth maps or a single depth map, which is not limited in this application.
- the size of the depth map can be set as needed, and this application does not limit this.
- the obtained current flight position, attitude information, and depth map are used for the construction of the local navigation map.
- the three-dimensional position information obtaining module 220 is configured to obtain three-dimensional position information of each point according to the current flight position, the posture information, and the depth map.
- each pixel in the depth map includes the distance information of the captured object and the aircraft, the pixel coordinates and the gray value of each pixel in the depth map according to the current flight position and posture information.
- the three-dimensional position information corresponding to each pixel point can be calculated, that is, each two-dimensional pixel point in the depth map corresponds to a three-dimensional point.
- the navigation coordinate system that is, the local horizontal coordinate system, is a coordinate system selected as a navigation reference according to the needs of the navigation system during navigation, and is used for navigation calculation. Considering that these three-dimensional points are used for navigation of the aircraft, the calculated three-dimensional points generally refer to points in the navigation coordinate system.
- the navigation coordinate system is a northeast coordinate system.
- the obtained point cloud may include points in the far range, so that only the points affecting the flight of the aircraft can be retained, for example, only the distance from the aircraft is retained. A certain range of points.
- the projection module 230 is configured to project the three-dimensional position information of each point into the partial navigation map centered on the current flight position according to the respective set weights, wherein the partial navigation map includes a plurality of sub-areas.
- the local navigation map is used for local navigation of the aircraft, which is dynamic, the center of the map is the current position of the aircraft, and the size of the map can be preset by man or program.
- the center of the local navigation map is moved.
- the content of the entire map is translated once. After the translation, the original content beyond the map range will be deleted, and the newly added content will be set to zero. .
- the local navigation map is divided into sub-areas, and the specific division rules can be set according to actual conditions, which is not limited in this application.
- the local navigation map can be a raster map, and each grid is a sub-region.
- the local navigation map is a two-dimensional map. After the point cloud is acquired, that is, after the point cloud of the navigation coordinate system, the point clouds need to be superimposed into the local navigation map with a certain weight to determine which areas are specific. Obstacle area. There are many ways to calculate the weight of each point. For example, in one embodiment, the weight of each point is obtained according to a product of a preset weight and a distance factor, wherein the distance factor and the distance are obtained. Information is a proportional relationship.
- the first area setting module 240 is configured to set the sub-area as an obstacle area when the weight of all points in the sub-area is greater than a preset threshold, to instruct the aircraft to implement obstacle avoidance on the obstacle area .
- each sub-area in the local navigation map may contain multiple points, and the total weight calculation is performed for each sub-area to obtain the total weight of each sub-area. If the total weight is greater than the preset threshold, the sub-area is set as the obstacle area.
- the obstacle avoidance device further includes a transit area setting module connected to the projection module 230, configured to set the sub-area to be equal to or less than a preset threshold when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
- a transit area setting module connected to the projection module 230, configured to set the sub-area to be equal to or less than a preset threshold when the weight of all points in the sub-area is less than or equal to a preset threshold Passage area to allow the aircraft to pass. If the total weight of the sub-area is less than or equal to the preset threshold, it means that the sub-area is not a real obstacle area, so the sub-area can be identified as a pass-through area.
- the second information acquiring module 250 is configured to acquire mapping data set by the user for indicating the obstacle area and the job boundary area, and three-dimensional position information for indicating the obstacle area in the partial navigation map.
- mapping data including boundaries and obstacles
- data of the local navigation map The data of the local navigation map is updated to the global navigation map with a certain period.
- the survey data can be data manually tested by the user, or each data selected by the user through the map interface.
- the survey data contains the boundary points of the obstacle area and the boundary points of the job boundary area.
- the user-mapped map data can be uploaded to the aircraft through the data link, for example, uploaded to the aircraft's binocular system for mapping operations of the global navigation map.
- the three-dimensional position information is data that has been set as an obstacle area in the partial navigation map.
- the local navigation map will be updated in a certain period.
- the position determined as the obstacle area can be selected and placed in an obstacle queue.
- the obstacle queue can be deleted or added: when it is determined as an obstacle in the local navigation map, the information of the obstacle area is added to the queue; when the obstacle moves or disappears, the information of the obstacle area is taken from the queue. Delete it inside.
- the information of the obstacle area in the obstacle queue is updated to the global navigation map in a certain cycle, so that the global navigation map also contains the information detected by the binocular system. It should be noted that the present application is not limited to updating the data of the global navigation map in the form of a queue, and the user may also update the information of the obstacle area in the partial navigation map to the global navigation map by other forms as needed.
- the obstacle By updating the information of the obstacle area of the local navigation map to the global navigation map, the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding the finding only caused by the mapping data. Paths cannot avoid unmapped obstacles.
- the second area setting module 260 is configured to set an obstacle area and a job boundary area in the preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
- the global navigation map can be a raster map.
- the global navigation map is much larger than the local navigation map.
- the local navigation map can be small.
- the global navigation map needs to include the flight range of the aircraft.
- the corresponding area can be identified in the global navigation map according to the position information of the job boundary area and the position information of the obstacle area included in the survey data. For example, set the weight of some of the grids in the global navigation map to 1 based on the survey data.
- the global navigation map is updated, and the corresponding position in the global navigation map is updated to the obstacle area.
- the aircraft can perform obstacle avoidance on the obstacle area and the work boundary area according to the global navigation map.
- the global navigation map has to be initialized before the aircraft takes off.
- the content of the initialization is to determine the size of the global navigation map and the location of the center point.
- the center and size of the preset global navigation map are obtained based on the position of the aircraft prior to takeoff and the survey data.
- the information for initializing the global navigation map comes from the mapping data set by the user.
- the geographic location represented by the map center and the size of the map are determined at the time of initialization. After determining the information, it is possible to allocate storage space for the global navigation map, determine the storage location of the obstacle information according to the geographical location of the obstacle, and conveniently store and access the obstacle information.
- the horizontal boundary of the global navigation map is determined by the position and the maximum and minimum values of the survey data on the Y-axis, the vertical boundary of the global navigation map being from the position And determining that the survey data is expanded after the maximum value and the minimum value on the X-axis are expanded.
- the maximum value on the Y-axis is found from the survey data and the position information before the take-off of the aircraft, and the maximum value is expanded by a certain distance to obtain a horizontal boundary above the global navigation map; from the survey data and the position before the aircraft takes off.
- the second area setting module 260 is based on the acquired survey data. And the three-dimensional position information, obtaining the first obstacle area and the first work boundary area; expanding the first obstacle area and the first work boundary area to obtain the second obstacle area and the second work boundary area; The second obstacle area and the second work boundary area are set as areas for indicating that the aircraft implements obstacle avoidance.
- the distance between the first obstacle area and the first working boundary area may be set according to actual needs. These expanded areas are also dangerous areas, and the passage of the aircraft is prohibited. Therefore, it is also necessary to set the area for the obstacle avoidance of the aircraft to keep the aircraft A safe distance from the obstacle and the boundary of the work area.
- the present application is not limited to the manner of setting the obstacle area and the job boundary area in the global navigation map, and the user may directly set the prohibition in the global navigation map according to the survey data and the obstacle information in the local navigation map.
- the area where the aircraft passes, without expansion or the like, or only the obstacle area or the work boundary area is expanded.
- the distances in which the respective directions are expanded may be set to be the same, or different expansion distances may be set for the respective directions.
- the obstacle avoidance device may further include being connected between the first information acquisition module 210 and the three-dimensional information location acquisition module 220. a sparse processing module, the sparse processing module being configured to perform sparse processing on the depth map.
- the distant obstacles are concentrated at the center of the image, while the near obstacles are larger because of the distance, and therefore, in one embodiment, the sparse processing module is adopted.
- the change step size performs a sparse process on the depth map, wherein the change step size is used to control the pixel points in the depth map to gradually increase from the edge to the center.
- the original depth map, or the sparse depth map after sparse processing can be converted to the navigation coordinate system after coordinate transformation, and the information as the obstacle can be updated to the local navigation map. Therefore, in one embodiment, the three-dimensional position information obtaining module 220 performs coordinate transformation on the depth map to obtain various points in the navigation coordinate system; according to each point in the navigation coordinate system, the current flight position, and The posture information obtains three-dimensional position information of each point.
- the three-dimensional position information obtaining module 220 converts each point in the depth map into points in the camera coordinate system according to the camera internal reference matrix; according to the camera coordinate system Transformation matrix of the body coordinate system Convert each point in the camera coordinate system into each point of the body coordinate system; the transformation matrix according to the body coordinate system to the navigation coordinate system Convert each point in the body coordinate system to each point of the navigation coordinate system.
- the obstacle avoidance device further includes an attenuation module coupled between the projection module 230 and the first area setting module 240 (and/or the transit area setting module), the attenuation module setting To attenuate the weight of each point in the preset area in the partial navigation map; obtain the weight sum of all points in each sub-area after the attenuation.
- the attenuation module first attenuates the weights of the points in the preset area in the local navigation map, and then calculates the total weight of each sub-area. Based on the total weight value, it is judged whether each sub-area is an obstacle area, thereby reducing the influence of noise on the obstacle judgment and improving the accuracy of the obstacle area judgment.
- the attenuation module multiplies the weight of each point within the predetermined area by a predetermined attenuation factor.
- the attenuation factor can be set empirically. If the preset area includes exactly N sub-areas, the total weight of each sub-area can be multiplied by the attenuation factor.
- the preset area is based on the center of the partial navigation map, the horizontal field of view of the binocular system in the aircraft for acquiring the depth map, and Set the attenuation distance to determine.
- the attenuation area is related to the way the binocular system is installed. If the depth map is acquired by other devices, the manner of determining the preset area is the same as the concept of determining the preset area based on the binocular system.
- the present application also provides a terminal, which may be an aircraft or other device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor implementing the program to implement the above The steps of the method described in the navigation map configuration method and the automatic obstacle avoidance method.
- the present application also provides a storage medium, the storage medium including a stored program, wherein the device in which the storage medium is located is controlled to execute the navigation map configuration method according to the first aspect described above when the program is running.
- the application further provides a storage medium, the storage medium comprising a stored program, wherein the device in which the storage medium is located is controlled to perform the obstacle avoidance method according to the second aspect described above when the program is running.
- FIG. 8 is a schematic structural view of an unmanned aerial vehicle 800 according to an embodiment of the present application.
- unmanned aerial vehicle 800 includes a controller 810 that is coupled to one or more sensors or sensing systems 801a-c in a wired or wireless manner.
- the sensor can be connected to the controller via a controller area network (CAN).
- the controller 810 can also be coupled to one or more actuators 820 to control the state of the UAV.
- the sensor may include any of the sensors described herein, such as an inertial sensor, a GPS receiver, a compass, an RTK positioning sensor, a magnetometer, an altimeter, a distance sensor (eg, an infrared sensor or a lidar sensor), a visual or image sensor (eg, a camera) Or camera), photoelectric sensor, motion sensor, touch sensor, pressure sensor, temperature sensor, magnetic sensor, etc.
- sensors described herein such as an inertial sensor, a GPS receiver, a compass, an RTK positioning sensor, a magnetometer, an altimeter, a distance sensor (eg, an infrared sensor or a lidar sensor), a visual or image sensor (eg, a camera) Or camera), photoelectric sensor, motion sensor, touch sensor, pressure sensor, temperature sensor, magnetic sensor, etc.
- the inertial sensor also called IMU
- the inertial sensor can be set to determine the attitude information of the aircraft, including a three-axis gyroscope, a three-axis acceleration sensor, a three-axis geomagnetic sensor and a barometer, among which three-axis gyroscope, three-axis acceleration sensor, three
- the three axes in the geomagnetic sensor refer to the left and right sides of the UAV, the front and rear, the vertical direction of the three axes, the sensor is mainly responsible for measuring the inclination of the three axes of XYZ; the three-axis acceleration sensor is responsible for measuring the three axes of the XYZ of the drone Acceleration; geomagnetic sensor senses geomagnetism, allowing the drone to know its own nose and flight direction to find the mission position; the barometer can calculate the pressure difference to obtain the current altitude by measuring the air pressure at different positions.
- the IMU inertial measurement unit can sense changes in the attitude of
- some sensors may be coupled to a field programmable gate array (FPGA, not shown).
- the field programmable gate array can be coupled to the controller (e.g., via a general purpose memory controller (GPMC)).
- some sensors eg, vision sensors
- GPMC general purpose memory controller
- some sensors and/or the field programmable gate arrays can be coupled to the transmission module.
- the transmission module can be used to communicate data acquired by the sensor (eg, image data) to any suitable external device or system, such as a terminal or remote device as described herein.
- the controller can include one or more programmable processors (eg, a central processing unit).
- the controller can be coupled to a storage medium such as a non-transitory computer readable medium 830.
- the storage medium may include one or more storage units (eg, removable media or external storage such as an SD card or a random access memory).
- data from the sensor eg, a camera
- DMA direct memory access connection
- the storage unit of the storage medium may store code and/or program instructions.
- the controller executes the code and/or program instructions to perform the method embodiments described herein.
- the controller can execute instructions such that one or more processors of the controller analyze data generated by one or more sensors or sensing systems to determine the orientation of the UAV described in this specification and / or motion information, detected external contact information and / or detected external signal information.
- the controller can execute an instruction such that one or more processors of the controller determine whether to control the UAV to take off or land autonomously.
- the storage unit of the storage medium 830 stores sensed data from the one or more sensing systems that will be processed by the controller.
- the storage unit may store the UAV azimuth and/or motion information, detected external contact information, and/or detected external signal information.
- the storage unit may store predetermined or pre-stored data to control the UAV (eg, a threshold of predetermined sensing data, parameters to control the actuator, the The intended flight path, speed, acceleration or direction of the unmanned aerial vehicle).
- the actuators can include motors, electronic governors, mechanical transmissions, hydraulic transmissions, pneumatic transmissions, and the like.
- the motor may include a magnetic motor, an electrostatic motor, or a piezoelectric motor.
- the actuator comprises a brushed or brushless DC motor.
- the controller can be coupled to the communication module 840 for transmitting and/or receiving data from one or more external devices (eg, terminals, display devices, ground controls, or other remote controls).
- the communication module can use any suitable communication means, such as wired communication or wireless communication.
- the communication module can employ one or more local area networks, wide area networks, infrared, radio waves, WiFi, point-to-point (P2P) networks, telecommunications networks, cloud communications, and the like.
- a relay station such as a tower, satellite or mobile station may be employed.
- the wireless communication may be affected by distance or by distance. In some embodiments, communication can be within sight or outside of line of sight.
- the communication module can transmit and/or receive one or more sensed data, orientation and/or motion information from the sensing system, external contact information obtained by processing the sensed data, and/or an external signal Information, predetermined control data, user commands from the terminal or remote control, and the like.
- the components of the UAV can be configured in any suitable manner.
- one or more components of the UAV may be disposed on the UAV, carrier, load, terminal, sensing system, or any other remote device or system in communication with one or more of the devices described above.
- FIG. 8 depicts a single controller and a single storage medium, those skilled in the art will appreciate that the description is not a limitation on the UAV, which may include multiple controllers and/or storage media. .
- one or more of the plurality of controllers and/or storage media may be disposed at different locations, such as in the UAV, carrier, load, terminal, sensing system, or any Other remote devices or systems in which one or more of the above devices are in communication, or a suitable combination thereof, such that the UAV facilitates performing processing and/or storage functions at one or more of the locations described above.
- the UAV includes, but is not limited to, a single rotor aircraft, a multi-rotor aircraft, and a rotorcraft.
- Rotorcraft typically use a propeller to rotate around a shaft or shaft to generate lift.
- the rotorcraft includes, for example, helicopters, rolling wings, rotation gyroplanes, rotary helicopters, and the like.
- the rotorcraft may have a plurality of rotors mounted at a plurality of locations of the aircraft.
- the UAV may include a quadrotor helicopter, a six-rotor helicopter, a ten-rotor helicopter, and the like.
- the UAV can move freely with respect to six degrees of freedom (eg, three translational degrees of freedom and three degrees of rotational freedom).
- the UAV may be limited to one or more degrees of freedom motion, such as being limited to a predetermined track or trajectory.
- the motion can be driven by any suitable drive mechanism, such as by an engine or motor.
- the UAV can be driven by a propulsion system.
- the propulsion system may include, for example, an engine, a motor, a wheel, an axle, a magnet, a rotor, a propeller, a paddle, a nozzle, or any suitable combination of the above.
- the motion of the UAV may be powered by any suitable source of energy, such as electrical energy, magnetic energy, solar energy, wind energy, gravity energy, chemical energy, nuclear energy, or any suitable combination of energy sources.
- the UAV may be of different sizes, sizes, and/or configurations.
- the UAV may be a multi-rotor UAV, and the axial spacing of the counter-rotating rotors does not exceed a certain threshold.
- the threshold may be about 5 m, 4 m, 3 m, 2 m, 1 m, and the like.
- the value of the axial spacing of the counter-rotating rotor may be 350 mm, 450 mm, 800 mm, 900 mm, or the like.
- the UAV is sized and/or sized to accommodate a person in or on it.
- the UAV is not sized and/or sized to accommodate a person in or on it.
- the UAV's largest dimensions do not exceed 5 m, 4 m, 3 m, 2 m, 1 m, 0.5 m, or 0.1 m.
- the axial distance of the counter-rotating rotor may not exceed 5 m, 4 m, 3 m, 2 m, 1 m, 0.5 m or 0.1 m.
- the UAV may have a volume of less than 100 cm x 100 cm x 100 cm.
- the UAV may have a volume of less than 50 cm x 50 cm x 30 cm. In certain embodiments, the UAV may have a volume of less than 5 cm x 5 cm x 3 cm. In certain embodiments, the footprint of the UAV (the area of the cross-section of the UAV) may be less than approximately 32,000 cm2, 20,000 cm2, 10,000 cm2, 1,000 cm2, 500 cm2, 100 cm2 Or smaller. In some cases, the unmanned aerial vehicle may weigh no more than 1000 kg, 500 kg, 100 kg, 10 kg, 5 kg, 1 kg, or 0.5 kg.
- the UAV can carry a load.
- the load may include one or more cargo, devices, instruments, and the like.
- the load can have a housing. Alternatively, part or all of the load may have no housing.
- the load may be rigidly fixed relative to the UAV. Alternatively, the load may be moved relative to the UAV (eg, translated or rotated relative to the UAV).
- the carrier can be moved relative to the UAV (eg, relative to one, two, or three translational degrees of freedom and/or one, two, or three rotational degrees of freedom) such that the load is relative to
- a suitable reference coordinate system maintains its position/and or direction without being affected by the movement of the UAV.
- the reference coordinate system may be a fixed reference coordinate system (eg, a surrounding environment).
- the reference coordinate system may be a motion reference coordinate system (eg, the unmanned aerial vehicle, load).
- the carrier can move the load relative to the carrier and/or the unmanned aerial vehicle.
- the motion may be relative to achieving three degrees of freedom (eg, along one, two, or three axes), relative to achieving three degrees of freedom (eg, along one, two, or three axes) or random combination.
- the carrier can include a frame assembly and an actuator assembly.
- the frame assembly can provide structural support for the load.
- the frame assembly can include a plurality of separate frame members, some of which can move relative to each other.
- the frame assembly and/or the separate frame member can be coupled to a drive assembly that drives the frame assembly to move.
- the drive assembly can include one or more actuators (e.g., motors) configured to urge the separate frame members to move.
- the actuator may cause a plurality of frame members to move simultaneously or only one frame member to move at a time.
- the movement of the frame member can cause the load to move accordingly.
- the drive assembly can drive one or more frame members to rotate about one or more axes of rotation, such as a roll axis, a pitch axis, or a heading axis. Rotation of the one or more frame members may cause the load to rotate about the one or more axes of rotation relative to the UAV.
- the drive assembly can drive one or more frame members to translate along one or more translation axes to translate the load relative to the unmanned aerial vehicle along one or more corresponding translation axes .
- the load may be coupled to the UAV by the carrier either directly (e.g., in direct contact with the UAV) or indirectly (e.g., without contacting the UAV).
- the load may be mounted on the UAV without a carrier.
- the load may be integral with the carrier.
- the load can be detachably coupled to the carrier.
- the load may include one or more load elements that, as previously described, may move relative to the UAV and/or carrier.
- the load can include one or more sensors configured to measure one or more targets.
- the load may comprise any suitable sensor, such as an image acquisition device (such as a camera), a sound acquisition device (such as a parabolic microphone), an infrared imaging device, or an ultraviolet imaging device.
- the sensor can provide static sensing data (eg, photos) or dynamic sensing data (eg, video).
- the sensor provides sensed data to a sensed object of the load.
- the load may include one or more transmitters arranged to provide signals to one or more sensing objects.
- the transmitter can be any suitable transmitter, such as a light source or a sound source.
- the load includes one or more transceivers, for example for communicating with a module remote from the UAV.
- the controller 810 by calling a program stored in the storage medium 830, the controller 810 is configured to acquire second map data and a flight position when flying according to the flight route; and to the first map data The work area and the second map data are matched to calculate a flight offset of the flight position from the flight path; flight correction is performed according to the flight offset to modify to the flight path.
- controller 810 is further configured to:
- the three-dimensional position information of each point is projected into the partial navigation map centered on the current flight position according to the respective set weights.
- control 810 is further configured to:
- the sub-area is set as an obstacle area to indicate that the aircraft performs obstacle avoidance on the obstacle area;
- An obstacle area and a job boundary area are set in a preset global navigation map to instruct the aircraft to perform obstacle avoidance on the obstacle area and the work boundary area.
- the navigation map configuration method, the automatic obstacle avoidance method, the device, the terminal, and the unmanned aerial vehicle dynamically generate a local navigation map centered on the current flight position of the aircraft, according to the position information, the posture information, and the depth map acquired by the aircraft during the flight.
- the three-dimensional position information of each point is analyzed, and the three-dimensional position information of each point may be information of an unknown obstacle encountered by the aircraft during the flight, or may be a prior encounter of the aircraft during the flight.
- the information of other objects to be planned, and the three-dimensional position information of each point is projected into the partial navigation map, and the real-time operation route planning can be performed according to the local navigation map.
- the operation route is dynamically generated according to the information acquired during the flight of the aircraft, it can effectively cope with temporary task changes, for example, simultaneously working on a work area of a previously unplanned operation route, or automatically avoiding the area where the unknown obstacle is located. Barriers and so on.
- the obstacle can be directly avoided when the global planning is performed, and the shortest path can be found in the global scope, thereby avoiding only the mapping data. The path found cannot avoid unmapped obstacles.
- the functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
- the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
- the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
- the storage medium includes, but is not limited to, any type of disk (including a floppy disk, a hard disk, an optical disk, a CD-ROM, and a magneto-optical disk), a ROM (Read-Only Memory), and a RAM (Random AcceSS Memory).
- a storage medium includes any medium that is stored or transmitted by a device (eg, a computer) in a readable form. It can be a read only memory, a disk or a disc.
- steps, measures, and solutions in the various operations, methods, and processes that have been discussed in the present application may be alternated, changed, combined, or deleted.
- steps, measures, and schemes of the various operations, methods, and processes that have been discussed in this application can be alternated, modified, rearranged, decomposed, combined, or deleted.
- the steps, measures, and solutions in the various operations, methods, and processes disclosed in the present application may also be alternated, modified, rearranged, decomposed, combined, or deleted.
- the solution provided by the present application can be applied to the field of UAV navigation, and the navigation map configuration method includes the steps of: acquiring a current flight position, posture information of the UAV, and a depth map detected at a current flight position; a current flight position, the attitude information, and the depth map, obtaining three-dimensional position information of each point; projecting the three-dimensional position information of each point to a local part centered on the current flight position according to a weight value set by each In the navigation map.
- the solution of the present application can dynamically generate a work route and effectively cope with temporary task changes.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Instructional Devices (AREA)
Abstract
Description
Claims (34)
- 一种导航图配置方法,包括步骤:获取飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
- 根据权利要求1所述的导航图配置方法,其中,所述局部导航图包括若干个子区域;所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器实施避障。
- 根据权利要求2所述的导航图配置方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。
- 根据权利要求1所述的导航图配置方法,其中,所述深度图包括每个点与所述当前飞行位置的距离信息;每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
- 根据权利要求2所述的导航图配置方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。
- 根据权利要求5所述的导航图配置方法,其中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:将所述预设区域内的每个点的权值与预设衰减因子相乘。
- 根据权利要求5所述的导航图配置方法,其中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。
- 根据权利要求1所述的导航图配置方法,其中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:对所述深度图进行坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
- 根据权利要求8所述的导航图配置方法,其中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵,将相机坐标系下的各个点转化为机体坐标系的各个点;根据机体坐标系到导航坐标系的转换矩阵,将机体坐标系下的各个点转换为导航坐标系的各个点。
- 根据权利要求1所述的导航图配置方法,其中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还包括:对所述深度图进行稀疏处理。
- 根据权利要求10所述的导航图配置方法,其中,所述对所述深度图进行稀疏处理,包括:采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
- 根据权利要求2至11任意一项所述的导航图配置方法,其中,所述局部导航图为栅格地图,每一个栅格为一个子区域。
- 一种避障方法,包括步骤:获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;若子区域中所有点的权值和大于预设阈值,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;获取用户设定的用于指示障碍物区域和作业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
- 根据权利要求13所述的避障方法,其中,所述在预设的全局导航图中设置障碍物区域和作业边界区域,包括:根据获取的所述测绘数据以及三维位置信息,获得第一障碍物区域和第一作业边界区域;对所述第一障碍物区域和所述第一作业边界区域进行膨胀,获得第二障碍物区域和第二作业边界区域;将所述第二障碍物区域和所述第二作业边界区域设置为用于指示飞行器实施避障的区域。
- 根据权利要求13所述的避障方法,其中,所述预设的全局导航图的中心和大小根据所述飞行器起飞前的位置以及所述测绘数据获得。
- 根据权利要求15所述的避障方法,其中,所述全局导航图的水平边界由所述位置和所述测绘数据在Y轴上的最大值和最小值膨胀后确定,所述全局导航图的竖直边界由所述位置和所述测绘数据在X轴上的最大值和最小值膨胀后确定。
- 根据权利要求13所述的避障方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,之后,还包括:若子区域中所有点的权值和小于等于预设阈值,将所述子区域设置为通行区域,以允许所述飞行器通过。
- 根据权利要求13所述的避障方法,其中,所述深度图包括每个点与所述当前飞行位置的距离信息;每个点的所述权值根据预设权值与距离因子的乘积获得,其中,所述距离因子与所述距离信息为正比例关系。
- 根据权利要求13所述的避障方法,其中,所述将每个点的三维位置信息按照各自设定的权值投影到局部导航图中,之后,还包括:对所述局部导航图中预设区域内的每个点的权值进行衰减;获得衰减后每个子区域中所有点的权值和。
- 根据权利要求19所述的避障方法,其中,所述对所述局部导航图中预设区域内的每个点的权值进行衰减,包括:将所述预设区域内的每个点的权值与预设衰减因子相乘。
- 根据权利要求19所述的避障方法,其中,所述预设区域根据所述局部导航图的中心、所述飞行器中用于获取深度图的双目系统的水平视场角以及设定衰减距离确定。
- 根据权利要求13所述的避障方法,其中,所述根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息,包括:对所述深度图进行坐标转换,获得在导航坐标系中的各个点;根据导航坐标系中的各个点、所述当前飞行位置以及所述姿态信息,获得每个点的三维位置信息。
- 根据权利要求22所述的避障方法,其中,所述对所述深度图进行坐标转换,获得在导航坐标系中的各个点,包括:根据相机内参矩阵,将所述深度图中每个点转化为相机坐标系下的各个点;根据相机坐标系到机体坐标系的转换矩阵,将相机坐标系下的各个点转化为机体坐标系的各个点;根据机体坐标系到导航坐标系的转换矩阵,将机体坐标系下的各个点转换为导航坐标系的各个点。
- 根据权利要求13所述的避障方法,其中,所述获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图,之后,所述获得每个点的三维位置信息,之前,还包括:对所述深度图进行稀疏处理。
- 根据权利要求24所述的避障方法,其中,所述对所述深度图进行稀疏处理,包括:采用变化步长对所述深度图进行稀疏处理,其中,所述变化步长用于控制深度图中的像素点从边缘到中心逐步增多。
- 根据权利要求13至25任意一项所述的避障方法,其中,所述局部导航图和所述全局导航图为栅格地图,每一个栅格为一个子区域。
- 一种飞行器导航图配置装置,包括:信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中。
- 一种避障装置,包括:第一信息获取模块,设置为获取所述飞行器的当前飞行位置、姿态信息以及在当前飞行位置探测到的深度图;三维位置信息获得模块,设置为根据所述当前飞行位置、所述姿态信息以及所述深度图,获得每个点的三维位置信息;投影模块,设置为将每个点的三维位置信息按照各自设定的权值投影到以所述当前飞行位置为中心的局部导航图中,其中,所述局部导航图包括若干个子区域;第一区域设置模块,设置为在子区域中所有点的权值和大于预设阈值时,将所述子区域设置为障碍物区域,以指示所述飞行器对所述障碍物区域实施避障;第二信息获取模块,设置为获取用户设定的用于指示障碍物区域和作 业边界区域的测绘数据,以及用于指示所述局部导航图中障碍物区域的三维位置信息;第二区域设置模块,设置为在预设的全局导航图中设置障碍物区域和作业边界区域,以指示所述飞行器对所述障碍物区域和所述作业边界区域实施避障。
- 一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现权利要求1-12中任意一项所述方法的步骤。
- 一种终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现权利要求13-26中任意一项所述方法的步骤。
- 一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求1-12中任意一项所述的导航图配置方法。
- 一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求13-26中任意一项所述的避障方法。
- 一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感器,所述通信模块,设置为与地面控制装置进行通信;所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;所述惯性传感器,设置为确定无人飞行器的姿态信息;所述图像传感器,设置为在当前飞行位置探测深度图;所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行权利要求1-12任一项所述方法的步骤。
- 一种无人飞行器,包括通信模块、传感器、控制器、存储介质;所述传感器包括图像传感器、GPS接收器、RTK定位传感器、惯性传感 器,所述通信模块,设置为与地面控制装置进行通信;所述GPS接收器和定位传感器,设置为确定无人飞行器的当前飞行位置;所述惯性传感器,设置为确定无人飞行器的姿态信息;所述图像传感器,设置为在当前飞行位置探测深度图;所述控制器与所述存储介质连接,所述存储介质设置为存储程序,所述程序运行时用于执行权利要求13-26任一项所述方法的步骤。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18871786.2A EP3702731A4 (en) | 2017-10-26 | 2018-10-26 | NAVIGATION DIAGRAM CONFIGURATION METHOD, OBSTACLE AVOIDANCE PROCESS AND DEVICE, TERMINAL, UNPILOT AIR VEHICLE |
AU2018355491A AU2018355491B2 (en) | 2017-10-26 | 2018-10-26 | Method for configuring navigation chart, obstacle avoidance method and device, terminal, unmanned aerial vehicle |
US16/641,763 US20200394924A1 (en) | 2017-10-26 | 2018-10-26 | Method for Configuring Navigation Chart, Method for Avoiding Obstacle and Device, Terminal and Unmanned Aerial Vehicle |
KR1020207005722A KR102385820B1 (ko) | 2017-10-26 | 2018-10-26 | 내비게이션 차트 구성 방법, 장애물 회피 방법 및 장치, 단말기, 무인 항공기 |
JP2020517902A JP2020535545A (ja) | 2017-10-26 | 2018-10-26 | ナビゲーションチャート構成方法、障害物回避方法及び装置、端末、無人航空機 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711021920.1 | 2017-10-26 | ||
CN201711021920.1A CN109708636B (zh) | 2017-10-26 | 2017-10-26 | 导航图配置方法、避障方法以及装置、终端、无人飞行器 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019080924A1 true WO2019080924A1 (zh) | 2019-05-02 |
Family
ID=66247759
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/112077 WO2019080924A1 (zh) | 2017-10-26 | 2018-10-26 | 导航图配置方法、避障方法以及装置、终端、无人飞行器 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20200394924A1 (zh) |
EP (1) | EP3702731A4 (zh) |
JP (1) | JP2020535545A (zh) |
KR (1) | KR102385820B1 (zh) |
CN (1) | CN109708636B (zh) |
AU (1) | AU2018355491B2 (zh) |
WO (1) | WO2019080924A1 (zh) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111950524A (zh) * | 2020-08-28 | 2020-11-17 | 广东省现代农业装备研究所 | 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统 |
CN113012479A (zh) * | 2021-02-23 | 2021-06-22 | 欧阳嘉兰 | 一种基于障碍物分析的飞行限重测量方法、装置及系统 |
CN113077551A (zh) * | 2021-03-30 | 2021-07-06 | 苏州臻迪智能科技有限公司 | 占据栅格地图构建方法、装置、电子设备和存储介质 |
CN113448326A (zh) * | 2020-03-25 | 2021-09-28 | 北京京东乾石科技有限公司 | 机器人定位方法及装置、计算机存储介质、电子设备 |
CN113485359A (zh) * | 2021-07-29 | 2021-10-08 | 北京超维世纪科技有限公司 | 一种工业类巡检机器人多传感器融合避障系统 |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110262556A (zh) * | 2019-07-12 | 2019-09-20 | 黑梭智慧技术(北京)有限公司 | 快递物流无人飞行器航线设计方法和装置 |
CN110471421B (zh) * | 2019-08-27 | 2022-03-18 | 广州小鹏汽车科技有限公司 | 一种车辆安全行驶的路径规划方法及路径规划系统 |
CN112313476A (zh) * | 2019-11-05 | 2021-02-02 | 深圳市大疆创新科技有限公司 | 无人飞行器的航线规划方法和装置 |
US11244164B2 (en) * | 2020-02-03 | 2022-02-08 | Honeywell International Inc. | Augmentation of unmanned-vehicle line-of-sight |
JP7412037B2 (ja) * | 2020-02-20 | 2024-01-12 | 株式会社ナイルワークス | ドローンシステム、操作器および作業エリアの定義方法 |
US20210300551A1 (en) * | 2020-03-25 | 2021-09-30 | Tencent America LLC | Systems and methods for unmanned aerial system communication |
CN113465614B (zh) * | 2020-03-31 | 2023-04-18 | 北京三快在线科技有限公司 | 无人机及其导航地图的生成方法和装置 |
CN111854754B (zh) * | 2020-06-19 | 2023-01-24 | 北京三快在线科技有限公司 | 无人机航线规划方法、装置、无人机及存储介质 |
CN112033413B (zh) * | 2020-09-07 | 2023-06-16 | 北京信息科技大学 | 一种基于结合环境信息的改进a*算法的路径规划方法 |
CN112066976B (zh) * | 2020-09-07 | 2023-06-16 | 北京信息科技大学 | 一种自适应膨胀处理方法、系统、机器人及存储介质 |
CN112116643A (zh) * | 2020-09-14 | 2020-12-22 | 哈工大机器人(合肥)国际创新研究院 | 一种基于tof相机深度图和点云图的避障处理方法及系统 |
CN112416018B (zh) * | 2020-11-24 | 2021-07-09 | 广东技术师范大学 | 基于多信号采集与路径规划模型的无人机避障方法和装置 |
CN112859893B (zh) * | 2021-01-08 | 2024-07-26 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | 一种飞行器避障方法、装置 |
CN113086227A (zh) * | 2021-03-30 | 2021-07-09 | 武汉学院 | 矢量共轴手持云台一体无人机及其智能系统 |
CN113310493B (zh) * | 2021-05-28 | 2022-08-05 | 广东工业大学 | 一种基于事件触发机制的无人机实时导航方法 |
CN113465606A (zh) * | 2021-06-30 | 2021-10-01 | 三一机器人科技有限公司 | 末端工位定位方法、装置及电子设备 |
CN115222808B (zh) * | 2021-06-30 | 2023-10-20 | 达闼机器人股份有限公司 | 基于无人机的定位方法、装置、存储介质和电子设备 |
CN113532471B (zh) * | 2021-07-15 | 2024-07-19 | 浙江东进航科信息技术有限公司 | 一种多源飞行轨迹数据融合的处理方法、设备及介质 |
CN114088094A (zh) * | 2021-09-27 | 2022-02-25 | 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) | 一种无人艇的智能航路规划方法及系统 |
CN113867349B (zh) * | 2021-09-28 | 2024-04-09 | 浙江大华技术股份有限公司 | 一种机器人的避障方法、系统及智能机器人 |
CN113642092B (zh) * | 2021-10-18 | 2022-01-04 | 西南交通大学 | 一种建筑空间路径捕获方法 |
WO2023070667A1 (zh) * | 2021-11-01 | 2023-05-04 | 深圳市大疆创新科技有限公司 | 可移动平台及用于处理其数据的方法和装置、终端设备 |
CN114313243B (zh) * | 2021-12-19 | 2023-06-02 | 四川省天域航通科技有限公司 | 一种植保用避障无人机 |
KR102622623B1 (ko) * | 2022-05-12 | 2024-01-10 | 한국광기술원 | 3차원 영상 정보를 제공하기 위한 이동형 영상 촬영 장치, 이에 대한 방법 및 이를 포함하는 시스템 |
CN114879704B (zh) * | 2022-07-11 | 2022-11-25 | 山东大学 | 一种机器人绕障控制方法及系统 |
CN115150784B (zh) * | 2022-09-02 | 2022-12-06 | 汕头大学 | 基于基因调控网络的无人机集群区域覆盖方法及设备 |
WO2024195211A1 (ja) * | 2023-03-17 | 2024-09-26 | 日本電気株式会社 | 制御装置、制御方法およびプログラム |
CN116757582B (zh) * | 2023-08-18 | 2023-11-17 | 山西汇能科技有限公司 | 基于无人机的物流配送系统及方法 |
CN116907511B (zh) * | 2023-09-12 | 2023-12-05 | 北京宝隆泓瑞科技有限公司 | 一种将管道坐标转换为图像坐标的方法 |
CN118470237B (zh) * | 2024-07-12 | 2024-09-17 | 中航材导航技术(北京)有限公司 | 自动生成飞行程序标准仪表图的方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1975646A2 (en) * | 2007-03-28 | 2008-10-01 | Honeywell International Inc. | Lader-based motion estimation for navigation |
CN105571588A (zh) * | 2016-03-10 | 2016-05-11 | 赛度科技(北京)有限责任公司 | 一种无人机三维空中航路地图构建及其航路显示方法 |
CN105910604A (zh) * | 2016-05-25 | 2016-08-31 | 武汉卓拔科技有限公司 | 一种基于多传感器的自主避障导航系统 |
CN106595659A (zh) * | 2016-11-03 | 2017-04-26 | 南京航空航天大学 | 城市复杂环境下多无人机视觉slam的地图融合方法 |
CN106931961A (zh) * | 2017-03-20 | 2017-07-07 | 成都通甲优博科技有限责任公司 | 一种自动导航方法及装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4061596B2 (ja) * | 2004-05-20 | 2008-03-19 | 学校法人早稲田大学 | 移動制御装置、環境認識装置及び移動体制御用プログラム |
JP5233432B2 (ja) * | 2008-06-16 | 2013-07-10 | アイシン・エィ・ダブリュ株式会社 | 運転支援システム、運転支援方法及び運転支援プログラム |
JP5093020B2 (ja) * | 2008-09-18 | 2012-12-05 | トヨタ自動車株式会社 | レーダ装置 |
CN102359784B (zh) * | 2011-08-01 | 2013-07-24 | 东北大学 | 一种室内移动机器人自主导航避障系统及方法 |
CN103576686B (zh) * | 2013-11-21 | 2017-01-18 | 中国科学技术大学 | 一种机器人自主导引及避障的方法 |
US9772712B2 (en) * | 2014-03-11 | 2017-09-26 | Textron Innovations, Inc. | Touch screen instrument panel |
WO2016015251A1 (en) * | 2014-07-30 | 2016-02-04 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
JP6278539B2 (ja) * | 2014-09-05 | 2018-02-14 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 状況に基づく飛行モード選択 |
CN104236548B (zh) * | 2014-09-12 | 2017-04-05 | 清华大学 | 一种微型无人机室内自主导航方法 |
JP6387782B2 (ja) * | 2014-10-17 | 2018-09-12 | ソニー株式会社 | 制御装置、制御方法及びコンピュータプログラム |
US9399524B2 (en) * | 2014-10-21 | 2016-07-26 | Honeywell International Inc. | System and method for displaying runway landing information |
KR101736089B1 (ko) * | 2015-01-08 | 2017-05-30 | 서울대학교산학협력단 | 깊이 지도를 이용한 사물 형상 맵핑 및 실시간 유도를 위한 무인기 비행 제어 장치 및 방법 |
US9470528B1 (en) * | 2015-03-26 | 2016-10-18 | Honeywell International Inc. | Aircraft synthetic vision systems utilizing data from local area augmentation systems, and methods for operating such aircraft synthetic vision systems |
WO2017071143A1 (en) * | 2015-10-30 | 2017-05-04 | SZ DJI Technology Co., Ltd. | Systems and methods for uav path planning and control |
CN105678754B (zh) * | 2015-12-31 | 2018-08-07 | 西北工业大学 | 一种无人机实时地图重建方法 |
CN105761265A (zh) * | 2016-02-23 | 2016-07-13 | 英华达(上海)科技有限公司 | 利用影像深度信息提供避障的方法及无人飞行载具 |
CN114610049A (zh) * | 2016-02-26 | 2022-06-10 | 深圳市大疆创新科技有限公司 | 用于修改无人飞行器自主飞行的系统和方法 |
CN105955258B (zh) * | 2016-04-01 | 2018-10-30 | 沈阳工业大学 | 基于Kinect传感器信息融合的机器人全局栅格地图构建方法 |
JP6327283B2 (ja) * | 2016-04-06 | 2018-05-23 | トヨタ自動車株式会社 | 車両用情報提供装置 |
CN106780592B (zh) * | 2016-06-30 | 2020-05-22 | 华南理工大学 | 基于相机运动和图像明暗的Kinect深度重建方法 |
CN106127788B (zh) * | 2016-07-04 | 2019-10-25 | 触景无限科技(北京)有限公司 | 一种视觉避障方法和装置 |
-
2017
- 2017-10-26 CN CN201711021920.1A patent/CN109708636B/zh active Active
-
2018
- 2018-10-26 EP EP18871786.2A patent/EP3702731A4/en not_active Withdrawn
- 2018-10-26 KR KR1020207005722A patent/KR102385820B1/ko active IP Right Grant
- 2018-10-26 AU AU2018355491A patent/AU2018355491B2/en not_active Ceased
- 2018-10-26 US US16/641,763 patent/US20200394924A1/en not_active Abandoned
- 2018-10-26 WO PCT/CN2018/112077 patent/WO2019080924A1/zh unknown
- 2018-10-26 JP JP2020517902A patent/JP2020535545A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1975646A2 (en) * | 2007-03-28 | 2008-10-01 | Honeywell International Inc. | Lader-based motion estimation for navigation |
CN105571588A (zh) * | 2016-03-10 | 2016-05-11 | 赛度科技(北京)有限责任公司 | 一种无人机三维空中航路地图构建及其航路显示方法 |
CN105910604A (zh) * | 2016-05-25 | 2016-08-31 | 武汉卓拔科技有限公司 | 一种基于多传感器的自主避障导航系统 |
CN106595659A (zh) * | 2016-11-03 | 2017-04-26 | 南京航空航天大学 | 城市复杂环境下多无人机视觉slam的地图融合方法 |
CN106931961A (zh) * | 2017-03-20 | 2017-07-07 | 成都通甲优博科技有限责任公司 | 一种自动导航方法及装置 |
Non-Patent Citations (3)
Title |
---|
CHEN, BAO-HUA ET AL.: "Instant Dense 3D Reconstruction-Based UAV Vision Localization", ACTA ELECTRONICA SINICA, vol. 45, no. 6, 30 June 2017 (2017-06-30), pages 1294 - 1300, XP055683181, ISSN: 0372-2112, DOI: 10.3969/j.issn.0372-2112.2017.06.003 * |
See also references of EP3702731A4 * |
YANG, WEI ET AL.: "A Fast Autonomous Obstacle Avoidance Algorithm Based on RGB-D Camera", JOURNAL OF HUNNAN UNIVERSITY OF TECHNOLOGY, vol. 29, no. 6, 30 November 2015 (2015-11-30), pages 74 - 79, XP009519863, ISSN: 1673-9833, DOI: 10.3969/j.issn.1673-9833.2015.06.015 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113448326A (zh) * | 2020-03-25 | 2021-09-28 | 北京京东乾石科技有限公司 | 机器人定位方法及装置、计算机存储介质、电子设备 |
CN111950524A (zh) * | 2020-08-28 | 2020-11-17 | 广东省现代农业装备研究所 | 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统 |
CN111950524B (zh) * | 2020-08-28 | 2024-03-29 | 广东省现代农业装备研究所 | 一种基于双目视觉和rtk的果园局部稀疏建图方法和系统 |
CN113012479A (zh) * | 2021-02-23 | 2021-06-22 | 欧阳嘉兰 | 一种基于障碍物分析的飞行限重测量方法、装置及系统 |
CN113077551A (zh) * | 2021-03-30 | 2021-07-06 | 苏州臻迪智能科技有限公司 | 占据栅格地图构建方法、装置、电子设备和存储介质 |
CN113485359A (zh) * | 2021-07-29 | 2021-10-08 | 北京超维世纪科技有限公司 | 一种工业类巡检机器人多传感器融合避障系统 |
Also Published As
Publication number | Publication date |
---|---|
US20200394924A1 (en) | 2020-12-17 |
JP2020535545A (ja) | 2020-12-03 |
AU2018355491B2 (en) | 2022-03-17 |
KR20200031165A (ko) | 2020-03-23 |
CN109708636B (zh) | 2021-05-14 |
AU2018355491A1 (en) | 2020-06-11 |
CN109708636A (zh) | 2019-05-03 |
EP3702731A4 (en) | 2021-07-28 |
KR102385820B1 (ko) | 2022-04-12 |
EP3702731A1 (en) | 2020-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019080924A1 (zh) | 导航图配置方法、避障方法以及装置、终端、无人飞行器 | |
US10914590B2 (en) | Methods and systems for determining a state of an unmanned aerial vehicle | |
US11237572B2 (en) | Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof | |
US20210247764A1 (en) | Multi-sensor environmental mapping | |
Meyer et al. | Comprehensive simulation of quadrotor uavs using ros and gazebo | |
CN109219785B (zh) | 一种多传感器校准方法与系统 | |
US20200007746A1 (en) | Systems, methods, and devices for setting camera parameters | |
US10459445B2 (en) | Unmanned aerial vehicle and method for operating an unmanned aerial vehicle | |
US10240930B2 (en) | Sensor fusion | |
US20190346562A1 (en) | Systems and methods for radar control on unmanned movable platforms | |
ES2889000T3 (es) | Métodos y sistema para controlar un objeto móvil | |
WO2010137596A1 (ja) | 移動体制御装置及び移動体制御装置を搭載した移動体 | |
WO2016187758A1 (en) | Sensor fusion using inertial and image sensors | |
US10983535B2 (en) | System and method for positioning a movable object | |
WO2016023224A1 (en) | System and method for automatic sensor calibration | |
US10937325B2 (en) | Collision avoidance system, depth imaging system, vehicle, obstacle map generator, and methods thereof | |
WO2021199449A1 (ja) | 位置算出方法及び情報処理システム | |
US20210229810A1 (en) | Information processing device, flight control method, and flight control system | |
JP2024021143A (ja) | 3次元データ生成システム、及び3次元データ生成方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18871786 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20207005722 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020517902 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018871786 Country of ref document: EP Effective date: 20200526 |
|
ENP | Entry into the national phase |
Ref document number: 2018355491 Country of ref document: AU Date of ref document: 20181026 Kind code of ref document: A |