WO2019041266A1 - 一种路径规划方法、飞行器、飞行系统 - Google Patents

一种路径规划方法、飞行器、飞行系统 Download PDF

Info

Publication number
WO2019041266A1
WO2019041266A1 PCT/CN2017/100034 CN2017100034W WO2019041266A1 WO 2019041266 A1 WO2019041266 A1 WO 2019041266A1 CN 2017100034 W CN2017100034 W CN 2017100034W WO 2019041266 A1 WO2019041266 A1 WO 2019041266A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation path
aircraft
navigation
current
depth map
Prior art date
Application number
PCT/CN2017/100034
Other languages
English (en)
French (fr)
Inventor
周游
刘洁
熊策
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780004776.2A priority Critical patent/CN108513643A/zh
Priority to PCT/CN2017/100034 priority patent/WO2019041266A1/zh
Publication of WO2019041266A1 publication Critical patent/WO2019041266A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a path planning method, an aircraft, and a flight system.
  • aircraft such as drones, remote-controlled flying devices, aerial cameras, etc.
  • aircraft usually need to rely on the operator to manually fly the work when performing mission work, or perform some simple navigation route planning, such as flying along a straight line in the area to be operated (such as farmland, tea garden, terrace, etc.) .
  • some simple navigation route planning such as flying along a straight line in the area to be operated (such as farmland, tea garden, terrace, etc.) .
  • the path planning method of the above aircraft is less automated and the operation efficiency of the aircraft is low.
  • the embodiment of the invention discloses a path planning method, an aircraft and a flight system, which can improve the automation degree of the aircraft and improve the working efficiency of the aircraft.
  • a first aspect of the embodiments of the present invention discloses a path specification method, including:
  • the aircraft flight is controlled in accordance with the navigation path to perform a task operation on the to-be-worked area.
  • a second aspect of an embodiment of the present invention discloses an aircraft, including: a memory and a processor;
  • the memory is configured to store program instructions
  • the processor is configured to execute program instructions stored in the memory, when program instructions are executed,
  • the processor is used to:
  • the aircraft flight is controlled in accordance with the navigation path to perform a task operation on the to-be-worked area.
  • the third aspect of the embodiment discloses a flight system, including:
  • At least one camera device At least one camera device
  • the aircraft may obtain a work edge line according to the depth map, and then determine a target edge line according to the current position information and the work edge line, and finally obtain a navigation path according to the target edge line plan, according to the navigation.
  • the path execution task can effectively cover the range of the area to be operated, improve the working efficiency of the aircraft, and does not require excessive manual intervention, which satisfies the user's automation and intelligent requirements.
  • FIG. 1 is a schematic diagram of a scenario for path planning according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of another scenario for path planning according to an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a path planning method according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of another path planning method according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an aircraft according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a flight system according to an embodiment of the present invention.
  • the path planning method of the aircraft usually has the following two methods:
  • One way is to rely on the operator to manually control the flight operation, that is, the operator uses a remote control device such as a remote control to remotely control the flight of the aircraft.
  • a remote control device such as a remote control to remotely control the flight of the aircraft.
  • this method depends on the operator's operation level.
  • For the non-professional aircraft user when operating the aircraft, it is necessary to effectively cover the range of the work area, and also to control the aircraft to maintain within a certain height range. The task operation is undoubtedly very difficult, reducing the efficiency of the aircraft and the automation of the aircraft is low.
  • Another way is to have a relatively regular and stable shape to be operated (usually a straight line arrangement, such as a polygonal rice field).
  • a relatively regular and stable shape to be operated (usually a straight line arrangement, such as a polygonal rice field).
  • the operator selects the work area by polygons and markers on the map, and then the aircraft According to the over polygon and the mark point, some parallel linear navigation paths (for example, a z-shaped loop path) are automatically generated to cover the work area.
  • the work area is a hexagonal rice field.
  • the operator may first draw the shape of the area to be worked on the map, and then mark the vertices of the area to be worked.
  • the shape of the area to be worked drawn by the operator is used in the black
  • the line indicates that the point is indicated by a circle.
  • the aircraft can cover the entire area to be operated by a line parallel to each other according to the marking point of the to-be-worked area and the shape of the to-be-worked area, and the straight line parallel to the strip is the navigation path, in FIG. Indicated by a dotted line.
  • the above methods still require human intervention, and are only applicable to the relatively regular and stable shape of the work area, for areas with irregular shapes (usually not arranged in a straight line), and having a height change (such as terraces, tea gardens, etc.) If the direct navigation path is adopted directly, due to its height variation and irregular shape, it is bound to effectively cover the entire area to be operated, and the degree of automation of the aircraft is low.
  • FIG. 2 is a schematic diagram of another scenario for path planning according to an embodiment of the present invention.
  • an aircraft device may be disposed on the aircraft, and the camera device may be used to capture a depth map including an area to be worked under the aircraft.
  • the imaging device may be, for example, a binocular camera, an aerial camera, or the like.
  • the depth map may be a bottom view depth map, that is, a depth map of the to-be-worked area obtained by the aircraft from below.
  • the to-be-worked area in the depth map may be a part of the to-be-worked area, that is, the depth map may include only a part of the to-be-worked area.
  • the to-be-worked area may be an area in which the curves are arranged, but in the part of the to-be-worked area, the edges of the to-be-worked area may be approximated as a straight line arrangement.
  • the area to be operated according to the embodiment of the present invention may be an area where there are work objects such as tea trees planted in tea gardens, crops grown in terraces, and the like, such as tea gardens or terraces.
  • the aircraft can utilize the binocular vision system to call the camera to acquire the depth map and obtain a plurality of job edge lines from the depth map.
  • the job edge line 1, the job edge line 2, and the job edge line 3 it should be noted that in other embodiments, the number of the edge lines of the work may be five, seven, etc., and the present invention does not impose any limitation.
  • the aircraft may search for a depth hopping point from both sides of the image center region of the depth map, wherein the depth hopping point may refer to a depth value corresponding to the point and one or more adjacent ones.
  • the difference between the depth values corresponding to the depth points reaches a preset threshold (for example, 2 meters, 3 meters, etc.), and the depth jump point is represented by a circle in FIG. 2 (where the depth jump point of the job edge line 3 is Not shown).
  • the aircraft may fit the job edge line based on the obtained depth jump point, and the job edge line may be used to identify the edge of the to-be-worked area.
  • the job edge lines are the work edge line 1, the work edge line 2, the work edge line 3, and the aircraft can be separated according to the distance.
  • the two adjacent work edge lines closest to the aircraft shooting position determine the to-be-worked area in the depth map.
  • the adjacent two work edge lines closest to the aircraft shooting position are the work edge line 1 And the job edge line 2, then the job edge line 1 and the job edge line 2 may be the target edge line, and the area between them may be the to-be-worked area.
  • the aircraft may plan a navigation path according to the target edge line, and control the aircraft to automatically fly according to the navigation path to perform a task operation on the to-be-worked area.
  • the navigation path may be a center line between two target edge lines, as shown by a broken line in FIG. 2 .
  • the navigation path may be any position determined according to two target edge lines, for example, a position close to the work edge line 2, and the like, which is not limited in the present invention.
  • the photographed to be displayed in the depth map may be a partial image of the to-be-worked area, which may not be easily seen.
  • the degree of bending of the work area, the commonly obtained work edge line is a straight line, and the planned navigation path can also be a straight line. Therefore, the aircraft can also adjust and get the navigation path in real time.
  • the aircraft may control the flight of the aircraft according to the navigation path (first navigation path) obtained from the depth map, and perform the current first navigation path by the depth map acquired by the camera during the flight. Correcting and obtaining a second navigation path according to the corrected first navigation path prediction.
  • the aircraft may first obtain the respective depth maps according to the first navigation path obtained by the current depth map and the N first navigation paths of the previously acquired depth map (N is a positive integer greater than or equal to 1). Shooting position coordinates, and obtaining relative position information of the shooting position coordinates of each depth map from the shooting position coordinates of the earliest captured depth map (ie, the target position point), and then recording the current first navigation path and the recorded according to the relative position information.
  • the N first navigation paths are mapped to the reference coordinate system in which the target position is located.
  • the aircraft may obtain a flight trajectory for a period of time during which the aircraft has flown in the reference coordinate system, and then the aircraft may predict a second navigation path of the unflighted portion based on the flight trajectory.
  • the aircraft can automatically perform path planning, and can effectively cover the range of the to-be-worked area, and correct and adjust the following navigation path in real time according to the currently planned navigation path, thereby improving the automation degree of the aircraft. .
  • FIG. 3 is a schematic flowchart of a path planning method according to an embodiment of the present invention.
  • the method shown in FIG. 1 may include:
  • the work edge line is obtained according to the depth map fitting.
  • the execution body of the embodiment of the present invention may be an aircraft, and the aircraft is provided with an imaging device for capturing a depth map including an area to be worked under the aircraft.
  • the aircraft may be a drone, a remote control aircraft, an aerial vehicle, etc., which is not limited by the present invention.
  • the drone is an agricultural drone.
  • the agricultural drone is mainly for crops.
  • the agricultural drone can be equipped with spraying equipment to perform tasks such as pesticide spraying and plant irrigation, and can also be equipped with a camera device to perform farmland scene shooting and monitor agriculture.
  • the embodiment of the present invention does not impose any limitation.
  • the camera device may be an aerial camera, a camera, etc., and the present invention is not limited in any way.
  • the to-be-worked area may refer to an area where the aircraft performs the task operation, and the task may be, for example, a pesticide spraying, a plant irrigation, a scene shooting, or the like, which is not limited by the present invention.
  • the ground of the to-be-worked area is an uneven ground
  • the plants growing in the to-be-worked area have a height difference, for example, in the to-be-worked area.
  • the growing plants can be up to 3 meters at the edge of the area, up to 1 meter at the edge of the area, and so on.
  • the to-be-worked areas can be arranged in a curve.
  • the to-be-worked area may be any one or more of tea gardens and terraces.
  • the depth map may be a bottom view depth map captured by the camera.
  • the job edge line can be used to identify the edge of the to-be-worked area. For example, there may be a space between the work area and other areas (eg, gully, etc.) at which the work edge line may be used to represent the edge of the work area.
  • the job edge line is obtained according to the depth map fitting, including: Determining a depth jump point in the depth map according to the depth map; performing line segment fitting according to the depth jump point to obtain a job edge line.
  • the difference between the depth value corresponding to the depth hopping point and the depth value corresponding to the adjacent one or more depth points reaches a preset threshold.
  • the aircraft may first extract depth information in the depth map according to the depth map, for example, a depth difference between depth points, a depth change condition, a depth difference range, and the like, and determine the depth information according to the depth information.
  • the depth jump point in the depth map for example, the difference between the depth point a in the depth map and the adjacent depth point b and depth point c is 3 meters, reaching a preset threshold (assuming a preset threshold here) It is 2 meters), then the aircraft can determine the depth point a as the depth jump point.
  • the aircraft may perform line segment fitting based on the determined depth jump points to obtain a job edge line.
  • the job edge line 1 can be obtained from a depth jump point fit in its vicinity.
  • the specific manner in which the aircraft fits the working edge line may be that the distance of the deep edge jump point near the fitted work edge line distance is within a preset range (for example, within 0.1 m, within 0.2 m, etc.) ).
  • the working edge line obtained by the aircraft may be a line segment of the working edge of the to-be-worked area within the depth map.
  • the job edge line may also be a job edge that is not within the depth map according to a line segment of the job edge of the to-be-worked area. In other words, the job edge line may also exceed the range of the depth map.
  • the aircraft may further determine that the work area closest to the aircraft is the to-be-worked area according to the location information of the aircraft, and determine that the work edge line of the edge of the to-be-worked area is the target edge line.
  • the area to be worked closest to the aircraft may be the largest working area in the entire depth image (the work area between the job edge line 1 and the job edge line 2), then the job edge line 1 and the job edge Line 2 can be the target edge line selected from the job edge line.
  • the aircraft may also perform a smoothing process, a noise filtering process, and the like according to the edge line of the job, and obtain a target edge line according to the processed job edge line.
  • the number of the working edge lines obtained by the fitting may be any number, for example, 2, 3, 5, etc., and the embodiment of the present invention does not impose any limitation.
  • the navigation path is located above the to-be-worked area.
  • the aircraft may select a navigation path directly above the target edge line, or the aircraft may select a navigation path directly above any position between the two target edge lines.
  • the aircraft may determine a centerline between two target edge lines and determine a navigation path based on the centerline.
  • the broken line portion indicates the center line of the work edge line 1 and the work edge line 2, and the aircraft can directly use the center line corresponding to the center line as the navigation path of the aircraft.
  • the aircraft may obtain the centerline according to a Random Sample Consensus (RANSAC).
  • RANSAC Random Sample Consensus
  • the aircraft can obtain a straight line equation of the working edge line 1 (ie, the target edge line) according to RANSAC, and the straight line equation may specifically be:
  • L 1 may represent the job edge line 1
  • x may represent the abscissa of the job edge line 1
  • y may represent the ordinate of the job edge line 1
  • a 1 may represent the coordinate coefficient corresponding to the abscissa
  • b 1 may Represents the coordinate coefficient corresponding to the ordinate
  • c 1 can be a constant.
  • the aircraft can also obtain a straight line equation of the working edge line 2 (ie, the target edge line) according to RANSAC, and the straight line equation can specifically be:
  • L 2 can represent the job edge line 2
  • x can represent the abscissa of the job edge line 2
  • y can represent the ordinate of the job edge line 2
  • a 2 can represent the coordinate coefficient corresponding to the abscissa
  • b 2 can Indicates the coordinate coefficient corresponding to the ordinate
  • c 2 can be a constant.
  • centerline equation can be inferred from the above two equations, and the centerline equation can be specifically:
  • the aircraft can use the direct line corresponding to the center line obtained by the L 3 equation as the navigation path of the aircraft.
  • the navigation path may be a navigation path within a range of the to-be-worked area captured by the depth map.
  • the navigation path may be a navigation path within a range of the to-be-worked area photographed according to the depth map, and a predicted navigation path of the to-be-worked area that is not within the depth map, for example, the aircraft according to S301
  • the navigation path in the range of the to-be-worked area photographed by the depth map is directly above the dotted line segment shown in FIG. 2, and then the aircraft may extend the dotted line segment to obtain a range that is not within the depth map.
  • the navigation path of the to-be-worked area is directly above the dotted line segment shown in FIG. 2, and then the aircraft may extend the dotted line segment to obtain a range that is not within the depth map.
  • the aircraft may control the aircraft to fly according to the indication of the navigation path to perform a task operation on the to-be-worked area above the to-be-worked area.
  • the task assignment may be, for example, plant irrigation, pesticide spraying, scene shooting, and the like.
  • the depth map includes relative height information of the aircraft and the area to be worked.
  • the relative height information of the aircraft and the to-be-worked area may be an altitude difference between an altitude of the aircraft and an altitude of a crop (eg, rice, tea tree, etc.) in the to-be-worked area.
  • a crop eg, rice, tea tree, etc.
  • the relative height information of the aircraft and the area to be worked may be: the relative height of the two is 5 meters.
  • the method further includes adjusting a height of the aircraft to a target relative height based on the relative height information to maintain the aircraft flying at the target relative height.
  • the relative height of the target may be, for example, a relative height of 3 meters, 4 meters, etc., which is not limited in the present invention.
  • the aircraft may preset the target relative height, and when detecting that the relative height indicated in the relative height information is greater than or less than the target relative height, the height of the aircraft may be adjusted to maintain the aircraft. And the to-be-worked area is maintained at a relative height of the target.
  • the aircraft can preset the target relative height to be 3 meters. Then the flight The relative height information can be obtained from the acquired depth map. If the relative height information indicates that the current relative height is 5 meters and is greater than the target relative height by 2 meters, the aircraft can adjust its own flying height to decrease 2 Meter to keep flying at the relative height of the target.
  • the aircraft may also preset a range of relative heights of the target, for example, within 2 to 4 meters. Then, if the relative height information obtained from the depth map indicates that the current relative height is 3 meters, within the target relative height range, then the aircraft can continue to remain at the current one without adjusting its own flying height. Flying at flight altitude.
  • the aircraft can obtain the working edge line according to the depth map, and determine the target edge line according to the working edge line and the current position information of the aircraft, and finally obtain the navigation according to the target edge line plan.
  • the path is controlled according to the navigation path, and the aircraft can automatically plan the navigation path, which improves the automation degree of the aircraft to a certain extent, and enables the aircraft to perform the task according to the planned navigation path, without human intervention, and improves the aircraft. Task work efficiency.
  • FIG. 4 is a schematic flowchart diagram of another path planning method according to an embodiment of the present invention.
  • the method as shown in FIG. 4 may include:
  • the job edge line is used to identify an edge of the to-be-worked area.
  • the first navigation path is a navigation path within a shooting range of the depth map.
  • the aircraft may determine that the work edge line 1 and the work edge line 2 are the target edge line, and determine the line segment indicated by the dotted line determined by the target edge line as the depth map.
  • the navigation path within the shooting range that is, the first navigation path.
  • the target edge line selected from the fitted work edge lines includes two, and determining the current first navigation path according to the target edge line includes: determining two target edge lines a center line between the lines, and determining a current first navigation path according to the center line.
  • the aircraft may select two work edge lines from the plurality of work edge lines as the target edge line, such as the work edge line 1 in FIG. 2 and the work edge line 2, and then, The aircraft may determine that the center line segment of the two target edge lines is the current first navigation path of the aircraft, and the dotted line segment as shown in FIG. 2 may be the first navigation path.
  • the second navigation path is a navigation path that is not within the shooting range of the depth map.
  • the second navigation path may be a navigation path of the to-be-worked area where the aircraft has not captured the depth map.
  • the aircraft captures the depth map a at 15:30 at the current time. It can be predicted that the aircraft needs to perform the mission work on the waiting area at 15:35, but this time, the depth is not captured at 15:35. Therefore, the aircraft can capture the first navigation path obtained by the depth map a according to the current time 15:30, and plan the navigation path that the aircraft may fly at 15:35.
  • the second navigation path can be directly obtained according to the current first navigation path.
  • the dotted line segment shown in FIG. 2 is the first navigation path obtained by the aircraft for the depth map, and the aircraft can directly extend the dotted line segment in a straight line, and the extended straight line can be used as the second navigation path.
  • the obtaining, by the current first navigation path, the second navigation path includes:
  • the aircraft may fly according to the first navigation path. Since the first navigation path has a certain navigation distance value, the drone can continuously acquire the depth map in the middle of the flight, and then according to the depth. The figure corrects the portion of the first navigation path that has not yet flowed, and predicts the navigation path of the next segment, that is, the second navigation path, according to the currently modified first navigation path.
  • the navigation section has a certain navigation distance value (for example, 5 meters), and the aircraft may fly to
  • the camera device is called to acquire the depth map b
  • the aircraft can call the camera to acquire the depth map c when flying to the position of 2 meters (that is, the depth map) b and depth map c are depth maps acquired by the aircraft during flight along the first navigation path).
  • the aircraft can plan a navigation path h according to the depth map b, and obtain a navigation path f according to the depth map c.
  • the navigation path h and the navigation path f can be obtained according to the planning of the first navigation path, and The navigation path h and the navigation path f may have a portion that intersects the first navigation path.) Further, since the depth map b and the depth map c are images taken at positions that are not far apart, the two depth maps may have The overlapping part, then the navigation path obtained by the two can also have overlapping parts. The aircraft may overlap the coincident portions of the planned two navigation paths, and then smooth the two navigation paths, and then correct the unflighted portion of the first navigation path (for example, after the first navigation path) The 3 meter part is corrected), and the corrected first navigation path is obtained.
  • the aircraft may predict the first stage according to the flight trend of the modified first navigation path, or may also combine the portion of the depth map b and the depth map c that exceeds the first navigation path range. Two navigation paths. For example, if the corrected first navigation path is a straight line segment, then the aircraft can speculate that the next flight trend in the to-be-worked area is also flying in a straight line, and therefore, the aircraft can predict that the second navigation path is a straight path.
  • the obtaining the second navigation path according to the current first navigation path plan includes: predicting the second navigation path according to the current first navigation path and the recorded N first navigation paths Where N is a positive integer greater than or equal to 1.
  • the second navigation path may be predicted according to the current first navigation path and the recorded N first navigation paths. The way.
  • the aircraft may integrate the current first navigation path and the recorded N first navigation paths by line segments, and integrate them in time series, and then according to the respective navigation paths.
  • the flight trend smoothes each of the first navigation paths and integrates to obtain a smooth navigation trajectory that can be used to indicate the flight trajectory that the aircraft has flown.
  • the aircraft may predict the second navigation path based on flight trends exhibited by the navigation trajectory.
  • the predicting the second navigation path according to the current first navigation path and the recorded N first navigation paths including: the current first navigation path and the recorded N first navigation paths are mapped to a reference coordinate system in which the target position point is located; and predicted according to the coordinate position of the current first navigation path and the recorded N first navigation paths in the reference coordinate system The second navigation path is out.
  • the target location point may be a location location coordinate corresponding to the depth map of the earliest shooting time in the depth map corresponding to each first navigation path.
  • the target position point may be the shooting position coordinate corresponding to any one of the depth maps, and the present invention does not impose any limitation.
  • the reference coordinate system may be a coordinate system established with the target position point as an origin.
  • the aircraft may first acquire the shooting time of the depth map corresponding to each of the current first navigation path and the recorded N first navigation paths, and then select the target depth map with the earliest shooting time.
  • the coordinate position corresponding to the target depth map is used as the target position point, and the coordinate system established by using the target position point as the origin.
  • the aircraft may also map the current first navigation path and the recorded N first navigation paths to the reference coordinate system, and may be in the reference coordinate system according to each first
  • the flight direction of the navigation path first obtains the flight trajectory that has been flown, and then predicts the second navigation path based on the flight trajectory.
  • the mapping the current first navigation path and the recorded N first navigation paths to the reference coordinate system where the target location point is located includes: acquiring the current first navigation Corresponding position information between the first navigation position point and the target position point on the path; acquiring relative position information of the second navigation position point and the target position point on the recorded N first navigation paths; The location information maps the current first navigation path and the recorded N first navigation paths to a reference coordinate system in which the target location point is located.
  • the relative position information refers to a relative displacement and a posture rotation relationship between the navigation position point and the target position point.
  • the navigation position point may be a shooting position coordinate of the depth map corresponding to the first navigation path.
  • the first navigation position point is a shooting position point of the depth map corresponding to the current first navigation path, and the second navigation position point is corresponding to each of the recorded N first navigation paths.
  • the shooting position of the depth map is a shooting position point of the depth map corresponding to the current first navigation path, and the second navigation position point is corresponding to each of the recorded N first navigation paths.
  • the first navigation location point and the second navigation location point may also be location coordinates corresponding to a center point of the corresponding depth map.
  • the posture rotation relationship is obtained according to a shooting posture corresponding to the navigation position point and a shooting posture corresponding to the target position point.
  • the shooting posture may be, for example, that the imaging device performs imaging at an angle of 90 degrees in a plan view, or at an angle of 45 degrees, or the like.
  • the shooting pose can be represented by a gesture quaternion.
  • the relative displacement may refer to a displacement difference between each navigation position point and the target position point, for example, 1 meter, 3 meters, and the like, which is not limited in this embodiment of the present invention.
  • the aircraft may first acquire an attitude rotation relationship and a relative displacement between a navigation position point and a target position point on the current first navigation path, and then acquire the recorded N first navigation paths.
  • the attitude rotation relationship and the relative displacement between the navigation point and the target position point are finally obtained.
  • the coordinate positions of the respective navigation position points in the reference coordinate system are obtained, and the current first The navigation path and the recorded N first navigation paths are mapped to the reference coordinate system, and the second navigation path is predicted according to the obtained coordinate position.
  • depth map 1, depth map 2, and depth map 3 are taken as examples, respectively, corresponding to three first navigation paths, and depth map 1 corresponds to the first navigation path 1, and depth map 2 corresponds to the first
  • the navigation path 2 depth map 3 corresponds to the first navigation path 3.
  • the current first navigation path is the first navigation path 1
  • the recorded first navigation path is the first navigation path 2 and the first navigation path 3, respectively.
  • the aircraft may first acquire the shooting time of the three depth maps and the shooting coordinates, and then select the shooting coordinates of the earliest shooting time (for example, the shooting coordinates of the depth map 3) as the target position point, and obtain the remaining The relative displacement of each of the two shooting coordinates from the target position point, assuming that the position of the target position point is marked as T 0 , the photographing sitting mark of the depth map 1 is T 1 , and the photographing sitting mark of the depth map 2 is T 2 , then
  • the relative displacements of depth map 1 and depth map 2 can be represented by the following formulas, respectively:
  • the aircraft may pose inertial measurement unit obtained by imaging time three quaternion depth map (shooting attitude can be expressed), the depth of the posture of FIG. 1 can quaternion Q 1 represents depth 2
  • the attitude quaternion can be represented by Q 2
  • the attitude quaternion of the depth map 3 can be represented by Q 0.
  • the depth rotation diagram 1 and the attitude rotation relationship of the depth map 2 with respect to the target position point can be respectively obtained by the following formula:
  • 1, 2 in the depth map a depth map imaging position coordinates P 1, P 2, to a projected position coordinates of the target point to establish the position of the reference coordinate system, respectively, can be:
  • the aircraft can fit the position coordinates with a smooth curve or a straight line according to the position coordinates obtained, and the flighted navigation track of the aircraft from the depth map 1 to the depth map 3 can be obtained. And can predict the possible navigation path after the flight trajectory of the navigation trajectory.
  • the waiting area is a terrace, and the navigation trajectory for a period of time can approach a circular arc with a large radius, and the trend of the arc can correct and adjust the next navigation path in real time.
  • the navigation path may include the first navigation path and the second navigation path.
  • the aircraft may control the aircraft to fly during the current time period according to the first navigation path and predict the second navigation path, and then the aircraft may control the aircraft to fly in the next period of time according to the second navigation path.
  • the aircraft can obtain the working edge line according to the depth map fitting, and determine the target edge line according to the working edge line and the current position information of the aircraft, and then obtain the first according to the target edge line plan.
  • a navigation path is obtained according to the first navigation path, and then the aircraft is controlled according to the navigation path, so that the navigation path of the aircraft can be corrected and adjusted, and the navigation path is automatically planned by the aircraft, and the aircraft is improved to some extent.
  • the degree of automation allows the aircraft to perform mission tasks in accordance with the planned navigation path, without the need for human intervention, which improves the efficiency of the mission of the aircraft.
  • FIG. 5 is a schematic structural diagram of a first embodiment of an apparatus according to an embodiment of the present invention.
  • the device described in this embodiment includes:
  • FIG. 5 is a schematic structural diagram of an aircraft according to an embodiment of the present invention.
  • the aircraft described in this embodiment includes:
  • the memory 501 is configured to store program instructions
  • the processor 502 is configured to execute the program instructions stored in the memory, when the program instructions are executed, to:
  • the aircraft flight is controlled in accordance with the navigation path to perform a task operation on the to-be-worked area.
  • the processor 502 when the processor 502 obtains a job edge line according to the depth map, it is specifically used to:
  • a line segment fitting is performed according to the depth jump point to obtain a job edge line.
  • the difference between the depth value corresponding to the depth hopping point and the depth value corresponding to the adjacent one or more depth points reaches a preset threshold.
  • the processor 502 when the processor 502 obtains a navigation path according to the target edge line plan, specifically:
  • the target edge line selected from the fitted job edge lines includes two For determining the current first navigation path according to the target edge line, the processor 502 is specifically configured to:
  • a center line between the two target edge lines is determined, and a current first navigation path is determined according to the center line.
  • the processor 502 when the processor 502 obtains the second navigation path according to the current first navigation path plan, specifically:
  • the processor 502 when the processor 502 obtains the second navigation path according to the current first navigation path plan, specifically:
  • N is a positive integer greater than or equal to 1.
  • the processor 502 predicts the second navigation path according to the current first navigation path and the recorded N first navigation paths, specifically for:
  • the processor 502 maps the current first navigation path and the recorded N first navigation paths to a reference coordinate system in which the target location point is located, specifically for :
  • the relative position information refers to a relative displacement and attitude rotation relationship between the navigation position point and the target position point.
  • the first navigation location point is a shooting location point of the depth map corresponding to the current first navigation path
  • the second navigation location point is the recorded N first navigation points The shooting position point of the depth map corresponding to each path.
  • the attitude rotation relationship is obtained according to a shooting attitude corresponding to the navigation position point and a shooting attitude corresponding to the target position point.
  • the depth map includes relative height information of the aircraft and the area to be worked.
  • the processor 502 is further configured to adjust the height of the aircraft to a target relative height according to the relative height information to keep the aircraft flying at the target relative height.
  • the area to be worked is any one or more of tea gardens and terraces.
  • FIG. 6 is a schematic structural diagram of a flight system according to an embodiment of the present invention. As shown in FIG. 6, the flight system includes: an aircraft 601, at least one camera 602, and.
  • the aircraft 601 is the same as the above-mentioned embodiment, and the principle and implementation are similar to the above embodiments, and details are not described herein again.
  • the camera device 602 can be disposed on the aircraft for capturing a depth map including an area to be worked under the aircraft.
  • the flight system can be applied to equipment such as drones, remote control aircraft, and the like.
  • the imaging device 602 can be mounted on the main body of the drone (ie, the aircraft 601) via a gimbal or other mounted device.
  • the camera device 602 is used for image or video shooting during flight of the drone, including but not limited to a multi-spectral imager, a hyperspectral imager, a visible light camera, an infrared camera, etc., and the camera device 602 can be one or one the above.
  • the aircraft 601 can control the camera 601 to capture a depth map, and fit a work edge line according to the depth map, and determine a target edge line according to the current position information of the aircraft and the fitted work edge line. Obtaining a navigation path according to the target edge line plan, and controlling the flight of the aircraft according to the navigation path to perform the operation area Line task jobs.
  • aircraft 601 can be used to perform the path planning method shown in the foregoing method embodiment, and the specific implementation process can refer to the method embodiment, and details are not described herein.
  • the program can be stored in a computer readable storage medium, and the storage medium can include: Flash disk, Read-Only Memory (ROM), Random Access Memory (RAM), disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

一种路径规划方法、飞行器、飞行系统,其中方法包括:根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业,可以提升飞行器的自动化程度,提高飞行器的作业效率。

Description

一种路径规划方法、飞行器、飞行系统 技术领域
本发明涉及电子技术领域,尤其涉及一种路径规划方法、飞行器、飞行系统。
背景技术
随着电子技术的不断发展,飞行器(例如无人机、遥控飞行装置、航拍装置等)也得到了很大发展。
目前,飞行器通常在进行任务作业时,需要依靠操作者手工飞行作业,或者执行一些较为简单的导航路径规划,例如在该待作业区域(如农田、茶园、梯田等)上沿着直线飞行等等。然而,上述飞行器的路径规划方式自动化程度较低,飞行器的作业效率低下。
发明内容
本发明实施例公开了一种路径规划方法、飞行器、飞行系统,可以提升飞行器的自动化程度,提高飞行器的作业效率。
本发明实施例第一方面公开了一种路径规范方法,包括:
根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;
根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;
根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;
按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
本发明实施例第二方面公开了一种飞行器,包括:存储器和处理器;
所述存储器,用于存储程序指令;
所述处理器,用于执行所述存储器存储的程序指令,当程序指令被执行时, 所述处理器用于:
根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;
根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;
根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;
按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
本实施例第三方面公开了一种飞行系统,包括:
至少一个摄像装置;
如第二方面所述的飞行器。
本发明实施例中,飞行器可以根据深度图拟合得到作业边缘线,然后根据当前的位置信息以及该作业边缘线,确定出目标边缘线,最后根据该目标边缘线规划得到导航路径,按照该导航路径执行任务作业,可以有效覆盖待作业区域范围,提升飞行器的作业效率,并且不需要过多的人工干预,满足了用户的自动化、智能化需求。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的一种用于路径规划的情景示意图;
图2是本发明实施例提供的另一种用于路径规划的情景示意图;
图3是本发明实施例提供的一种路径规划方法的流程示意图;
图4是本发明实施例提供的另一种路径规划方法的流程示意图;
图5是本发明实施例提供的一种飞行器的结构示意图;
图6是本发明实施例提供的一种飞行系统的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述。
当飞行器(例如无人机等)需要在待作业区域(例如茶园、梯田等)进行喷洒农药、灌溉植物等任务作业时,常常需要按照导航路径飞行,以高效完成任务作业。
飞行器的路径规划方式,通常具有以下两种方式:
一种方式是依靠操作者手工控制飞行作业,即操作者利用遥控器等遥控装置来远程控制飞行器飞行。然而,这种方式依赖于操作者的操作水平,对于非专业飞行器用户来说,在操作飞行器飞行,需要有效覆盖待作业区域的范围的同时,还要兼顾控制飞行器保持在一定的高度范围内执行任务作业,无疑十分困难,降低了飞行器的作业效率,且飞行器的自动化程度较低。
另一种方式是在形状较为规则且平稳的待作业区域(通常为直线排布,如多边形的稻田)等,首先由操作者在地图上通过多边形和标记点勾选出作业区域,然后飞行器再根据该过多边形和标记点自动生成一些互相平行的直线导航路径(例如Z字回环路径),来覆盖作业区域。
举例来说,如图1所示,待作业区域为一六边形的稻田。操作者可以首先在地图上将该待作业区域的形状描绘出来,然后再在待作业区域的顶点处做上标记点,图1中,操作者描绘出的该待作业区域的形状用该黑色实线表示,该标记点用圆圈表示。然后,飞行器可以根据该待作业区域的标记点以及该待作业区域的形状,用一条条互相平行的直线来覆盖整个待作业区域,该一条条互相平行的直线即为导航路径,在图1中用虚线表示。
然而,上述方式依然需要人为干预,且只适用于形状较为规则且平稳的待作业区域,对于形状不规则(通常不呈直线排布),且具有高度变化的待作业区域(例如梯田、茶园等),如果直接采用直线导航路径的方式,由于其高度变化和不规则的形状,势必不能有效覆盖整个待作业区域的范围,飞行器的自动化程度较低。
为了解决上述提出的技术问题,提高飞行器的自动化程度,本发明实施例 提供了一种路径规划方法、飞行器、飞行系统。为了更好的说明,请参阅图2,为本发明实施例提供的另一种用于路径规划的情景示意图。
在图2中,飞行器上可以设置有摄像装置,该摄像装置可以用于拍摄该飞行器下方的包括待作业区域的深度图。其中,该摄像装置例如可以是双目摄像头、航拍相机等。
其中,该深度图可以为下视深度图,即飞行器从上方拍摄下方获取到的待作业区域的深度图。
需要说明的是,该深度图中的待作业区域可以为该待作业区域的局部,也就是说,该深度图可以只包括该待作业区域的一部分。该待作业区域可以为曲线排列的区域,但在待作业区域的局部,可以近似认为该待作业区域的边缘为直线排列。
还需要说明的是,本发明实施例所述的待作业区域,可以是指茶园或者梯田等存在作业对象(例如茶园中种植的茶树、梯田中种植的农作物等等)的区域。
在一个实施例中,飞行器可以利用双目视觉系统,调用摄像装置获取深度图,并从该深度图得到多个作业边缘线,图2中以作业边缘线1、作业边缘线2、作业边缘线3来表示,应知,在其他实施例中,该作业边缘线的数量还可以为5条、7条等,本发明对此不作任何限制。
其中,摄像装置的视场角(field of view,FOV)越大,拍摄到的作业边缘线的数量就可以越多,以及该无人机的飞行高度越高,拍摄到的作业边缘线的数量也可以越多。换句话说,该深度图中的作业边缘线中的数量可以取决于摄像装置的视场角(field of view,FOV),以及该无人机的飞行高度。
在一个实施例中,该飞行器可以从深度图的图像中心区域开始,向两侧搜索深度跳变点,其中,该深度跳变点可以指该点对应的深度值与相邻的一个或者多个深度点所对应的深度值之间的差值达到预设阈值(例如2米,3米等),图2中以圆圈来表示该深度跳变点(其中,作业边缘线3的深度跳变点未示出)。
在一个实施例中,该飞行器可以根据得到的深度跳变点,拟合得到该作业边缘线,该作业边缘线可以用于标识该待作业区域的边缘。在图2中,该作业边缘线分别为作业边缘线1、作业边缘线2、作业边缘线3,飞行器可以根据距 离该飞行器拍摄位置最近的相邻两条作业边缘线,确定出深度图中的待作业区域,例如在图2中,距离该飞行器拍摄位置最近的相邻两条作业边缘线为作业边缘线1和作业边缘线2,那么该作业边缘线1和作业边缘线2则可以是目标边缘线,它们之间的区域则可以是该待作业区域。
在一个实施例中,该飞行器可以根据该目标边缘线规划得到导航路径,并按照该导航路径控制飞行器自动飞行,以对该待作业区域执行任务作业。其中,该导航路径可以是两条目标边缘线之间的中心线,如图2中的虚线所示。或者,该导航路径还可以是根据两条目标边缘线确定的任意位置,例如靠近作业边缘线2的位置等,本发明对此不作任何限制。
需要说明的是,飞行器如果是在较低的高度(例如2米,3米等)飞行,拍摄出来的深度图显示的待作业区域,可能是该待作业区域的局部图像,可能不易看出待作业区域的走势弯曲程度,通常得到的作业边缘线为直线,规划出的导航路径也可以是直线。因此,飞行器还可以实时调整并正在得到的导航路径。
在一个实施例中,飞行器可以根据从该深度图得到的导航路径(第一导航路径)控制该飞行器飞行,并在飞行过程中通过摄像装置采集到的深度图对该当前的第一导航路径进行修正,并根据修正后的第一导航路径预测得到第二导航路径。
在一个实施例中,该飞行器可以根据当前深度图得到的第一导航路径和之前获取到的深度图的N个第一导航路径(N为大于等于1的正整数),首先获取各个深度图的拍摄位置坐标,并求取各个深度图的拍摄位置坐标距离最早拍摄的深度图的拍摄位置坐标(即目标位置点)的相对位置信息,然后根据该相对位置信息将当前第一导航路径和已记录的N个第一导航路径,映射到目标位置点所在参考坐标系下。
在一个实施例中,该飞行器可以在该参考坐标系下得到该飞行器已飞行过的一段时间内的飞行轨迹,然后,飞行器可以根据该飞行轨迹,预测出未飞行部分的第二导航路径。
可见,通过上述路径规划的方法,可以使飞行器自动进行路径规划,并且能够有效覆盖待作业区域的范围,并按照当前规划的导航路径实时纠正并调整接下来的导航路径,提高了飞行器的自动化程度。
为了更清楚的描述,下面描述本发明实施例中所描述的路径规划方法。请参阅图3,为本发明实施例提供的一种路径规划方法的流程示意图,图1所示的方法可包括:
S301、根据所述深度图拟合得到作业边缘线。
需要说明的是,本发明实施例的执行主体可以是飞行器,该飞行器上设置有摄像装置,所述摄像装置用于拍摄所述飞行器下方的包括待作业区域的深度图。
其中,该飞行器可以是无人机、遥控飞行器、航拍飞行器等等,本发明对此不作限制。
在一些可行的实施方式中,该无人机为农业无人机。其中,该农业无人机的作业对象主要为农作物,具体的,该农业无人机可以搭载喷洒设备来执行农药喷洒、植物灌溉等任务作业,还可以搭载摄像装置来执行农田场景拍摄、监控农业病虫害等任务作业,本发明实施例对此不作任何限制。
其中,该摄像装置可以是航拍相机、摄像头等等,本发明对比不作任何限制。
在一些可行的实施方式中,所述待作业区域内存在高度差,且为曲线排列。
需要说明的是,该待作业区域可以指飞行器执行任务作业的区域,该任务作业例如可以是农药喷洒,植物灌溉,场景拍摄等等,本发明对此不作限制。
还需要说明的是,该待作业区域内可以存在高度差,例如待作业区域的地面为凹凸不平的地面,或者该待作业区域内生长的植物具有高度差异,举例来说,该待作业区域内生长的植物可以在区域边缘处最高为3米,在区域边缘处最高为1米等等。并且,该待作业区域可以按照曲线排列。
其中,该待作业区域可以为茶园、梯田的任意一种或多种。
还需要说明的是,该深度图可以是摄像装置拍摄得到的下视深度图。
其中,该作业边缘线可用于标识该待作业区域的边缘。举例来说,待作业区域与其他区域可以存在间隔(例如沟壑等),该作业边缘线可以在该间隔处,用来表示该待作业区域的边缘。
在一些可行的实施方式中,该根据该深度图拟合得到作业边缘线,包括: 根据该深度图确定该深度图中的深度跳变点;根据该深度跳变点进行线段拟合得到作业边缘线。
其中,该深度跳变点所对应的深度值与相邻的一个或者多个深度点所对应的深度值之间的差值达到预设阈值。
需要说明的是,飞行器可以首先根据该深度图,提取该深度图中的深度信息,例如各个深度点之间的深度差,深度变化情况,深度差异范围等等,并根据该深度信息确定出该深度图中的深度跳变点,例如深度图中的深度点a与相邻的深度点b和深度点c之间的差值均为3米,达到了预设阈值(假设此处预设阈值为2米),那么该飞行器可以确定深度点a为深度跳变点。
在一个实施例中,该飞行器可以根据确定出来的深度跳变点进行线段拟合,得到作业边缘线。举例来说,如图2所示,作业边缘线1可以根据其附近的深度跳变点拟合得到。
在一个实施例中,该飞行器拟合作业边缘线的具体方式可以是拟合得到的作业边缘线距离附近的深度跳变点的距离都在预设范围内(例如在0.1米内,0.2米内等等)。
还需要说明的是,该飞行器得到的作业边缘线可以是在该深度图范围内的待作业区域的作业边缘的线段。或者,该作业边缘线也可以是根据该待作业区域的作业边缘的线段推测的未在该深度图范围内的作业边缘,换句话说,该作业边缘线也可以超出该深度图拍摄的范围。
S302、根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线。
在一个实施例中,飞行器还可以根据该飞行器当前所在的位置信息,确定距离该飞行器最近的作业区域为待作业区域,并确定该待作业区域边缘的作业边缘线为目标边缘线。例如图2中,离该飞行器最近的待作业区域可以是整个深度图像中范围最大的作业区域(作业边缘线1以及作业边缘线2之间的作业区域),那么该作业边缘线1以及作业边缘线2可以是从作业边缘线中选择出来的目标边缘线。
在一个实施例中,该飞行器还可以根据将该作业边缘线进行平滑处理、噪点过滤处理等等,并根据处理后的作业边缘线得到目标边缘线。
其中,一张深度图中,该拟合得到的作业边缘线的数量可以为任意数量,例如为2条、3条、5条等等,本发明实施例对此不作任何限制。
S303、根据所述目标边缘线规划得到导航路径。
其中,所述导航路径位于所述待作业区域的上方。
在一个实施例中,该飞行器可以选择该目标边缘线的正上方为导航路径,或者,该飞行器也可以选择两条目标边缘线之间的任意位置的正上方为导航路径。
在一个实施例中,该飞行器可以确定两条目标边缘线间的中心线,并根据该中心线确定导航路径。举例来说,如图2所示,虚线部分表示作业边缘线1和作业边缘线2的中心线,该飞行器便可以将该中心线对应的正上方作为该飞行器的导航路径。
在一些可行的实施方式中,该飞行器可以根据随机抽样一致算法(Random sample consensus,RANSAC)得到该中心线。举例来说,如图2所示,该飞行器可以根据RANSAC得到该作业边缘线1(即目标边缘线)的直线方程,其直线方程具体可以为:
L1:a1x+b1y+c1=0
其中,L1可以表示该作业边缘线1,x可以表示该作业边缘线1的横坐标,y可以表示该作业边缘线1的纵坐标,a1可以表示横坐标对应的坐标系数,b1可以表示纵坐标对应的坐标系数,c1可以为常数。
其中,该飞行器还可以根据RANSAC得到该作业边缘线2(即目标边缘线)的直线方程,其直线方程具体可以为:
L2:a2x+b2y+c2=0
其中,L2可以表示该作业边缘线2,x可以表示该作业边缘线2的横坐标,y可以表示该作业边缘线2的纵坐标,a2可以表示横坐标对应的坐标系数,b2可以表示纵坐标对应的坐标系数,c2可以为常数。
然后,可以由上述两个方程推测出中心线方程,其中心线方程具体可以是:
L3:(a1+a2)x+(b1+b2)y+(c1+c2)=0
由此,该飞行器便可以将该L3方程得到的中心线对应的正上方作为该飞行器的导航路径。
还需要说明的是,该导航路径,可以是该深度图拍摄的待作业区域的范围内的导航路径。或者,该导航路径,还可以是根据该深度图拍摄的待作业区域的范围内的导航路径,预测得到的未在该深度图范围内的该待作业区域的导航路径,例如,飞行器根据S301至S303步骤得到在该深度图拍摄的待作业区域的范围内的导航路径为图2所示的虚线线段的正上方,那么该飞行器可以将该虚线线段进行延伸,得到未在该深度图范围内的该待作业区域的导航路径。
S304、按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
需要说明的是,该飞行器可以按照该导航路径的指示,控制该飞行器进行飞行,以在该待作业区域上方,对该待作业区域执行任务作业。
其中,该任务作业例如可以是植物灌溉、农药喷洒、场景拍摄等等。
在一些可行的实施方式中,所述深度图包括所述飞行器与所述待作业区域的相对高度信息。
需要说明的是,该飞行器与该待作业区域的相对高度信息可以是该飞行器的海拔高度与该待作业区域中的作物(例如水稻、茶树等)的海拔高度之间的海拔高度差。
例如,该飞行器的海拔高度为2305米,该待作业区域中的作物的海拔高度为2300米,那么该飞行器与所述待作业区域的相对高度信息可以是:二者相对高度为5米。
在一些可行的实施方式中,所述方法还包括:根据所述相对高度信息调整所述飞行器的高度为目标相对高度,以使所述飞行器保持在所述目标相对高度上飞行。
需要说明的是,该目标相对高度例如可以是相对高度为3米,4米等,本发明对此不作限制。
在一些可行的实施方式中,该飞行器可以预先设置该目标相对高度,当检测到相对高度信息中表示的相对高度大于或小于该目标相对高度时,就可以调整该飞行器的高度,以保持该飞行器和该待作业区域保持在该目标相对高度上。
举例来说,该飞行器可以预先设置该目标相对高度为3米。然后,该飞行 器可以从获取到的深度图中,得到该相对高度信息,如果该相对高度信息表示当前的相对高度为5米,大于该目标相对高度2米,那么,该飞行器可以调整自身的飞行高度降低2米,以保持在该目标相对高度上飞行。
在一些可行的实施方式中,该飞行器也可以预设设置目标相对高度的范围,例如2米-4米内。然后,该飞行器如果从深度图中得到的该相对高度信息表示当前的相对高度为3米,在该目标相对高度范围内,那么,该飞行器可以不用调整自身的飞行高度,继续保持在该当前的飞行高度上飞行。
可见,在本发明实施例中,飞行器可以根据深度图拟合得到作业边缘线,并根据该作业边缘线以及该飞行器当前的位置信息,确定出目标边缘线,最后根据该目标边缘线规划得到导航路径,并按照该导航路径控制该飞行器飞行,可以实现飞行器自动规划导航路径,一定程度上提高了飞行器的自动化程度,使飞行器按照规划得到的导航路径执行任务作业,无需人为干预,提升了飞行器的任务作业效率。
请参阅图4,是本发明实施例提供的另一种路径规划方法的流程示意图。如图4所示的方法可包括:
S401、根据所述深度图拟合得到作业边缘线。
其中,所述作业边缘线用于标识所述待作业区域的边缘。
S402、根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线。
需要说明的是,本发明实施例中的S401及S402步骤的具体实现过程可参考前述方法实施例中的S301以及S302步骤,在此不作赘述。
S403、根据所述目标边缘线确定当前的第一导航路径。
其中,所述第一导航路径为在所述深度图的拍摄范围内的导航路径。
举例来说,如图2所示,该飞行器可以确定该作业边缘线1以及该作业边缘线2为该目标边缘线,并将该目标边缘线确定的用虚线表示的线段作为在该深度图的拍摄范围内的导航路径,也就是该第一导航路径。
可选的,所述从拟合得到的作业边缘线中选择的目标边缘线包括两条,所述根据所述目标边缘线确定当前的第一导航路径,包括:确定两条目标边缘线 间的中心线,并根据所述中心线确定当前的第一导航路径。
在一些可行的实施方式中,该飞行器可以从多条作业边缘线中,选择出两条作业边缘线作为该目标边缘线,例如图2中的作业边缘线1以及该作业边缘线2,然后,该飞行器可以确定该两条目标边缘线的中心线段为该飞行器当前的第一导航路径,如图2所示的虚线段就可以为该第一导航路径。
S404、根据所述当前的第一导航路径规划得到第二导航路径。
其中,所述第二导航路径为未在所述深度图的拍摄范围内的导航路径。
需要说明的是,该第二导航路径可以是该飞行器还未拍摄得到深度图的该待作业区域的导航路径。例如,飞行器在当前时间15:30分拍摄得到深度图a,可以预测该飞行器在15:35分还需要在该待作业区域上执行任务作业,但这时并没有拍摄得到15:35时的深度图,因此,该飞行器可以根据当前时间15:30分拍摄得到深度图a得到的第一导航路径,规划得到15:35时该飞行器可能会飞行的导航路径。
还需要说明的是,该第二导航路径可以根据该当前的第一导航路径直接得到。例如,如图2所示的虚线段为该飞行器针对该深度图得到的第一导航路径,该飞行器可以直接将该虚线段进行直线延伸,延伸得到的直线便可以作为该第二导航路径。
可选的,所述根据所述当前的第一导航路径规划得到第二导航路径,包括:
按照所述当前的第一导航路径控制所述飞行器飞行;根据飞行过程中所述摄像装置采集到的深度图对所述当前的第一导航路径进行修正;根据修正后的第一导航路径预测得到第二导航路径。
在一些可行的实施方式中,该飞行器可以按照该第一导航路径飞行,由于该第一导航路径具有一定的导航距离值,因此,无人机可以在飞行中途不断获取深度图,然后根据该深度图对该第一导航路径的还未飞行的部分进行修正,并根据当前修正后的第一导航路径预测下一段的导航路径,即第二导航路径。
例如,该飞行器在按照该第一导航路径飞行的过程中,由于该第一导航路径可以为一导航路段,该导航路段具有一定的导航距离值(例如为5米),该飞行器可以在飞行到1米的位置时,调用摄像装置获取到深度图b,该飞行器可以在飞行到2米的位置时,调用摄像装置获取到深度图c(也就是说,该深度图 b和深度图c是飞行器在按照第一导航路径飞行途中获取到的深度图)。这时,该飞行器可以根据该深度图b规划得到一个导航路径h,根据该深度图c规划得到一个导航路径f(该导航路径h和导航路径f可以按照规划第一导航路径的方式得到,且该导航路径h和导航路径f可以和第一导航路径具有交叉的部分。)进一步的,由于该深度图b和深度图c为相隔不远的位置拍摄得到的图像,因此两张深度图可以具有重叠部分,那么二者得到的导航路径也可以有重叠部分。飞行器可以将规划得到的两条导航路径的重合部分重叠在一起,然后将两条导航路径进行平滑处理后,对该第一导航路径的还未飞行的部分进行修正(例如对第一导航路径后3米的部分进行修正),得到修正后的第一导航路径。
在一个实施例中,该飞行器可以根据该修正后的第一导航路径的飞行趋势,或者还可以结合该深度图b和深度图c中的超出该第一导航路径范围的部分,预测出该第二导航路径。例如,该修正后的第一导航路径为直线段,那么,飞行器可以推测接下来在该待作业区域的飞行趋势也为沿直线飞行,因此,该飞行器可以预测该第二导航路径为直线路径。
可选的,所述根据所述当前的第一导航路径规划得到第二导航路径,包括:根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径;其中,N为大于等于1的正整数。
需要说明的是,该飞行器在根据该当前的第一导航路径规划得到第二导航路径时,还可以采用根据当前的第一导航路径和已记录的N个第一导航路径预测得到第二导航路径的方式。
在一些可行的实施方式中,该飞行器可以将该当前的第一导航路径和已记录的N个第一导航路径通过线段拟合的方式,并按照时间顺序整合到一起,然后根据各个导航路径的飞行趋势将各个第一导航路径进行平滑处理,并整合得到一平滑的导航轨迹,该导航轨迹可以用于表示该飞行器已飞行过的飞行轨迹。
在一个实施例中,该飞行器可以根据该导航轨迹展现的飞行趋势,预测出该第二导航路径。
可选的,所述根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径,包括:将所述当前的第一导航路径和所述已记录 的N个第一导航路径,映射到目标位置点所在的参考坐标系下;根据所述当前第一导航路径和已记录的N个第一导航路径在所述参考坐标系下的坐标位置,预测出第二导航路径。
在一些可行的实施方式中,该目标位置点可以是各个第一导航路径对应的深度图中,拍摄时间最早的深度图所对应的拍摄位置坐标。
或者,该目标位置点也可以是任意一个深度图所对应的拍摄位置坐标,本发明对此不作任何限制。
还需要说明的是,该参考坐标系可以是以该目标位置点为原点建立的坐标系。
在一些可行的实施方式中,该飞行器可以首先获取当前的第一导航路径和所述已记录的N个第一导航路径各自对应的深度图的拍摄时间,然后选择出拍摄时间最早的目标深度图,将该目标深度图对应的拍摄位置坐标作为该目标位置点,并以该目标位置点为原点建立的坐标系。
在一个实施例中,该飞行器还可以将当前的第一导航路径和已记录的N个第一导航路径,均映射到该参考坐标系下,并可以在该参考坐标系中,根据各个第一导航路径的飞行趋势,首先得到已飞行过的飞行轨迹,然后再根据该飞行轨迹预测出第二导航路径。
可选的,所述将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下,包括:获取所述当前的第一导航路径上的第一导航位置点与目标位置点间的相对位置信息;获取所述已记录的N个第一导航路径上的第二导航位置点与目标位置点的相对位置信息;根据获取的相对位置信息将所述当前第一导航路径和所述已记录的N个第一导航路径,映射到所述目标位置点所在参考坐标系下。
其中,所述相对位置信息是指:导航位置点与所述目标位置点之间的相对位移和姿态旋转关系。
需要说明的是,该导航位置点可以是该第一导航路径所对应的深度图的拍摄位置坐标。
其中,所述第一导航位置点为所述当前的第一导航路径对应的深度图的拍摄位置点、所述第二导航位置点为所述已记录的N个第一导航路径各自对应的 深度图的拍摄位置点。
在一些可行的实施方式中,该第一导航位置点以及该二导航位置点,还可以是对应深度图的中心点所对应的位置坐标。
其中,所述姿态旋转关系是根据所述导航位置点对应的拍摄姿态以及所述目标位置点对应的拍摄姿态得到的。
需要说明的是,该拍摄姿态例如可以是摄像装置以90度俯视的角度进行拍摄,或者以45度的角度进行拍摄等等。
在一些可行的实施方式中,该拍摄姿态可以用姿态四元数表示。
其中,该相对位移可以是指各个导航位置点与该目标位置点之间的位移差,例如为1米、3米等等,本发明实施例对此不作任何限制。
在一些可行的实施方式中,该飞行器可以首先获取当前的第一导航路径上的导航位置点与目标位置点间的姿态旋转关系和相对位移,然后获取已记录的N个第一导航路径上的导航位置点与目标位置点间的姿态旋转关系和相对位移,最后根据获取到的姿态旋转关系和相对位移,得到各个导航位置点在该参考坐标系下的坐标位置,并可以将该当前第一导航路径和已记录的N个第一导航路径,映射到该参考坐标系下,并根据得到的坐标位置,预测出第二导航路径。
举例来说,以3张深度图(深度图1、深度图2、深度图3)为例,分别对应3条第一导航路径,深度图1对应第一导航路径1、深度图2对应第一导航路径2深度图3对应第一导航路径3。当前的第一导航路径为第一导航路径1,已记录的第一导航路径分别为第一导航路径2和第一导航路径3。
在一个实施例中,飞行器可以首先获取3张深度图的拍摄时刻以及拍摄坐标,然后可以选取拍摄时刻最早的拍摄坐标(例如为深度图3的拍摄坐标)作为该目标位置点,并求取剩余两个拍摄坐标各自距离该目标位置点的相对位移,假设目标位置点的位置坐标记为T0,深度图1的拍摄坐标记为T1,深度图2的拍摄坐标记为T2,那么,深度图1和深度图2的相对位移可以分别由以下公式表示:
t10=T1-T0,t20=T2-T0
在一个实施例中,该飞行器可以通过惯性测量单元获取三张深度图的 拍摄时刻的姿态四元数(可以表示拍摄姿态),深度图1的姿态四元数可以用Q1表示、深度图2的姿态四元数可以用Q2表示,深度图3的姿态四元数可以用Q0表示,那么深度图1、深度图2相对于目标位置点的姿态旋转关系可以分别用以下公式得到:
Figure PCTCN2017100034-appb-000001
那么,该深度图1、深度图2中的拍摄位置坐标P1、P2,投影到以目标位置点建立的参考坐标系中的位置坐标,可以分别为:
Figure PCTCN2017100034-appb-000002
在一个实施例中,该飞行器可以根据求取出的位置坐标,将各个位置坐标用平滑的曲线或直线进行拟合,便可以得到该飞行器从深度图1到深度图3的已飞行过的导航轨迹,并可以根据该导航轨迹的飞行趋势,预测出之后可能的导航路径。例如,该待作业区域为梯田,一段时间内的导航轨迹,可以接近一个半径较大的圆弧,通过该圆弧的趋势便可以实时纠正并调整接下来的导航路径。
S405、按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
需要说明的是,该导航路径可以包括该第一导航路径以及第二导航路径。飞行器可以按照该第一导航路径控制该飞行器在当前时间段内飞行,并预测出该第二导航路径,然后,该飞行器可以按照该第二导航路径控制该飞行器在下一段时间段内飞行。
可见,在本发明实施例中,飞行器可以根据深度图拟合得到作业边缘线,并根据该作业边缘线以及该飞行器当前的位置信息,确定出目标边缘线,然后根据该目标边缘线规划得到第一导航路径,并根据第一导航路径规划得到第二导航路径,然后按照导航路径控制该飞行器飞行,可以实现纠正并调整飞行器的导航路径,实现了飞行器自动规划导航路径,一定程度上提高了飞行器的自动化程度,使飞行器按照规划得到的导航路径执行任务作业,无需人为干预,提升了飞行器的任务作业效率。
请参阅图5,为本发明实施例提供的一种装置的第一实施例结构示意图。本实施例中所描述的装置,包括:
本发明实施例提供一种飞行器。请参阅图5,为本发明实施例提供的一种飞行器的结构示意图,本实施例中所描述的飞行器,包括:
存储器501和处理器502;
所述存储器501,用于存储程序指令;
所述处理器502,用于执行所述存储器存储的程序指令,当程序指令被执行时,用于:
根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;
根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;
根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;
按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
在某些实施例中,所述处理器502根据所述深度图拟合得到作业边缘线时,具体用于:
根据所述深度图确定所述深度图中的深度跳变点;
根据所述深度跳变点进行线段拟合得到作业边缘线。
所述深度跳变点所对应的深度值与相邻的一个或者多个深度点所对应的深度值之间的差值达到预设阈值。
在某些实施例中,所述处理器502根据所述目标边缘线规划得到导航路径时,具体用于:
根据所述目标边缘线确定当前的第一导航路径,所述第一导航路径为在所述深度图的拍摄范围内的导航路径;
根据所述当前的第一导航路径规划得到第二导航路径,所述第二导航路径为未在所述深度图的拍摄范围内的导航路径。
在某些实施例中,从拟合得到的作业边缘线中选择的目标边缘线包括两 条,所述处理器502根据所述目标边缘线确定当前的第一导航路径时,具体用于:
确定两条目标边缘线间的中心线,并根据所述中心线确定当前的第一导航路径。
在某些实施例中,所述处理器502根据所述当前的第一导航路径规划得到第二导航路径时,具体用于:
按照所述当前的第一导航路径控制所述飞行器飞行;
根据飞行过程中所述摄像装置采集到的深度图对所述当前的第一导航路径进行修正;
根据修正后的第一导航路径预测得到第二导航路径。
在某些实施例中,所述处理器502根据所述当前的第一导航路径规划得到第二导航路径时,具体用于:
根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径;
其中,N为大于等于1的正整数。
在某些实施例中,所述处理器502根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径时,具体用于:
将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下;
根据所述当前第一导航路径和已记录的N个第一导航路径在所述参考坐标系下的坐标位置,预测出第二导航路径。
在某些实施例中,所述处理器502将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下时,具体用于:
获取所述当前的第一导航路径上的第一导航位置点与目标位置点间的相对位置信息;
获取所述已记录的N个第一导航路径上的第二导航位置点与目标位置点的相对位置信息;
根据获取的相对位置信息将所述当前第一导航路径和所述已记录的N个第一导航路径,映射到所述目标位置点所在参考坐标系下。
在某些实施例中,所述相对位置信息是指:导航位置点与所述目标位置点之间的相对位移和姿态旋转关系。
在某些实施例中,所述第一导航位置点为所述当前的第一导航路径对应的深度图的拍摄位置点、所述第二导航位置点为所述已记录的N个第一导航路径各自对应的深度图的拍摄位置点。
在某些实施例中,所述姿态旋转关系是根据所述导航位置点对应的拍摄姿态以及所述目标位置点对应的拍摄姿态得到的。
在某些实施例中,所述深度图包括所述飞行器与所述待作业区域的相对高度信息。
在某些实施例中,所述处理器502还用于根据所述相对高度信息调整所述飞行器的高度为目标相对高度,以使所述飞行器保持在所述目标相对高度上飞行。
在某些实施例中,所述待作业区域内存在高度差,且为曲线排列。
在某些实施例中,所述待作业区域为茶园、梯田的任意一种或多种。
本发明实施例提供一种飞行系统。图6是本发明实施例提供的飞行系统的架构示意图。如图6所示,该飞行系统包括:飞行器601、至少一个摄像装置602、。
其中,该飞行器601为上述本发明实施例中公开的飞行器,原理和实现方式均与上述实施例类似,此处不再赘述。
其中,该摄像装置602可以设置在该飞行器上,用于拍摄该飞行器下方的包括待作业区域的深度图。
具体地,飞行系统可应用于无人机、遥控飞机等设备。以无人机为例,摄像装置602可通过云台或其他搭载设备搭载于无人机的主体(即飞行器601)上。摄像装置602用于在无人机的飞行过程中进行图像或视频拍摄,包括但不限于多光谱成像仪、高光谱成像仪、可见光相机及红外相机等,并且该摄像装置602可以为一个或者一个以上。其中,该飞行器601可控制该摄像装置601拍摄深度图,并根据该深度图拟合得到作业边缘线,并根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线,根据该目标边缘线规划得到导航路径,按照该导航路径控制该飞行器飞行,以对该待作业区域执 行任务作业。
需要说明的是,该飞行器601可用于执行前述方法实施例所示的路径规划方法,其具体实现过程可参照该方法实施例,在此不作赘述。
需要说明的是,对于前述的各个方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某一些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
以上对本发明实施例所提供的一种路径规划方法、飞行器及飞行系统进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (33)

  1. 一种路径规划方法,其特征在于,应用于飞行器,所述飞行器上设置有摄像装置,所述摄像装置用于拍摄所述飞行器下方的包括待作业区域的深度图,所述方法包括:
    根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;
    根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;
    根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;
    按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述深度图拟合得到作业边缘线,包括:
    根据所述深度图确定所述深度图中的深度跳变点;
    根据所述深度跳变点进行线段拟合得到作业边缘线。
  3. 根据权利要求2所述的方法,其特征在于,所述深度跳变点所对应的深度值与相邻的一个或者多个深度点所对应的深度值之间的差值达到预设阈值。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述根据所述目标边缘线规划得到导航路径,包括:
    根据所述目标边缘线确定当前的第一导航路径,所述第一导航路径为在所述深度图的拍摄范围内的导航路径;
    根据所述当前的第一导航路径规划得到第二导航路径,所述第二导航路径为未在所述深度图的拍摄范围内的导航路径。
  5. 根据权利要求4所述的方法,其特征在于,所述从拟合得到的作业边缘线中选择的目标边缘线包括两条,所述根据所述目标边缘线确定当前的第一导航路径,包括:
    确定两条目标边缘线间的中心线,并根据所述中心线确定当前的第一导航路径。
  6. 根据权利要求4或5所述的方法,其特征在于,所述根据所述当前的第一导航路径规划得到第二导航路径,包括:
    按照所述当前的第一导航路径控制所述飞行器飞行;
    根据飞行过程中所述摄像装置采集到的深度图对所述当前的第一导航路径进行修正;
    根据修正后的第一导航路径预测得到第二导航路径。
  7. 根据权利要求4或5所述的方法,其特征在于,所述根据所述当前的第一导航路径规划得到第二导航路径,包括:
    根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径;
    其中,N为大于等于1的正整数。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径,包括:
    将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下;
    根据所述当前第一导航路径和已记录的N个第一导航路径在所述参考坐标系下的坐标位置,预测出第二导航路径。
  9. 根据权利要求8所述的方法,其特征在于,所述将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下,包括:
    获取所述当前的第一导航路径上的第一导航位置点与目标位置点间的相对位置信息;
    获取所述已记录的N个第一导航路径上的第二导航位置点与目标位置点的相对位置信息;
    根据获取的相对位置信息将所述当前第一导航路径和所述已记录的N个第一导航路径,映射到所述目标位置点所在参考坐标系下。
  10. 根据权利要求9所述的方法,其特征在于,所述相对位置信息是指:导航位置点与所述目标位置点之间的相对位移和姿态旋转关系。
  11. 根据权利要求9或10所述的方法,其特征在于,所述第一导航位置点为所述当前的第一导航路径对应的深度图的拍摄位置点、所述第二导航位置点为所述已记录的N个第一导航路径各自对应的深度图的拍摄位置点。
  12. 根据权利要求10或11所述的方法,其特征在于,所述姿态旋转关系是根据所述导航位置点对应的拍摄姿态以及所述目标位置点对应的拍摄姿态得到的。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述深度图包括所述飞行器与所述待作业区域的相对高度信息。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    根据所述相对高度信息调整所述飞行器的高度为目标相对高度,以使所述飞行器保持在所述目标相对高度上飞行。
  15. 根据权利要求1-14所述的方法,其特征在于,所述待作业区域内存在高度差,且为曲线排列。
  16. 根据权利要求15所述的方法,其特征在于,所述待作业区域为茶园、 梯田的任意一种或多种。
  17. 一种飞行器,其特征在于,所述飞行器上设置有摄像装置,所述摄像装置用于拍摄所述飞行器下方的包括待作业区域的深度图,包括:存储器和处理器;
    所述存储器,用于存储程序指令;
    所述处理器,用于执行所述存储器存储的程序指令,当程序指令被执行时,所述处理器用于:
    根据所述深度图拟合得到作业边缘线,所述作业边缘线用于标识所述待作业区域的边缘;
    根据所述飞行器当前的位置信息以及拟合得到的作业边缘线,确定出目标边缘线;
    根据所述目标边缘线规划得到导航路径,其中,所述导航路径位于所述待作业区域的上方;
    按照所述导航路径控制所述飞行器飞行,以对所述待作业区域执行任务作业。
  18. 根据权利要求17所述的飞行器,其特征在于,所述处理器根据所述深度图拟合得到作业边缘线时,具体用于:
    根据所述深度图确定所述深度图中的深度跳变点;
    根据所述深度跳变点进行线段拟合得到作业边缘线。
  19. 根据权利要求18所述的飞行器,其特征在于,所述深度跳变点所对应的深度值与相邻的一个或者多个深度点所对应的深度值之间的差值达到预设阈值。
  20. 根据权利要求17-19任一项所述的飞行器,其特征在于,所述处理器根据所述目标边缘线规划得到导航路径时,具体用于:
    根据所述目标边缘线确定当前的第一导航路径,所述第一导航路径为在所述深度图的拍摄范围内的导航路径;
    根据所述当前的第一导航路径规划得到第二导航路径,所述第二导航路径为未在所述深度图的拍摄范围内的导航路径。
  21. 根据权利要求20所述的飞行器,其特征在于,所述处理器从拟合得到的作业边缘线中选择的目标边缘线包括两条,所述根据所述目标边缘线确定当前的第一导航路径时,具体用于:
    确定两条目标边缘线间的中心线,并根据所述中心线确定当前的第一导航路径。
  22. 根据权利要求20或21所述的飞行器,其特征在于,所述处理器根据所述当前的第一导航路径规划得到第二导航路径时,具体用于:
    按照所述当前的第一导航路径控制所述飞行器飞行;
    根据飞行过程中所述摄像装置采集到的深度图对所述当前的第一导航路径进行修正;
    根据修正后的第一导航路径预测得到第二导航路径。
  23. 根据权利要求20或21所述的飞行器,其特征在于,所述处理器根据所述当前的第一导航路径规划得到第二导航路径时,具体用于:
    根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径;
    其中,N为大于等于1的正整数。
  24. 根据权利要求23所述的飞行器,其特征在于,所述处理器根据所述当前的第一导航路径和已记录的N个第一导航路径,预测得到第二导航路径时,具体用于:
    将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下;
    根据所述当前第一导航路径和已记录的N个第一导航路径在所述参考坐标系下的坐标位置,预测出第二导航路径。
  25. 根据权利要求24所述的飞行器,其特征在于,所述处理器将所述当前的第一导航路径和所述已记录的N个第一导航路径,映射到目标位置点所在的参考坐标系下时,具体用于:
    获取所述当前的第一导航路径上的第一导航位置点与目标位置点间的相对位置信息;
    获取所述已记录的N个第一导航路径上的第二导航位置点与目标位置点的相对位置信息;
    根据获取的相对位置信息将所述当前第一导航路径和所述已记录的N个第一导航路径,映射到所述目标位置点所在参考坐标系下。
  26. 根据权利要求25所述的飞行器,其特征在于,所述相对位置信息是指:导航位置点与所述目标位置点之间的相对位移和姿态旋转关系。
  27. 根据权利要求25或26所述的飞行器,其特征在于,所述第一导航位置点为所述当前的第一导航路径对应的深度图的拍摄位置点、所述第二导航位置点为所述已记录的N个第一导航路径各自对应的深度图的拍摄位置点。
  28. 根据权利要求26或27所述的飞行器,其特征在于,所述姿态旋转关系是根据所述导航位置点对应的拍摄姿态以及所述目标位置点对应的拍摄姿态得到的。
  29. 根据权利要求17-28任一项所述的飞行器,其特征在于,所述深度图包括所述飞行器与所述待作业区域的相对高度信息。
  30. 根据权利要求29所述的飞行器,其特征在于,所述飞行器还用于根据所述相对高度信息调整所述飞行器的高度为目标相对高度,以使所述飞行器保 持在所述目标相对高度上飞行。
  31. 根据权利要求17-30所述的飞行器,其特征在于,所述待作业区域内存在高度差,且为曲线排列。
  32. 根据权利要求31所述的飞行器,其特征在于,所述待作业区域为茶园、梯田的任意一种或多种。
  33. 一种飞行系统,其特征在于,包括:
    至少一个摄像装置;
    如权利要求1-16任一项所述的飞行器。
PCT/CN2017/100034 2017-08-31 2017-08-31 一种路径规划方法、飞行器、飞行系统 WO2019041266A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780004776.2A CN108513643A (zh) 2017-08-31 2017-08-31 一种路径规划方法、飞行器、飞行系统
PCT/CN2017/100034 WO2019041266A1 (zh) 2017-08-31 2017-08-31 一种路径规划方法、飞行器、飞行系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/100034 WO2019041266A1 (zh) 2017-08-31 2017-08-31 一种路径规划方法、飞行器、飞行系统

Publications (1)

Publication Number Publication Date
WO2019041266A1 true WO2019041266A1 (zh) 2019-03-07

Family

ID=63375366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100034 WO2019041266A1 (zh) 2017-08-31 2017-08-31 一种路径规划方法、飞行器、飞行系统

Country Status (2)

Country Link
CN (1) CN108513643A (zh)
WO (1) WO2019041266A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759912A (zh) * 2021-09-03 2021-12-07 深圳一清创新科技有限公司 一种全覆盖清扫路径规划方法、装置和清扫车
CN114111799A (zh) * 2021-12-07 2022-03-01 青岛市勘察测绘研究院 针对高大单体精细建模的无人机航摄路径规划方法
CN116880570A (zh) * 2023-09-06 2023-10-13 中联金冠信息技术(北京)有限公司 一种无人机无线定向导航系统

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968110B (zh) * 2018-09-29 2021-04-06 广州极飞科技有限公司 作业区域的确定方法、装置、无人机及存储介质
CN109683627A (zh) * 2018-12-10 2019-04-26 杭州瓦屋科技有限公司 一种植保无人机作业自动识别地块的方法及系统
CN111982097A (zh) * 2019-05-23 2020-11-24 广州极飞科技有限公司 无人作业设备的目标航线的生成方法及装置、植保系统
CN111982096B (zh) * 2019-05-23 2022-09-13 广州极飞科技股份有限公司 一种作业路径生成方法、装置及无人飞行器
CN112204636A (zh) * 2019-08-29 2021-01-08 深圳市大疆创新科技有限公司 航线调整方法、地面端设备、无人机、系统和存储介质
CN111752298A (zh) * 2019-09-30 2020-10-09 广州极飞科技有限公司 无人机作业航线生成方法及相关装置
WO2021159249A1 (zh) * 2020-02-10 2021-08-19 深圳市大疆创新科技有限公司 航线规划方法、设备及存储介质
CN114041097B (zh) * 2020-05-27 2024-05-17 深圳市大疆创新科技有限公司 无人飞行器的航线平滑处理方法、装置及控制终端

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
CN105717933A (zh) * 2016-03-31 2016-06-29 深圳奥比中光科技有限公司 无人机以及无人机防撞方法
CN106096559A (zh) * 2016-06-16 2016-11-09 深圳零度智能机器人科技有限公司 障碍物检测方法及系统以及运动物体
CN106931961A (zh) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 一种自动导航方法及装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103196430B (zh) * 2013-04-27 2015-12-09 清华大学 基于无人机的飞行轨迹与视觉信息的映射导航方法及系统
CN103941747B (zh) * 2014-03-31 2016-08-17 清华大学 无人机群的控制方法及系统
CN105116911B (zh) * 2015-07-20 2017-07-21 广州极飞科技有限公司 无人机喷药方法
CN105222779B (zh) * 2015-08-26 2018-03-09 北京农业智能装备技术研究中心 植保无人机的航迹规划方法及装置
CN105159319B (zh) * 2015-09-29 2017-10-31 广州极飞科技有限公司 一种无人机的喷药方法及无人机
CN105676856B (zh) * 2016-02-24 2018-05-29 陈昊 无人飞行器的交互方法、交互装置及交互系统
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机
CN106020233B (zh) * 2016-07-08 2023-11-28 聂浩然 无人机植保作业系统、用于植保作业的无人机及控制方法
CN106384382A (zh) * 2016-09-05 2017-02-08 山东省科学院海洋仪器仪表研究所 一种基于双目立体视觉的三维重建系统及其方法
CN106568443B (zh) * 2016-10-19 2019-11-29 槃汩工业技术(岳阳)有限公司 自动喷洒弓字形路径规划方法及植保机喷洒作业方法
CN106919178A (zh) * 2017-04-14 2017-07-04 南京信息工程大学 一种植保无人机自主飞行路径优化装置及其优化方法
CN106873630B (zh) * 2017-04-20 2021-05-14 广州极飞科技有限公司 一种飞行控制方法及装置,执行设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
CN105717933A (zh) * 2016-03-31 2016-06-29 深圳奥比中光科技有限公司 无人机以及无人机防撞方法
CN106096559A (zh) * 2016-06-16 2016-11-09 深圳零度智能机器人科技有限公司 障碍物检测方法及系统以及运动物体
CN106931961A (zh) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 一种自动导航方法及装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759912A (zh) * 2021-09-03 2021-12-07 深圳一清创新科技有限公司 一种全覆盖清扫路径规划方法、装置和清扫车
CN114111799A (zh) * 2021-12-07 2022-03-01 青岛市勘察测绘研究院 针对高大单体精细建模的无人机航摄路径规划方法
CN114111799B (zh) * 2021-12-07 2023-08-15 青岛市勘察测绘研究院 针对高大单体精细建模的无人机航摄路径规划方法
CN116880570A (zh) * 2023-09-06 2023-10-13 中联金冠信息技术(北京)有限公司 一种无人机无线定向导航系统

Also Published As

Publication number Publication date
CN108513643A (zh) 2018-09-07

Similar Documents

Publication Publication Date Title
WO2019041266A1 (zh) 一种路径规划方法、飞行器、飞行系统
US10902566B2 (en) Methods for agronomic and agricultural monitoring using unmanned aerial systems
WO2018195955A1 (zh) 一种基于飞行器的设施检测方法及控制设备
JP6621140B2 (ja) 無人飛行体による薬剤散布方法、および、プログラム
WO2022094854A1 (zh) 农作物的生长监测方法、设备及存储介质
Valente et al. Near-optimal coverage trajectories for image mosaicing using a mini quad-rotor over irregular-shaped fields
WO2020103110A1 (zh) 一种基于点云地图的图像边界获取方法、设备及飞行器
CN111225855A (zh) 无人机
US20210316857A1 (en) Drone for capturing images of field crops
JP7299213B2 (ja) 情報処理装置
JP6762629B2 (ja) 圃場作物撮影方法および撮影用ドローン
CN105526916A (zh) 动态图像遮蔽系统和方法
WO2019167207A1 (ja) 制御装置、作業機及びプログラム
WO2021046304A1 (en) Uav surveying system and methods
WO2021100430A1 (ja) 情報処理装置、情報処理方法、プログラム
CN108007437B (zh) 一种基于多旋翼飞行器测量农田边界与内部障碍的方法
US20220221857A1 (en) Information processing apparatus, information processing method, program, and information processing system
CN110382358A (zh) 云台姿态修正方法、云台姿态修正装置、云台、云台系统和无人机
US20220214700A1 (en) Control method and device, and storage medium
WO2019235240A1 (ja) 情報処理装置
AU2017203165B2 (en) Method and System for Collection of Photographic Data
WO2023019445A1 (zh) 图像处理方法、无人飞行器和存储介质
WO2021149355A1 (ja) 情報処理装置、情報処理方法、プログラム
JP7090516B2 (ja) 灌水制御システム
WO2021255941A1 (ja) 作物の生育管理装置及び生育管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17923330

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17923330

Country of ref document: EP

Kind code of ref document: A1