CN113791641A - Aircraft-based facility detection method and control equipment - Google Patents

Aircraft-based facility detection method and control equipment Download PDF

Info

Publication number
CN113791641A
CN113791641A CN202111071109.0A CN202111071109A CN113791641A CN 113791641 A CN113791641 A CN 113791641A CN 202111071109 A CN202111071109 A CN 202111071109A CN 113791641 A CN113791641 A CN 113791641A
Authority
CN
China
Prior art keywords
detection
aircraft
image
flight
target facility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111071109.0A
Other languages
Chinese (zh)
Inventor
李思晋
赵丛
封旭阳
张李亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202111071109.0A priority Critical patent/CN113791641A/en
Publication of CN113791641A publication Critical patent/CN113791641A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Abstract

The utility model provides a facility detection method and controlgear can realize patrolling and examining some facilities automatically, has saved the human cost, has improved the efficiency of patrolling and examining, wherein, the facility detection method includes: acquiring an environment image including a target facility when an aircraft is located at a detection position for the target facility (S201, S301); determining an image area to which a target facility belongs from the environment image, and performing image segmentation on the image area to obtain a detection object related to the target facility (S202, S302); acquiring a flight rule about the detection object according to the image position and the detection position of the detection object in the environment image (S203, S303); and controlling the aircraft to fly according to the flight rules (S304) so as to complete the detection of the target facility (S204).

Description

Aircraft-based facility detection method and control equipment
Technical Field
The invention relates to the technical field of computer control, in particular to a facility detection method and control equipment based on an aircraft.
Background
Some facilities require users to periodically perform inspection and maintenance on the facilities so as to confirm the safety state of the facilities. For example, for facilities such as electric towers, bridges, and high buildings, regular inspection is required to ensure the safe and proper operation of the facilities.
However, for facilities at special locations, especially for power towers, bridges, etc. at critical locations, periodic inspection becomes difficult. Moreover, such facilities are usually in large numbers, and the periodic inspection requires a lot of manpower.
Disclosure of Invention
The embodiment of the invention provides a facility detection method and control equipment based on an aircraft, which are used for realizing inspection of target facilities by controlling the aircraft.
In one aspect, an embodiment of the present invention provides an aircraft-based facility detection method, including:
acquiring an environmental image including a target facility based on a sensor carried by an aircraft when the aircraft is located at a detection position for the target facility;
determining an image area to which the target facility belongs from the environment image, identifying one or more detection objects of the target facility in the image area, and determining the image positions of the one or more detection objects in the environment image; wherein the detection object is a part of the target facility;
acquiring a routing inspection strategy associated with the one or more detection objects;
generating a flight track meeting the inspection strategy according to the image position of the one or more detection objects in the environment image and the detection position;
and controlling the aircraft to fly according to the flight track so as to complete the detection of the target facility.
On the other hand, an embodiment of the present invention further provides a control device, including: a processor, a memory, and a data interface;
the data interface is used for interacting data with the aircraft;
the memory is used for storing program instructions, and the processor is configured for calling the program instructions to execute the aircraft-based facility detection method provided by the embodiment of the invention.
Drawings
FIG. 1 is a schematic flow chart of a method for detecting a target facility according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for aircraft-based facility detection in accordance with an embodiment of the present invention;
FIG. 3 is a schematic flow chart diagram of another aircraft-based facility detection method in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart illustrating a method for determining flight rules in accordance with an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a facility detection apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a control device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention collects images by using a vision technology, can detect and identify the target facility to be detected at a distance, and automatically flies to the vicinity of the target facility to be detected. And by means of image segmentation and recognition technology, each part in the target facility is segmented and distinguished to obtain one or more detection objects of the target facility, and each detection object identified is detected and recorded by using a sensor (such as a camera, a thermal imager and the like) carried on the ground. After the detection record is finished, the unmanned aerial vehicle and other aircrafts automatically return or detect the next target facility needing to be detected approaching under the condition of sufficient electric quantity.
The embodiment of the invention provides an aircraft-based facility detection method, which can be executed by control equipment of an aircraft, and comprises the following steps:
acquiring an environmental image including a target facility based on a sensor carried by an aircraft when the aircraft is located at a detection position for the target facility; determining an image area to which the target facility belongs from the environment image, identifying one or more detection objects of the target facility in the image area, and determining the image positions of the one or more detection objects in the environment image; wherein the detection object is a part of the target facility; acquiring a routing inspection strategy associated with the one or more detection objects; generating a flight track meeting the inspection strategy according to the image position of the one or more detection objects in the environment image and the detection position; and controlling the aircraft to fly according to the flight track so as to complete the detection of the target facility.
In one embodiment, the detection objects comprise a plurality of detection objects, and each detection object is associated with a patrol inspection strategy; the manner of generating the flight trajectory meeting the inspection strategy by the control device according to the image position of the one or more detection objects in the environment image and the detection position may be:
acquiring a routing inspection strategy of each detection object; and generating a flight track according to the image position of each detection object in the environment image, the detection position and the inspection strategy of each detection object.
In an embodiment, the generated flight trajectory further satisfies a preset limiting parameter; the limiting parameters include one or more of the following: flight distance parameters, flight duration parameters, flight safety parameters and energy loss parameters.
In an embodiment, the control device may further generate detection parameters including sensing parameters for instructing the aircraft to patrol the detection object during the flight of the aircraft, the sensing parameters including: shooting parameters of a sensor for detecting a detection object.
In an embodiment, the control device may further obtain a detection image obtained by detection in a process of controlling the aircraft to fly according to the flight trajectory; updating the flight track according to the position of the detection object in the detection image and a routing inspection strategy set for the detection object; and controlling the aircraft to fly according to the updated flight track so as to complete the detection of the target facility.
In an embodiment, the manner of generating, by the control device, a flight trajectory that satisfies the inspection policy according to the image position of the one or more detection objects in the environment image and the detection position may be:
obtaining a residual energy value of the aircraft; and generating a flight track meeting the inspection strategy according to the residual energy value, the image position of the one or more detection objects in the environment image and the detection position.
In an embodiment, the way in which the control device identifies one or more detection objects of the target facility in the image area may be:
acquiring an object model preset for the target facility; and carrying out image segmentation on the image area according to the object model to obtain one or more detection objects with the similarity meeting the similarity condition with the object model.
In an embodiment, the control device may further configure one or more to-be-detected facility location points on the interactive interface displaying the map; determining facilities corresponding to the selected one or more facility position points as target facilities; and controlling the aircraft to fly towards the target facility according to the selected facility position point so as to fly to the detection position for the target facility.
In an embodiment, the control device may further control the aircraft to fly to the target facility according to a specified rule during the flight of the aircraft to the target facility, where the specified rule is used to instruct the aircraft to move to at least two target shooting positions; respectively acquiring depth maps of the aircraft in the traveling direction at the at least two shooting positions, wherein the depth maps are used for representing the scene depth obtained by shooting the target facility from different angles; and adjusting the flight track of the aircraft according to the acquired depth map so that the aircraft can avoid the obstacle aiming at the target facility.
In an embodiment, the control device may further receive position information returned by the aircraft, the position information including: distance information and direction information of the aircraft relative to a target object generated by the aircraft, or position coordinate information of the aircraft returned by the aircraft; displaying, on an interactive interface, a relative position between the aircraft and the target facility based on the received position information and the position of the target facility.
Fig. 1 is a schematic flow chart of a method for detecting a target facility according to an embodiment of the present invention. The facility detection method of embodiments of the present invention may be performed by a control device, which may be deployed on an aircraft. In the embodiment of the present invention, the method is described with an unmanned aerial vehicle as an aircraft, and the unmanned aerial vehicle has a sensor mounted thereon for detecting a target facility. The main steps performed by the control apparatus are as follows.
S101: the location of the target facility is determined. The position of the target facility can be roughly defined by using position information such as GPS (Global Positioning System) information, and the unmanned aerial vehicle is controlled to autonomously fly to the target facility so that the target facility appears within an observation range of the unmanned aerial vehicle, which mainly refers to a detection range of a sensor mounted on the unmanned aerial vehicle, such as a shooting range of a camera. The user may input location information of one or more target facilities in advance on a user interface configuring the control device, for example, in an interface on which a map is displayed, one or more location points may be specified on the interface including the map by touch-clicking, and the control device may record the location points as the location points of the target facilities. The control device can control the unmanned aerial vehicle to fly based on the position points of the target facilities when the unmanned aerial vehicle opens the patrol mode, so as to fly to the corresponding one or more target facilities, so as to monitor one target facility, or monitor a plurality of target facilities on a certain route, for example, a plurality of electric towers which are connected.
S102: and carrying out obstacle avoidance processing based on image segmentation identification and the depth map. In the process of flying to the target facility, active obstacle avoidance can be carried out on the basis of image segmentation identification and the acquired depth map, so that the target facility can be detected in a safe flying mode.
Various obstacles in a flight path need to be detected in the autonomous flight process of the unmanned aerial vehicle. The first task of obstacle avoidance is to detect obstacles in the direction of flight. The depth map can be detected and calculated based on a binocular vision detection mode, so that obstacles on a flight path are positioned. The acquisition of the depth map can be obtained by binocular matching calculation, and can also be obtained by calculation based on structured light or infrared equipment. Structured light or infrared based devices can yield relatively higher quality depth maps.
In order to further improve the accuracy of the depth map and avoid the situations of missing detection and false detection when the texture is not rich and the target object is too small, the embodiment of the invention can further combine with an image segmentation technology to determine the barrier and avoid the barrier in flight. Because image segmentation does not need to be matched, the region with abundant texture also can have better recognition effect, consequently can cooperate the depth map to use together, can obtain better depth map on the one hand, and on the other hand also can give actual meaning for each point in the depth map, helps unmanned aerial vehicle to carry out route planning, confirms the flight rule that unmanned aerial vehicle flies.
In the flight process, the unmanned aerial vehicle continuously estimates and corrects the relative position with the target facility. In one embodiment, the drone may use a visual tracking algorithm to lock the target facility within the range observable by the image, estimating the relative distance to the target facility from the change in size of the target facility in the image and the current speed of flight. The unmanned aerial vehicle can also roughly acquire the depth information of objects in the scene according to the flight of a specific track, and provides reference for distance obstacle avoidance.
In an embodiment, the inspection scene detected this time may be determined based on the captured image recognition, and the inspection scene may be specifically classified according to the target facility detected this time, for example, the method includes: the method comprises the following steps of determining a scene of an inspection electric tower, a scene of an inspection bridge and the like, wherein the determined inspection scene can provide reference information for flight obstacle avoidance and flight route planning. For example, when the scene of patrolling and examining is the scene of patrolling and examining the electric tower, because under these scenes, there is generally the power line to be connected between the adjacent electric tower, the position of connecting is relatively more fixed, and unmanned aerial vehicle can select to bypass the intensive region of power line when setting for the route.
For a specific inspection scene, the unmanned aerial vehicle can be provided with a corresponding sensor so as to further improve the reliability of obstacle avoidance. For example, for equipment inspection of a power system, the thermal imager can be used for detecting the existence of a power line, and more robust obstacle avoidance is provided for inspection close to the power line.
S103: and after the target facility is detected to enter the observation range of the unmanned aerial vehicle, detecting the specific position of the target facility, and flying towards the target facility. The embodiment of the invention can detect the target facilities in the image range based on a visual detection method of artificial features. And the identification algorithm based on the deep neural network can learn more stable and reliable image characteristics from the data by learning mass image data of target facilities such as the power tower and the like, so that a more accurate identification result is obtained.
And the detection algorithm is operated on the image observed by the unmanned aerial vehicle, and the position of the target facility is detected and positioned in the image. Once the target facility to be detected is found in the image, it is locked in the image and gradually flies toward the target facility to be detected. In this process, a tracking algorithm may be used to lock the detected target facility and determine the flight path of the drone using the detected result as a reference.
In order to locate the position of the target facility in the image, a plurality of feature points may be selected in the target facility. Segmentation identification based on the whole graph can enable the unmanned aerial vehicle to select more stable image feature points according to the category of the target facility, and the image feature points generally need to exist on the target facility stably all the time, do not move and are easy to detect and find. For example, the feature points selected on the electric tower are more stable than those of the water surface, and based on these image feature points, the accuracy of calculating SLAM (Simultaneous localization and mapping) for the unmanned aerial vehicle can be improved. Therefore, unmanned aerial vehicle can be more nimble adjustment gesture and route. For example, in order to make the flight path safer, the local path may be selected such that the target facility to be detected does not appear within the field of view, and after passing through the obstacle, the target facility to be detected is locked in the image again according to the estimated relative position.
S104: identifying each component of the facility, and detecting and recording the components in a targeted manner. When the detection position of the target facility is reached, for example, a certain position in an area where the distance from the target facility is within a preset distance range, the detection object may be further identified from the target facility based on the image, for example, when the target facility is an electric tower, the detection object which needs to be detected this time may be identified as the entire tower head, or a component for fixing the power line.
The image segmentation algorithm provides pixel level identification and segmentation and provides class information of each pixel in the image, and the class information mainly has the function of determining a detection object corresponding to the class information, and further determining a routing inspection strategy to be adopted to guide the unmanned aerial vehicle to fly.
The target facility to be detected can be segmented from the image, a local image area only including the target facility is determined, the local image area is analyzed and identified, different types of the target facility are identified from different positions, a key part needing to be detected is obtained, the key part is a detection object, and therefore the key part of the target facility can be detected and recorded in a targeted mode. In order to provide more accurate information for the patrol, when the distance between the target facility and the target facility is less than a preset distance threshold, an object model which is segmented and identified aiming at a specific part of the target facility can be used.
The object model can divide the target facility from the background, and can also refine and identify the components of the target facility to obtain one or more detection objects. In the inspection process, each component of a target facility such as an electric tower can be identified, and the position of each component is marked in an image. The user can specify the patrol policy for each component in advance. In one embodiment, the patrol policies include, but are not limited to: surround shooting, near-far continuous video shooting, and fixed-point shooting using a high-precision camera. Similar to the image segmentation model of remote recognition, the input of the object model is an image, and the recognition result at the pixel level is output, wherein the specific category to which each pixel belongs is represented, and the specific category may include a background and various components in a refined target facility, that is, each pixel point may be a pixel point of a background category or a pixel point of a certain detection object category (for example, a tower head). The object model can be configured according to the target facility to be inspected actually, for example, for an electric tower, the object model such as a tower head, a tower foot and the like can be configured so as to identify the tower head and the tower foot of the target facility in the image.
And generating a specific routing inspection scheme according to the identification result and the setting of the user. The generation of the routing inspection scheme comprises the following steps: generation of traces, time allocation for each segment of traces, etc. The information that the criteria for solving the trajectory would consider includes: on the premise of completing the inspection of all the detection objects, how to reduce the flight time more possibly, how to select a safer flight path, for example, how to avoid a power line, how to make a shot picture stable and reliable, characteristics of the unmanned aerial vehicle, such as the maximum and minimum acceleration of the unmanned aerial vehicle, and special effects of detection equipment for detection, such as the FOV of a camera, the optimal use distance of other sensors for detection, and the like.
After the track is generated, the unmanned aerial vehicle executes the calculated track, and patrols and examines each part to be detected (detection object) of the target facility. At this moment, unmanned aerial vehicle still can be continuous the renewal observe, and real-time dynamic revises the orbit and patrol and examine in order to guarantee can be safe effectual.
S105: and after the detection is finished, automatically returning. And after the detection task is finished, the unmanned aerial vehicle can realize automatic return according to the recorded departure point information and/or the recorded flight data when the inspection task is started to be executed. In one embodiment, visual odometry (visual odometry) may be combined with GPS information to guide the drone to return. The visual odometer estimates the track of the unmanned aerial vehicle in the task execution by means of image feature matching. The visual odometer combined with the image segmentation algorithm can select better matching features so as to realize more accurate track estimation.
The optimal return route to the departure point can be calculated by analyzing the flight track when the task is executed. For example, a flight trajectory that is explored, tried, or repeated in performing a task may be bypassed. Wherein the flight path that can be bypassed may be a partial path marked in the flight path obtained by implementing the above-described visual odometer.
In one embodiment, the estimated flight path can be corrected to return flight in combination with the GPS coordinates. And can further pass through positioning sensor, after unmanned aerial vehicle arrived near the departure point, realize more accurate return journey.
In one embodiment, the drone may confirm the optimal return route based on the visual odometer generated from previous records, while turning on the obstacle avoidance function during the return.
In addition, when implementing flight obstacle avoidance, obstacles on the flight route, such as buildings, mountains, etc. which have been identified on the map, can be determined in combination with the known 2D/3D map, and then the obstacles can be bypassed when determining the flight path. In order to more quickly determine the detection object from the target facility, marks for identifying the detection object may be set on the target facility, and based on these marks and the captured image, one or more detection objects may be quickly segmented and located from the image region in which the target facility is located. In acquiring the depth map, it may be acquired not only in a binocular vision-based manner but also using a device like a lidar.
When some facilities are inspected, particularly higher facilities or facilities in regions which are difficult to reach, the embodiment of the invention can inspect one or more objects needing to be detected in the facilities based on image recognition and automatic control flight modes, thereby reducing the labor cost and safety of inspection and improving the inspection efficiency.
Referring to fig. 2 again, it is a schematic flow chart of the method for detecting a facility based on an aircraft according to an embodiment of the present invention, and the method according to an embodiment of the present invention may be executed by a dedicated control device, which may be configured on an aircraft such as an unmanned aerial vehicle. This controlgear also can regard as a ground end equipment, through the mutual data of aircraft such as wireless mode and unmanned aerial vehicle, and then accomplish the task of patrolling and examining to the target facility.
S201: when an aircraft is located at a detection position for a target facility, an environmental image including the target facility is acquired. The function of the detection position mainly lies in that: relevant processing of the target facility may be triggered to facilitate completion of the routing inspection task for the target facility.
The detection position may refer to a position point located in a position area, and the position area may refer to an area within a preset distance range from the target facility. The control device determines whether the aircraft has reached the detected location for the target facility based on the detected location of the aircraft (e.g., GPS location coordinates) or the location reported by the aircraft and based on the location of the target facility.
Whether the aircraft reaches the detection position can also be judged by the aircraft, in one embodiment, the aircraft can be analyzed according to the shot image including the target facility, the distance between the aircraft and the target facility is estimated based on the preset actual size of the target facility, the size of the target facility in the image and the position of the target facility in the image, and if the distance is within a preset distance range, the aircraft can be considered to be at the detection position for detecting the target facility.
The detection position may also be a specific position, the target facility is included in the image obtained at the position, or the target facility included in the image can be segmented at the position, for example, if the area occupied by the target facility in the image meets a preset condition (the number of pixel points of the target facility is greater than a preset threshold, or the area of the image area occupied by the target facility is greater than a preset threshold), the position where the aircraft is located may be regarded as the detection position.
The unmanned aerial vehicle and other aircraft are provided with shooting devices such as cameras, the shooting devices are triggered to collect environment images after the shooting devices reach the detection positions, the environment images collected in S201 are mainly used for determining detection objects, and flight rules used in detection of the detection objects are determined.
In one embodiment, one or more to-be-detected facility location points can be configured on an interactive interface for displaying a map; determining facilities corresponding to the selected one or more facility position points as target facilities; and controlling the aircraft to fly to the target facility according to the selected facility position point so as to fly to the detection position of the target facility. If the user selects a plurality of target facilities on the interactive interface, the aircraft can be controlled to sequentially execute the following steps from far to near or from near to far so as to finish the inspection of the plurality of target facilities. Or completing the inspection of one or part of the target facilities based on the residual energy of the aircraft. If the control device is a smart terminal, such as a smart phone, a tablet computer, etc., comprising a display, an interactive interface comprising a map may be directly displayed to the user. If the control equipment is mounted on the aircraft, the control equipment can send an instruction to trigger a detection end for receiving data of the aircraft to display an interactive interface through a self-contained wireless communication interface or through the wireless communication interface on the aircraft, and receives the position of the target facility determined on the interactive interface to control the aircraft to fly.
S202: and determining an image area to which the target facility belongs from the environment image, and performing image segmentation on the image area to obtain a detection object related to the target facility. Determining the image region to which the target facility belongs may also be performed based on image segmentation. The image segmentation can be performed based on the brightness and the color of the pixels in the environment image, so as to obtain the image area of the target facility in the environment image and each detection object of the target facility.
The detection object may be the entire target facility, and the detection object is also a part of the target facility, such as a tower head of an electric tower, a fixing part fixing an electric power line, and the like. The detection object is specified in advance by the user. For example, if the user specifies that the entire electric tower needs to be inspected, in S202, after obtaining the image area of the electric tower, the entire electric tower in the image area may be used as the inspection object. The user can also designate the tower head part of the inspection electric tower, and after the image area of the electric tower is obtained, the tower head is obtained through segmentation and is used as a detection object.
When the image area is segmented, the image area is segmented mainly based on an object model preset for the target facility, and one or more detection objects are determined from the target facility of the image area. In an embodiment, the performing image segmentation on the image region to obtain the detection object related to the target facility may specifically include: acquiring an object model preset for the target facility; and carrying out image segmentation on the image area according to the object model to obtain a detection object with the shape similarity meeting the similarity condition with the object model. The object model is mainly used for identifying a certain component on the target facility, for example, a preset object model related to the tower head can assist in identifying the tower head part of the electric tower. A plurality of object models at different angles can be preset to correspond to one detection object, so that the detection object in the target facility can be accurately segmented and determined when the images of the target facility acquired at different angles are acquired. Further, the object model is configured with a model identifier, and the routing inspection strategy associated with the corresponding detection object is obtained according to the model identifier. The model identification may be a name, for example, the model identification of the object model about the tower head is referred to as "tower head". After a detection object is identified based on the object model, the identification of the detection object corresponds to the object model, and the identification of the detection object may be the same as the model identification of the object model. Based on the identification of the detection object, the routing inspection strategies related to the identification of the detection object can be determined from a preset mapping relation library, and the routing inspection strategies are mainly used for indicating routing inspection rules of the detection object, including surrounding flying and surrounding shooting, far and near continuous video shooting, fixed-point shooting by using a high-precision camera and other rules. In performing S203 described below, a flight rule about the inspection object may be further acquired based on a patrol rule.
S203: acquiring a flight rule about the detection object according to the image position of the detection object in the environment image and the detection position. The image location may be a pixel location of the detection object in the image, and the orientation of the detection object relative to the aircraft may be determined based on the image location. With the detection position as a starting point, a flight rule that can detect the detection object from an upper direction, a lower direction, or the like, or a flight rule that flies around the detection object may be determined. The flight rule can be a flight trajectory, for example, a flight trajectory for controlling the unmanned aerial vehicle to separately fly, and based on the flight trajectory, the aim of aligning the unmanned aerial vehicle can be achieved
S204: and controlling the aircraft to fly according to the flight rules so as to complete the detection of the target facility. After the flight rule is determined, the aircraft is controlled to fly according to the flight rule, and then the inspection of the detection object can be completed.
During the process of flying to the target facility to reach the detection position before the step S201 or during the process of controlling the flight of the aircraft at the step S204, the position information returned by the aircraft may be received in real time or periodically, and the position information includes: distance information and direction information of the aircraft relative to a target object generated by the aircraft, or position coordinate information of the aircraft returned by the aircraft. And displaying the relative position between the aircraft and the target facility on the interactive interface in real time according to the received position information and the position of the target facility.
When some facilities are inspected, particularly higher facilities or facilities in regions which are difficult to reach, the embodiment of the invention can inspect one or more objects needing to be detected in the facilities based on image recognition and automatic control flight modes, thereby reducing the labor cost and safety of inspection and improving the inspection efficiency.
Referring to fig. 3, it is a schematic flow chart of another aircraft-based facility detection method according to an embodiment of the present invention, where the method according to an embodiment of the present invention may be executed by a dedicated control device, and the control device may be configured on an aircraft such as an unmanned aerial vehicle. This controlgear also can regard as a ground end equipment, through the mutual data of aircraft such as wireless mode and unmanned aerial vehicle, and then accomplish the task of patrolling and examining to the target facility.
S301: when an aircraft is located at a detection position for a target facility, an environmental image including the target facility is acquired. The detection position is one of positions within a preset distance range from the target facility. And in the environment image, the environment image is acquired by sensors such as a camera carried on the aircraft.
S302: and determining an image area to which the target facility belongs from the environment image, and performing image segmentation on the image area to obtain a detection object related to the target facility. And determining the image area based on image segmentation technology segmentation, and analyzing and determining a detection object of the target facility based on a preset object model.
S303: acquiring a flight rule about the detection object according to the image position of the detection object in the environment image and the detection position. The image positions have the effect of: it is necessary to ensure that the detection object is always located in the image. The relative direction of the detection object can be determined according to the image position, and further, when the detection object is detected, if the detection object needing to be detected currently needs to be kept in the middle of the picture, the image position is considered when generating the flight rule comprising the flight track. When the detection object to be detected includes only one detection object, it is only necessary to generate a flight rule for the detection object, for example, to generate a flight trajectory around the detection object. If the detection object comprises a plurality of detection objects, a flight rule needs to be generated, and the plurality of detection objects can be detected in sequence based on the flight rule. For example, when the detection of the electric tower includes three detection objects, namely, a tower head, a tower body and a tower foot, a flight rule may be generated, where the flight rule includes, starting from a detection position, first detecting the tower head along a flight trajectory in the flight rule, then detecting the tower body along the flight trajectory, and finally detecting the tower foot along the trajectory, thereby completing the detection of the three detection objects, namely, the tower head, the tower body and the tower foot, on a plurality of flight trajectories.
Wherein the flight rule includes a flight trajectory, and S303 may specifically include: acquiring a routing inspection strategy associated with the detection object; and generating a flight track meeting the inspection strategy according to the image position of the detection object in the environment image and the detection position. When only one detection object is used, a flight track can be generated directly on the basis of the routing inspection strategy corresponding to the detection object. For example, in a simple embodiment, the detection object is located in the middle of the image acquired by the aircraft, and when the inspection strategy is long-distance continuous video shooting, a straight line track from the detection position as a starting point to the position of the target facility may be generated, so that the aircraft can continuously shoot the detection object of the target facility from long distance to short distance.
If the obtained detection objects include a plurality of detection objects, each detection object is associated with a patrol policy, and the step S303 may specifically include: acquiring a routing inspection strategy of each detection object; generating a flight rule according to the image position of each detection object in the environment image and the detection position; the flight rules comprise flight tracks meeting all inspection strategies or comprise multiple sections of flight tracks, and each section of flight track meets part of inspection strategies.
When the flight rules are generated, the generation of the flight rules is further restricted based on preset limiting conditions. The limiting conditions include: conditions set based on flight parameters and detection parameters, the flight parameters including: any one or more of a flight distance parameter, a flight duration parameter, a flight safety parameter, an energy loss parameter and a flight speed parameter.
In one embodiment, if the flight distance parameter is configured to be a valid value such as 1, it indicates that when generating a flight trajectory satisfying one or more routing inspection strategies, the total length of the flight trajectory is further required to be the shortest, so that the flight distance of the aircraft is the shortest, thereby saving energy consumption. When the flight time length parameter is configured to be an effective value such as 1, the flight time length parameter indicates that when a flight track meeting one or more routing inspection strategies is generated, the shortest time is consumed when the aircraft flies at a preset speed according to the adopted flight track, so that the routing inspection efficiency is improved. When the flight safety parameter is configured to be a valid value such as 1, the safe flight trajectory is preferentially selected, and some possible trajectories with obstacles are excluded, for example, when an electric tower is used as a target facility, the trajectories which may pass through an electric wire are excluded so as to ensure the flight safety. When the energy loss parameter is configured to be 1 and the like, the trajectory with low energy consumption is preferably selected as the final flight trajectory.
S304: and controlling the aircraft to fly according to the flight rules. In one embodiment, further comprising: generating detection parameters, wherein the detection parameters are used for indicating sensing parameters of the aircraft for inspecting the detection object in the process of controlling the aircraft to fly, and the sensing parameters comprise: shooting angle parameters of a sensor used for detecting a detection object and shooting parameters of a camera used for shooting the detection object. For example, for an unmanned aerial vehicle mounting a camera through a pan-tilt, the detection parameters may specifically be parameters for controlling the pan-tilt angle, parameters for controlling the camera focal length, white balance, shutter, and the like.
S305: and acquiring a detection image obtained by detection in the process of controlling the aircraft to fly according to the flight rule. The process of controlling the aircraft to fly according to the flight rules is the process of inspecting the detection objects, the shot images or the videos generated according to the images can be immediately transmitted to an inspection user, and the inspection user determines whether one or more detection objects are normal or not by checking the images or the videos generated according to the images.
For the acquired image, the control device may further perform an analysis to determine the position of the currently detected object in the detection image. For example, when the tower head of the electric tower is currently detected, the position of the tower head as the detection object in the detection image is analyzed and determined. Similarly, the image region where the currently detected detection object is located may be determined according to an image segmentation technique, and further the pixel position where the detection object is located may be determined.
S306: and updating the flight rules according to the position of the detection object in the detection image and the routing inspection strategy set for the detection object. The flight rules are updated mainly to ensure that the detection object currently required to be detected can be detected, for example, to ensure that the position of the detection object is in the central region of the image of the detection image. In the embodiment of the present invention, the step S306 may be performed, and/or the above-mentioned sensing parameter may be updated and adjusted. Or the sensing parameters are adjusted first, if the preset detection requirement cannot be met, for example, it cannot be guaranteed that the position of the detected object is in the image center area of the detected image, the step S306 is executed to update the flight rules (which may further be combined with the update adjustment of the sensing parameters) to meet the preset detection requirement.
S307: and controlling the aircraft to fly according to the updated flying rules so as to complete the detection of the target facility. That is to say, the detection object is continuously detected to obtain a corresponding detection image or a video generated based on the image, and the detection image or the video is returned to the inspection user for viewing.
S308: after the target facility is detected, controlling the aircraft to return according to a preset return track; the preset return trajectory comprises: a recorded flight trajectory before the aircraft flies to the detection location.
When some facilities are inspected, particularly higher facilities or facilities in regions which are difficult to reach, the embodiment of the invention can inspect one or more objects needing to be detected in the facilities based on image recognition and automatic control flight modes, thereby reducing the labor cost and safety of inspection and improving the inspection efficiency. And can also return to the journey automatically and keep away the barrier, further satisfy automation, the intelligent demand of patrolling and examining, improve the security of patrolling and examining.
Referring to fig. 4, a flowchart of a method for determining flight rules according to an embodiment of the present invention is shown, where the method according to the embodiment of the present invention includes the following steps.
S401: acquiring an initial flight rule about a detection object according to the image position of the detection object in the environment image and the detection position. The initial flight rules include one or more flight trajectories, and in one embodiment, in addition to considering the image position of the detection object in the environment image and the detection position, reference is further made to the above-mentioned limiting parameters in generating the initial flight rules.
S402: obtaining a residual energy value of the aircraft. The residual energy value comprises data such as the residual electric quantity value of the unmanned aerial vehicle.
S403: and adjusting the initial flight rule according to the residual energy value, and taking the rule obtained after adjustment as the flight rule of the detection object. And determining the distance capable of supporting the aircraft to fly according to the residual energy value, and if the flight trajectory in the initial flight rule cannot be covered, selecting to execute a part of flight trajectories to obtain the flight rule required to be executed at this time. After the inspection of the detection object is performed by the execution of the partial flight trajectory, the initial flight rule is automatically recorded and the executed flight trajectory is recorded, so that the adjustment is performed on the initial flight rule again determined from the executed partial flight trajectory on the basis of the initial flight rule next time, and a new flight rule comprising the flight trajectory is generated.
Can come to carry out intelligent adjustment to flight orbit etc. according to unmanned aerial vehicle's circumstances such as battery power, further ensure to patrol and examine safety.
The embodiment of the present invention further provides a computer storage medium, where program instructions are stored in the computer storage medium, and when the program instructions are executed, the corresponding method of the embodiment corresponding to fig. 1, fig. 2, fig. 3, or fig. 4 is implemented.
The utility vehicle and the control device of the embodiment of the invention are described below.
Fig. 5 is a schematic structural diagram of a facility detection apparatus according to an embodiment of the present invention, and the facility aircraft according to the embodiment of the present invention may be installed in a flying aircraft capable of performing an inspection task, such as an unmanned aerial vehicle. The utility vehicle includes the following structure.
An obtaining module 501, configured to obtain an environment image including a target facility when an aircraft is located at a detection position for the target facility; a determining module 502, configured to determine an image region to which the target facility belongs from the environment image, and perform image segmentation on the image region to obtain a detection object related to the target facility; a processing module 503, configured to obtain a flight rule about a detection object according to an image position of the detection object in the environment image and the detection position; a control module 504, configured to control the aircraft to fly according to the flight rules, so as to complete detection of the target facility.
Further optionally, the flight rule includes a flight trajectory, and the processing module 503 is specifically configured to obtain a patrol policy associated with the detection object; and generating a flight track meeting the inspection strategy according to the image position of the detection object in the environment image and the detection position.
Further optionally, the obtained detection objects include a plurality of detection objects, each detection object is associated with an inspection policy, and the processing module 503 is specifically configured to obtain the inspection policy of each detection object; generating a flight rule according to the image position of each detection object in the environment image and the detection position; the flight rules comprise flight tracks meeting all inspection strategies or comprise multiple sections of flight tracks, and each section of flight track meets part of inspection strategies.
Further optionally, the generated flight rules also meet preset limiting parameters; the limiting parameters include: any one or more of a flight distance parameter, a flight duration parameter, a flight safety parameter, and an energy loss parameter.
Further optionally, the apparatus may further include: a generating module 505, configured to generate a detection parameter, where the detection parameter is a sensing parameter used for instructing the aircraft to inspect a detection object in a process of controlling the aircraft to fly, and the sensing parameter includes: shooting angle parameters of a sensor used for detecting a detection object and shooting parameters of a camera used for shooting the detection object.
Further optionally, the processing module 503 is further configured to obtain a detection image obtained by detection in a process of controlling the aircraft to fly according to the flight rule; updating the flight rules according to the position of the detection object in the detection image and the routing inspection strategy set for the detection object; and controlling the aircraft to fly according to the updated flying rules so as to complete the detection of the target facility.
Further optionally, the processing module 503 is specifically configured to obtain an initial flight rule about the detection object according to the image position of the detection object in the environment image and the detection position; obtaining a residual energy value of the aircraft; and adjusting the initial flight rule according to the residual energy value, and taking the rule obtained after adjustment as the flight rule of the detection object.
Further optionally, the determining module 502 is specifically configured to obtain an object model preset for the target facility; and carrying out image segmentation on the image area according to the object model to obtain a detection object with the shape similarity meeting the similarity condition with the object model.
Further optionally, the object model is configured with a model identifier, and the inspection policy associated with the detected object is obtained according to the model identifier.
Further optionally, the apparatus may further include: a setting module 506, configured to configure one or more to-be-detected facility location points on an interactive interface displaying a map; determining facilities corresponding to the selected one or more facility position points as target facilities; and controlling the aircraft to fly to the target facility according to the selected facility position point so as to fly to the detection position of the target facility.
Further optionally, the apparatus may further include: a receiving module 507, configured to receive location information returned by the aircraft, where the location information includes: distance information and direction information of the aircraft relative to a target object generated by the aircraft, or position coordinate information of the aircraft returned by the aircraft.
Further optionally, the control module 504 is further configured to control the aircraft to fly to the target facility according to a specified rule in a process that the aircraft flies to the target facility, where the specified rule is used to instruct the aircraft to fly to at least two shooting positions that can be shot at different angles for obtaining the depth map; acquiring a depth map of the aircraft in a traveling direction based on at least two shooting positions; and carrying out flight obstacle avoidance processing according to the acquired depth map.
Further optionally, the control module 504 is further configured to control the aircraft to return according to a preset return trajectory after the target facility is detected; the preset return trajectory comprises: a recorded flight trajectory before the aircraft flies to the detection location.
When some facilities are inspected, particularly higher facilities or facilities in regions which are difficult to reach, the embodiment of the invention can inspect one or more objects needing to be detected in the facilities based on image recognition and automatic control flight modes, thereby reducing the labor cost and safety of inspection and improving the inspection efficiency. And can also return to the journey automatically and keep away the barrier, further satisfy automation, the intelligent demand of patrolling and examining, improve the security of patrolling and examining.
Referring to fig. 6 again, the structure of the control device in the embodiment of the present invention is schematically illustrated, and the control device in the embodiment of the present invention includes a power supply circuit, and the control device may be powered by a single battery, or may be powered by a battery of an aircraft such as an unmanned aerial vehicle through a power supply interface. The control device may also comprise a processor 601, a data interface 602 and a memory 603.
The data interface 602 is mainly used for interacting data with the aircraft, or further, the data interface 602 may also interact data with monitoring equipment on the ground for receiving and displaying data such as images detected by the aircraft.
The memory 603 may include a volatile memory (volatile memory), such as a random-access memory (RAM); the memory 603 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); the memory 603 may also comprise a combination of memories of the kind described above.
The processor 601 may be a Central Processing Unit (CPU). The processor 601 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 603 is also used for storing program instructions. The processor 601 may invoke the program instructions to implement the facility detection method as shown in the embodiments corresponding to fig. 1, 2, 3 and 4 of the present application.
In one embodiment, the processor 601 is configured to obtain an environment image including a target facility when an aircraft is located at a detection position for the target facility; determining an image area to which the target facility belongs from the environment image, and performing image segmentation on the image area to obtain a detection object related to the target facility; acquiring a flight rule about a detection object according to the image position of the detection object in the environment image and the detection position; and generating a control instruction according to the flight rule, and sending the control instruction to the aircraft through the data interface 602 to control the aircraft to fly so as to complete the detection of the target facility.
Optionally, the flight rule includes a flight trajectory, and the processor 601 is configured to obtain a patrol policy associated with the detection object; and generating a flight track meeting the inspection strategy according to the image position of the detection object in the environment image and the detection position.
Optionally, the obtained detection objects include a plurality of detection objects, each detection object is associated with an inspection policy, and the processor 601 is configured to obtain the inspection policy of each detection object; generating a flight rule according to the image position of each detection object in the environment image and the detection position; the flight rules comprise flight tracks meeting all inspection strategies or comprise multiple sections of flight tracks, and each section of flight track meets part of inspection strategies.
Optionally, the generated flight rules further satisfy preset limiting parameters; the limiting parameters include: any one or more of a flight distance parameter, a flight duration parameter, a flight safety parameter, and an energy loss parameter.
Optionally, the processor 601 is further configured to generate a detection parameter, and send the detection parameter to an aircraft through the data interface 602, where the detection parameter is a sensing parameter used for instructing the aircraft to inspect a detection object in a process of controlling a flight of the aircraft, and the sensing parameter includes: shooting angle parameters of a sensor used for detecting a detection object and shooting parameters of a camera used for shooting the detection object.
Optionally, the processor 601 is further configured to obtain a detection image obtained by detection in a process of controlling the aircraft to fly according to the flight rule; updating the flight rules according to the position of the detection object in the detection image and the routing inspection strategy set for the detection object; and controlling the aircraft to fly according to the updated flying rules so as to complete the detection of the target facility.
Optionally, the processor 601 is configured to obtain an initial flight rule about a detection object according to an image position of the detection object in the environment image and the detection position; obtaining a residual energy value of the aircraft; and adjusting the initial flight rule according to the residual energy value, and taking the rule obtained after adjustment as the flight rule of the detection object.
Optionally, the processor 601 is configured to obtain an object model preset for the target facility; and carrying out image segmentation on the image area according to the object model to obtain a detection object with the shape similarity meeting the similarity condition with the object model.
Optionally, the object model is configured with a model identifier, and a routing inspection policy associated with the detected object is obtained according to the model identifier.
Optionally, the processor 601 is further configured to configure one or more to-be-detected facility location points on an interactive interface displaying a map; determining facilities corresponding to the selected one or more facility position points as target facilities; and controlling the aircraft to fly to the target facility according to the selected facility position point so as to fly to the detection position of the target facility.
Optionally, the processor 601 is further configured to receive position information returned by the aircraft, where the position information includes: distance information and direction information of the aircraft relative to a target object generated by the aircraft, or position coordinate information of the aircraft returned by the aircraft.
Optionally, the processor 601 is further configured to control the aircraft to fly to the target facility according to a specified rule during the flight of the aircraft to the target facility, where the specified rule is used to instruct the aircraft to fly to at least two shooting positions capable of shooting at different angles for obtaining the depth map; acquiring a depth map of the aircraft in a traveling direction based on at least two shooting positions; and carrying out flight obstacle avoidance processing according to the acquired depth map.
Optionally, the processor 601 is further configured to control the aircraft to return according to a preset return trajectory after the target facility is detected; the preset return trajectory comprises: a recorded flight trajectory before the aircraft flies to the detection location.
When some facilities are inspected, particularly higher facilities or facilities in regions which are difficult to reach, the embodiment of the invention can inspect one or more objects needing to be detected in the facilities based on image recognition and automatic control flight modes, thereby reducing the labor cost and safety of inspection and improving the inspection efficiency. And can also return to the journey automatically and keep away the barrier, further satisfy automation, the intelligent demand of patrolling and examining, improve the security of patrolling and examining.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (11)

1. An aircraft-based facility detection method, comprising:
acquiring an environmental image including a target facility based on a sensor carried by an aircraft when the aircraft is located at a detection position for the target facility;
determining an image area to which the target facility belongs from the environment image, identifying one or more detection objects of the target facility in the image area, and determining the image positions of the one or more detection objects in the environment image; wherein the detection object is a part of the target facility;
acquiring a routing inspection strategy associated with the one or more detection objects;
generating a flight track meeting the inspection strategy according to the image position of the one or more detection objects in the environment image and the detection position;
and controlling the aircraft to fly according to the flight track so as to complete the detection of the target facility.
2. The method of claim 1, wherein the inspection object comprises a plurality of inspection objects, each inspection object associated with an inspection policy;
generating a flight track meeting the inspection strategy according to the image position of the one or more detection objects in the environment image and the detection position, wherein the flight track comprises:
acquiring a routing inspection strategy of each detection object;
and generating a flight track according to the image position of each detection object in the environment image, the detection position and the inspection strategy of each detection object.
3. The method according to claim 1 or 2, characterized in that the generated flight trajectory also satisfies preset limiting parameters; the limiting parameters include one or more of the following: flight distance parameters, flight duration parameters, flight safety parameters and energy loss parameters.
4. The method of any one of claims 1-3, further comprising:
generating detection parameters including sensing parameters for instructing the aircraft to patrol a detection object during flight of the aircraft, the sensing parameters including: shooting parameters of a sensor for detecting a detection object.
5. The method of any one of claims 1-4, further comprising:
in the process of controlling the aircraft to fly according to the flight track, obtaining a detection image obtained by detection;
updating the flight track according to the position of the detection object in the detection image and a routing inspection strategy set for the detection object;
and controlling the aircraft to fly according to the updated flight track so as to complete the detection of the target facility.
6. The method according to any one of claims 1 to 5, wherein the generating of the flight trajectory satisfying the patrol inspection strategy according to the image positions of the one or more detection objects in the environment image and the detection positions comprises:
obtaining a residual energy value of the aircraft;
and generating a flight track meeting the inspection strategy according to the residual energy value, the image position of the one or more detection objects in the environment image and the detection position.
7. The method of any one of claims 1-6, wherein said identifying one or more detection objects of the target facility in the image region comprises:
acquiring an object model preset for the target facility;
and carrying out image segmentation on the image area according to the object model to obtain one or more detection objects with the similarity meeting the similarity condition with the object model.
8. The method of any one of claims 1-7, further comprising:
configuring one or more to-be-detected facility position points on an interactive interface for displaying a map;
determining facilities corresponding to the selected one or more facility position points as target facilities;
and controlling the aircraft to fly towards the target facility according to the selected facility position point so as to fly to the detection position for the target facility.
9. The method of claim 8, further comprising:
in the process that the aircraft flies to the target facility, the aircraft is controlled to fly to the target facility according to a specified rule, and the specified rule is used for indicating the aircraft to move to at least two target shooting positions;
respectively acquiring depth maps of the aircraft in the traveling direction at the at least two shooting positions, wherein the depth maps are used for representing the scene depth obtained by shooting the target facility from different angles;
and adjusting the flight track of the aircraft according to the acquired depth map so that the aircraft can avoid the obstacle aiming at the target facility.
10. The method of claim 8, further comprising:
receiving position information returned by the aircraft, wherein the position information comprises: distance information and direction information of the aircraft relative to a target object generated by the aircraft, or position coordinate information of the aircraft returned by the aircraft;
displaying, on an interactive interface, a relative position between the aircraft and the target facility based on the received position information and the position of the target facility.
11. A control apparatus, characterized by comprising: a processor, a memory, and a data interface;
the data interface is used for interacting data with the aircraft;
the memory for storing program instructions, the processor configured to invoke the program instructions to perform the aircraft-based facility detection method of any of claims 1-10.
CN202111071109.0A 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment Pending CN113791641A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111071109.0A CN113791641A (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/CN2017/082501 WO2018195955A1 (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control device
CN201780004504.2A CN108496129B (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment
CN202111071109.0A CN113791641A (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201780004504.2A Division CN108496129B (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment

Publications (1)

Publication Number Publication Date
CN113791641A true CN113791641A (en) 2021-12-14

Family

ID=63344773

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201780004504.2A Active CN108496129B (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment
CN202111071109.0A Pending CN113791641A (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201780004504.2A Active CN108496129B (en) 2017-04-28 2017-04-28 Aircraft-based facility detection method and control equipment

Country Status (2)

Country Link
CN (2) CN108496129B (en)
WO (1) WO2018195955A1 (en)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358650B (en) * 2018-12-14 2022-11-18 国网冀北电力有限公司检修分公司 Routing inspection path planning method and device, unmanned aerial vehicle and computer readable storage medium
CN109471447A (en) * 2018-12-14 2019-03-15 国网冀北电力有限公司检修分公司 Navigation of Pilotless Aircraft method, apparatus, unmanned plane and data readable storage devices
CN110068332B (en) * 2019-02-21 2022-06-24 国网浙江平湖市供电有限公司 Transformer substation inspection path planning device and method based on wearable equipment
CN109885083A (en) * 2019-03-06 2019-06-14 国网陕西省电力公司检修公司 Transmission line of electricity fining inspection flying platform and method for inspecting based on laser radar
CN109885098B (en) * 2019-04-11 2022-02-11 株洲时代电子技术有限公司 Method for planning inspection route of bridge side fence
CN110307837B (en) * 2019-07-22 2023-04-18 湖南中图通无人机技术有限责任公司 Unmanned aerial vehicle navigation system and method based on image recognition
WO2021016880A1 (en) * 2019-07-30 2021-02-04 深圳市大疆创新科技有限公司 Flight simulation method and device for unmanned aerial vehicle, and recording medium
CN112262319A (en) * 2019-08-30 2021-01-22 深圳市大疆创新科技有限公司 Detection method of power line, millimeter wave radar, system, and storage medium
CN112119361A (en) * 2019-09-23 2020-12-22 深圳市大疆创新科技有限公司 Returning control method and device for movable platform and movable platform
CN111401146A (en) * 2020-02-26 2020-07-10 长江大学 Unmanned aerial vehicle power inspection method, device and storage medium
CN111582117A (en) * 2020-04-29 2020-08-25 长江大学 Unmanned aerial vehicle illegal building inspection method, equipment and storage medium
CN112823323A (en) * 2020-05-06 2021-05-18 深圳市大疆创新科技有限公司 Inspection method, unmanned aerial vehicle, ground control platform, system and storage medium
CN113741413B (en) * 2020-05-29 2022-11-08 广州极飞科技股份有限公司 Operation method of unmanned equipment, unmanned equipment and storage medium
CN112068591A (en) * 2020-08-25 2020-12-11 中国南方电网有限责任公司超高压输电公司天生桥局 Unmanned aerial vehicle for automatic inspection of power transmission line, control method and device and storage medium
CN112180955B (en) * 2020-08-26 2024-02-20 国网安徽省电力有限公司淮南供电公司 Visual feedback-based secondary review method and system for automatic inspection unmanned aerial vehicle
CN112229845A (en) * 2020-10-12 2021-01-15 国网河南省电力公司濮阳供电公司 Unmanned aerial vehicle high-precision winding tower intelligent inspection method based on visual navigation technology
CN112233270A (en) * 2020-10-30 2021-01-15 国家电网有限公司 Unmanned aerial vehicle is intelligence around tower system of patrolling and examining independently
CN112327920B (en) * 2020-11-16 2023-07-14 国网新疆电力有限公司检修公司 Unmanned aerial vehicle autonomous obstacle avoidance routing inspection path planning method and device
CN112788292B (en) * 2020-12-28 2023-05-26 深圳市朗驰欣创科技股份有限公司 Method and device for determining inspection observation points, inspection robot and storage medium
CN113014904A (en) * 2021-02-24 2021-06-22 苏州臻迪智能科技有限公司 Method, device and system for processing inspection image of unmanned aerial vehicle and storage medium
CN113112098A (en) * 2021-05-12 2021-07-13 上海野戈智能科技发展有限公司 Building defect detection method and device
CN113625730B (en) * 2021-06-30 2023-07-14 南京邮电大学 Four-rotor self-adaptive fault-tolerant control method based on ultra-torsion sliding mode
DE102021123124A1 (en) * 2021-09-07 2023-03-09 Spleenlab GmbH Method for controlling an unmanned aerial vehicle for an inspection flight to inspect an object and inspection unmanned aerial vehicle
CN113938609B (en) * 2021-11-04 2023-08-22 中国联合网络通信集团有限公司 Regional monitoring method, device and equipment
CN113872680B (en) * 2021-12-03 2022-04-26 特金智能科技(上海)有限公司 TDOA (time difference of arrival) auxiliary RID (Rich Internet protocol) signal receiving control method, device and system
CN114326794A (en) * 2021-12-13 2022-04-12 广东省建设工程质量安全检测总站有限公司 Curtain wall defect identification method, control terminal, server and readable storage medium
CN115052133B (en) * 2022-07-06 2023-09-12 国网江苏省电力有限公司南通市通州区供电分公司 Unmanned aerial vehicle-based power distribution rack acceptance method
CN115275870B (en) * 2022-09-28 2022-12-06 合肥优晟电力科技有限公司 Inspection system based on high-altitude line maintenance
CN116225062B (en) * 2023-03-14 2024-01-16 广州天勤数字科技有限公司 Unmanned aerial vehicle navigation method applied to bridge inspection and unmanned aerial vehicle
CN116878518B (en) * 2023-09-06 2023-11-21 滨州市华亿电器设备有限公司 Unmanned aerial vehicle inspection path planning method for urban power transmission line maintenance
CN116909318B (en) * 2023-09-14 2023-11-24 众芯汉创(江苏)科技有限公司 Unmanned aerial vehicle autonomous routing inspection route planning system based on high-precision three-dimensional point cloud

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101477169A (en) * 2009-01-16 2009-07-08 华北电力大学 Electric power circuit detection method by polling flying robot
CN103196430A (en) * 2013-04-27 2013-07-10 清华大学 Mapping navigation method and system based on flight path and visual information of unmanned aerial vehicle
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN105551032A (en) * 2015-12-09 2016-05-04 国网山东省电力公司电力科学研究院 Pole image collection system and method based on visual servo
CN106354156A (en) * 2016-09-29 2017-01-25 腾讯科技(深圳)有限公司 Method and device for tracking target object, and air vehicle
CN106504362A (en) * 2016-10-18 2017-03-15 国网湖北省电力公司检修公司 Power transmission and transformation system method for inspecting based on unmanned plane

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589524B (en) * 2011-01-13 2014-01-08 国家电网公司 Power line patrolling method
KR20130127822A (en) * 2012-05-15 2013-11-25 한국전자통신연구원 Apparatus and method of processing heterogeneous sensor fusion for classifying and positioning object on road
CN102941920A (en) * 2012-12-05 2013-02-27 南京理工大学 High-tension transmission line inspection robot based on multi-rotor aircraft and method using robot
US9459889B2 (en) * 2014-05-19 2016-10-04 Qualcomm Incorporated Systems and methods for context-aware application control
CN104035446B (en) * 2014-05-30 2017-08-25 深圳市大疆创新科技有限公司 The course generation method and system of unmanned plane
CN104298248B (en) * 2014-10-08 2018-02-13 南京航空航天大学 Rotor wing unmanned aerial vehicle accurate vision positioning and orienting method
CN106468918B (en) * 2015-08-18 2020-03-20 航天图景(北京)科技有限公司 Standardized data acquisition method and system for line inspection
CN105023014B (en) * 2015-08-21 2018-11-23 马鞍山市安工大工业技术研究院有限公司 A kind of shaft tower target extraction method in unmanned plane inspection transmission line of electricity image
CN205920057U (en) * 2015-09-29 2017-02-01 柳州欧维姆机械股份有限公司 Detect fissured many rotor unmanned aerial vehicle testing platform system in structure surface
CN105787447A (en) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision
CN105678289A (en) * 2016-03-07 2016-06-15 谭圆圆 Control method and device of unmanned aerial vehicle
CN105955308B (en) * 2016-05-20 2018-06-29 腾讯科技(深圳)有限公司 The control method and device of a kind of aircraft
CN106127788B (en) * 2016-07-04 2019-10-25 触景无限科技(北京)有限公司 A kind of vision barrier-avoiding method and device
CN106054924B (en) * 2016-07-06 2019-08-30 北京大为远达科技发展有限公司 A kind of unmanned plane accompanying flying method, accompanying flying device and accompanying flying system
CN106054931B (en) * 2016-07-29 2019-11-05 北方工业大学 A kind of unmanned plane fixed point flight control system of view-based access control model positioning
CN106228862A (en) * 2016-09-28 2016-12-14 国家电网公司 Emulation training method patrolled and examined by a kind of power transmission line unmanned machine
CN106595631B (en) * 2016-10-25 2019-08-23 纳恩博(北京)科技有限公司 A kind of method and electronic equipment of avoiding barrier
CN106477038B (en) * 2016-12-20 2018-12-25 北京小米移动软件有限公司 Image capturing method and device, unmanned plane

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101477169A (en) * 2009-01-16 2009-07-08 华北电力大学 Electric power circuit detection method by polling flying robot
CN103196430A (en) * 2013-04-27 2013-07-10 清华大学 Mapping navigation method and system based on flight path and visual information of unmanned aerial vehicle
CN104049641A (en) * 2014-05-29 2014-09-17 深圳市大疆创新科技有限公司 Automatic landing method and device and air vehicle
CN105551032A (en) * 2015-12-09 2016-05-04 国网山东省电力公司电力科学研究院 Pole image collection system and method based on visual servo
CN106354156A (en) * 2016-09-29 2017-01-25 腾讯科技(深圳)有限公司 Method and device for tracking target object, and air vehicle
CN106504362A (en) * 2016-10-18 2017-03-15 国网湖北省电力公司检修公司 Power transmission and transformation system method for inspecting based on unmanned plane

Also Published As

Publication number Publication date
WO2018195955A1 (en) 2018-11-01
CN108496129B (en) 2021-10-01
CN108496129A (en) 2018-09-04

Similar Documents

Publication Publication Date Title
CN108496129B (en) Aircraft-based facility detection method and control equipment
CN109765930B (en) Unmanned aerial vehicle vision navigation
CN106873627B (en) Multi-rotor unmanned aerial vehicle and method for automatically inspecting power transmission line
JP7274674B1 (en) Performing 3D reconstruction with unmanned aerial vehicle
CN111512256B (en) Automated and adaptive three-dimensional robotic site survey
CN108508916B (en) Control method, device and equipment for unmanned aerial vehicle formation and storage medium
CN102707724B (en) Visual localization and obstacle avoidance method and system for unmanned plane
CN111958592B (en) Image semantic analysis system and method for transformer substation inspection robot
CN103941746A (en) System and method for processing unmanned aerial vehicle polling image
CN112379681A (en) Unmanned aerial vehicle obstacle avoidance flight method and device and unmanned aerial vehicle
CN111244822B (en) Fixed-wing unmanned aerial vehicle line patrol method, system and device in complex geographic environment
CN114115289A (en) Autonomous unmanned cluster reconnaissance system
CN112542800A (en) Method and system for identifying transmission line fault
CN112380933A (en) Method and device for identifying target by unmanned aerial vehicle and unmanned aerial vehicle
CN115019216B (en) Real-time ground object detection and positioning counting method, system and computer
US20220221857A1 (en) Information processing apparatus, information processing method, program, and information processing system
WO2022004333A1 (en) Information processing device, information processing system, information processing method, and program
JP7437930B2 (en) Mobile objects and imaging systems
CN113759944A (en) Automatic inspection method, system and equipment based on designated altitude flight
CN113126649A (en) Control system for intelligent patrol inspection unmanned aerial vehicle of power transmission line
Liu et al. Visualization of Power Corridor Based on UAV Line Inspection Data
CN109144098A (en) A kind of unmanned plane stair automatic detecting method
CN114740878B (en) Unmanned aerial vehicle flight obstacle detection method based on computer image recognition
CN116168370B (en) Automatic driving data identification method and system
CN117873158A (en) Unmanned aerial vehicle routing inspection complex route optimization method based on live-action three-dimensional model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination