CN116506471A - Unmanned aerial vehicle inspection system - Google Patents

Unmanned aerial vehicle inspection system Download PDF

Info

Publication number
CN116506471A
CN116506471A CN202310756054.XA CN202310756054A CN116506471A CN 116506471 A CN116506471 A CN 116506471A CN 202310756054 A CN202310756054 A CN 202310756054A CN 116506471 A CN116506471 A CN 116506471A
Authority
CN
China
Prior art keywords
target
inspection
unmanned aerial
aerial vehicle
patrol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310756054.XA
Other languages
Chinese (zh)
Other versions
CN116506471B (en
Inventor
王继军
黄国胜
荣正官
周明
张平
司福强
伍平
伏松平
詹秀峰
罗颖欣
赵灵燕
解智
韩超
王超
刘玖林
车颜泽
马浩
付先武
杨晓燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Railway Construction Electrification Bureau Group Co Ltd
Science and Technology Co Ltd of China Railway Construction Electrification Bureau Group Co Ltd
Beijing China Railway Construction Electrification Design and Research Institute Co Ltd
Original Assignee
China Railway Construction Electrification Bureau Group Co Ltd
Science and Technology Co Ltd of China Railway Construction Electrification Bureau Group Co Ltd
Beijing China Railway Construction Electrification Design and Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Railway Construction Electrification Bureau Group Co Ltd, Science and Technology Co Ltd of China Railway Construction Electrification Bureau Group Co Ltd, Beijing China Railway Construction Electrification Design and Research Institute Co Ltd filed Critical China Railway Construction Electrification Bureau Group Co Ltd
Priority to CN202310756054.XA priority Critical patent/CN116506471B/en
Publication of CN116506471A publication Critical patent/CN116506471A/en
Application granted granted Critical
Publication of CN116506471B publication Critical patent/CN116506471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/20Checking timed patrols, e.g. of watchman
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Selective Calling Equipment (AREA)
  • Train Traffic Observation, Control, And Security (AREA)

Abstract

The invention relates to an unmanned aerial vehicle inspection system. Wherein, unmanned aerial vehicle inspection system includes: unmanned aerial vehicle, control terminal and server; the control terminal is used for sending a patrol task to the unmanned aerial vehicle; the unmanned aerial vehicle is used for collecting first patrol data after receiving the patrol task and sending the first patrol data to the server; the server is used for receiving the first inspection data, analyzing the first inspection data to obtain a first analysis result of whether the easy-to-float objects exist along the railway, and sending the first analysis result to the control terminal; the control terminal is used for generating the target inspection task according to the type of the easily-floating object and/or the inspection data corresponding to the easily-floating object when the first analysis result is that the easily-floating object exists.

Description

Unmanned aerial vehicle inspection system
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle inspection system.
Background
Under the general circumstances, the construction units and residents along the railway are more, so that easily-floating objects such as dust screens, sunshade screens, advertisement banners, vegetable greenhouses and plastic bags are easy to generate, and great hidden danger is brought to the driving safety of the railway, so that the daily inspection work along the railway is particularly important.
The conventional daily patrol work along the railway, such as regular patrol, irregular patrol and emergency patrol, is performed by special patrol personnel, and the easily-floating objects along the railway are cleaned in the patrol process, so that hidden danger of the easily-floating objects along the railway on the driving safety of the railway is eliminated.
However, along with the rapid development of railways, the number of railways is gradually increased, daily inspection work of railways is also gradually increased, a large amount of human resources are occupied, the human cost is increased, and meanwhile, the problem that part of easily-floating objects cannot be cleaned in time due to the fact that inspection staff easily generate inspection dead angles in the daily inspection process is solved.
Disclosure of Invention
In order to solve the technical problems, the invention provides an unmanned aerial vehicle inspection system.
The embodiment of the invention provides an unmanned aerial vehicle inspection system, which comprises an unmanned aerial vehicle, a control terminal and a server;
the control terminal is used for sending a patrol task to the unmanned aerial vehicle, wherein the patrol task comprises a patrol route for patrol the railway line and a patrol control instruction;
the unmanned aerial vehicle is used for collecting first patrol data based on a patrol route and a patrol control instruction after receiving a patrol task and sending the first patrol data to the server;
The server is used for receiving the first inspection data, analyzing the first inspection data to obtain a first analysis result of whether the easy-to-float objects exist along the railway, sending the first analysis result to the control terminal, and when the easy-to-float objects exist along the railway, the first analysis result also comprises the type of the easy-to-float objects and/or the inspection data corresponding to the easy-to-float objects;
and the control terminal is used for generating a target inspection task according to the type of the flyable object and/or inspection data corresponding to the flyable object when the first analysis result is that the flyable object is available, wherein the type of the flyable object is the target flyable object and the suspected flyable object.
In some embodiments of the invention, the server includes an image processing module;
the image processing module is used for carrying out framing processing on the first inspection data to obtain at least one target image data, and then analyzing the at least one target image data to obtain a first analysis result.
In some embodiments of the present invention, the image processing module includes a first computing unit, a second computing unit, a filtering processing unit, and an image recognition unit;
the first calculation unit is used for extracting pixel coordinates corresponding to at least one piece of target image data respectively, and calculating the connected domain area corresponding to the at least one piece of target image data respectively based on the pixel coordinates;
The second calculation unit is used for comparing the area of the connected domain with a preset area threshold value and determining a target connected domain with the area of the connected domain being larger than or equal to the preset area threshold value;
the filtering processing unit is used for performing filtering processing on the target image data based on the target connected domain;
the image recognition unit is used for carrying out image recognition on the target image data after the filtering processing to obtain a recognition result and obtaining a first analysis result based on the recognition result.
In some embodiments of the invention, the server includes a database to be determined, a drift database, and an interferent database;
the image recognition unit is specifically configured to match, after the recognition result is obtained, the target object corresponding to the recognition result with the objects in the database to be determined, the volatile object database and the interfering object database, respectively, to obtain a first analysis result.
In some embodiments of the present invention, the control terminal includes a hidden trouble follow-up module and a patrol task generating module;
the hidden danger follow-up module is used for acquiring target characteristics of the target flyable objects when the first analysis result shows that the flyable objects exist and the type of the flyable objects is the target flyable objects, and determining a hidden danger elimination scheme based on the target characteristics, wherein the target characteristics comprise at least one of region characteristics of the target flyable objects, types of the target flyable objects, sizes of the target flyable objects and colors of the target flyable objects;
The inspection task generating module is used for determining a first target area corresponding to the flyable object when the first analysis result is that the flyable object is the target flyable object and the type of the flyable object is the target flyable object, generating a first target control instruction aiming at the first target area, and generating a first target inspection task based on the first target area and the first target control instruction so as to enable the unmanned aerial vehicle to execute the first target inspection task;
the inspection task generating module is further configured to determine a second target area corresponding to the suspected flap object when the first analysis result indicates that the flap object is present and the type of the flap object is suspected flap object, generate a second target control instruction for the second target area, and generate a second target inspection task based on the second target area and the second target control instruction, so that the unmanned aerial vehicle executes the second target inspection task, where the second target control instruction includes at least one of a hover instruction, a steering control instruction, and a descent instruction;
the unmanned aerial vehicle is further used for executing a first target patrol task to obtain second patrol data, sending the second patrol data to the server, and/or executing a second target patrol task to obtain third patrol data, and sending the third patrol data to the server;
The server is further configured to receive second inspection data, analyze the second inspection data to obtain a second analysis result, and/or receive third inspection data, analyze the third inspection data to obtain a third analysis result.
In some embodiments of the present invention, the hidden danger follow-up module includes a hidden danger labeling unit;
the hidden danger marking unit is used for generating a first marking result based on the first analysis result, wherein the first marking result comprises Beidou coordinates corresponding to the target easily-floating object, relative coordinates of the target easily-floating object and a railway, target characteristics of the target easily-floating object and the discovery time of the target easily-floating object; generating a second labeling result based on the second analysis result; and generating a third labeling result based on the third analysis result.
In some embodiments of the present invention, the control terminal further includes a tag statistics module;
the marking statistics module is used for accumulating marking results of the multiple inspection tasks in a preset time period by the hidden danger marking unit to generate a first statistics result;
the marking statistics module is also used for updating and counting the first marking result according to the second marking result and/or the third marking result to generate a second statistical result.
In some embodiments of the present invention, when the third analysis result determines that the suspected flap object is the target flap object or the interfering object, the server is further configured to generate a tag corresponding to the suspected flap object, and store the tag and the inspection data corresponding to the suspected flap object in a flap object database or an interfering object database corresponding to the server.
In some embodiments of the present invention, the control terminal includes a display module;
the display module is used for generating a corresponding planar map based on the first inspection data, displaying the planar map in real time, and displaying the position information of the unmanned aerial vehicle on the planar map in real time;
and the display module is also used for displaying the hidden danger positions of the flyable objects and the types of the flyable objects on the planar map when the first analysis result is that the flyable objects exist.
In some embodiments of the invention, the unmanned aerial vehicle comprises a task judgment module;
the task judging module is used for judging the priority of the patrol task when a plurality of patrol tasks exist, and executing the patrol task based on the priority.
Compared with the prior art, the technical scheme provided by the embodiment of the invention has the following advantages:
the unmanned aerial vehicle inspection system comprises an unmanned aerial vehicle, a control terminal and a server, wherein the control terminal is used for sending inspection tasks to the unmanned aerial vehicle, the inspection tasks comprise inspection lines used for inspecting railway lines and inspection control instructions, the unmanned aerial vehicle is used for collecting first inspection data based on the inspection lines and the inspection control instructions after receiving the inspection tasks, the first inspection data are sent to the server, the server is used for receiving the first inspection data, analyzing the first inspection data to obtain a first analysis result of whether the railway lines have the easily-floating objects or not, the first analysis result is sent to the control terminal, the first analysis result also comprises the types of the easily-floating objects and/or the inspection data corresponding to the easily-floating objects when the railway lines have the easily-floating objects, and the control terminal is used for generating target tasks according to the types of the easily-floating objects and/or the inspection data corresponding to the easily-floating objects when the first analysis result is the easily-floating objects, the types of the easily-floating objects are the easily-floating objects and the easily-floating objects, the first analysis result is the easily-floating objects, the easily-floating objects are the easily-floating objects and the easily-floating objects are easily-floating objects, the easily-floating objects are easily-floating objects, and the easily-floating objects are lower, and the cost is safe and the easily-floating objects are in the process is avoided, and the a process is safe.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle inspection system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of another unmanned aerial vehicle inspection system according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention.
Detailed Description
In order that the above objects, features and advantages of the invention will be more clearly understood, a further description of the invention will be made. It should be noted that, without conflict, the embodiments of the present invention and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced otherwise than as described herein; it will be apparent that the embodiments in the specification are only some, but not all, embodiments of the invention.
It should be understood that the various steps recited in the method embodiments of the present invention may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the invention is not limited in this respect.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those skilled in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
Under normal conditions, the conventional daily inspection work along the railway, such as regular inspection, irregular inspection and emergency inspection, is performed by special inspection personnel, and the easily-floating objects along the railway are cleaned in the inspection process, so that hidden hazards of the easily-floating objects along the railway on the driving safety of the railway are eliminated. However, along with the rapid development of railways, the number of railways is gradually increased, daily inspection work of railways is also gradually increased, a large amount of human resources are occupied, the human cost is increased, and meanwhile, the problem that part of easily-floating objects cannot be cleaned in time due to the fact that inspection staff easily generate inspection dead angles in the daily inspection process is solved. In order to solve the problem, the embodiment of the invention provides an unmanned aerial vehicle inspection system, and the unmanned aerial vehicle inspection system is described below with reference to specific embodiments.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle inspection system according to an embodiment of the present invention.
As shown in fig. 1, an unmanned aerial vehicle inspection system 10 provided by an embodiment of the present invention includes an unmanned aerial vehicle 11, a control terminal 12, and a server 13.
In the embodiment of the present invention, the control terminal 12 is configured to send a patrol task to the unmanned aerial vehicle 11, where the patrol task includes a patrol route for patrol of the railway line and a patrol control instruction.
The patrol task can be a preset timing patrol task or a temporarily established emergency patrol task.
In the embodiment of the invention, the patrol range corresponding to the patrol task may be a range where the patrol railway is located by a preset distance to the left and a preset distance to the right along the line, for example, the preset distance may be 500 meters, and the patrol range may be a range of 500 meters to the left and 500 meters to the right along the line of the patrol railway.
Further, the inspection route includes an inspection start point and an inspection end point corresponding to the inspection task, and the inspection control instruction is used for controlling the unmanned aerial vehicle 11 to complete the instruction of the inspection task based on the inspection route, which may specifically include an inspection speed, an inspection direction, an inspection height, and the like.
Wherein the first patrol data may include one or more sub-patrol data. The size of the one or more patrol data can be determined according to the memory size of the unmanned aerial vehicle or a preset duration.
The unmanned aerial vehicle 11 is configured to collect first inspection data based on an inspection route and an inspection control command after receiving the inspection task, and send the first inspection data to the server 13.
Specifically, after receiving the inspection task, the unmanned aerial vehicle 11 analyzes the inspection task, acquires an inspection route and an inspection control instruction, flies according to the inspection route and the inspection control instruction, and acquires railway line image data through at least one image acquisition device such as a camera and the like installed on the unmanned aerial vehicle 11 in the flying process, so as to acquire first inspection data.
In some embodiments of the present invention, after the first inspection data is acquired, the unmanned aerial vehicle 11 packages and marks the first inspection data, and then sends the first inspection data to the server 13, where the package and mark includes information such as compression processing of the first inspection data, marking of an inspection task corresponding to the first inspection data, an inspection route, an inspection start time, and a number of the unmanned aerial vehicle corresponding to the inspection.
The server 13 is configured to receive the first inspection data, analyze the first inspection data to obtain a first analysis result of whether the easy-to-float objects exist along the railway, and send the first analysis result to the control terminal 12, where the first analysis result further includes the type of the easy-to-float objects and/or the inspection data corresponding to the easy-to-float objects when the easy-to-float objects exist along the railway.
In the embodiment of the invention, the first analysis result can comprise two types of easy-to-drift objects along the railway and no easy-to-drift objects along the railway.
The floatable object can be understood as a floatable object which may cause hidden danger to driving safety along the railway, namely hidden danger of the floatable object.
In the embodiment of the invention, the floatable objects can comprise dust screens, sunshade screens, advertisement banners, vegetable greenhouses, floating plastic bags and the like along the railway, which influence the driving safety of the railway.
The control terminal 12 is configured to generate a target inspection task according to the type of the drift object and/or the inspection data corresponding to the drift object when the first analysis result indicates that the drift object is a drift object, where the type of the drift object is a target drift object and a suspected drift object.
In the embodiment of the present invention, the suspected flap may be understood as an object between the flap and the interfering object, and it is not determined whether the flap is the object.
The target inspection task may be an inspection task for following up the target flyable object, or an inspection task for further determining the suspected flyable object.
In the embodiment of the invention, the unmanned aerial vehicle inspection system comprises an unmanned aerial vehicle, a control terminal and a server, wherein the control terminal is used for sending an inspection task to the unmanned aerial vehicle, the inspection task comprises an inspection route used for inspecting the railway line and an inspection control instruction, the unmanned aerial vehicle is used for collecting first inspection data based on the inspection route and the inspection control instruction after receiving the inspection task, sending the first inspection data to the server, the server is used for receiving the first inspection data, analyzing the first inspection data to obtain a first analysis result of whether the railway line has the easily-floating objects or not, sending the first analysis result to the control terminal, the first analysis result also comprises the type of the flyable object and/or the inspection data corresponding to the flyable object when the flyable object is arranged along the railway, and the control terminal is used for generating a target inspection task according to the type of the flyable object and/or the inspection data corresponding to the flyable object when the first analysis result is that the flyable object is arranged along the railway, wherein the type of the flyable object is the target flyable object and the suspected flyable object, so that the labor cost in the inspection process can be reduced, the problem of inspection dead angles in the inspection process is avoided, the flyable object along the railway is ensured to be found and cleaned in time, and the safety along the railway is improved.
In some embodiments of the present invention, the number of unmanned aerial vehicles 11 is at least one, and the control terminal 12 may send inspection tasks to a plurality of unmanned aerial vehicles 11 at the same time, where the plurality of unmanned aerial vehicles 11 perform the inspection tasks at the same time, so as to inspect the flyings along one or more roads; the control terminal 12 may also send the patrol task to the plurality of unmanned aerial vehicles 11 one by one in a batch or in a sequential manner, which is not limited herein.
In some embodiments of the present invention, the unmanned aerial vehicle inspection system 10 may further comprise at least one automatic airport, wherein the distance between each two of the at least one automatic airports may be a preset distance, for example, 10 km, wherein each automatic airport may provide services for charging, parking, microwave communication, etc. to one or more unmanned aerial vehicles 11, and wherein each automatic airport further comprises a communication module for communication between the unmanned aerial vehicle 11 and the control terminal 12 and the server 13 in the automatic airport.
Fig. 2 is a schematic structural diagram of another unmanned aerial vehicle inspection system according to an embodiment of the present invention.
As shown in fig. 2, the unmanned aerial vehicle inspection system 20 includes an unmanned aerial vehicle 21, a control terminal 22, and a server 23, and in the embodiment of the present invention, the server 23 includes an image processing module 231.
The image processing module 231 is configured to perform frame-division processing on the first inspection data to obtain at least one target image data, and then analyze the at least one target image data to obtain a first analysis result.
Further, the image processing module 231 may include a first computing unit 2301, a second computing unit 2302, a filter processing unit 2303, and an image recognition unit 2304.
The first calculating unit 2301 is configured to extract pixel coordinates corresponding to at least one target image data, and calculate connected domain areas corresponding to the at least one target image data based on the pixel coordinates.
Here, the connected domain may be understood as a region having the same pixel value and adjacent to the target image data.
Specifically, the calculation formula of the connected domain area of the target image data is as follows:
wherein,,representing the +.>A plurality of connected domains; />A sequence number value indicating a connected domain of the target image data; />Representing a constant; />Indicate->Serial number values of pixel points of the connected domains; />Indicate->The total pixel point number of each connected domain; />Indicate->First->The abscissa of the individual pixel points; />Indicate->First->Ordinate of each pixel point; / >Indicate->First->The abscissa of the individual pixel points; />Indicate->First->The ordinate of the individual pixel points.
Further, the method comprises the steps of,the value can be 0.5.
The second calculation unit 2302 is configured to compare the connected domain area with a preset area threshold value, and determine a target connected domain with the connected domain area greater than or equal to the preset area threshold value.
The preset area threshold is an area threshold which is preset and stored and used for screening the connected domain.
Specifically, the second calculation unit 2302 compares the connected domain area with the preset area threshold, and then removes connected domains having the connected domain area smaller than the preset area threshold according to the comparison result.
The specific formula for removing the connected domain with the area smaller than the preset area threshold value is as follows:
wherein,,representing a target coordinate function after removing the connected domain with the area smaller than a preset area threshold value; />Indicate->A connected domain coordinate function; />Representing a preset area threshold; />Indicating the total number of connected domains.
The filter processing unit 2303 is configured to perform filter processing on the target image data based on the target connected domain.
In the embodiment of the invention, the filtering process can be understood as removing influencing factors (such as bright spots, dark spots and the like in an image) influencing the image quality corresponding to the target image data, wherein the influencing factors are noise in the image (the image is often represented as an isolated pixel spot or a pixel block causing a stronger visual effect), the connected domain smaller than a preset area threshold is judged as noise, and when the area of the connected domain is smaller than the preset area threshold, the connected domain is removed, wherein the removing is to filter the noise of the target image.
In this embodiment of the present invention, the second calculating unit 2302 may be further configured to remove the connected domain having the connected domain area greater than the target area threshold, or may be understood as identifying the connected domain having the connected domain area greater than the target area threshold, where the target area threshold is greater than the preset area threshold, so as to reduce the calculation pressure of the server 23.
The image recognition unit 2304 is configured to perform image recognition on the target image data after the filtering processing, obtain a recognition result, and obtain a first analysis result based on the recognition result.
In the embodiment of the invention, image recognition can be understood as recognizing a target object in target image data.
In some embodiments of the present invention, the image recognition unit 2304 may compare the target image data with a preset image set in a database to efficiently and accurately recognize the target object.
In other embodiments of the present invention, the image recognition unit 2304 may input the target image data into a pre-trained machine learning model, identify the target image data by the pre-trained machine learning model, and output a recognition result, where the recognition result includes the target object and a location in the image where the target object is located.
In the embodiment of the invention, the total area and the outline of the target image corresponding to the target image data are obtained by accumulation through determining the connected domain which is larger than the preset area threshold and determining the area and the position of the connected domain which is larger than the preset area threshold, so that the quality of the target image is improved, and the quality is further compared with the preset image set in the database, so that when the filtered target image data are subjected to image recognition, the recognition of the easily-floating objects is more accurate and efficient, the recognition of the unmanned aerial vehicle is effectively ensured, and the inspection efficiency of the unmanned aerial vehicle is improved.
Fig. 3 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention.
As shown in fig. 3, the unmanned aerial vehicle inspection system 30 includes an unmanned aerial vehicle 31, a control terminal 32 and a server 33, and in the embodiment of the present invention, the server 33 includes an image processing module 331, a database to be determined 332, a drift object database 333 and an interference object database 334, where the image processing module 331 includes a first computing unit 3301, a second computing unit 3302, a filtering processing unit 3303 and an image recognition unit 3304.
Further, the image recognition unit 3304 is specifically configured to match, after obtaining the recognition result, the target object corresponding to the recognition result with the objects in the database to be determined 332, the volatile object database 333, and the interfering object database 334, respectively, so as to obtain a first analysis result.
The objects to be determined in the database 332 to be determined are objects that are not stored in the volatile object database 333 and the interfering object database 334, and require manual assistance to perform judgment or require re-acquisition of inspection data corresponding to the target object to further perform judgment, and after the objects to be determined in the database 332 to be determined are further judged by manual assistance or re-acquisition of inspection data corresponding to the target object, the objects to be determined in the database 332 to be determined are placed in corresponding databases, such as the volatile object database 333 and the interfering object database 334, according to the judgment result.
The flyable objects stored in the flyable object database 333 may include vegetable greenhouses along the railway, agricultural mulch films, colored steel plate houses, dust screens, garbage stacking points with the number of plastic bags exceeding a preset threshold, banners, kites below the flying route, and the like.
The interferents stored in the interferent database 334 may include objects having similar colors and shapes to the flyable objects and easily interfering with the judgment of the flyable objects, such as buildings painted with a predetermined color along the railway, sun-dried clothes, etc.
Specifically, after obtaining the recognition result, the image recognition unit 3304 matches the recognition result with the objects in the database to be determined 332, the volatile object database 333, and the interfering object database 334, respectively, to obtain a first analysis result.
The matching method may be performed by a similarity calculation method, which is the same as the prior art and is not described herein.
In the embodiment of the present invention, when no stored object is to be determined in the database 332, only the recognition result is matched with the objects in the volatile object database 333 and the interfering object database 334.
In some embodiments of the present invention, the image recognition unit 3304 determines the type of the database where any object is located as the type of the target object when the similarity value between the target object and any object in the database 332 to be determined, the volatile object database 333, and the interfering object database 334 reaches a preset threshold.
For example, the preset threshold is 80%, and when the similarity value between the target object and any object in the flyable object database 333 reaches 80% or more, the type of the target object is determined as a flyable object; when the similarity value between the target object and any object in the interferent database 334 reaches 80% or more, the type of the target object is determined as the interferent.
In other embodiments of the present invention, when the similarity value between the target object and any object in the volatile object database 333 and the interfering object database 334 is smaller than the preset threshold, the image recognition unit 3304 stores the target object in the to-be-determined database 332, and meanwhile, marks the target object to the server 33, and the relevant manager further confirms the type of the target object based on the feedback of the server 33, or the server 33 further recognizes and determines the target object based on a pre-trained machine learning model.
In the embodiment of the invention, after the type of the target object is determined, the data of the flyable object database and the interference object database are further supplemented and updated based on the type of the target object so as to realize a forward gain circulation mechanism, so that the unmanned aerial vehicle inspection system is continuously self-optimized and perfected in the use process, the target object in the target image data is rapidly identified, the accuracy of identifying the target object is improved, the identification result of the flyable object is more accurate, the driving safety of a railway is further improved, and meanwhile, the target object with the similarity value of any object in the database to be determined, the flyable object database and the interference object database is less than the preset threshold value can be rapidly distinguished and processed by combining with manpower, the problem of inaccurate identification of the flyable object caused by judgment errors is avoided, and the flexibility of the unmanned aerial vehicle inspection system is improved.
The server may further include a database creation module based on the above-described embodiment of the present invention.
The database creation module is used to create the database to be determined 332, the drift mass database 333, and the interferent database 334.
Specifically, the database establishing module inputs the target object to the data classification model after obtaining the target object, classifies the target object to obtain a classification result of the target object, extracts the characteristics of the target object, establishes an association relationship between the target object and the extracted characteristics, establishes the volatile object database 333 and the interfering object database 334 based on the classification result, and stores the association relationship between the target object and the extracted characteristics in the corresponding database, establishes the database 332 to be determined at the same time, and puts the database 332 to be determined when the type of the target object corresponding to the inspection data cannot be determined based on the analysis result.
Fig. 4 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention.
As shown in fig. 4, the unmanned aerial vehicle inspection system 40 includes an unmanned aerial vehicle 41, a control terminal 42 and a server 43, and in the embodiment of the present invention, the server 43 includes an image processing module 431, a to-be-determined database 432, a flyer database 433 and an interfering object database 434, and the control terminal 42 includes a hidden danger follow-up module 421 and an inspection task generating module 422.
The hidden danger follow-up module 421 is configured to obtain a target feature of the target flyable object when the first analysis result indicates that the flyable object is a target flyable object, and determine a hidden danger elimination scheme based on the target feature, where the target feature includes at least one of a region feature of the target flyable object, a type of the target flyable object, a size of the target flyable object, and a color of the target flyable object.
In the embodiment of the present invention, the hidden danger follow-up module 421 may obtain, from the database of the target drift-prone object, the target feature corresponding to the target drift-prone object based on the type of the target drift-prone object; the target image data corresponding to the target flyable object can be input into a pre-trained machine learning model, and the pre-trained machine learning model outputs the target characteristic corresponding to the target flyable object.
The hidden danger follow-up module 421 may match the target feature with a preset scheme library after the target feature is acquired, and determine a hidden danger elimination scheme corresponding to the target flyable object based on the matching result, thereby determining an elimination scheme corresponding to the target flyable object according to the target feature of the target flyable object.
Wherein, the scheme library is a scheme database which is pre-established and is used for removing the drift matters.
The hidden danger eliminating scheme comprises the type of the easy-to-float object, the number of hidden danger eliminating personnel corresponding to the easy-to-float object, hidden danger eliminating time corresponding to the easy-to-float object, hidden danger eliminating required matched tools and the like, so that the hidden danger can be quickly and effectively eliminated, and meanwhile, the maintenance cost is reduced.
The hidden danger follow-up module 421 may also be configured to generate a drift eligibility task based on the target drift eligibility and the target image data corresponding to the target drift eligibility, add the drift eligibility task to a drift eligibility task list, and ordering the drift object processing tasks in the drift object processing task list based on the hidden danger eliminating scheme corresponding to the drift object processing tasks, so that the hidden danger areas corresponding to the target drift objects can be managed and maintained conveniently.
The hidden danger follow-up module 421 can also be used for acquiring the implementation progress of the hidden danger elimination scheme in real time, and updating the hidden danger elimination situation in real time based on the implementation progress of the hidden danger elimination scheme.
Optionally, the hidden danger follow-up module 421 includes a hidden danger tagging unit 4201.
The hidden danger marking unit 4201 is configured to generate a first marking result based on the first analysis result, where the first marking result includes a beidou coordinate corresponding to the target easily-floating object, a relative coordinate of the target easily-floating object and the railway, a target feature of the target easily-floating object, and a discovery time of the target easily-floating object; generating a second labeling result based on the second analysis result; and generating a third labeling result based on the third analysis result.
In the embodiment of the invention, the second labeling result can be used for representing the elimination condition of the target drift easily.
The third labeling result is used for updating the first labeling result, namely updating the target flyable object in the first labeling result when the suspected flyable object is determined to be the target flyable object according to the third analysis result.
The inspection task generating module 422 is configured to determine a first target area corresponding to the flyable object when the first analysis result indicates that the flyable object is present and the type of the flyable object is a target flyable object, generate a first target control instruction for the first target area, and generate a first target inspection task based on the first target area and the first target control instruction, so that the unmanned aerial vehicle 41 performs the first target inspection task.
In some embodiments of the present invention, the inspection task generating module 422 may generate a first target inspection task according to a first target area where the target flyable object is located, where the first target inspection task includes the first target area, an inspection route corresponding to the first target area, a first target control instruction, and a time for executing the first target inspection task, so that the unmanned aerial vehicle 41 executes the first target inspection task, and then follows the removal situation of the target flyable object based on the first target inspection task.
In other embodiments of the present invention, the inspection task generating module 422 may further search a preset inspection task including the first target area from the next preset inspection task according to the first target area where the target flyable object is located, and determine the preset inspection task as the first target inspection task, so that the unmanned aerial vehicle 41 performs follow-up on the removal situation of the target flyable object based on the preset inspection task when executing the preset inspection task.
The inspection task generating module 422 is further configured to determine a second target area corresponding to the suspected flap object when the first analysis result is that the flap object is present and the type of the flap object is suspected flap object, and generate a second target control instruction for the second target area, and generate a second target inspection task based on the second target area and the second target control instruction, so that the unmanned aerial vehicle 41 executes the second target inspection task, where the second target control instruction includes at least one of a hover instruction, a steering control instruction, and a descent instruction.
In the embodiment of the present invention, when the first analysis result is that the type of the easily-floating object is a suspected easily-floating object, the inspection task generating module 422 is configured to further determine whether the suspected easily-floating object is a target easily-floating object, so as to facilitate further follow-up of the removal situation of the easily-floating object, and when the inspection task is executed, create a second target inspection task, and further determine whether the suspected easily-floating object is the target easily-floating object according to the second target inspection task.
Specifically, when the first analysis result indicates that the type of the easily-floating object is a suspected easily-floating object, the inspection task generating module 422 directly generates a second target inspection task, and when the inspection task is executed, the second target inspection task is executed, that is, the area where the suspected easily-floating object is located is closely observed and data is collected through at least one of a hover instruction, a steering control instruction and a descent instruction, so as to determine whether the suspected easily-floating object is the target easily-floating object.
In the embodiment of the invention, the patrol task can further comprise patrol duration, when the second target patrol task is executed in the process of executing the patrol task, the patrol duration of the patrol task is stopped to be timed, and after the execution of the second target patrol task is finished, the patrol duration of the patrol task is continuously timed, so that the execution integrity of the patrol task is ensured under the condition of executing the second target patrol task.
The unmanned aerial vehicle 41 is further configured to perform a first target inspection task to obtain second inspection data, send the second inspection data to the server 43, and/or perform a second target inspection task to obtain third inspection data, and send the third inspection data to the server 43.
In the embodiment of the present invention, the specific implementation of the unmanned aerial vehicle 41 to execute the first target inspection task and/or the second target inspection task is similar to the specific implementation of the inspection task described above, and will not be described herein.
The server 43 is further configured to receive the second inspection data, analyze the second inspection data to obtain a second analysis result, and/or receive the third inspection data, and analyze the third inspection data to obtain a third analysis result.
In the embodiment of the present invention, the specific implementation of the server 43 for performing the second inspection data analysis and/or the third inspection data analysis is similar to the specific implementation of the first inspection data analysis described above, and will not be described herein.
Optionally, after the third analysis result is obtained, the server 43 is further configured to generate a tag corresponding to the suspected flap object when the third analysis result determines that the suspected flap object is the target flap object or the interfering object, and store the tag and the inspection data corresponding to the suspected flap object in the flap object database 433 or the interfering object database 434 corresponding to the server 43, so as to update the database in the server 43 in real time.
In the above embodiment of the present invention, the control terminal 42 further includes a flag statistics module 423.
The marking statistics module 423 is configured to accumulate marking results of the multiple inspection tasks in a preset time period by the hidden danger marking unit 4201, and generate a first statistics result; the marking statistics module 423 is further configured to update and count the first marking result according to the second marking result and/or the third marking result, and generate a second statistical result.
In some embodiments of the present invention, the marking statistics module 423 is configured to accumulate marking results of the multiple inspection tasks in the preset time period by the hidden danger marking unit 4201, and generate a first statistics result, so that the control terminal 42 can quickly obtain information such as the number of times of hidden danger occurring along the railway line, the number of hidden danger occurring, and the time of hidden danger occurring in the preset time period, which is beneficial to further specifying the correction measure, and further improving the driving safety of the railway.
In other embodiments, the marking statistics module 423 is configured to update and count the first marking result according to the second marking result and/or the third marking result, generate a second statistics result, determine, according to the second statistics result, an exclusion condition of the target flyable object corresponding to the current inspection task, so as to determine whether to exclude the target flyable object corresponding to the current inspection task completely, and further determine a next execution scheme.
Optionally, the control terminal 42 includes a display module 424.
The display module 424 is configured to generate a corresponding planar map based on the first inspection data, display the planar map in real time, and display position information of the unmanned aerial vehicle on the planar map in real time.
In the embodiment of the invention, the first patrol data can be displayed in real time through the display module to generate the corresponding planar map and the position information of the unmanned aerial vehicle, so that the execution condition of the patrol task and the condition of the unmanned aerial vehicle can be further known.
The display module 424 is further configured to display the hidden danger location of the drift object and the type of the drift object on the planar map when the first analysis result indicates that the drift object exists.
In the embodiment of the invention, the hidden danger positions of the flyable objects and the types of the flyable objects can be displayed on the planar map through the display module so as to further know the distribution condition of the flyable objects.
In the embodiment of the invention, the plane map generated by the railway line monitoring image data is displayed in real time through the display module, so that the unmanned aerial vehicle and the area where the easily-floating object are located can be marked and positioned conveniently and rapidly, the management efficiency is improved, the timely and effective data intercommunication between the control terminal and the unmanned aerial vehicle is realized, and a large amount of manpower and material resources are saved.
Fig. 5 is a schematic structural diagram of still another unmanned aerial vehicle inspection system according to an embodiment of the present invention.
As shown in fig. 5, the unmanned aerial vehicle inspection system 50 includes an unmanned aerial vehicle 51, a control terminal 52, and a server 53.
In an embodiment of the present invention, the unmanned aerial vehicle 51 may include a task determination module 511.
The task determination module 511 is configured to determine a priority of the patrol task when there are a plurality of patrol tasks, and execute the patrol task based on the priority.
In the embodiment of the invention, the priority of the second target patrol task is higher than the priority of the patrol task, and the priority of the patrol task is higher than the priority of the first target patrol task.
In the embodiment of the invention, the control terminal 52 can also automatically time the unmanned aerial vehicle in real time, at regular time or at irregular time, acquire the patrol data time node of the unmanned aerial vehicle, judge whether the return time of the patrol data reaches the preset duration, ensure the accuracy of the time of the unmanned aerial vehicle 51, and avoid the problems that the positioning error occurs due to the time difference between the control terminal 52 and the unmanned aerial vehicle 51, the patrol data acquired by the unmanned aerial vehicle 51 cannot be acquired in time, and the like.
In the embodiment of the invention, the priority of the inspection task can be judged through the task judging module, so that the execution effectiveness and timeliness of the inspection task are ensured.
The foregoing is only a specific embodiment of the invention to enable those skilled in the art to understand or practice the invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The unmanned aerial vehicle inspection system is characterized by comprising an unmanned aerial vehicle, a control terminal and a server;
the control terminal is used for sending a patrol task to the unmanned aerial vehicle, wherein the patrol task comprises a patrol route used for patrol of the railway line and a patrol control instruction;
the unmanned aerial vehicle is used for collecting first patrol data based on the patrol route and the patrol control instruction after receiving the patrol task, and sending the first patrol data to the server;
the server is used for receiving the first inspection data, analyzing the first inspection data to obtain a first analysis result of whether the easy-to-float objects exist along the railway, sending the first analysis result to the control terminal, and when the easy-to-float objects exist along the railway, the first analysis result also comprises the type of the easy-to-float objects and/or the inspection data corresponding to the easy-to-float objects;
And the control terminal is used for generating a target inspection task according to the type of the easy-to-float object and/or inspection data corresponding to the easy-to-float object when the first analysis result is that the easy-to-float object exists, wherein the type of the easy-to-float object is the target easy-to-float object and the suspected easy-to-float object.
2. The unmanned aerial vehicle inspection system of claim 1, wherein the server comprises an image processing module;
the image processing module is used for carrying out framing processing on the first inspection data to obtain at least one target image data, and then analyzing the at least one target image data to obtain the first analysis result.
3. The unmanned aerial vehicle inspection system of claim 2, wherein the image processing module comprises a first computing unit, a second computing unit, a filtering processing unit, and an image recognition unit;
the first calculating unit is used for extracting pixel coordinates corresponding to the at least one target image data respectively, and calculating the connected domain areas corresponding to the at least one target image data respectively based on the pixel coordinates;
the second calculation unit is used for comparing the area of the communicating domain with a preset area threshold value and determining a target communicating domain with the area of the communicating domain being larger than or equal to the preset area threshold value;
The filtering processing unit is used for performing filtering processing on the target image data based on the target connected domain;
the image recognition unit is used for carrying out image recognition on the target image data after the filtering processing to obtain a recognition result, and obtaining the first analysis result based on the recognition result.
4. A drone inspection system according to claim 3, wherein the server comprises a database to be determined, a drift database and an interferent database;
the image recognition unit is specifically configured to match, after the recognition result is obtained, a target object corresponding to the recognition result with objects in the database to be determined, the volatile object database and the interfering object database, respectively, so as to obtain the first analysis result.
5. The unmanned aerial vehicle inspection system of claim 1, wherein the control terminal comprises a hidden trouble follow-up module and an inspection task generation module;
the hidden danger follow-up module is used for acquiring target characteristics of the target flyable objects when the first analysis result is that the flyable objects exist and the type of the flyable objects is the target flyable objects, and determining a hidden danger elimination scheme based on the target characteristics, wherein the target characteristics comprise at least one of area characteristics of the target flyable objects, the types of the target flyable objects, the sizes of the target flyable objects and the colors of the target flyable objects;
The inspection task generating module is used for determining a first target area corresponding to the flyable object when the first analysis result is that the flyable object exists and the type of the flyable object is a target flyable object, generating a first target control instruction aiming at the first target area, and generating a first target inspection task based on the first target area and the first target control instruction so as to enable the unmanned aerial vehicle to execute the first target inspection task;
the patrol task generating module is further configured to determine a second target area corresponding to a suspected flap object when the first analysis result is that the flap object is present and the type of the flap object is a suspected flap object, generate a second target control instruction for the second target area, and generate a second target patrol task based on the second target area and the second target control instruction, so that the unmanned aerial vehicle executes the second target patrol task, where the second target control instruction includes at least one of a hover instruction, a steering control instruction, and a descent instruction;
the unmanned aerial vehicle is further used for executing the first target patrol task to obtain second patrol data, sending the second patrol data to the server, and/or executing the second target patrol task to obtain third patrol data, and sending the third patrol data to the server;
The server is further configured to receive the second inspection data, analyze the second inspection data to obtain a second analysis result, and/or receive the third inspection data, and analyze the third inspection data to obtain a third analysis result.
6. The unmanned aerial vehicle inspection system of claim 5, wherein the hidden danger follow-up module comprises a hidden danger marking unit;
the hidden danger marking unit is used for generating a first marking result based on the first analysis result, wherein the first marking result comprises Beidou coordinates corresponding to the target easily-floating object, relative coordinates of the target easily-floating object and a railway, target characteristics of the target easily-floating object and the discovery time of the target easily-floating object; generating a second labeling result based on the second analysis result; and generating a third labeling result based on the third analysis result.
7. The unmanned aerial vehicle inspection system of claim 6, wherein the control terminal further comprises a tag statistics module;
the marking statistics module is used for accumulating marking results of multiple inspection tasks in a preset time period by the hidden danger marking unit to generate a first statistics result;
The marking statistics module is further used for updating and counting the first marking result according to the second marking result and/or the third marking result to generate a second statistical result.
8. The unmanned aerial vehicle inspection system of claim 5, wherein the server is further configured to generate a tag corresponding to the suspected flap object when the third analysis result determines that the suspected flap object is a target flap object or an interfering object, and store the tag and inspection data corresponding to the suspected flap object in a flap object database or an interfering object database corresponding to the server.
9. The unmanned aerial vehicle inspection system of claim 1, wherein the control terminal comprises a display module;
the display module is used for generating a corresponding plane map based on the first inspection data, displaying the plane map in real time, and displaying the position information of the unmanned aerial vehicle on the plane map in real time;
the display module is further used for displaying hidden danger positions of the drift objects and types of the drift objects on the plane map when the first analysis result is that the drift objects exist.
10. The unmanned aerial vehicle inspection system of claim 1, wherein the unmanned aerial vehicle comprises a task determination module;
the task judging module is used for judging the priority of the patrol task when a plurality of patrol tasks exist, and executing the patrol task based on the priority.
CN202310756054.XA 2023-06-26 2023-06-26 Unmanned aerial vehicle inspection system Active CN116506471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310756054.XA CN116506471B (en) 2023-06-26 2023-06-26 Unmanned aerial vehicle inspection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310756054.XA CN116506471B (en) 2023-06-26 2023-06-26 Unmanned aerial vehicle inspection system

Publications (2)

Publication Number Publication Date
CN116506471A true CN116506471A (en) 2023-07-28
CN116506471B CN116506471B (en) 2023-09-12

Family

ID=87318717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310756054.XA Active CN116506471B (en) 2023-06-26 2023-06-26 Unmanned aerial vehicle inspection system

Country Status (1)

Country Link
CN (1) CN116506471B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106741008A (en) * 2016-12-29 2017-05-31 北京交通大学 Rail track method for recognizing impurities and system
CN107097810A (en) * 2017-04-30 2017-08-29 中南大学 A kind of Along Railway foreign body intrusion UAV Intelligent identification and method for early warning and system
US20180012323A1 (en) * 2016-07-07 2018-01-11 Motorola Solutions, Inc Method and apparatus for improving dispatch of different types of law enforcement patrols for achieving a desired deterrent effect
CN114063638A (en) * 2021-10-15 2022-02-18 中电科翌智航(宁夏)科技有限公司 UAV inspection system and smart empowered urban emergency equipment
CN115311584A (en) * 2022-08-15 2022-11-08 贵州电网有限责任公司 Unmanned aerial vehicle high-voltage power grid video inspection floating hanging method based on deep learning
CN115311354A (en) * 2022-09-20 2022-11-08 中国铁建电气化局集团有限公司 Foreign matter risk area identification method, device, equipment and storage medium
CN218038059U (en) * 2022-09-20 2022-12-13 中国铁建电气化局集团有限公司 Contact net foreign matter early warning monitoring system
CN116246181A (en) * 2023-01-17 2023-06-09 国网浙江省电力有限公司湖州供电公司 Extra-high voltage dense channel drift monitoring method based on satellite remote sensing technology

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012323A1 (en) * 2016-07-07 2018-01-11 Motorola Solutions, Inc Method and apparatus for improving dispatch of different types of law enforcement patrols for achieving a desired deterrent effect
CN106741008A (en) * 2016-12-29 2017-05-31 北京交通大学 Rail track method for recognizing impurities and system
CN107097810A (en) * 2017-04-30 2017-08-29 中南大学 A kind of Along Railway foreign body intrusion UAV Intelligent identification and method for early warning and system
CN114063638A (en) * 2021-10-15 2022-02-18 中电科翌智航(宁夏)科技有限公司 UAV inspection system and smart empowered urban emergency equipment
CN115311584A (en) * 2022-08-15 2022-11-08 贵州电网有限责任公司 Unmanned aerial vehicle high-voltage power grid video inspection floating hanging method based on deep learning
CN115311354A (en) * 2022-09-20 2022-11-08 中国铁建电气化局集团有限公司 Foreign matter risk area identification method, device, equipment and storage medium
CN218038059U (en) * 2022-09-20 2022-12-13 中国铁建电气化局集团有限公司 Contact net foreign matter early warning monitoring system
CN116246181A (en) * 2023-01-17 2023-06-09 国网浙江省电力有限公司湖州供电公司 Extra-high voltage dense channel drift monitoring method based on satellite remote sensing technology

Also Published As

Publication number Publication date
CN116506471B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN111310645B (en) Method, device, equipment and storage medium for warning overflow bin of goods accumulation
Varadharajan et al. Vision for road inspection
CN101587622B (en) Forest rocket detecting and identifying method and apparatus based on video image intelligent analysis
CN110414320B (en) Method and system for safety production supervision
CN102073846B (en) Method for acquiring traffic information based on aerial images
CN111753612B (en) Method and device for detecting casting object and storage medium
CN109815912B (en) Highway safety inspection system based on artificial intelligence
CN111814835A (en) Training method and device of computer vision model, electronic equipment and storage medium
CN109782364B (en) Traffic sign board missing detection method based on machine vision
CN108038424A (en) A kind of vision automated detection method suitable for working at height
CN116168356B (en) Vehicle damage judging method based on computer vision
CN113095441A (en) Pig herd bundling detection method, device, equipment and readable storage medium
CN109977862B (en) Recognition method of parking space limiter
CN115083212B (en) Unmanned aerial vehicle location intelligent management system based on three-dimensional modeling
CN116506471B (en) Unmanned aerial vehicle inspection system
Borkar et al. An efficient method to generate ground truth for evaluating lane detection systems
CN113963331B (en) Signal lamp identification model training method, signal lamp identification method and related device
CN112861701B (en) Illegal parking identification method, device, electronic equipment and computer readable medium
CN113129590A (en) Traffic facility information intelligent analysis method based on vehicle-mounted radar and graphic measurement
CN114155421A (en) A deep learning algorithm model automatic iteration method
CN116403162A (en) A method, system, and electronic device for recognizing target behavior on an airport scene
CN110378403B (en) Wire spool classification and identification method and system
CN115620047A (en) Target object attribute information determination method and device, electronic equipment and storage medium
CN113096395A (en) Road traffic safety evaluation system based on positioning and artificial intelligence recognition
CN116543327A (en) Method, device, computer equipment and storage medium for identifying work types of operators

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant