CN109379564A - A kind of gas pipeline unmanned plane inspection device and method for inspecting - Google Patents
A kind of gas pipeline unmanned plane inspection device and method for inspecting Download PDFInfo
- Publication number
- CN109379564A CN109379564A CN201811276910.7A CN201811276910A CN109379564A CN 109379564 A CN109379564 A CN 109379564A CN 201811276910 A CN201811276910 A CN 201811276910A CN 109379564 A CN109379564 A CN 109379564A
- Authority
- CN
- China
- Prior art keywords
- image
- module
- alarm
- video
- passes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000007689 inspection Methods 0.000 title claims abstract description 38
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 74
- 238000010191 image analysis Methods 0.000 claims abstract description 69
- 230000005540 biological transmission Effects 0.000 claims abstract description 67
- 238000012545 processing Methods 0.000 claims abstract description 59
- 230000003287 optical effect Effects 0.000 claims description 44
- 238000001931 thermography Methods 0.000 claims description 33
- 238000004458 analytical method Methods 0.000 claims description 15
- 230000006870 function Effects 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 12
- 238000013439 planning Methods 0.000 claims description 10
- 239000013307 optical fiber Substances 0.000 claims description 9
- 238000003860 storage Methods 0.000 claims description 9
- 238000013473 artificial intelligence Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000005516 engineering process Methods 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 7
- 239000000203 mixture Substances 0.000 claims description 7
- 230000015572 biosynthetic process Effects 0.000 claims description 6
- 238000001228 spectrum Methods 0.000 claims description 6
- 238000003786 synthesis reaction Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 5
- 210000003462 vein Anatomy 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 238000005553 drilling Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 4
- 239000013598 vector Substances 0.000 claims description 4
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 239000012141 concentrate Substances 0.000 claims description 3
- 238000010801 machine learning Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 230000007246 mechanism Effects 0.000 claims description 3
- 238000002156 mixing Methods 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000000465 moulding Methods 0.000 claims description 2
- 238000012937 correction Methods 0.000 claims 1
- 238000011156 evaluation Methods 0.000 claims 1
- 206010022000 influenza Diseases 0.000 claims 1
- 239000007789 gas Substances 0.000 abstract description 22
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 abstract description 14
- 239000003345 natural gas Substances 0.000 abstract description 7
- 238000012549 training Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 3
- 210000002569 neuron Anatomy 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 241000475481 Nebula Species 0.000 description 1
- 238000012952 Resampling Methods 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000003912 environmental pollution Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
A kind of gas pipeline unmanned plane inspection device and method for inspecting, are related to gas pipeline inspection field, solve the problems, such as low efficiency existing for existing natural gas line method for inspecting, are not easy to realize inspection, are at high cost, influencing vulnerable to weather environment.Flight control modules in the present invention send control instruction to unmanned plane module, and multi-rotor unmanned aerial vehicle is flown by control instruction;The data image signal of acquisition is passed to data reception module by wireless image transmission module by picture recording module, then passes to image processing module and flight control modules;Image processing module is spliced and is merged to the image received, and treated image by data/address bus is passed to image analysis module, display module and flight control modules;Image analysis module is analyzed suspicious accident point and is marked on the image, and the accident point after will confirm that is broadcasted by alarm module, and treated that image passes to display module shows by general;The image data received is transferred to data service center by flight control modules.
Description
Technical field
The present invention relates to gas pipeline inspection technical fields, and in particular to a kind of gas pipeline unmanned plane inspection device and patrols
Detecting method.
Background technique
Natural gas is as the energy most clean in fossil fuel, it also has many advantages, such as, and heat is high, energy efficiency is high, is regarded
For most emerging high efficient energy sources of future at present.Pipeline transmission is because having at low cost, security closed, and freight volume is big, quality can protect
It demonstrate,proves, be easy to control the advantages such as management, it has also become the preferred manner of natural gas transportation.But natural gas has inflammable, explosive spy
Point, once pipeline leakage accident occurs, it is possible to a series of disasters such as explosion, fire, poisoning, environmental pollution are caused, if
Occur in resident accumulation regions, it will cause even more serious harm.The reason of natural gas line leaks mainly includes external force
Damage, material installation corrosion and ageing, violation operation and natural calamity etc..
The method that natural gas line method for inspecting uses manual inspection at present rides storage battery using inspector in city scope
Vehicle carries out inspection, and intercity line is then equipped with automobile progress inspection using to inspector.The mode speed of manual inspection is slow, checks
The frequency is low, some are dangerous or the place of road is not difficult to inspection.Inspector rides or is equipped with the mode that automobile carries out inspection
Manpower and material resources are increased, routing inspection cost is increased, and are easy to receive weather influence, inspection is not convenient.
Summary of the invention
In order to solve low efficiency existing for existing natural gas line method for inspecting, be not easy to realize inspection, it is at high cost, vulnerable to day
The problem of compression ring border influences, the present invention provide a kind of gas pipeline unmanned plane inspection device and method for inspecting.
Used technical solution is as follows in order to solve the technical problem by the present invention:
A kind of gas pipeline unmanned plane inspection device of the invention, comprising: unmanned plane module, picture recording module, wireless image transmission
Module, data reception module, flight control modules, image processing module, image analysis module, alarm module, display module and
Data service center;
Flight control modules send control instruction to unmanned plane module, and the multi-rotor unmanned aerial vehicle in unmanned plane module is according to control
System instruction is flown;
Unmanned plane module mounts picture recording module and wireless image transmission module, complete the flight of unmanned plane, ground image acquisition and
Image transmitting;
The data image signal of acquisition is passed to data reception module by wireless image transmission module by picture recording module;
Received data image signal is passed to image processing module and flight by data/address bus by data reception module
Control module;
Image processing module is spliced and is merged to the image received, and by spliced image and fused figure
As passing to image analysis module, display module and flight control modules by data/address bus;
Image analysis module is analyzed suspicious accident point by artificial intelligence image recognition technology and is labeled on the image,
Accident point after will confirm that is broadcasted by alarm module, and will treated that image passes to display module shows;
The image data received is transferred to data service center by flight control modules.
Further, the unmanned plane module is by multi-rotor unmanned aerial vehicle, onboard flight control module, flight controller group
At;The onboard flight control module receives the control instruction of flight control modules, multi-rotor unmanned aerial vehicle according to control instruction into
Row flight is provided by hand-held flight controller and carries out flight control manually.
Further, the picture recording module includes two optical cameras and a heat of the carry in multi-rotor unmanned aerial vehicle
Imaging camera machine, two optical cameras and a thermal imaging camera are located in approximately the same plane, two optical cameras pair
Claim the two sides for being laid in multi-rotor aerocraft heading, axis where the camera of optical camera and multi-rotor aerocraft fly
The angle α of line direction is 60 degree, and thermal imaging camera is placed among two optical cameras, the camera of thermal imaging camera
The axis at place is identical as heading, acquires ground picture number by two optical cameras and a thermal imaging camera
According to, and data reception module is passed to by wireless image transmission module.
Further, the data reception module is made of data receiver and the first video server;The data connect
Receipts machine receives the ground image data passed over by wireless image transmission module, and passes to the first Video service by video line
Device, the first video server pass it to flight control modules by network.
Further, the flight control modules are made of 3 notebooks, maintenance-free battery, wireless routers;It is described
Maintenance-free battery is wireless router power supply;Automatic Flight Control Software is installed in notebook A, utilizes automatic flight control
Software carries out flight path planning, actual path and desired guiding trajectory and compares, shows flight control information;Notebook C is for showing
The image that thermal imaging camera passes back;Notebook B is for showing fused optical video image;The first video clothes
Ground image data are passed to wireless router by network by business device, and wireless router is passed it to corresponding by WiFi
Notebook and data service center, onboard flight control module by wireless router receive the transmitting of automatic Flight Control Software come
Control signal, send control instruction to multi-rotor unmanned aerial vehicle, multi-rotor unmanned aerial vehicle flies according to control instruction.
Further, the data service center is fitted server, image analysis service by the second video server, image
Device, optical fiber switch, storage array, core switch, control computer, firewall, special line composition;Second video server,
Image fitting server, image analysis server, firewall and control computer pass through cable and are connected with core switch, institute
State video server, image fitting server, image analysis server and storage array pass through optical fiber and be connected to optical fiber exchange
Machine, the special line are connect with firewall by cable, and the alarm module is connected by special line and firewall with core switch,
Wireless router in the flight control modules is connected by special line and firewall with core switch.
Further, described image processing module includes the first image processing module and the second image processing module, described
First image processing module is installed on notebook B, and the optical video image of two optical cameras acquisition passes through wireless image transmission
Module, data receiver are transferred to the first video server, and the vision signal of the first video server output passes through wireless routing
Device, WiFi wireless network are transmitted in the first image processing module in notebook B, are carried out by the first image processing module real
When two-way video Image Mosaic and stitching image is transmitted to data service center by wireless router, WiFi wireless network;
Second image processing module is installed on image fitting server, and the optical video image of two optical cameras acquisition is logical
Cross wireless image transmission module, data receiver is transferred to the first video server, the vision signal of the first video server output is logical
Cross wireless router, WiFi be transmitted to image fitting server in the second image processing module in, pass through the second image procossing
Module carries out the fusion of timesharing video image, and fused optical video image transmitting is given notebook B, shown on notebook B
Show.
Further, described image analysis module is installed on image analysis server, and described image analysis module receives
First image processing module treated stitching image and the second image processing module treated blending image, by image
Image analysis algorithm in deep neural network is embedded in artificial intelligence in analysis module, analyze suspicious accident point and on the image into
Rower note, the accident point after will confirm that broadcasted by alarm module, and general's treated image passes to display module into
Row display.
Further, the display module is made of video distributor, display, mosaic screen and splicing screen controller;Institute
The vision signal for stating control computer output is transferred to video distributor, the vision signal of video distributor output by video line
It passes to display by video line all the way to be shown, another way passes to splicing screen controller, mosaic screen by video line
Vision signal is passed to corresponding mosaic screen by video line and shown by controller.
Further, the alarm module is by cruise alarm module, cruise alarm software, cruise alarm module field control
Plate, smart phone, microphone, speaker composition;The cruise alarm module is installed on image analysis server, and cruise alarm module
Field control plate is installed on notebook C, and cruise alarm software is installed on smart phone, and cruise alarm module and cruise are alarmed
Software is wirelessly connected, and microphone is connected by tone frequency channel wire with the control computer of data service center, and speaker is installed to more rotations
On wing unmanned plane;Warning message is passed to cruise alarm module by messaging method by described image analysis module, cruise
Warning message is sent to the cruise alarm software in smart phone by alarm module, manages every inspection by cruise alarm software
Warning function, the control computer are the alarm software that cruises using the client of cruise alarm module by internal network, and
It is propagandaed directed to communicate by the microphone being connected on control computer to specified multi-rotor unmanned aerial vehicle, the voice signal of the microphone is logical
Control computer is crossed, core switch is transferred to cruise alarm module and speech digit letter is converted by the alarm module that cruises
Number, voice digital signal is passed to notes by core switch, firewall, special line, wireless network again by cruise alarm module
The voice digital signal is passed through data by the cruise alarm module field control plate on this C, the alarm module field control plate that cruises
Receiver passes to wireless image transmission module, and the audio port of wireless image transmission module is connected by tone frequency channel wire with speaker, speaker carry
In multi-rotor unmanned aerial vehicle, voice digital signal is passed to speaker by tone frequency channel wire by wireless image transmission module.
A kind of method for inspecting of gas pipeline unmanned plane inspection device of the invention, comprising the following steps:
Step 1: aviation is planned
The flight path for planning and being arranged multi-rotor unmanned aerial vehicle controls multi-rotor unmanned aerial vehicle by onboard flight control module
It flies according to the flight path of planning, picture recording module acquires ground image;
Step 2: image zooming-out
By collecting sample image, sample image is cut out and is classified, the brightness and contrast of image is adjusted, is used
Bi-cubic interpolation method carries out image drop sampling;
Step 3: machine learning
Learnt using image of the depth neuroid to extraction, the picture feature file formed after study is updated
Into image analysis module;
Step 4: image mosaic
The video image brightness that the acquisition of two optical cameras is automatically adjusted by image processing module, by constructing vocabulary
Tree carries out feature point extraction, words tree determines, then completes image mosaic by image pre-matching, geometric deformation processing, obtains
The whole aerial photo of entire ship trajectory;
Step 5: target identification
Color characteristic identification is carried out from the image after image mosaic, and the feature of image texture is obtained from color histogram
Set calculates the color and vein feature histogram in hollow structural context region, is sentenced by probability calculation to target image
It is fixed, target image set is formed, which is passed to by alarm module by image analysis module;
Step 6: characteristic spectrum moves point analysis
Image analysis module analyzes the characteristic spectrum of split image, and the shape of feature object is checked from the image of different time
State provides object of which movement warning message if it find that the state of object changes;
Step 7: analysis of central issue
By way of manual calibration, thermal imaging video image is calibrated on visible images;It is imaged by thermal imaging
Machine acquires the heat sensitive image of flight range, and passes to data reception module by wireless image transmission module, and data reception module will
The video image passes to image processing module;Heat sensitive image is merged into the whole boat of step 4 synthesis by way of figure layer
Empty graph on piece;The coordinates regional that temperature is higher than setting value is extracted by image analysis module, and passes to alarm module;
Step 8: alarm
According to Step 5: Step 6: the information that step 7 obtains, shows the alarm of flashing in the coordinate points of designated pictures
Information, and the alarm coordinate points are passed into flight control modules;
Step 9: fixed point cruise
Flight control modules receive the report for police service after coordinate points, are controlled according to alarm coordinate points by the automatic flight in notebook A
The flight path of software plan multi-rotor unmanned aerial vehicle processed, while sending fixed point cruise course line setting to onboard flight control module and referring to
It enables, so that it is carried out flight in alarm coordinate points overhead and realize fixed point cruise.
Further, the detailed process of step 1 is as follows:
Step S101: flight path confirmation
According to the distribution of gas pipeline, peripheral obstacle height and feature, the flight path of multi-rotor unmanned aerial vehicle is planned;
Step S102: flight path setting
In notebook A in flight control modules, flight path is planned by automatic Flight Control Software;By automatic
Flight Control Software shows flight path, GPS location, height, speed, battery capacity information;
Step S103: unmanned plane during flying
Multi-rotor unmanned aerial vehicle is controlled by onboard flight control module to fly according to the flight path of planning;
Step S104: flight path adjustment
Ground image readability around the gas pipeline passed back according to multi-rotor unmanned aerial vehicle adjusts flight road
Line.
Further, the detailed process of step 2 is as follows:
Step S201: sample image is collected
Excavator, bull-dozer, hook machine, drilling platforms image are found by hand picking method from ground image, every kind
It is more than 1000 picture of image recognition;
Step S202: classification is cut out at present
The picture that will be singled out manually is cut, and image is separated, and is saved in data service center;
Step S203: image preprocessing
Manual identified is carried out to the brightness of image, clarity, the bad image of shooting effect is subjected to brightness and contrast
Adjustment, until being adjusted to identify which kind of object by image naked eye;
Step S204: image drop sampling
Using bi-cubic interpolation method, one part of pixel is filtered out from the neighborhood of image adjusted, retains another part
Pixel.
Further, the detailed process of step 3 is as follows:
Step S301: pre-training
Deep learning network parameter is initialized, the part of initialization includes each layer of number of nodes and entire depth neuron
The other parts of network individually train each layer of neuron node after initialization, the output of first layer is as the second layer
Input, and so on, retain each layer of weight;
Step S302: fine tuning
After the completion of training, according to the label value of sample, using the downward algorithm of gradient, to entire depth neuroid into
Row adjustment, optimizes the performance of depth neuroid;
Step S303: graphic verification
Non-classified picture is read by program, calculates the probability of identification, thinks to pass through if discrimination is higher than 85%,
Execute step S304;Otherwise it is further added by identification picture number, re-executes S202, carries out algorithm training, until discrimination is higher than
Until 85%;
Step S304: the picture feature file formed after study is updated into image analysis module.
Further, the detailed process of step 4 is as follows:
Step S401: image transmitting
The vision signal of two optical cameras acquisition passes to wireless image transmission module, wireless image transmission module by video line
Data reception module is wirelessly passed it to, the two vision signals are passed to image procossing by data reception module
Module;
Step S402: brightness regulation
Brightness of image automatic adjustment is carried out by image processing module, makes the brightness of two vision signals in 1% range;
Step S403: feature point extraction
The characteristic point of each image is searched by image processing module, constructs words tree;Features all in image are described
Symbol is layered, while being classified in the way of mean cluster, and the word that a height is L, burl points are K is finally constituted
It converges and sets, and describe each node in words tree by model;
Step S404: words tree determines
By the weight of node each in words tree, calculates image to be matched and concentrate all visual vocabulary vectors, utilize
Similarity function evaluates the similarity between piece image and other images;
Step S405: image pre-matching
Similarity measurement using the Euclidean distance of descriptor as characteristic point in two images, to every in image to be matched
One characteristic point, using BBF inquiry mechanism by the minimum euclidean distance criterion in characteristic point in the feature point set of reference picture
It is matched;
Step S406: geometric deformation processing
If image coordinate is (u, v), the ground coordinate of corresponding image pixel is (x, y), then multinomial between the two
It is as follows to correct formula:
U=a00+a10x+a01y+a20x2+a11xy+a02y2+a30x3+a21x2y+a12xy2+a03y3+……(1)
V=b00+b10x+b01y+b20x2+b11xy+b02y2+b30x3+b21x2y+b12xy2+b03y3+……(2)
Wherein, order of a polynomial number scale is n;aij、bij(i=0 ... n) is polynomial coefficient;
If polynomial item number is N, then the relationship between N and its order n is as follows:
N=(n+1) (n+2)/2 (3)
By the image and ground coordinate value at N number of control point, polynomial coefficient is solved using principle of least square method;
Step S407: Image Mosaic
Image is matched into the principle of correspondence according to distinguished point, image merging is carried out, synthesizes the entirety of an entire ship trajectory
Aerial photo, and pass to image analysis module.
Further, the detailed process of step 5 is as follows:
Step S501: color characteristic identification
Tri- Color Channels of R, G, B in rgb space are divided into several minizones, initial phase is in target area
The middle each pixel of statistics belongs to the probability of each color interval, color histogram is obtained, using color histogram as target signature
Template;
Step S502: image texture characteristic
From the image chosen in adjacent 5*5 pixel in color histogram, set the gray value of central point as threshold value, by its with
The adjacent threshold pixel of surrounding is compared, and the label more than or equal to threshold point is that the point less than threshold value is labeled as 0, forms one
Group binary code, is defined as image texture characteristic value;
The characteristic set of image texture is defined as Si, to simplify the calculation, identification method top left co-ordinate and bottom right angular coordinate,
That is:
Si={ (Xi0, Yi0), (Xi1, Yi1)} (4)
In formula: Xi0, Yi0For target top left co-ordinate;Xi1, Yi1For target bottom right angular coordinate;SiFor the target having determined
In image texture characteristic set;
Step S503: background weighting
The feature templates of background area are calculated, background area is a hollow structure close to target area, forms histogram
Scheme Mu;
Step S504: histogram
Calculate the color and vein feature histogram in hollow structural context region;
Step S505: probability calculation
It is using histogram Mu as the denominator of background weighting coefficient, the weight and target is special when histogram Mu is bigger
The individual features component levied in template is multiplied;
Step S506: target discrimination
A possibility that probability value in target signature template is higher, is shown to be the object is higher;
Step S507: object transmission
Extract target image set S, S={ s1, s2... sn1, wherein n1For target object set siImage texture
Character quantity, unit are group;S is target image set, is passed to target image set S by image analysis module
Alarm module.
Further, the detailed process of step 6 is as follows:
Step S601: target collection determines
By different time tiGenerate a time series coordinate set ST:
ST={ s1, s2..., snj}{t1, t2..., tm} (5)
Wherein, m is time series number;
Step S602: Euclidean distance calculates
Any two time point, the Euclidean distance of time series S and S ' is defined as:
In formula: SiFor image texture in target collection feature in tiThe coordinate set at moment, Si' it is to scheme in target collection
As the feature of texture is in tjThe coordinate set at moment, d (S, S ') are the distance of target collection between two moment, unit pixel;
If meeting the following conditions:
D (S, S ') >=ε (7)
Then think that the target is dynamic point, wherein ε is the preset threshold value of point drift, then its coordinate range is output to dynamic point minute
Analyse file;
Step S603: dynamic point alarm transmission
Dynamic point analysis file is passed to alarm module by image analysis module.
Further, the detailed process of step 7 is as follows:
Step S701: image calibration
It determines coordinate origin (a, b), then determines the tilt angle theta of thermal imaging video image, determine different scale again,
The coordinate s ' of image pixel=(x ', y ') is mapped to s=(x, y) coordinate in a new coordinate system;
In formula, a, b are coordinate translation transformed value, unit pixel;θ is image inclination angle, unit radian;X, y are transformation
Coordinate value afterwards, unit pixel;X ', y ' is coordinate value before converting, unit pixel;Kx, KyFor scale ratio;
Step S702: thermal imagery transmission
The heat sensitive image of flight range is acquired by thermal imaging camera, and data are passed to by wireless image transmission module and are connect
Module is received, which is passed to image processing module by data reception module;
Step S703: coordinate transform
By coordinate conversion parameter (a, b), heat sensitive image is merged into the entirety of step 4 synthesis by way of figure layer
On aerial photo;
Step S704: hot spot alarm
The coordinates regional that temperature is higher than setting value is extracted by image analysis module, and passes to alarm module.
The beneficial effects of the present invention are: the present invention is melted by the way of unmanned plane inspection using thermal imaging and colour imaging
Conjunction technology, hot spot identification, excavates identification technology automatic identification realization automatic detecting at automatic Image Stitching technology, automaticly inspects figure
Picture, automatic alarm of finding the problem.The present invention can automatically, quickly have found the targets such as construction tool, the personnel near gas ductwork,
And identification early warning is carried out, the generation of hidden danger is reduced, 1 square kilometre can be greater than per minute with inspection area, solve manual inspection speed
Spend slow problem.
Detailed description of the invention
Fig. 1 is the structural schematic diagram of gas pipeline unmanned plane inspection device of the invention.
Fig. 2 is unmanned plane module and picture recording module scheme of installation in gas pipeline unmanned plane inspection device of the invention.
Fig. 3 is unmanned plane module in gas pipeline unmanned plane inspection device of the invention, picture recording module and flight control mould
Block connection schematic diagram.
Fig. 4 is the structural schematic diagram of flight control modules in gas pipeline unmanned plane inspection device of the invention.
Fig. 5 is the structural representation of data service center and display module in gas pipeline unmanned plane inspection device of the invention
Figure.
Fig. 6 is the structural schematic diagram of alarm module in gas pipeline unmanned plane inspection device of the invention.
Fig. 7 is the flow chart of gas pipeline unmanned plane method for inspecting of the invention.
Specific embodiment
Invention is further described in detail with reference to the accompanying drawings and examples.
As shown in Figure 1, a kind of gas pipeline unmanned plane inspection device of the invention, specifically includes that unmanned plane module, video recording
Module, wireless image transmission module, data reception module, flight control modules, image processing module, image analysis module, alarm mould
Block, display module and data service center.
Unmanned plane module mounts picture recording module and wireless image transmission module, complete the flight of unmanned plane, ground image acquisition and
Image transmission function.The data image signal that several groups are recorded a video is passed to data reception by wireless image transmission module by picture recording module
Block.The data image signal that several groups are recorded a video is passed to image processing module by data/address bus to data reception module and flight is controlled
Molding block.Several groups of video images are spliced into significantly ground image by image processing module, and combine thermal imaging video image number
Fusion is added according to hot spot image layer is carried out, and image (stitching image and fused optical video image) is logical by treated
It crosses data/address bus and passes to image analysis module, display module and flight control modules.Image analysis module passes through artificial intelligence
Image recognition technology is analyzed suspicious accident point and is labeled on the image, and the accident point after will confirm that is broadcast by alarm module
It quotes, and will treated that image passes to display module shows.The image data that flight control modules will receive
It is transferred to data service center, flight control modules send control instruction to unmanned plane module, more rotors in unmanned plane module
Unmanned plane flies according to control instruction.
As shown in figure 3, unmanned plane module is made of multi-rotor unmanned aerial vehicle, onboard flight control module, flight controller.Fly
Line control unit is wirelessly connect with onboard flight control module.Onboard flight control module and multi-rotor unmanned aerial vehicle use
Control line connection.Onboard flight control module is wirelessly connect with flight control modules.Onboard flight control module master
It receives flight control modules and flying for multi-rotor unmanned aerial vehicle is controlled according to flight directive to the flight directive of multi-rotor unmanned aerial vehicle
Row state.Manually flight control function is provided by hand-held flight controller.
As shown in Figures 2 and 3, picture recording module carry is in the multi-rotor unmanned aerial vehicle in unmanned plane module, picture recording module packet
Include two optical cameras 1 and a thermal imaging camera 2.Two optical cameras 1 and a thermal imaging camera 2 are located at same
In one plane, two optical cameras 1 are symmetrically laid in the two sides of multi-rotor aerocraft heading, optical camera 1
The angle α of axis and multi-rotor aerocraft heading where camera is 60 degree.Thermal imaging camera 2 is placed in two optics
Among camera 1, the axis where the camera of thermal imaging camera 2 is identical as heading.Pass through two optical cameras 1
Ground image data (data image signal) is acquired with a thermal imaging camera 2, and number is passed to by wireless image transmission module
According to receiving module.
Wireless image transmission module carry is in the multi-rotor unmanned aerial vehicle in unmanned plane module, wireless image transmission module and picture recording module
Three video cameras (two optical cameras 1 with a thermal imaging camera 2) be all made of video line and connect.Wireless image transmission module
The ground image data acquired by three video cameras are passed to the data reception on ground by way of wireless data transmission
Block.
Data reception module is made of data receiver and the first video server.Data receiver wirelessly with
The connection of wireless image transmission module, the first video server connect with data receiver by video line, the first video server with it is winged
Row control module passes through network connection.Data receiver receives the ground image data passed over by wireless image transmission module, and
The first video server is passed to by video line, the first video server passes it to flight control modules by network.
As shown in figure 4, flight control modules be mainly used for control multi-rotor unmanned aerial vehicle flight path, receive more rotors without
The image information of man-machine transmission sends the functions such as voice to multi-rotor unmanned aerial vehicle.Flight control modules are by (the association of 3 notebooks
Small modish 5000 type), maintenance-free battery (Foster, John Watson 100AH12V*2), wireless router (H3C 4G+wifi) composition.It is airborne
Flight control modules are connected with wireless router, and three notebooks pass through wifi and connect with wireless router, wireless router
It is connected with the first video server in data reception module by cable, wireless router passes through wireless network and data service
Center is connected, and maintenance-free battery is connected with wireless router.On the one hand maintenance-free battery is powered to wireless router, another
It has a frequency conversion delivery outlet to aspect, can charge to notebook and multi-rotor unmanned aerial vehicle.Wherein installed in a notebook A
There is automatic Flight Control Software, automatic Flight Control Software can carry out flight path planning, actual path and desired guiding trajectory ratio
To function, for showing the information such as flight control information, including flight path, GPS location, height, speed, battery capacity;One
Platform notebook C shows the image that passes back of thermal imaging camera, and another notebook B show that 2 optical cameras pass through the
The fused optical video image of two image processing modules.The first video server in data reception module passes through network for ground
Face image data passes to the wireless router in flight control modules, and wireless router is passed it to corresponding by WiFi
Laptop and data service center, onboard flight control module are received in notebook A by wireless router and are flown automatically
The control signal that software transmitting comes is controlled, sends control instruction to multi-rotor unmanned aerial vehicle, multi-rotor unmanned aerial vehicle is according to control instruction
It flies.
As shown in figure 5, data service center is quasi- by the second video server (Haikang prestige regards DS-VR2208C-BBC), image
Hop server (association X3750), image analysis server (association X3750), optical fiber switch, (Haikang prestige regards DS- to storage array
AT1000S), core switch (H3C S7506), control computer, firewall (domain nebula), special line composition.Second video
Server, image fitting server, image analysis server, firewall and control computer pass through cable and core switch
It is connected.Storage array is for storing equipment, the more video image of storage occupied space.Computer is controlled for operating cruise report
The softwares such as alert module, image processing module.Video server, image fitting server, image analysis server and storage array
Optical fiber switch is connected to by optical fiber.Special line is connect with firewall by cable.Alarm module passes through special line and firewall
It is connected with core switch, the wireless router in flight control modules is connected by special line and firewall with core switch.
Image processing module is a set of software systems, including the first image processing module and the second image processing module,
One image processing module is installed on the notebook B of flight control modules, and the optical video image of 2 optical cameras acquisition is logical
Cross wireless image transmission module, data receiver is transferred to the first video server, the vision signal of the first video server output is logical
Cross wireless router, WiFi wireless network is transmitted in the first image processing module in notebook B, pass through the first image procossing
Module carries out real-time two-way video Image Mosaic and stitching image is transmitted to data by wireless router, WiFi wireless network
Service centre;Second image processing module is installed on the image fitting server of data service center, and 2 optical cameras are adopted
The optical video image of collection is transferred to the first video server, the first Video service by wireless image transmission module, data receiver
The vision signal of device output is transmitted to the second image processing module that image is fitted in server by wireless router, WiFi
In, the fusion of timesharing video image is carried out by the second image processing module, fused optical video image transmitting is to notebook
B is shown on notebook B.
Image analysis module is installed on the image analysis server of data service center.Image analysis module receives first
Image processing module treated stitching image and the second image processing module treated blending image;By in image analysis
It is embedded in artificial intelligence image analysis algorithm in deep neural network in module, can identify excavator, bull-dozer, hook machine, bore
Image designated position is identified by the images such as well platform after finding target with red block, while the accident point after will confirm that is logical
The mode (by the link of core switch, firewall and special line) for crossing messaging passes to alarm module, passes through mould of alarming
Block alarms to corresponding target, and can upgrade the identification increased to various equipment by algorithm and improve accuracy of identification.
Image analysis module is analyzed suspicious accident point and is labeled on the image by artificial intelligence image recognition technology, alarms
Module is broadcasted away, and will treated that image passes to display module shows.
As shown in figure 5, display module is by video distributor, display (TPV E2270), mosaic screen, (Haikang 4*3 prestige is regarded
DS-D2055NL-B/G) and splicing screen controller (Haikang prestige regards DS-C10S-S11/E) forms.Video distributor and control calculate
Machine is connected, and video distributor is connected with display and splicing screen controller, and splicing screen controller is connected with mosaic screen.Control calculates
The vision signal of machine output is transferred to video distributor by video line, and the vision signal of video distributor output passes through view all the way
Frequency line passes to display and is shown, another way passes to splicing screen controller by video line, and splicing screen controller will regard
Frequency signal passes to corresponding mosaic screen by video line and is shown.
As shown in fig. 6, alarm module by cruise alarm module, cruise alarm software, cruise alarm module field control plate,
Smart phone, microphone, speaker composition, realize remote alarms and phonetic warning function.Wherein cruise alarm module is installed to data
On the image analysis server of service centre;Cruise alarm module field control plate is installed to the notebook in flight control modules
On C;Cruise alarm software is installed on smart phone;Cruise alarm module is wirelessly connected with cruise alarm software;Microphone
It is connected by tone frequency channel wire with the control computer of data service center;Speaker is installed to the multi-rotor unmanned aerial vehicle in unmanned plane module
On.Warning message is passed to cruise alarm mould by messaging mode by the image analysis module in image analysis server
Warning message is sent to the cruise alarm software in smart phone, passes through alarm software management of cruising by block, the alarm module that cruises
Every inspection warning function.Cruise report can be used by internal network (special line, firewall, core switch) by controlling computer
The client of alert module is the cruise alarm software being mounted in smart phone, and the microphone by being connected on control computer
It propagandas directed to communicate function to specified multi-rotor unmanned aerial vehicle, transmission path is that the voice signal of microphone passes through control computer, core exchange
Cruise alarm module that machine is transferred in image analysis server is simultaneously converted into voice digital signal by the alarm module that cruises, and patrols
Voice digital signal is passed to flight control mould by core switch, firewall, special line, wireless network again by boat alarm module
Cruise alarm module field control plate in block on notebook C, cruise alarm module field control plate is by the voice digital signal
Wireless image transmission module is passed to by the data receiver in data reception module, the audio port of wireless image transmission module passes through sound
Frequency line is connected with speaker, and for speaker carry in multi-rotor unmanned aerial vehicle, voice digital signal is passed through tone frequency channel wire by wireless image transmission module
Pass to speaker.
In present embodiment, multi-rotor unmanned aerial vehicle selects Changchun Institute of Optics, Fine Mechanics and Physics, CAS raw
The free butterfly-type quadrotor drone and matched onboard flight control module and flight controller of production.Multi-rotor unmanned aerial vehicle load
Flight time 40 minutes or more, flying speed 12m/s or more, 6 grades of wind loading rating or more.
In present embodiment, optical camera 1 flies think of IXU150 airphoto head using Denmark, thermal imaging camera 2 is adopted
With KELUSI flying fish high definition thermal imaging system D101, more than 50,000,000 pixel of resolution ratio of camera head.
In present embodiment, wireless image transmission module specifically uses Sa Jia/SAGA-RCXY, and transmission bandwidth is passed in 10M or more
Defeated distance has the function of 4 tunnel forward direction transmission of video, 1 tunnel audio reverse transfer in 500m or more.
In present embodiment, the data receiver in data reception module specifically uses Sa Jia/SAGA-RCXY, video clothes
Business implement body uses 4 road NVR of Haikang prestige view.
As shown in fig. 7, a kind of gas pipeline unmanned plane method for inspecting of the invention, the specific steps of which are as follows:
Step 1: aviation is planned
Step S101: flight path confirmation
According to the distribution of gas pipeline, peripheral obstacle height and feature, the flight path of multi-rotor unmanned aerial vehicle is planned, and
Apply for air lane to aviation management department.
Step S102: flight path setting
In notebook A in flight control modules, flight path is planned by automatic Flight Control Software.Notebook A
In automatic Flight Control Software is installed, automatic Flight Control Software can carry out flight path planning, actual path and default
Track comparison function, for showing that flight control information, including flight path, GPS location, height, speed, battery capacity etc. are believed
Breath;
Step S103: unmanned plane during flying
Multi-rotor unmanned aerial vehicle is controlled by onboard flight control module to fly according to the flight path of planning.
Step S104: flight path adjustment
Ground image readability around the gas pipeline passed back according to multi-rotor unmanned aerial vehicle adjusts flight road
Line.
Step 2: image zooming-out
Step S201: sample image is collected
Method from ground image through hand picking finds the images such as excavator, bull-dozer, hook machine, drilling platforms,
It is more than every kind of 1000 picture of image recognition.
Step S202: target cuts out classification
The above-mentioned image picked out manually is cut, image is separated, and be saved in data service center
In specified path.
Step S203: image preprocessing
Manual identified is carried out for the brightness of image, clarity, the bad image of shooting effect is subjected to brightness and comparison
Degree adjustment, until being adjusted to identify which kind of object by image naked eye.
Step S204: image drop sampling
Using bi-cubic interpolation method, one part of pixel is filtered out from the neighborhood of image adjusted, retains another part
Pixel.
Step 3: machine learning
Step S301: pre-training
Deep learning network parameter is initialized first, and the part of initialization includes each layer of number of nodes and entire depth
The other parts of neuroid, such as block size, sparsity whether is added, whether noise is added.After initialization, just
Each layer of neuron node can be individually trained, input of the output of first layer as the second layer so analogizes, retain each layer
Weight.
Step S302: fine tuning
After the completion of training, in order to keep the performance of depth neuroid more excellent, it can be used according to the label value of sample
The downward algorithm of gradient, is adjusted entire depth neuroid.
Step S303: graphic verification
Non-classified picture is read by program, calculates the probability of identification, thinks to pass through if discrimination is higher than 85%,
Execute step S304;Otherwise it is further added by identification picture number, re-executes S202, carries out algorithm training, until discrimination is higher than
Until 85%.
Step S304: the picture feature file formed after study is updated into image analysis module.
Step 4: image mosaic
Step S401: image transmitting
The vision signal of two optical cameras acquisition passes to wireless image transmission module, wireless image transmission module by video line
Data reception module is wirelessly passed it to, the two vision signals are passed to image procossing by data reception module
Module completes image transmission function.
Step S402: brightness regulation
Brightness of image automatic adjustment is carried out by image processing module, makes the brightness of two vision signals in 1% range.
Step S403: feature point extraction
The characteristic point of each image is searched by image processing module, constructs words tree;Features all in image are described
Symbol is layered is classified in the way of mean cluster, and finally constitutes the vocabulary that a height is L, burl points are K
It sets, and describes each node in words tree by model.
Step S404: words tree determines
By the weight of node each in words tree, calculates image to be matched and concentrate all visual vocabulary vectors, utilize
Similarity function evaluates the similarity between piece image and other images.
Step S405: image pre-matching
In the potential images match of determination to rear, it is of interest that the Feature Points Matching between image.Using the European of descriptor
Similarity measurement of the distance as characteristic point in two images utilizes BBF (Best to each characteristic point in image to be matched
Bin First) inquiry mechanism can carry out in the feature point set of reference picture by the minimum euclidean distance criterion in characteristic point
Matching.
Step S406: geometric deformation processing
The coordinate (x, y) of image pixel is first mapped to a certain coordinate factory (x ', y ') in a new coordinate system, then to it
Pixel carries out resampling, and mapping function is obtained by calculating established feature corresponding relationship between image and reference picture subject to registration
It arrives.
If image coordinate is (u, v), the ground coordinate of corresponding image pixel is (x, y), then multinomial between the two
Correcting formula can indicate are as follows:
U=a00+a10x+a01y+a20x2+a11xy+a02y2+a30x3+a21x2y+a12xy2+a03y3+……(1)
V=b00+b10x+b01y+b20x2+b11xy+b02y2+b30x3+b21x2y+b12xy2+b03y3+……(2)
Wherein, order of a polynomial number scale is n;aij、bij(i=0 ... n) is polynomial coefficient.It is N number of by choosing
Polynomial coefficient is sought using least square method in control point.
Polynomial item number (i.e. the number of multinomial coefficient), which is denoted as N, N and its order n, following fixed relationship:
N=(n+1) (n+2)/2 (3)
By the image and ground coordinate value at N number of control point, polynomial coefficient is solved using principle of least square method.
Step S407: Image Mosaic
Image is matched into the principle of correspondence according to distinguished point, image merging is carried out, synthesizes the entirety of an entire ship trajectory
Aerial photo, and pass to image analysis module.
Step 5: target identification
Step S501: color characteristic identification
Image after image analysis module analysis synthesis, extracts color histogram as target signature template.By rgb space
In tri- Color Channels of R, G, B be divided into several minizones, initial phase is flat in excavator, bull-dozer, hook machine, drilling well
Each pixel is counted in the target areas such as platform and belongs to the probability of each color interval, to obtain color histogram.
Step S502: image texture characteristic (LBP)
From the image chosen in adjacent 5*5 pixel in color histogram, set the gray value of central point as threshold value, by its with
The adjacent threshold pixel of surrounding is compared, and the label more than or equal to threshold point is that the point less than threshold value is labeled as 0, thus shape
At one group of binary code, LBP texture eigenvalue is defined as with this.
Feature (LBP) set of image texture is defined as Si, calculated to simplify, identification method top left co-ordinate and bottom right
Angular coordinate, it may be assumed that
Si={ (Xi0, Yi0), (Xi1, Yi1)} (4)
In formula: Xi0, Yi0For the targets top left co-ordinate such as excavator, bull-dozer;Xi1, Yi1For targets such as excavator, bull-dozers
Bottom right angular coordinate;SiFor feature (LBP) set for having determined the image texture among the targets such as excavator, bull-dozer.
Step S503: background weighting
Firstly the need of the feature templates for calculating background area, background area is a hollow structure close to target area,
Form histogram Mu.
Step S504: histogram
Hollow structural context region is calculated using method identical with the target signatures histograms such as excavator, bull-dozer are calculated
Interior color and vein feature histogram.
Step S505: probability calculation
When some characteristic component in the target signatures template such as excavator, bull-dozer hollow structural context region have compared with
High probability when that is, histogram Mu is bigger, using histogram Mu as the denominator of background weighting coefficient, can make histogram Mu have
There is the value of very little, which is multiplied with the individual features component in target signature template.
Step S506: target discrimination
Probability value in the target signatures template such as excavator, bull-dozer is higher, shows that a possibility that may be the object gets over
It is high.
Step S507: object transmission
Image analysis module will analyze several image textures of the target images set such as possible excavator, bull-dozer S
Feature (LBP) set extract, S={ s1, s2... sn1, wherein n1For target object set siImage texture spy
(LBP) feature quantity is levied, unit is group;S is the target images set such as excavator, bull-dozer.Image analysis module is by the target
Image collection S passes to alarm module.
Step 6: characteristic spectrum moves point analysis
Image analysis module analyzes the characteristic spectrum of split image, and the shape of feature object is checked from the image of different time
State provides object of which movement warning message if it find that the state of object changes.
Step S601: the target collections such as excavator, bull-dozer determine
By different time tiA time series coordinate set ST is generated, may be expressed as:
ST={ s1, s2..., sn1}{t1, t2..., tm} (5)
Wherein, m is time series number.
Step S602: Euclidean distance calculates
Any two time point, the Euclidean distance of time series S and S ' is defined as:
In formula: SiFor image texture in the target collections such as excavator, bull-dozer feature (LBP) in tiThe coordinate set at moment
It closes.Si' it is the feature (LBP) of image texture in the target collections such as excavator, bull-dozer in tjThe coordinate set at moment.D (S, S ')
The distance of the target collections such as excavator, bull-dozer, unit pixel between two moment.
If meeting the following conditions:
D (S, S ') >=ε (7)
Then think that the targets such as the excavator, bull-dozer are dynamic point.Wherein ε is the preset threshold value of point drift, then by its coordinate
Range is output to dynamic point analysis file.
Step S603: dynamic point alarm transmission
Dynamic point analysis file is passed to alarm module by image analysis module.
Step 7: analysis of central issue
Step S701: image calibration
By way of manual calibration, thermal imaging video image is calibrated on visible images.Determine that coordinate is former first
Point (a, b) then determines the tilt angle theta of thermal imaging video image, determines different scale (scale ratio K againx, Ky).Image slices
Coordinate s '=(x ', the y ') of element is mapped to s=(x, y) coordinate in a new coordinate system.
In formula, a, b are coordinate translation transformed value, unit pixel;θ is image inclination angle, unit radian;X, y are transformation
Coordinate value afterwards, unit pixel;X ', y ' is coordinate value before converting, unit pixel;Kx, KyFor scale ratio.
Step S702: thermal imagery transmission
The heat sensitive image of flight range is acquired by thermal imaging camera, and data are passed to by wireless image transmission module and are connect
Module is received, which is passed to image processing module by data reception module.
Step S703: coordinate transform
By coordinate conversion parameter (a, b), heat sensitive image is merged into the entirety of step 4 synthesis by way of figure layer
On aerial photo.
Step S704: hot spot alarm
The coordinates regional that temperature is higher than setting value is extracted by image analysis module, and passes to alarm module.
Step 8: alarm
Alarm module is sat by the hot spot of the acquisition coordinates regional of step 5, the moving coordinates region of step 6, step 7
Region is marked, the warning message of flashing is shown in the coordinate points of designated pictures, and the alarm coordinate points are passed into flight control
Module.
Step 9: fixed point cruise
Flight control modules receive the report for police service after coordinate points, are controlled according to alarm coordinate points by the automatic flight in notebook A
The flight path of software plan multi-rotor unmanned aerial vehicle processed, while sending fixed point cruise course line setting to onboard flight control module and referring to
It enables, so that it is carried out flight in alarm coordinate points overhead and realize fixed point cruise.
The aviation image of testing staff's detailed inspection alarm coordinate points.Control by being connected to data service center calculates
The voice signal of microphone on machine, microphone is transferred to patrolling in image analysis server by control computer, core switch
Boat alarm module is simultaneously converted into voice digital signal by the alarm module that cruises, and cruise alarm module again leads to voice digital signal
It crosses core switch, firewall, special line, wireless network and passes to cruise alarm module in flight control modules on notebook C
The voice digital signal is passed through the data receiver in data reception module by field control plate, cruise alarm module field control plate
Machine passes to wireless image transmission module, and voice messaging is passed to carry more by the reverse voice port in wireless image transmission module
The phonetic warning is played back by speaker, reaches warning effect with this by the speaker on rotor wing unmanned aerial vehicle.
Although the embodiments of the present invention have been disclosed as above, but its is not only in the description and the implementation listed
With.It can be applied to various suitable the field of the invention completely.It for those skilled in the art, can be easily
Realize other modification.Therefore without departing from the general concept defined in the claims and the equivalent scope, the present invention is simultaneously unlimited
In specific details and legend shown and described herein.
Claims (10)
1. a kind of gas pipeline unmanned plane inspection device characterized by comprising unmanned plane module, picture recording module, wireless image transmission
Module, data reception module, flight control modules, image processing module, image analysis module, alarm module, display module and
Data service center;
Flight control modules send control instruction to unmanned plane module, and the multi-rotor unmanned aerial vehicle in unmanned plane module refers to according to control
Order is flown;
Unmanned plane module mounts picture recording module and wireless image transmission module, completes the flight, ground image acquisition and image of unmanned plane
Transmission;
The data image signal of acquisition is passed to data reception module by wireless image transmission module by picture recording module;
Received data image signal is passed to image processing module by data/address bus to data reception module and flight controls
Module;
Image processing module is spliced and is merged to the image received, and spliced image and fused image are led to
It crosses data/address bus and passes to image analysis module, display module and flight control modules;
Image analysis module is analyzed suspicious accident point by artificial intelligence image recognition technology and is labeled on the image, will be true
Accident point after recognizing is broadcasted by alarm module, and will treated that image passes to display module shows;
The image data received is transferred to data service center by flight control modules.
2. a kind of gas pipeline unmanned plane inspection device according to claim 1, which is characterized in that the unmanned plane module
It is made of multi-rotor unmanned aerial vehicle, onboard flight control module, flight controller;The onboard flight control module receives flight control
The control instruction of molding block, multi-rotor unmanned aerial vehicle are flown according to control instruction, provide progress by hand-held flight controller
Flight control manually;
The picture recording module includes two optical cameras and a thermal imaging camera of the carry in multi-rotor unmanned aerial vehicle, and two
A optical camera and a thermal imaging camera are located in approximately the same plane, and two optical cameras are symmetrically laid in more rotors
The two sides in aircraft flight direction, the angle α of axis and multi-rotor aerocraft heading where the camera of optical camera
Be 60 degree, thermal imaging camera is placed among two optical cameras, the axis where the camera of thermal imaging camera with
Heading is identical, acquires ground image data by two optical cameras and a thermal imaging camera, and by wireless
Figure transmission module passes to data reception module;
The data reception module is made of data receiver and the first video server;The data receiver is received by wireless
The ground image data that figure transmission module passes over, and the first video server, the first Video service are passed to by video line
Device passes it to flight control modules by network;
The flight control modules are made of 3 notebooks, maintenance-free battery, wireless routers;The maintenance-free battery
For wireless router power supply;Automatic Flight Control Software is installed in notebook A, is flown using automatic Flight Control Software
Trajectory planning, actual path and desired guiding trajectory compare, show flight control information;Notebook C is for showing thermal imaging camera
The image passed back;Notebook B is for showing fused optical video image;First video server passes through network
Ground image data are passed into wireless router, wireless router passes it to corresponding notebook and data by WiFi
Service centre, onboard flight control module by wireless router receive the transmitting of automatic Flight Control Software come control signal,
Control instruction is sent to multi-rotor unmanned aerial vehicle, multi-rotor unmanned aerial vehicle is flown according to control instruction.
3. a kind of gas pipeline unmanned plane inspection device according to claim 2, which is characterized in that in the data service
The heart is exchanged by the second video server, image fitting server, image analysis server, optical fiber switch, storage array, core
Machine, control computer, firewall, special line composition;Second video server, image analysis server, is prevented image fitting server
Wall with flues and control computer pass through cable and are connected with core switch, and the video server, image are fitted server, image
Analysis server and storage array pass through optical fiber and are connected to optical fiber switch, and the special line is connect with firewall by cable,
The alarm module is connected by special line and firewall with core switch, and the wireless router in the flight control modules is logical
It crosses special line and firewall is connected with core switch;
Described image processing module includes the first image processing module and the second image processing module, and the first image handles mould
Block is installed on notebook B, and the optical video image of two optical cameras acquisition passes through wireless image transmission module, data receiver
It is transferred to the first video server, the vision signal of the first video server output passes through wireless router, WiFi wireless network
It is transmitted in the first image processing module in notebook B, real-time two-way video image is carried out by the first image processing module
Stitching image is simultaneously transmitted to data service center by wireless router, WiFi wireless network by split;At second image
Reason module is installed on image fitting server, and the optical video image of two optical cameras acquisition passes through wireless image transmission mould
Block, data receiver are transferred to the first video server, the vision signal of the first video server output by wireless router,
WiFi is transmitted in the second image processing module in image fitting server, carries out timesharing view by the second image processing module
Frequency image co-registration, fused optical video image transmitting are given notebook B, are shown on notebook B;
Described image analysis module is installed on image analysis server, and described image analysis module receives the first image procossing mould
Block treated stitching image and the second image processing module treated blending image, by being embedded in image analysis module
Image analysis algorithm in deep neural network in artificial intelligence is analyzed suspicious accident point and is labeled on the image, will confirm that
Accident point afterwards is broadcasted by alarm module, and will treated that image passes to display module shows;
The display module is made of video distributor, display, mosaic screen and splicing screen controller;The control computer is defeated
Vision signal out is transferred to video distributor by video line, and the vision signal of video distributor output passes through video line all the way
It passes to display to be shown, another way passes to splicing screen controller by video line, and splicing screen controller believes video
Number passing to corresponding mosaic screen by video line is shown.
4. a kind of gas pipeline unmanned plane inspection device according to claim 3, which is characterized in that the alarm module by
Cruise alarm module, cruise alarm software, cruise alarm module field control plate, smart phone, microphone, speaker composition;It is described
Cruise alarm module is installed on image analysis server, and cruise alarm module field control plate is installed on notebook C, is cruised
Alarm software is installed on smart phone, and cruise alarm module is wirelessly connected with cruise alarm software, and microphone passes through audio
Line is connected with the control computer of data service center, and speaker is installed in multi-rotor unmanned aerial vehicle;Described image analysis module will
Warning message passes to cruise alarm module by messaging method, and warning message is sent to intelligent hand by the alarm module that cruises
Cruise alarm software in machine manages every inspection warning function by cruise alarm software, and the control computer passes through interior
Portion's Web vector graphic cruise alarm module client i.e. cruise alarm software, and by be connected to control computer on microphone to
Specified multi-rotor unmanned aerial vehicle is propagandaed directed to communicate, and the voice signal of the microphone is transferred to by control computer, core switch
Cruise alarm module is simultaneously converted into voice digital signal by the alarm module that cruises, and the alarm module that cruises is again by voice digital signal
The cruise alarm module field control plate on notebook C is passed to by core switch, firewall, special line, wireless network, is patrolled
The voice digital signal is passed to wireless image transmission module, wireless image transmission by data receiver by boat alarm module field control plate
The audio port of module is connected by tone frequency channel wire with speaker, and speaker carry is in multi-rotor unmanned aerial vehicle, and wireless image transmission module is by language
Sound digital signal passes to speaker by tone frequency channel wire.
5. a kind of method for inspecting of gas pipeline unmanned plane inspection device as described in any one of Claims 1-4, special
Sign is, comprising the following steps:
Step 1: aviation is planned
The flight path for planning and being arranged multi-rotor unmanned aerial vehicle, by onboard flight control module control multi-rotor unmanned aerial vehicle according to
The flight path of planning is flown, and picture recording module acquires ground image;
Step 2: image zooming-out
By collecting sample image, sample image is cut out and is classified, the brightness and contrast of image is adjusted, using double vertical
Square interpolation method carries out image drop sampling;
Step 3: machine learning
Learnt using image of the depth neuroid to extraction, the picture feature file formed after study is updated to figure
As in analysis module;
Step 4: image mosaic
By image processing module automatically adjust two optical cameras acquisition video image brightness, by building words tree into
Row feature point extraction, words tree determine, then complete image mosaic by image pre-matching, geometric deformation processing, obtain entire
The whole aerial photo of ship trajectory;
Step 5: target identification
Color characteristic identification is carried out from the image after image mosaic, and the feature set of image texture is obtained from color histogram
It closes, calculates the color and vein feature histogram in hollow structural context region, target image is determined by probability calculation,
Target image set is formed, which is passed to by alarm module by image analysis module;
Step 6: characteristic spectrum moves point analysis
Image analysis module analyzes the characteristic spectrum of split image, and the state of feature object is checked from the image of different time,
If it find that the state of object changes, object of which movement warning message is provided;
Step 7: analysis of central issue
By way of manual calibration, thermal imaging video image is calibrated on visible images;It is adopted by thermal imaging camera
Collect the heat sensitive image of flight range, and data reception module is passed to by wireless image transmission module, data reception module is by the view
Frequency image passes to image processing module;Heat sensitive image is merged into the whole aerial chart of step 4 synthesis by way of figure layer
On piece;The coordinates regional that temperature is higher than setting value is extracted by image analysis module, and passes to alarm module;
Step 8: alarm
According to Step 5: Step 6: the information that step 7 obtains, shows the alarm signal of flashing in the coordinate points of designated pictures
Breath, and the alarm coordinate points are passed into flight control modules;
Step 9: fixed point cruise
Flight control modules receive the report for police service after coordinate points, soft by the automatic flight control in notebook A according to alarm coordinate points
Part plans the flight path of multi-rotor unmanned aerial vehicle, while sending the setting instruction of fixed point cruise course line to onboard flight control module,
So that it is carried out flight in alarm coordinate points overhead and realizes fixed point cruise.
6. method for inspecting according to claim 5, which is characterized in that the detailed process of step 2 is as follows:
Step S201: sample image is collected
Excavator, bull-dozer, hook machine, drilling platforms image, every kind of image are found by hand picking method from ground image
Identify 1000 pictures or more;
Step S202: classification is cut out at present
The picture that will be singled out manually is cut, and image is separated, and is saved in data service center;
Step S203: image preprocessing
Manual identified is carried out to the brightness of image, clarity, the bad image of shooting effect is subjected to brightness and setting contrast,
Until being adjusted to identify which kind of object by image naked eye;
Step S204: image drop sampling
Using bi-cubic interpolation method, one part of pixel is filtered out from the neighborhood of image adjusted, retains another part pixel.
Step S304: the picture feature file formed after study is updated into image analysis module.
7. method for inspecting according to claim 5, which is characterized in that the detailed process of step 4 is as follows:
Step S401: image transmitting
The vision signal of two optical cameras acquisition passes to wireless image transmission module by video line, and wireless image transmission module passes through
Wireless mode passes it to data reception module, the two vision signals are passed to image procossing mould by data reception module
Block;
Step S402: brightness regulation
Brightness of image automatic adjustment is carried out by image processing module, makes the brightness of two vision signals in 1% range;
Step S403: feature point extraction
The characteristic point of each image is searched by image processing module, constructs words tree;By feature descriptors all in image into
Row layering, while being classified in the way of mean cluster, finally constitute the vocabulary that a height is L, burl points are K
It sets, and describes each node in words tree by model;
Step S404: words tree determines
By the weight of node each in words tree, calculates image to be matched and concentrate all visual vocabulary vectors, utilization is similar
Spend the similarity between function evaluation piece image and other images;
Step S405: image pre-matching
Similarity measurement using the Euclidean distance of descriptor as characteristic point in two images, to each in image to be matched
Characteristic point is carried out in the feature point set of reference picture by the minimum euclidean distance criterion in characteristic point using BBF inquiry mechanism
Matching;
Step S406: geometric deformation processing
If image coordinate is (u, v), the ground coordinate of corresponding image pixel is (x, y), then polynomial correction between the two
Formula is as follows:
U=a00+a10x+a01y+a20x2+a11xy+a02y2+a30x3+a21x2y+a12xy2+a03y3+…… (1)
V=b00+b10x+b01y+b20x2+b11xy+b02y2+b30x3+b21x2y+b12xy2+b03y3+…… (2)
Wherein, order of a polynomial number scale is n;aij、bij(i=0 ... n) is polynomial coefficient;
If polynomial item number is N, then the relationship between N and its order n is as follows:
N=(n+1) (n+2)/2 (3)
By the image and ground coordinate value at N number of control point, polynomial coefficient is solved using principle of least square method;
Step S407: Image Mosaic
Image is matched into the principle of correspondence according to distinguished point, image merging is carried out, synthesizes the whole aviation of an entire ship trajectory
Picture, and pass to image analysis module.
8. method for inspecting according to claim 5, which is characterized in that the detailed process of step 5 is as follows:
Step S501: color characteristic identification
Tri- Color Channels of R, G, B in rgb space are divided into several minizones, initial phase is united in the target area
The probability that each pixel belongs to each color interval is counted, color histogram is obtained, using color histogram as target signature template;
Step S502: image texture characteristic
From the image chosen in adjacent 5*5 pixel in color histogram, the gray value of central point is set as threshold value, by itself and surrounding
Adjacent threshold pixel be compared, the label more than or equal to threshold point is, the point less than threshold value is labeled as 0, forms one group two
Ary codes are defined as image texture characteristic value;
The characteristic set of image texture is defined as Si, to simplify the calculation, identification method top left co-ordinate and bottom right angular coordinate, it may be assumed that
Si={ (Xi0, Yi0), (Xi1, Yi1)} (4)
In formula: Xi0, Yi0For target top left co-ordinate;Xi1, Yi1For target bottom right angular coordinate;SiFor in the target that has determined
Image texture characteristic set;
Step S503: background weighting
The feature templates of background area are calculated, background area is a hollow structure close to target area, forms histogram Mu;
Step S504: histogram
Calculate the color and vein feature histogram in hollow structural context region;
Step S505: probability calculation
When histogram Mu is bigger, using histogram Mu as the denominator of background weighting coefficient, by the weight and target signature mould
Individual features component in plate is multiplied;
Step S506: target discrimination
A possibility that probability value in target signature template is higher, is shown to be the object is higher;
Step S507: object transmission
Extract target image set S, S={ s1, s2... sn1, wherein n1For target object set siImage texture feature
Feature quantity, unit are group;S is target image set, and target image set S is passed to alarm by image analysis module
Module.
9. method for inspecting according to claim 5, which is characterized in that the detailed process of step 6 is as follows:
Step S601: target collection determines
By different time tiGenerate a time series coordinate set ST:
ST={ s1, s2..., sn1}{t1, t2..., tm} (5)
Wherein, m is time series number;
Step S602: Euclidean distance calculates
Any two time point, the Euclidean distance of time series S and S ' is defined as:
In formula: SiFor image texture in target collection feature in tiThe coordinate set at moment, Si' it is image line in target collection
The feature of reason is in tjThe coordinate set at moment, d (S, S ') are the distance of target collection between two moment, unit pixel;
If meeting the following conditions:
D (S, S ') >=ε (7)
Then think that the target is dynamic point, wherein ε is the preset threshold value of point drift, then its coordinate range is output to dynamic point analysis text
Part;
Step S603: dynamic point alarm transmission
Dynamic point analysis file is passed to alarm module by image analysis module.
10. method for inspecting according to claim 5, which is characterized in that the detailed process of step 7 is as follows:
Step S701: image calibration
It determines coordinate origin (a, b), then determines the tilt angle theta of thermal imaging video image, determine different scale, image again
The coordinate s ' of pixel=(x ', y ') is mapped to s=(x, y) coordinate in a new coordinate system;
In formula, a, b are coordinate translation transformed value, unit pixel;θ is image inclination angle, unit radian;X, y are transformation recoil
Scale value, unit pixel;X ', y ' is coordinate value before converting, unit pixel;Kx, KyFor scale ratio;
Step S702: thermal imagery transmission
The heat sensitive image of flight range is acquired by thermal imaging camera, and data reception is passed to by wireless image transmission module
The video image is passed to image processing module by block, data reception module;
Step S703: coordinate transform
By coordinate conversion parameter (a, b), heat sensitive image is merged into the whole aviation of step 4 synthesis by way of figure layer
On picture;
Step S704: hot spot alarm
The coordinates regional that temperature is higher than setting value is extracted by image analysis module, and passes to alarm module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811276910.7A CN109379564A (en) | 2018-10-30 | 2018-10-30 | A kind of gas pipeline unmanned plane inspection device and method for inspecting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811276910.7A CN109379564A (en) | 2018-10-30 | 2018-10-30 | A kind of gas pipeline unmanned plane inspection device and method for inspecting |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109379564A true CN109379564A (en) | 2019-02-22 |
Family
ID=65390592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811276910.7A Pending CN109379564A (en) | 2018-10-30 | 2018-10-30 | A kind of gas pipeline unmanned plane inspection device and method for inspecting |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109379564A (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110008845A (en) * | 2019-03-12 | 2019-07-12 | 北京市燃气集团有限责任公司 | The detection method and device of burning line hidden danger point |
CN110598532A (en) * | 2019-07-31 | 2019-12-20 | 长春市万易科技有限公司 | Tree pest and disease damage monitoring system and method |
CN110673628A (en) * | 2019-09-20 | 2020-01-10 | 北京航空航天大学 | Inspection method for oil-gas pipeline of composite wing unmanned aerial vehicle |
CN111174822A (en) * | 2019-12-23 | 2020-05-19 | 湖南君泰勘测设计研究有限公司 | Geographic information acquisition system and method |
CN111521279A (en) * | 2020-06-05 | 2020-08-11 | 中国有色金属长沙勘察设计研究院有限公司 | Pipeline leakage inspection method |
CN113023293A (en) * | 2021-02-08 | 2021-06-25 | 精锐视觉智能科技(深圳)有限公司 | Inspection method, device, equipment and system for belt conveyor |
CN113077562A (en) * | 2021-04-09 | 2021-07-06 | 北京市燃气集团有限责任公司 | Intelligent inspection method and system for gas pipe network |
CN113203049A (en) * | 2020-05-28 | 2021-08-03 | 中国石油天然气股份有限公司 | Intelligent monitoring and early warning system and method for pipeline safety |
CN113361434A (en) * | 2021-06-16 | 2021-09-07 | 广东电网有限责任公司 | Disaster exploration method and device based on unmanned aerial vehicle remote control device |
CN113421354A (en) * | 2021-05-25 | 2021-09-21 | 西安万飞控制科技有限公司 | Unmanned aerial vehicle oil and gas pipeline emergency inspection method and system |
CN113780113A (en) * | 2021-08-25 | 2021-12-10 | 廊坊中油朗威工程项目管理有限公司 | Pipeline violation behavior identification method |
CN114355340A (en) * | 2021-12-10 | 2022-04-15 | 中国民用航空华东地区空中交通管理局 | Civil aviation monitoring source video playback analysis system |
CN115797811A (en) * | 2023-02-07 | 2023-03-14 | 江西农业大学 | Agricultural product detection method and system based on vision |
CN115909183A (en) * | 2022-09-16 | 2023-04-04 | 北京燃气平谷有限公司 | Monitoring system and monitoring method for external environment of gas delivery |
CN116107325A (en) * | 2022-12-02 | 2023-05-12 | 苏州博旭数据科技有限公司 | Inspection control system for carrying cradle head camera on unmanned aerial vehicle |
CN116382348A (en) * | 2023-05-11 | 2023-07-04 | 中国电建集团山东电力建设第一工程有限公司 | Unmanned aerial vehicle inspection method and system for power distribution equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014076294A1 (en) * | 2012-11-19 | 2014-05-22 | Inria Institut National De Recherche En Informatique Et En Automatique | Method for determining, in a fixed 3d frame of reference, the location of a moving craft, and associated computer program and device |
CN105468015A (en) * | 2016-01-20 | 2016-04-06 | 清华大学合肥公共安全研究院 | Oil gas pipeline inspection system of multi-rotor unmanned plane flying according to programmed course |
CN106657882A (en) * | 2016-10-18 | 2017-05-10 | 国网湖北省电力公司检修公司 | Real-time monitoring method for power transmission and transformation system based on unmanned aerial vehicle |
-
2018
- 2018-10-30 CN CN201811276910.7A patent/CN109379564A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014076294A1 (en) * | 2012-11-19 | 2014-05-22 | Inria Institut National De Recherche En Informatique Et En Automatique | Method for determining, in a fixed 3d frame of reference, the location of a moving craft, and associated computer program and device |
CN105468015A (en) * | 2016-01-20 | 2016-04-06 | 清华大学合肥公共安全研究院 | Oil gas pipeline inspection system of multi-rotor unmanned plane flying according to programmed course |
CN106657882A (en) * | 2016-10-18 | 2017-05-10 | 国网湖北省电力公司检修公司 | Real-time monitoring method for power transmission and transformation system based on unmanned aerial vehicle |
Non-Patent Citations (1)
Title |
---|
刘佳等: "基于改进SIFT 算法的图像匹", 《仪器仪表学报》 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110008845A (en) * | 2019-03-12 | 2019-07-12 | 北京市燃气集团有限责任公司 | The detection method and device of burning line hidden danger point |
CN110598532A (en) * | 2019-07-31 | 2019-12-20 | 长春市万易科技有限公司 | Tree pest and disease damage monitoring system and method |
CN110598532B (en) * | 2019-07-31 | 2022-09-13 | 长春市万易科技有限公司 | Tree pest and disease damage monitoring system and method |
CN110673628A (en) * | 2019-09-20 | 2020-01-10 | 北京航空航天大学 | Inspection method for oil-gas pipeline of composite wing unmanned aerial vehicle |
CN110673628B (en) * | 2019-09-20 | 2020-09-29 | 北京航空航天大学 | Inspection method for oil-gas pipeline of composite wing unmanned aerial vehicle |
CN111174822A (en) * | 2019-12-23 | 2020-05-19 | 湖南君泰勘测设计研究有限公司 | Geographic information acquisition system and method |
CN113203049A (en) * | 2020-05-28 | 2021-08-03 | 中国石油天然气股份有限公司 | Intelligent monitoring and early warning system and method for pipeline safety |
CN111521279A (en) * | 2020-06-05 | 2020-08-11 | 中国有色金属长沙勘察设计研究院有限公司 | Pipeline leakage inspection method |
CN113023293A (en) * | 2021-02-08 | 2021-06-25 | 精锐视觉智能科技(深圳)有限公司 | Inspection method, device, equipment and system for belt conveyor |
CN113077562B (en) * | 2021-04-09 | 2021-12-14 | 北京市燃气集团有限责任公司 | Intelligent inspection method and system for gas pipe network |
CN113077562A (en) * | 2021-04-09 | 2021-07-06 | 北京市燃气集团有限责任公司 | Intelligent inspection method and system for gas pipe network |
CN113421354A (en) * | 2021-05-25 | 2021-09-21 | 西安万飞控制科技有限公司 | Unmanned aerial vehicle oil and gas pipeline emergency inspection method and system |
CN113361434A (en) * | 2021-06-16 | 2021-09-07 | 广东电网有限责任公司 | Disaster exploration method and device based on unmanned aerial vehicle remote control device |
CN113780113A (en) * | 2021-08-25 | 2021-12-10 | 廊坊中油朗威工程项目管理有限公司 | Pipeline violation behavior identification method |
CN114355340A (en) * | 2021-12-10 | 2022-04-15 | 中国民用航空华东地区空中交通管理局 | Civil aviation monitoring source video playback analysis system |
CN114355340B (en) * | 2021-12-10 | 2024-04-30 | 中国民用航空华东地区空中交通管理局 | Civil aviation monitoring source video playback analysis system |
CN115909183A (en) * | 2022-09-16 | 2023-04-04 | 北京燃气平谷有限公司 | Monitoring system and monitoring method for external environment of gas delivery |
CN115909183B (en) * | 2022-09-16 | 2023-08-29 | 北京燃气平谷有限公司 | Monitoring system and monitoring method for external environment of fuel gas delivery |
CN116107325A (en) * | 2022-12-02 | 2023-05-12 | 苏州博旭数据科技有限公司 | Inspection control system for carrying cradle head camera on unmanned aerial vehicle |
CN115797811A (en) * | 2023-02-07 | 2023-03-14 | 江西农业大学 | Agricultural product detection method and system based on vision |
CN116382348A (en) * | 2023-05-11 | 2023-07-04 | 中国电建集团山东电力建设第一工程有限公司 | Unmanned aerial vehicle inspection method and system for power distribution equipment |
CN116382348B (en) * | 2023-05-11 | 2023-10-20 | 中国电建集团山东电力建设第一工程有限公司 | Unmanned aerial vehicle inspection method and system for power distribution equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109379564A (en) | A kind of gas pipeline unmanned plane inspection device and method for inspecting | |
US11017228B2 (en) | Method and arrangement for condition monitoring of an installation with operating means | |
CN112101088A (en) | Automatic unmanned aerial vehicle power inspection method, device and system | |
KR20200017601A (en) | Analysis of illegal activities and monitoring based on recognition using unmanned aerial vehicle and artificial intelligence deep running that can monitor illegal activities in the field farm | |
US20180144644A1 (en) | Method and system for managing flight plan for unmanned aerial vehicle | |
CN110276254B (en) | Unmanned aerial vehicle-based automatic recognition and early warning method for bootlegged area bootlegged | |
CN102654940A (en) | Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system | |
CN113408510B (en) | Transmission line target deviation rectifying method and system based on deep learning and one-hot coding | |
RU2755603C2 (en) | System and method for detecting and countering unmanned aerial vehicles | |
CN112764427B (en) | Relay unmanned aerial vehicle inspection system | |
CN109889793A (en) | Cloud computing platform and can recognition of face video monitoring intelligence Skynet system | |
CN105810023A (en) | Automatic airport undercarriage retraction and extension monitoring system and method | |
CN114529767A (en) | Small sample SAR target identification method based on two-stage comparison learning framework | |
CN116954264B (en) | Distributed high subsonic unmanned aerial vehicle cluster control system and method thereof | |
Jin et al. | Unmanned aerial vehicle (uav) based traffic monitoring and management | |
Wu et al. | Multimodal Collaboration Networks for Geospatial Vehicle Detection in Dense, Occluded, and Large-Scale Events | |
CN111950524A (en) | Orchard local sparse mapping method and system based on binocular vision and RTK | |
CN110618424B (en) | Remote high-voltage line discovery method based on multi-sensor fusion | |
CN117148853A (en) | Unmanned aerial vehicle environment self-adaptive obstacle avoidance method and system based on 5G technology and deep learning | |
Lin et al. | A multi-target detection framework for multirotor UAV | |
CN216647401U (en) | Safety helmet recognition device | |
CN116243725A (en) | Substation unmanned aerial vehicle inspection method and system based on visual navigation | |
CN110163139A (en) | Three-dimensional digital information acquisition in city updates scanning system | |
Pavlove et al. | Efficient Deep Learning Methods for Automated Visibility Estimation at Airports | |
CN115457313A (en) | Method and system for analyzing photovoltaic equipment fault based on thermal infrared image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190222 |