CN108415453A - Unmanned plane tunnel method for inspecting based on BIM technology - Google Patents
Unmanned plane tunnel method for inspecting based on BIM technology Download PDFInfo
- Publication number
- CN108415453A CN108415453A CN201810065982.0A CN201810065982A CN108415453A CN 108415453 A CN108415453 A CN 108415453A CN 201810065982 A CN201810065982 A CN 201810065982A CN 108415453 A CN108415453 A CN 108415453A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- tunnel
- engine
- image
- rgb
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000005516 engineering process Methods 0.000 title claims abstract description 40
- 230000033001 locomotion Effects 0.000 claims abstract description 55
- 238000007689 inspection Methods 0.000 claims abstract description 45
- 230000004927 fusion Effects 0.000 claims abstract description 16
- 238000006243 chemical reaction Methods 0.000 claims abstract description 9
- 230000005540 biological transmission Effects 0.000 claims abstract description 7
- 230000007547 defect Effects 0.000 claims abstract description 7
- 238000005259 measurement Methods 0.000 claims description 26
- 238000001914 filtration Methods 0.000 claims description 20
- 230000008569 process Effects 0.000 claims description 8
- 238000012986 modification Methods 0.000 claims description 7
- 230000004048 modification Effects 0.000 claims description 7
- 239000008186 active pharmaceutical agent Substances 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 4
- 230000014509 gene expression Effects 0.000 claims description 4
- 230000001105 regulatory effect Effects 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims 2
- 238000001514 detection method Methods 0.000 description 9
- 230000008859 change Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 3
- 238000013508 migration Methods 0.000 description 3
- 230000005012 migration Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
Present invention is disclosed a kind of unmanned plane tunnel method for inspecting based on BIM technology, steps are as follows:Tunnel model is established by BIM technology, and is conducted into d engine, relationship between Navigation of Pilotless Aircraft coordinate system and d engine camera coordinates system is obtained using coordinate system conversion method, establishes the correspondence between unmanned plane and d engine camera;Camera is arranged in d engine to be roamed by inspection circuit, unmanned plane movement track parameters are converted to by coordinate system conversion method;Unmanned plane obtains its motion state in tunnel by information fusion algorithm;The video of RGB D camera inspections is transferred to ground station by video transmission module, using carrying out video image and the textures of model, splicing, matching based on the done tunnel model of BIM technology and combine, the true three-dimension model of inspection circuit is finally obtained, and then clearly grasps the concrete condition of current circuit and whether there is or not tunnel defect hidden danger.The present invention can accelerate the working efficiency of tunnel inspection.
Description
Technical field
The invention belongs to tunnel inspection technical fields, are related to a kind of tunnel method for inspecting, more particularly to a kind of based on BIM skills
The unmanned plane tunnel method for inspecting of art.
Background technology
The method for being usually used in tunnel inspection both at home and abroad mainly has several sides such as artificial detection, the detection of semi-automatic instrument
Formula.Wherein, artificial detection can only have two or three hours detection times due to the daily operation in tunnel, and which results in artificial detections
Circuit it is all very short.Semi-automatic instrument detection due to just to need pre-buried detection sensor when tunnel produces, while
Tunnel sets up special communication apparatus when runing and can just carry out, moreover pre-buried sensor survival rate is not very high, this causes
Current tunnel inspection still based on manual inspection in a manner of carry out.
In terms of the patent searched both at home and abroad, there is presently no by unmanned plane be applied to tunnel inspection circuit in patent,
This aspect is since line in tunnel is various, and unmanned plane can have touching tunnel interior lines how no matter tunnel accurately control all
The probability generation of the case where road;Another aspect is former indoor navigation due to not having GPS to make the precision of navigation relatively low.
In view of this, nowadays there is an urgent need to design a kind of new detection mode, to overcome existing for existing detection mode
Drawbacks described above.
Invention content
The technical problem to be solved by the present invention is to:A kind of unmanned plane tunnel method for inspecting based on BIM technology is provided, it can
To accelerate the working efficiency of tunnel inspection, this has safely greatly synergism for growing subway tunnel.
In order to solve the above technical problems, the present invention adopts the following technical scheme that:
A kind of unmanned plane tunnel method for inspecting based on BIM technology, the method for inspecting include the following steps:
Step S1, tunnel model is established by BIM technology, and be conducted into three-dimensional display software, utilize coordinate system
Conversion method obtains relationship between the d engine camera coordinates system in Navigation of Pilotless Aircraft coordinate system and three-dimensional display software, builds
Vertical correspondence between unmanned plane and d engine camera;First, by RGB-D cameras obtain unmanned plane from left tunnel,
Unmanned plane is adjusted to roam origin coordinates same position with d engine, so that its Two coordinate system is overlapped, then by right, upper distance
Coordinate system rotation transformation between the two is carried out using quaternary number
Step S2, it is roamed by inspection circuit in the d engine of ground station three-dimensional display software setting camera,
It will be turned by coordinate system using the correspondence between the d engine roaming movement track parameters obtained and d engine camera
It is unmanned plane movement track parameters to change method migration, while being investigated in current viewport using ray technology when d engine roaming
Tag object, mutually to confirm with the label information that unmanned plane obtains, further to obtain unmanned plane inspection circuit
Middle posture and location parameter;The label is to mark position;
Step S3, unmanned plane passes through d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
The label information fusion algorithm of upper installation obtains its motion state in tunnel:Tunnel internal is obtained using RGB-D cameras
Then the Two-dimensional Color Image and range data of environment obtain posture and the position of unmanned plane by limit learning algorithm
Estimates of parameters;Inertia measurement value is obtained using MEMS Inertial Measurement Units, the posture that the RGB-D cameras are obtained and position
It sets estimates of parameters and is merged into row information by filtering algorithm with the measured value that Inertial Measurement Unit obtains, three-dimensional is recycled to draw
The movement track parameters progress unmanned plane motion state for holding up camera acquisition is once corrected, and then passes through what tag recognizer obtained
Label information carries out unmanned plane motion state second-order correction, and master control is come to obtain more accurate unmanned plane motion state
Unmanned plane processed is by inspection circuit autonomous flight;
Step S4, the video of RGB-D cameras inspection is transferred to ground station, ground station by video transmission module
Video image and the textures of model, splicing, matching are carried out using the tunnel model that is done based on BIM technology and are combined, final
To the true three-dimension model of inspection circuit, and then clearly grasp the concrete condition of current circuit and whether there is or not tunnel defect hidden danger;Institute
It includes depth image and coloured image to state video image;
In step s3, information fusion algorithm is as follows:
Unmanned plane to d engine camera, based on being installed on RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
Label information handled, and the fuse information being calculated using the DS evidence theory algorithms based on degree of belief is existed
Motion state accuracy in tunnel, calculating process are as follows:
S31. the label installed on d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece is set
Information degree of belief is W={ ω1,ω2,ω3,ω4, wherein ωi(i=1,2,3,4) absolute for the attribute of individual data
Difference;
S32. the conflict between wantonly two groups of data is calculated
Wherein m (F) is the confidence level of single group data;
S33. the similitude between wantonly two groups of data is calculated
S34. the ratio conflict factor is calculated
S35. average conflict coefficient is calculated
S36. total weight factor ω is calculated*=m × (k*)α×min(ωi| i=1,2,3,4);
Wherein, α is regulatory factor;
S37. Primary regulation is carried out to the degree of belief of all data
S38. above-mentioned S32 to S37 operations are repeated, when total trust degree W ' is more than given threshold, unmanned plane is in tunnel
In motion state be safety operating status.
In step S3, state-space model and RGB-D cameras, MEMS inertia are established according to the kinetic characteristics of unmanned plane
The observation model of measuring unit carries out medium filtering, connected domain filtering, expansion by the coloured image shot to RGB-D cameras
Modification and image thinning obtain multigroup characteristic point in tunnel, and then corresponding to depth data according to characteristic point and obtain has matching
Then three-dimensional point cloud in the adjacent two field pictures of relationship obtains the posture of unmanned plane by limit learning algorithm and position is joined
Observed quantity of the number estimated values as the configuration space model of unmanned plane, in this way with the observed quantity of MEMS inertial sensor offer into
Row fusion, the motion state parameters after the fusion are once corrected compared with the motion state parameters that d engine obtains
The state of flight and flight path of unmanned plane, again by tunnel per the fixed label of endless tube on piece during unmanned plane during flying
Position carries out secondary motion state and the flight path amendment of unmanned plane during flying, and guiding unmanned plane roams circuit by d engine
Carry out inspection.
In step S3, medium filtering, connected domain filtering, expansion modification and image thinning obtain multigroup spy in tunnel
Sign point algorithm is as follows:
It is proceeded as follows by the coloured image shot to RGB-D cameras:
Step1. median filter is used to carry out denoising to image
If filter window is W, with the following method to the gray value { x of image each pointij,(i,j)∈I2It is filtered place
Reason
yi=MedW{xij}=Med { xi+r,j+s,(r,s)∈W(i,j)∈I2}
Wherein, yiFor filtered value.
Step2. Butterworth high-pass filters are used to enhance image in frequency domain
If n rank Butterworth high pass filter functions are as follows:
Wherein, D0For by frequency,The distance of frequency plane origin is arrived for point P (u, v).
H (u, v) at point P (u, v) is dropped into maximum value
Step3. motion blur present in the image obtained to Step2 by liftering method is eliminated.
Step4. Hilditch thinning algorithms is used to carry out micronization processes to the image that Step3 is obtained
Method is as follows:If background value is 0, foreground value is 1, using 8 connected domains, central point P0
P8 | P1 | P2 |
P7 | P0 | P3 |
P6 | P5 | P4 |
The number of non-zero pixels point in B (P0) expressions 8 connected domains adjacent with P0 points
A (P0) indicates the number of 0 or 1 pattern in 8 adjacent connected domain clockwise direction P1-P8 sequences of P0 points.
Provide as follows refinement condition:
2<=B (P0)<=6
A (P0)=1
P2 | P4 | P8=0 or A (P2)!=1
P2 | P4 | P6=0 or A (P4)!=1
P0 is set to 0 when meeting condition, according to from left to right, sequence from top to bottom traverses each pixel, directly
Until pixel is unsatisfactory for above-mentioned refinement condition.
A kind of unmanned plane tunnel method for inspecting based on BIM technology, the method for inspecting include the following steps:
Step S1, tunnel model is established by BIM technology, and be conducted into three-dimensional display software, utilize coordinate system
Conversion method obtains relationship between Navigation of Pilotless Aircraft coordinate system and d engine camera coordinates system, establishes unmanned plane and draws with three-dimensional
Hold up the correspondence between camera;
Step S2, it is roamed by inspection circuit in the d engine of ground station three-dimensional display software setting camera,
It is converted by coordinate system using the correspondence between the d engine roaming movement track parameters obtained and d engine camera
Method migration is unmanned plane movement track parameters, while being investigated in current viewport using ray technology when d engine roaming
Tag object, mutually to confirm with the label information that unmanned plane obtains, further to obtain in unmanned plane inspection circuit
Posture and location parameter;
Step S3, unmanned plane by d engine camera, be based on RGB-D cameras, MEMS Inertial Measurement Units and tunnel
The label information fusion algorithm installed on section of jurisdiction obtains its motion state in tunnel;
Step S4, the video of RGB-D cameras inspection is transferred to ground station, ground station by video transmission module
Video image and the textures of model, splicing, matching are carried out using the tunnel model that is done based on BIM technology and are combined, final
To the true three-dimension model of inspection circuit, and then clearly grasp the concrete condition of current circuit and whether there is or not tunnel defect hidden danger;Institute
It includes depth image and coloured image to state video image.
As a preferred embodiment of the present invention, in step S1, by RGB-D cameras obtain unmanned plane it is left and right from tunnel,
Unmanned plane is adjusted to roam origin coordinates same position with d engine, its Two coordinate system is made to overlap by upper distance, then sharp
Coordinate system rotation transformation between the two is carried out with quaternary number.
As a preferred embodiment of the present invention, in step S2, the label is to mark position.
As a preferred embodiment of the present invention, in step s3, information fusion algorithm is as follows:
Unmanned plane to d engine camera, based on being installed on RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
Label information handled, and the fuse information being calculated using the DS evidence theory algorithms based on degree of belief is existed
Motion state accuracy in tunnel, calculating process are as follows:
S31. the label installed on d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece is set
Information degree of belief is W={ ω1,ω2,ω3,ω4, wherein ωi(i=1,2,3,4) absolute for the attribute of individual data
Difference;
S32. the conflict between wantonly two groups of data is calculated
Wherein m (F) is the confidence level of single group data;
S33. the similitude between wantonly two groups of data is calculated
S34. the ratio conflict factor is calculated
S35. average conflict coefficient is calculated
S36. total weight factor ω is calculated*=m × (k*)α×min(ωi| i=1,2,3,4);
Wherein, α is regulatory factor;
S37. Primary regulation is carried out to the degree of belief of all data
S38. above-mentioned S32 to S37 operations are repeated, when total trust degree W ' is more than given threshold, unmanned plane is in tunnel
In motion state be safety operating status.
As a preferred embodiment of the present invention, in step S3, RGB-D cameras is utilized to obtain the two dimension of tunnel internal environment
Then coloured image and range data obtain the posture and location parameter estimated value of unmanned plane by limit learning algorithm;
Inertia measurement value is obtained using MEMS Inertial Measurement Units, the posture and location parameter estimated value that the RGB-D cameras are obtained
The measured value obtained with Inertial Measurement Unit is merged by filtering algorithm into row information, and d engine camera is recycled to obtain
Movement track parameters carry out unmanned plane motion state and once correct, and are then carried out by the label information that tag recognizer obtains
Unmanned plane motion state second-order correction carrys out autonomous control unmanned plane by patrolling to obtain more accurate unmanned plane motion state
Examine circuit autonomous flight.
As a preferred embodiment of the present invention, in step S3, state space is established according to the kinetic characteristics of unmanned plane
The observation model of model and RGB-D cameras, MEMS Inertial Measurement Units, by coloured image that RGB-D cameras are shot into
Row medium filtering, connected domain filtering, expansion modification and image thinning obtain multigroup characteristic point in tunnel, then according to feature
The corresponding depth data of point obtains the three-dimensional point cloud in the adjacent two field pictures with matching relationship, then learns to calculate by the limit
Method obtains the observed quantity as the configuration space model of unmanned plane of posture and location parameter estimated value of unmanned plane, in this way with
The observed quantity that MEMS inertial sensor provides is merged, the fortune that the motion state parameters after the fusion are obtained with d engine
Dynamic state parameter, which compares, carries out the primary state of flight and flight path for correcting unmanned plane, during unmanned plane during flying again
Secondary motion state and the flight path amendment of unmanned plane during flying are carried out per the fixed label position of endless tube on piece by tunnel,
Unmanned plane is guided to carry out inspection by d engine roaming circuit.
As a preferred embodiment of the present invention, in step S3, medium filtering, connected domain filtering, expansion modification and
Multigroup characteristic point algorithm that image thinning obtains in tunnel is as follows:
It is proceeded as follows by the coloured image shot to RGB-D cameras:
Step1. median filter is used to carry out denoising to image
If filter window is W, with the following method to the gray value { x of image each pointij,(i,j)∈I2It is filtered place
Reason
yi=MedW{xij}=Med { xi+r,j+s,(r,s)∈W(i,j)∈I2}
Wherein, yiFor filtered value.
Step2. Butterworth high-pass filters are used to enhance image in frequency domain
If n rank Butterworth high pass filter functions are as follows:
Wherein, D0For by frequency,The distance of frequency plane origin is arrived for point P (u, v).
H (u, v) at point P (u, v) is dropped into maximum value
Step3. motion blur present in the image obtained to Step2 by liftering method is eliminated.
Step4. Hilditch thinning algorithms is used to carry out micronization processes to the image that Step3 is obtained
Method is as follows:If background value is 0, foreground value is 1, using 8 connected domains, central point P0;
P8 | P1 | P2 |
P7 | P0 | P3 |
P6 | P5 | P4 |
The number of non-zero pixels point in B (P0) expressions 8 connected domains adjacent with P0 points;
A (P0) indicates the number of 0 or 1 pattern in 8 adjacent connected domain clockwise direction P1-P8 sequences of P0 points.
Provide as follows refinement condition:
2<=B (P0)<=6
A (P0)=1
P2 | P4 | P8=0 or A (P2)!=1
P2 | P4 | P6=0 or A (P4)!=1
P0 is set to 0 when meeting condition, according to from left to right, sequence from top to bottom traverses each pixel, directly
Until pixel is unsatisfactory for above-mentioned refinement condition.
As a preferred embodiment of the present invention, the method further includes:The video of RGB-D camera inspections is passed by video
Defeated module transfer to ground station, ground station using the tunnel model done based on BIM technology carry out video image with
Textures, splicing, matching and the combination of model, finally obtain the true three-dimension model of inspection circuit, and then clearly grasp current line
The concrete condition on road and whether there is or not tunnel defect hidden danger;The video image includes depth image and coloured image.
The beneficial effects of the present invention are:Unmanned plane tunnel method for inspecting proposed by the present invention based on BIM technology, can be with
Accelerate the working efficiency of tunnel inspection, this has safely greatly synergism for growing subway tunnel.The present invention
Inspection is carried out using unmanned plane, and the acquisition of easy to operate, data is quick, abundant information, cost are relatively low, and monitoring time substantially reduces.
Description of the drawings
Fig. 1 is that the present invention is based on the flow charts of the unmanned plane tunnel method for inspecting of BIM technology.
Fig. 2 is that the present invention is based on the composition schematic diagrams of the unmanned plane tunnel cruising inspection system of BIM technology.
Specific implementation mode
The preferred embodiment that the invention will now be described in detail with reference to the accompanying drawings.
Embodiment one
Referring to Fig. 1, based on the unmanned plane tunnel method for inspecting of BIM technology, the method for inspecting includes the following steps:
【Step S1】Tunnel model is established by BIM technology, and is conducted into three-dimensional display software, coordinate system is utilized
Conversion method obtains relationship between Navigation of Pilotless Aircraft coordinate system and d engine camera coordinates system, establishes unmanned plane and draws with three-dimensional
Hold up the correspondence between camera;
【Step S2】It is roamed by inspection circuit in the d engine setting camera of ground station three-dimensional display software,
It will be turned by coordinate system using the correspondence between the d engine roaming movement track parameters obtained and d engine camera
It is unmanned plane movement track parameters to change method migration, while being investigated in current viewport using ray technology when d engine roaming
Tag object, mutually to confirm with the label information that unmanned plane obtains, further to obtain unmanned plane inspection circuit
Middle posture and location parameter;
【Step S3】Unmanned plane passes through d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
The label information fusion algorithm of upper installation obtains its motion state in tunnel;
【Step S4】The video of RGB-D camera inspections is transferred to ground station, ground handling by video transmission module
It stands and carries out video image and the textures of model, splicing, matching using the tunnel model that is done based on BIM technology and combine, finally
The true three-dimension model of inspection circuit is obtained, and then clearly grasps the concrete condition of current circuit and whether there is or not tunnel defect hidden danger.
Embodiment two
Referring to Fig. 1, Fig. 2, based on the unmanned plane tunnel method for inspecting of BIM technology, it includes the following steps:
【Step 1】Tunnel model is established by BIM technology, and is conducted into three-dimensional display software, coordinate system is utilized
Conversion method obtains relationship between Navigation of Pilotless Aircraft coordinate system and d engine camera coordinates system, establishes unmanned plane and draws with three-dimensional
Hold up the correspondence between camera;
Wherein, coordinate system conversion method obtains relationship between Navigation of Pilotless Aircraft coordinate system and d engine camera coordinates system.
The acquisition of the relationship be first by RGB-D cameras obtain unmanned plane it is left and right from tunnel, it is upper with a distance from, by unmanned plane be adjusted to
D engine roams origin coordinates same position, its Two coordinate system is made to overlap, and then quaternary number is recycled to be sat therebetween
The rotation transformation of mark system.
【Step 2】It is roamed by inspection circuit in the d engine setting camera of ground station three-dimensional display software,
Unmanned plane movement track parameters are obtained using the correspondence between d engine roaming and d engine camera, are roamed simultaneously
Shi Liyong ray technologies investigate the tag object in current viewport, mutually to confirm with the label information that unmanned plane obtains.
【Step 3】Unmanned plane passes through d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
The label information fusion algorithm of upper installation obtains its motion state in tunnel:Tunnel internal is obtained using RGB-D cameras
Then the Two-dimensional Color Image and range data of environment obtain posture and the position of unmanned plane by limit learning algorithm
Estimates of parameters;Inertia measurement value is obtained using MEMS Inertial Measurement Units, the posture that the RGB-D cameras are obtained and position
It sets estimates of parameters and is merged into row information by filtering algorithm with the measured value that Inertial Measurement Unit obtains, three-dimensional is recycled to draw
The movement track parameters progress unmanned plane motion state for holding up camera acquisition is once corrected, and then passes through what tag recognizer obtained
Label information carries out unmanned plane motion state second-order correction, and master control is come to obtain more accurate unmanned plane motion state
Unmanned plane processed is by inspection circuit autonomous flight.
In step s3, information fusion algorithm is as follows:
Unmanned plane is to the mark installed on d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
Label information is handled, and the fuse information being calculated using the DS evidence theory algorithms based on degree of belief is in tunnel
Middle motion state accuracy, calculating process are as follows:
S31. the label installed on d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece is set
Information degree of belief is W={ ω1,ω2,ω3,ω4, wherein ωi(i=1,2,3,4) absolute for the attribute of individual data
Difference;
S32. the conflict between wantonly two groups of data is calculated
Wherein m (F) is the confidence level of single group data;
S33. the similitude between wantonly two groups of data is calculated
S34. the ratio conflict factor is calculated
S35. average conflict coefficient is calculated
S36. total weight factor ω is calculated*=m × (k*)α×min(ωi| i=1,2,3,4);
Wherein, α is regulatory factor;
S37. Primary regulation is carried out to the degree of belief of all data
S38. above-mentioned S32 to S37 operations are repeated, when total trust degree W ' is more than given threshold, unmanned plane is in tunnel
In motion state be safety operating status.
Specifically, it establishes state-space model according to the kinetic characteristics of unmanned plane and RGB-D cameras, MEMS inertia is surveyed
The observation model for measuring unit carries out medium filtering by the coloured image shot to RGB-D cameras, connected domain filtering, expands and repair
Change and image thinning obtains multigroup characteristic point in tunnel, then corresponding to depth data according to characteristic point and obtain, there is matching to close
Then three-dimensional point cloud in the adjacent two field pictures of system obtains the posture and location parameter of unmanned plane by limit learning algorithm
Observed quantity of the estimated value as the configuration space model of unmanned plane, the observed quantity provided in this way with MEMS inertial sensor carry out
Fusion, the motion state parameters after the fusion carry out once correcting nothing compared with the motion state parameters that d engine obtains
Man-machine state of flight and flight path, again by tunnel per the fixed label position of endless tube on piece during unmanned plane during flying
Set carry out unmanned plane during flying secondary motion state and flight path amendment, guiding unmanned plane by d engine roam circuit into
Row inspection.
In above-mentioned steps, medium filtering, connected domain filtering, expansion modification and image thinning obtain multigroup in tunnel
Characteristic point algorithm is as follows:
It is proceeded as follows by the coloured image shot to RGB-D cameras:
S301. median filter is used to carry out denoising to image
If filter window is W, with the following method to the gray value { x of image each pointij,(i,j)∈I2It is filtered place
Reason
yi=MedW{xij}=Med { xi+r,j+s,(r,s)∈W(i,j)∈I2}
Wherein, yiFor filtered value.
S302. Butterworth high-pass filters are used to enhance image in frequency domain
If n rank Butterworth high pass filter functions are as follows:
Wherein, D0For by frequency,The distance of frequency plane origin is arrived for point P (u, v).
H (u, v) at point P (u, v) is dropped into maximum value
S303. motion blur present in the image obtained to S302 by liftering method is eliminated.
S304. Hilditch thinning algorithms is used to carry out micronization processes to the image that Step3 is obtained
Method is as follows:If background value is 0, foreground value is 1, using 8 connected domains, central point P0;
P8 | P1 | P2 |
P7 | P0 | P3 |
P6 | P5 | P4 |
The number of non-zero pixels point in B (P0) expressions 8 connected domains adjacent with P0 points;
A (P0) indicates the number of 0 or 1 pattern in 8 adjacent connected domain clockwise direction P1-P8 sequences of P0 points.
Provide as follows refinement condition:
2<=B (P0)<=6
A (P0)=1
P2 | P4 | P8=0 or A (P2)!=1
P2 | P4 | P6=0 or A (P4)!=1
P0 is set to 0 when meeting condition, according to from left to right, sequence from top to bottom traverses each pixel, directly
Until pixel is unsatisfactory for above-mentioned refinement condition.
【Step S4】The video of RGB-D camera inspections is transferred to ground station, ground handling by video transmission module
Stand using the tunnel model that is done based on BIM technology carry out video image (depth image and coloured image) and model textures,
Splicing, matching and combination, finally obtain the true three-dimension model of inspection circuit, and then clearly grasp the specific feelings of current circuit
Condition and whether there is or not tunnel defect hidden danger.
In conclusion the unmanned plane tunnel autonomous method for inspecting proposed by the present invention based on BIM technology, can accelerate tunnel
The working efficiency of inspection, this has safely greatly synergism for growing subway tunnel.The present invention utilizes nobody
Machine carries out inspection, and the acquisition of easy to operate, data is quick, abundant information, cost are relatively low, and monitoring time substantially reduces.
Description and application of the invention herein are illustrative, is not wishing to limit the scope of the invention to above-described embodiment
In.The deformation and change of embodiments disclosed herein are possible, real for those skilled in the art
The replacement and equivalent various parts for applying example are well known.It should be appreciated by the person skilled in the art that not departing from the present invention
Spirit or essential characteristics in the case of, the present invention can in other forms, structure, arrangement, ratio, and with other components,
Material and component are realized.Without departing from the scope and spirit of the present invention, can to embodiments disclosed herein into
The other deformations of row and change.
Claims (8)
1. the operating system of a kind of unmanned plane tunnel method for inspecting based on BIM technology, use is that a ground station (1) connects
An airborne processing unit (2) is connect, airborne processing unit (2) connects an airborne sensor system (3) and an airborne control
Unit (4), airborne sensor system (3) identify section of jurisdiction label in tunnel (5), it is characterised in that concrete operation step is as follows:
Step S1, tunnel model is established by BIM technology, and is conducted into the three-dimensional display software of ground station (1),
Using coordinate system conversion method obtain the d engine camera coordinates system in Navigation of Pilotless Aircraft coordinate system and three-dimensional display software it
Between relationship, establish the correspondence between unmanned plane and d engine camera;
Step S2, it is roamed, is utilized by inspection circuit in the d engine of ground station three-dimensional display software setting camera
Correspondence between the d engine roaming movement track parameters obtained and d engine camera will pass through coordinate system conversion side
Method is converted to unmanned plane movement track parameters, while the label in current viewport is investigated using ray technology when d engine roaming
Object, mutually to confirm with the label information that unmanned plane obtains, further to obtain posture in unmanned plane inspection circuit
And location parameter;Label is to mark position;
Step S3, unmanned plane passes through the RGB-D phases in the d engine camera of ground station (1), airborne sensor system (3)
The section of jurisdiction label information fusion algorithm installed on machine and MEMS Inertial Measurement Units and tunnel duct piece (5) obtains it in tunnel
Motion state in road;
Step S4, the video of the RGB-D camera inspections in airborne sensor system (3) is transferred to ground by video transmission module
Work station (1), ground station (1) carried out using the tunnel model that is done based on BIM technology video image and model textures,
Splicing, matching and combination, finally obtain the true three-dimension model of inspection circuit, and then clearly grasp the concrete condition of current circuit
With whether there is or not tunnel defect hidden danger;The video image includes depth image and coloured image.
2. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In step S1, by RGB-D cameras obtain unmanned plane it is left and right from tunnel, it is upper with a distance from, unmanned plane is adjusted to draw with three-dimensional
Roaming origin coordinates same position is held up, its Two coordinate system is made to overlap, then carries out coordinate system rotation between the two using quaternary number
Transformation.
3. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In step S2, the label is to mark position.
4. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In step S3, the Two-dimensional Color Image and range data of tunnel internal environment are obtained using RGB-D cameras, is then led to
Cross posture and location parameter estimated value that limit learning algorithm obtains unmanned plane;Inertia is obtained using MEMS Inertial Measurement Units to survey
Magnitude passes through the posture that the RGB-D cameras obtain and the measured value that location parameter estimated value is obtained with Inertial Measurement Unit
Filtering algorithm is merged into row information, and the movement track parameters that d engine camera obtains is recycled to carry out unmanned plane motion state one
Then secondary amendment carries out unmanned plane motion state second-order correction, to obtain more by the label information that tag recognizer obtains
Carry out autonomous control unmanned plane by inspection circuit autonomous flight for accurate unmanned plane motion state.
5. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In step s3, information fusion algorithm is as follows:
Unmanned plane to d engine camera, based on the mark installed on RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece
Label information is handled, and the fuse information being calculated using the DS evidence theory algorithms based on degree of belief is in tunnel
Motion state accuracy, calculating process are as follows:
S31. the section of jurisdiction label installed on d engine camera, RGB-D cameras, MEMS Inertial Measurement Units and tunnel duct piece is set
Information degree of belief is W={ ω1,ω2,ω3,ω4, wherein ωi(i=1,2,3,4) it is the attribute absolute difference of individual data;
S32. the conflict between wantonly two groups of data is calculated
Wherein m (F) is the confidence level of single group data;
S33. the similitude between wantonly two groups of data is calculated
S34. the ratio conflict factor is calculated
S35. average conflict coefficient is calculated
S36. total weight factor ω is calculated*=m × (k*)α×min(ωi| i=1,2,3,4);
Wherein, α is regulatory factor;
S37. adjusting is updated to the degree of belief of all data
S38. the S32 to step S37 that repeats the above steps is operated, and when total trust degree W ' is more than given threshold, unmanned plane exists
Motion state in tunnel is the operating status of safety.
6. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In step S3, state-space model and RGB-D cameras, MEMS inertia measurements are established according to the kinetic characteristics of unmanned plane
The observation model of unit carries out medium filtering, connected domain filtering, expansion modification by the coloured image shot to RGB-D cameras
And image thinning obtains multigroup characteristic point in tunnel, then corresponding to depth data according to characteristic point obtains with matching relationship
Then three-dimensional point cloud in adjacent two field pictures obtains the posture and location parameter estimated value of unmanned plane by limit learning algorithm
The observed quantity of configuration space model as unmanned plane, the observed quantity provided in this way with MEMS inertial sensor are merged, should
Motion state parameters after fusion carry out once correcting flying for unmanned plane compared with the motion state parameters that d engine obtains
Row state and flight path carry out nobody by tunnel per the fixed label position of endless tube on piece again during unmanned plane during flying
The secondary motion state of machine flight and flight path amendment, guiding unmanned plane carry out inspection by d engine roaming circuit.
7. the unmanned plane tunnel method for inspecting according to claim 5 based on BIM technology, it is characterised in that:
In step S3, medium filtering, connected domain filtering, expansion modification and image thinning obtain multigroup characteristic point in tunnel
Algorithm is as follows:
It is proceeded as follows by the coloured image shot to RGB-D cameras:
S301. median filter is used to carry out denoising to image;
If filter window is W, with the following method to the gray value { x of image each pointij,(i,j)∈I2Be filtered;
yi=MedW{xij}=Med { xi+r,j+s,(r,s)∈W(i,j)∈I2};
Wherein, yiFor filtered value;
S302. Butterworth high-pass filters are used to enhance image in frequency domain
If n rank Butterworth high pass filter functions are as follows:
Wherein, D0For by frequency,The distance of frequency plane origin is arrived for point P (u, v);
H (u, v) at point P (u, v) is dropped into maximum value
S303. motion blur present in the image obtained to step S302 by liftering method is eliminated;
S304. Hilditch thinning algorithms is used to carry out micronization processes to the image that step S303 is obtained;
Method is as follows:If background value is 0, foreground value is 1, using 8 connected domains, central point P0;
The number of non-zero pixels point in B (P0) expressions 8 connected domains adjacent with P0 points;
A (P0) indicates the number of 0 or 1 pattern in 8 adjacent connected domain clockwise direction P1-P8 sequences of P0 points;
Provide as follows refinement condition:
2<=B (P0)<=6
A (P0)=1
P2 | P4 | P8=0 or A (P2)!=1
P2 | P4 | P6=0 or A (P4)!=1
P0 is set to 0 when meeting condition, according to from left to right, sequence from top to bottom traverses each pixel, until picture
Until vegetarian refreshments is unsatisfactory for above-mentioned refinement condition.
8. the unmanned plane tunnel method for inspecting according to claim 1 based on BIM technology, it is characterised in that:
In the step S4, the video of RGB-D camera inspections is transferred to ground station (1), ground by video transmission module
Work station (1) carries out textures, splicing, matching and the group of video image and model using the tunnel model done based on BIM technology
It closes, finally obtains the true three-dimension model of inspection circuit, and then clearly grasp the concrete condition of current circuit and have non-tunnel sick
Evil hidden danger;The video image includes depth image and coloured image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810065982.0A CN108415453A (en) | 2018-01-24 | 2018-01-24 | Unmanned plane tunnel method for inspecting based on BIM technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810065982.0A CN108415453A (en) | 2018-01-24 | 2018-01-24 | Unmanned plane tunnel method for inspecting based on BIM technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108415453A true CN108415453A (en) | 2018-08-17 |
Family
ID=63126237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810065982.0A Pending CN108415453A (en) | 2018-01-24 | 2018-01-24 | Unmanned plane tunnel method for inspecting based on BIM technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108415453A (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109186554A (en) * | 2018-09-07 | 2019-01-11 | 成都川江信息技术有限公司 | A kind of scene automatic positioning seat calibration method by real-time video fixation locus inspection |
CN109405895A (en) * | 2018-12-29 | 2019-03-01 | 广州供电局有限公司 | Cable tunnel monitoring management system |
CN109760837A (en) * | 2019-02-21 | 2019-05-17 | 西京学院 | A kind of cable duct and the patrol unmanned machine system in tunnel |
CN109840600A (en) * | 2018-12-29 | 2019-06-04 | 天津大学 | The feeder channel unmanned plane of BIM auxiliary cooperates with cruising inspection system online |
CN110113572A (en) * | 2019-05-08 | 2019-08-09 | 中铁八局集团建筑工程有限公司 | A kind of outdoor scene loaming method based on Building Information Model |
CN110262546A (en) * | 2019-06-18 | 2019-09-20 | 武汉大学 | A kind of tunnel intelligent unmanned plane cruising inspection system and method |
CN110842926A (en) * | 2019-11-28 | 2020-02-28 | 陕西广播电视大学 | Accurate positioning system of robot |
CN111414518A (en) * | 2020-03-26 | 2020-07-14 | 中国铁路设计集团有限公司 | Video positioning method for railway unmanned aerial vehicle |
CN111739308A (en) * | 2019-03-19 | 2020-10-02 | 上海大学 | Road abnormal mobile internet of things monitoring system and method for vehicle-road cooperation |
CN112268541A (en) * | 2020-10-16 | 2021-01-26 | 中国有色金属长沙勘察设计研究院有限公司 | Three-dimensional space detection method |
CN112386171A (en) * | 2020-11-18 | 2021-02-23 | 福州市长乐区三互信息科技有限公司 | Intelligent cleaning method and system for building property |
CN112789672A (en) * | 2018-09-10 | 2021-05-11 | 感知机器人有限公司 | Control and navigation system, attitude optimization, mapping and positioning technology |
CN112862879A (en) * | 2021-02-18 | 2021-05-28 | 中国矿业大学(北京) | Method for constructing subway tunnel three-dimensional model based on TIN model |
CN113701733A (en) * | 2021-09-10 | 2021-11-26 | 上海冈波科技有限公司 | Construction supervision method and system based on BIM model |
CN114356913A (en) * | 2021-12-15 | 2022-04-15 | 联奕科技股份有限公司 | Micro-service link operation and maintenance system and method |
US11827351B2 (en) | 2018-09-10 | 2023-11-28 | Perceptual Robotics Limited | Control and navigation systems |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103984355A (en) * | 2014-05-19 | 2014-08-13 | 华北电力大学 | Routing inspection flying robot and overhead power line distance prediction and maintaining method |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
CN104843176A (en) * | 2015-04-28 | 2015-08-19 | 武汉大学 | Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method |
CN105334862A (en) * | 2015-10-28 | 2016-02-17 | 上海同筑信息科技有限公司 | BIM-based unmanned aerial vehicle monitoring method and system |
KR20160034013A (en) * | 2014-09-19 | 2016-03-29 | 한국건설기술연구원 | System and method for construction site management by using unmaned aerial vehicle |
CN105575157A (en) * | 2016-01-29 | 2016-05-11 | 浙江理工大学 | Vehicle accurate positioning assistant system based on RF identification technology and positioning method thereof |
CN106441286A (en) * | 2016-06-27 | 2017-02-22 | 上海大学 | Unmanned aerial vehicle tunnel inspection system based on BIM technology |
-
2018
- 2018-01-24 CN CN201810065982.0A patent/CN108415453A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103984355A (en) * | 2014-05-19 | 2014-08-13 | 华北电力大学 | Routing inspection flying robot and overhead power line distance prediction and maintaining method |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
KR20160034013A (en) * | 2014-09-19 | 2016-03-29 | 한국건설기술연구원 | System and method for construction site management by using unmaned aerial vehicle |
CN104843176A (en) * | 2015-04-28 | 2015-08-19 | 武汉大学 | Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method |
CN105334862A (en) * | 2015-10-28 | 2016-02-17 | 上海同筑信息科技有限公司 | BIM-based unmanned aerial vehicle monitoring method and system |
CN105575157A (en) * | 2016-01-29 | 2016-05-11 | 浙江理工大学 | Vehicle accurate positioning assistant system based on RF identification technology and positioning method thereof |
CN106441286A (en) * | 2016-06-27 | 2017-02-22 | 上海大学 | Unmanned aerial vehicle tunnel inspection system based on BIM technology |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109186554A (en) * | 2018-09-07 | 2019-01-11 | 成都川江信息技术有限公司 | A kind of scene automatic positioning seat calibration method by real-time video fixation locus inspection |
CN109186554B (en) * | 2018-09-07 | 2021-05-07 | 成都川江信息技术有限公司 | Method for automatically positioning coordinates of scene in real-time video fixed track inspection |
CN112789672B (en) * | 2018-09-10 | 2023-12-12 | 感知机器人有限公司 | Control and navigation system, gesture optimization, mapping and positioning techniques |
US11827351B2 (en) | 2018-09-10 | 2023-11-28 | Perceptual Robotics Limited | Control and navigation systems |
CN112789672A (en) * | 2018-09-10 | 2021-05-11 | 感知机器人有限公司 | Control and navigation system, attitude optimization, mapping and positioning technology |
US11886189B2 (en) | 2018-09-10 | 2024-01-30 | Perceptual Robotics Limited | Control and navigation systems, pose optimization, mapping, and localization techniques |
CN109405895A (en) * | 2018-12-29 | 2019-03-01 | 广州供电局有限公司 | Cable tunnel monitoring management system |
CN109840600A (en) * | 2018-12-29 | 2019-06-04 | 天津大学 | The feeder channel unmanned plane of BIM auxiliary cooperates with cruising inspection system online |
CN109760837A (en) * | 2019-02-21 | 2019-05-17 | 西京学院 | A kind of cable duct and the patrol unmanned machine system in tunnel |
CN109760837B (en) * | 2019-02-21 | 2022-03-18 | 西京学院 | Unmanned aerial vehicle system is patrolled and examined in cable pit and tunnel |
CN111739308A (en) * | 2019-03-19 | 2020-10-02 | 上海大学 | Road abnormal mobile internet of things monitoring system and method for vehicle-road cooperation |
CN111739308B (en) * | 2019-03-19 | 2024-01-19 | 上海大学 | Vehicle-road cooperation-oriented road abnormal movement online monitoring system and method |
CN110113572B (en) * | 2019-05-08 | 2021-04-13 | 中铁八局集团建筑工程有限公司 | Real scene roaming method based on building information model |
CN110113572A (en) * | 2019-05-08 | 2019-08-09 | 中铁八局集团建筑工程有限公司 | A kind of outdoor scene loaming method based on Building Information Model |
CN110262546A (en) * | 2019-06-18 | 2019-09-20 | 武汉大学 | A kind of tunnel intelligent unmanned plane cruising inspection system and method |
CN110262546B (en) * | 2019-06-18 | 2021-07-20 | 武汉大学 | Tunnel intelligent unmanned aerial vehicle inspection method |
CN110842926A (en) * | 2019-11-28 | 2020-02-28 | 陕西广播电视大学 | Accurate positioning system of robot |
CN111414518B (en) * | 2020-03-26 | 2022-06-14 | 中国铁路设计集团有限公司 | Video positioning method for railway unmanned aerial vehicle |
CN111414518A (en) * | 2020-03-26 | 2020-07-14 | 中国铁路设计集团有限公司 | Video positioning method for railway unmanned aerial vehicle |
CN112268541B (en) * | 2020-10-16 | 2022-04-15 | 中国有色金属长沙勘察设计研究院有限公司 | Three-dimensional space detection method |
CN112268541A (en) * | 2020-10-16 | 2021-01-26 | 中国有色金属长沙勘察设计研究院有限公司 | Three-dimensional space detection method |
CN112386171A (en) * | 2020-11-18 | 2021-02-23 | 福州市长乐区三互信息科技有限公司 | Intelligent cleaning method and system for building property |
CN112862879B (en) * | 2021-02-18 | 2023-07-07 | 中国矿业大学(北京) | Subway tunnel three-dimensional model construction method based on TIN model |
CN112862879A (en) * | 2021-02-18 | 2021-05-28 | 中国矿业大学(北京) | Method for constructing subway tunnel three-dimensional model based on TIN model |
CN113701733A (en) * | 2021-09-10 | 2021-11-26 | 上海冈波科技有限公司 | Construction supervision method and system based on BIM model |
CN114356913A (en) * | 2021-12-15 | 2022-04-15 | 联奕科技股份有限公司 | Micro-service link operation and maintenance system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108415453A (en) | Unmanned plane tunnel method for inspecting based on BIM technology | |
CN106441286B (en) | Unmanned plane tunnel cruising inspection system based on BIM technology | |
KR102001728B1 (en) | Method and system for acquiring three dimentional position coordinates in non-control points using stereo camera drone | |
US10630962B2 (en) | Systems and methods for object location | |
CN109911188B (en) | Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment | |
WO2019093532A1 (en) | Method and system for acquiring three-dimensional position coordinates without ground control points by using stereo camera drone | |
CN105197252B (en) | A kind of SUAV landing method and system | |
CN101419055B (en) | Space target position and pose measuring device and method based on vision | |
CA2526105C (en) | Image display method and image display apparatus | |
CN111241988B (en) | Method for detecting and identifying moving target in large scene by combining positioning information | |
EP3032818A1 (en) | Image processing device and markers | |
CN111044018B (en) | Method for planning aerial photogrammetry route on opposite face | |
CN111735445A (en) | Monocular vision and IMU (inertial measurement Unit) integrated coal mine tunnel inspection robot system and navigation method | |
CN106705962A (en) | Method and system for acquiring navigation data | |
CN111540052A (en) | Rapid positioning and three-dimensional reconstruction method for dangerous rock falling along railway | |
CN114034296A (en) | Navigation signal interference source detection and identification method and system | |
CN110675453A (en) | Self-positioning method for moving target in known scene | |
CN106969721A (en) | A kind of method for three-dimensional measurement and its measurement apparatus | |
CN109472778A (en) | A kind of tall and slender structure appearance detecting method based on unmanned plane | |
CN109981980A (en) | Over the horizon real-time display method, system, storage medium and computer equipment | |
CN107093187B (en) | A kind of measurement method and device of unmanned plane during flying speed | |
Moore et al. | A stereo vision system for uav guidance | |
Bertram et al. | Generation the 3D model building by using the quadcopter | |
CN112446915A (en) | Picture-establishing method and device based on image group | |
CN109342439A (en) | Cable Structure appearance detecting method based on unmanned plane |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180817 |
|
RJ01 | Rejection of invention patent application after publication |