NL2030333A - Lidar-based unmanned aerial vehicle bridge bottom detection system - Google Patents

Lidar-based unmanned aerial vehicle bridge bottom detection system Download PDF

Info

Publication number
NL2030333A
NL2030333A NL2030333A NL2030333A NL2030333A NL 2030333 A NL2030333 A NL 2030333A NL 2030333 A NL2030333 A NL 2030333A NL 2030333 A NL2030333 A NL 2030333A NL 2030333 A NL2030333 A NL 2030333A
Authority
NL
Netherlands
Prior art keywords
bridge
unmanned aerial
aerial vehicle
lidar
information
Prior art date
Application number
NL2030333A
Other languages
Dutch (nl)
Other versions
NL2030333B1 (en
Inventor
Zhang Xiaoming
Zhong Sheng
Jiang Shengchuan
Original Assignee
Shanghai Tonglu Cloud Transp Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tonglu Cloud Transp Technology Co Ltd filed Critical Shanghai Tonglu Cloud Transp Technology Co Ltd
Publication of NL2030333A publication Critical patent/NL2030333A/en
Application granted granted Critical
Publication of NL2030333B1 publication Critical patent/NL2030333B1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0008Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings of bridges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0041Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by determining deflection or stress
    • G01M5/005Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by determining deflection or stress by means of external apparatus, e.g. test benches or portable test systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9515Objects of complex shape, e.g. examined with use of a surface follower device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/90Lidar systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link

Abstract

The present invention relates to a LiDAR—based unmanned aerial vehicle bridge bottom detection system, including an unmanned aerial vehicle, a LiDAR acquisition and navigation module, an image acquisition module, a transmission module, and a ground control center. The unmanned aerial vehicle acquires on—site information through the carried LiDAR acquisition and navigation module and the image acquisition module, and transmits the acquired information to the ground control center through the transmission module. The bridge bottom detection system of the present invention uses the unmanned aerial vehicle as a platform, and has low cost, high efficiency, high reliability, and great flexibility, the detection time is not limited, and meanwhile, the flight safety of the unmanned aerial vehicle is improved through the detection of the obstacle information. Compared with the conventional detection, the device of the present invention can acquire more comprehensive bridge information.

Description

LIDAR-BASED UNMANNED AERIAL VEHICLE BRIDGE BOTTOM DETECTION SYSTEM
TECHNICAL FIELD
The present invention relates to the technical field of rapid bridge inspection, and particularly relates to a LiDAR-based un- manned aerial vehicle bridge bottom detection system.
BACKGROUND ART
The monitoring and analysis of bridge bottom conditions is the basic task of the bridge health monitoring. The existing bridge bottom detection technologies are mainly divided into a manual detection technology and an unmanned aerial vehicle detec- tion technology. The manual detection technology relies on bridge detection experts who will be transported to a bridge bottom by large machinery such as a bridge detection vehicle, find cracks with the naked eye, and use a crack observation instrument or rul- er to measure the crack size. The manual detection technology is inefficient and costly, and has potential safety hazards. The un- manned aerial vehicle detection technology usually uses a video image method, which is greatly affected by positioning accuracy, and hardly identifies microscopic damage. Unmanned aerial vehicles flying under bridges also have many problems, such as GPS signal loss, inaccurate positioning, remote control signal loss due to occlusion, and obstacle avoidance, and thus, the unmanned aerial vehicles need to use other sensors to position themselves, and meanwhile, need to improve the autonomous flight ability so as to avoid hitting obstacles or crashing when flying under bridges.
Therefore, how to detect and position a bridge bottom safely, accurately, and efficiently is a problem to be solved urgently.
SUMMARY
An objective of the present invention is to provide a LiDAR- based improved unmanned aerial vehicle bridge bottom detection method to realize intelligent, convenient, and rapid bridge bottom inspection.
In order to achieve the above objective, the present inven- tion provides the following solutions.
A LiDAR-based unmanned aerial vehicle bridge bottom detection system includes: an unmanned aerial vehicle, configured to complete a flight function and carry other modules; a LiDAR acquisition and navigation module, including a GPS unit and an inertia measurement unit, and configured to position the unmanned aerial vehicle and acquire obstacle information and three-dimensional bridge information; an image acquisition module, including a wide-angle camera and a memory, and configured to photograph the condition of a de- tection area and store the photos; a transmission module, configured to transmit the obstacle information, a bridge dynamic variation, and the image information acquired by the image acquisition module; and a ground control center, configured to process the received information, plan a flight route, and detect and assess an abnor- mal area, wherein the unmanned aerial vehicle acquires on-site infor- mation through the carried LiDAR acquisition and navigation module and the image acquisition module, and transmits the acquired in- formation to the ground control center through the transmission module.
Preferably, the inertia measurement unit is configured to scan three-dimensional point cloud information of areas including a bridge, a bridge bottom surface, and a bearing, and generate three-dimensional point cloud data.
Preferably, the LiDAR acquisition and navigation module fur- ther includes a data processing unit configured to detect the three-dimensional point cloud data obtained through scanning by the inertia measurement unit.
Preferably, the data processing unit uses a noise reduction method to detect the three-dimensional point cloud data, and is configured to position length, width, depth, height, and location information of the bridge to obtain the obstacle information and the bridge dynamic variation.
Preferably, the ground control center includes a processing unit, a detection positioning unit, and an abnormality assessment unit, wherein the processing unit is configured to correct the re- ceived image, and plan a flight route for the unmanned aerial ve- hicle according to the received obstacle information; the detection positioning unit is configured to position the detected abnormal area; and the abnormality assessment unit is configured to assess the abnormal condition of the abnormal area.
Preferably, the processing unit corrects the received image, removes motion blur and enhances the image, and splices the ac- quired images of areas between two bridge piers together to form a large-scale high-definition image.
Preferably, the detection positioning unit is configured to detect problems such as pier column defects, column tilt, cracks on the bridge bottom surface, and an uneven expansion surface in the large-scale high-definition image, and position the problems.
Preferably, the abnormality assessment unit measures the cracks and analyzes the density according to the problems of the bridge to finally obtain the type of failure of the bridge.
The present invention has the following beneficial effects. (1) The bridge bottom detection system of the present inven- tion uses the unmanned aerial vehicle as a platform, and has low cost, high efficiency, high reliability, and great flexibility, the detection time is not limited, and meanwhile, the flight safe- ty of the unmanned aerial vehicle is improved through the detec- tion of the obstacle information. (2) The system of the present invention uses laser to scan three-dimensional information of the bridge to obtain three- dimensional point cloud data, can realize multidimensional detec- tion of the bridge by processing the three-dimensional point cloud data, and can accurately position specific defects of the bridge. (3) The system of the present invention can dynamically de- tect a bridge variation by comparing previous data with present data so as to effectively guide workers to perform maintenance op- erations.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to describe the technical solutions in the embodi- ments of the present invention or the prior art more clearly, the drawings required to be used in the embodiments will be briefly introduced below. Obviously, the drawings in the following de- scription are only some of the embodiments of the present inven- tion, and those of ordinary skill in the art can obtain other drawings based on these drawings without creative effort.
FIG. 1 is a schematic diagram of modules of a system accord- ing to the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The technical solutions in the embodiments of the present in- vention will be clearly and completely described below with refer- ence to the drawings in the embodiments of the present invention.
Obviously, the described embodiments are only some but not all of the embodiments of the present invention. All other embodiments obtained by those of ordinary skill in the art based on the embod- iments of the present invention without creative effort shall fall within the scope of protection of the present invention.
In order to make the above objective, features, and ad- vantages of the present invention more obvious and understandable, the present invention will further be described below in detail with reference to the drawings and specific implementation modes.
As shown in FIG. 1, a LiDAR-based unmanned aerial vehicle bridge bottom detection system includes: an unmanned aerial vehicle, configured to complete a flight function and carry other modules; a LiDAR acquisition and navigation module, including a GPS unit and an inertia measurement unit, and configured to position the unmanned aerial vehicle and acquire obstacle information and three-dimensional bridge information, the LiDAR acquisition and navigation module being a solid-state LIDAR device; an image acquisition module, including a wide-angle camera and a memory, and configured to photograph the condition of a de- tection area and store the photos;
a transmission module, configured to transmit the obstacle information, a bridge dynamic variation, and the image information acquired by the image acquisition module; and a ground control center, configured to process the received 5 information, plan a flight route, and detect and assess an abnor- mal area, wherein the unmanned aerial vehicle acquires on-site infor- mation through the carried solid-state LIDAR device and the wide- angle camera, and transmits the acquired information to the ground control center through the transmission module.
First, the wide-angle camera is mounted on a top of the un- manned aerial vehicle by using a dedicated fixing support via a bolt, and then the solid-state LIDAR device is fixed to the sup- port. It is necessary to check whether a scanning range of the solid-state LIDAR device is obscured by a body of the unmanned aerial vehicle, if so, an angle of inclination of the solid-state
LIDAR device is appropriately adjusted, or the entire device moves outwards along the fixing support.
As a further optimized solution, the inertia measurement unit in the solid-state LIDAR device is configured to scan three- dimensional point cloud information of areas including a bridge, a bridge bottom surface, and a bearing, and generate three- dimensional point cloud data.
As a further optimized solution, the LiDAR acquisition and navigation module further includes a data processing unit config- ured to detect the three-dimensional point cloud data obtained through scanning by the inertia measurement unit. The data pro- cessing unit uses a noise reduction method to detect the three- dimensional point cloud data, and is configured to position length, width, depth, height, and location information of the bridge to obtain the obstacle information and the bridge dynamic variation.
There are two methods for checking whether the point cloud data obtained through scanning by the LIDAR device meet the detec- tion requirements. One method includes the following steps: first, it is necessary to check the self-vibration condition of the LIDAR device when the unmanned aerial vehicle is stationary, the self-
vibration should be avoided when the device is running, because this condition is sufficient to interfere with subsequent experi- mental data. If the vibration does not meet the requirements, a cushioning material such as a rubber gasket is added between the
LIDAR device and the fixing support to reduce the impact of the vibration. Then, it is necessary to verify a visible range and distances between points of the three-dimensional point cloud da- ta. The detection method is to move the unmanned aerial vehicle forwards for a short distance to obtain a plane scanning data so as to determine the visible range of the LIDAR device. The other method includes the following steps: a user holds a sign made of a high-reflection material by hand, moves it back and forth along a scanning line, and manually observes locations of points with high reflection values to determine the visible range of the point cloud data. Finally, it is necessary to adjust an angle of a laser scanner in the LIDAR device to make the scanning line perpendicu- lar to a central axis of the body of the unmanned aerial vehicle as far as possible, and calibrate the rolling angle of the laser scanner to zero as far as possible.
The total amount of the point cloud data obtained by the above method is too large, generally the point cloud data of about 10 meters exceeds 500,000 points, and therefore, it is necessary to use an efficient noise reduction method to process the point cloud data. The adopted method is to use a statistical filter to process each frame of point cloud data. In comparison, a single frame of point cloud data generally does not exceed 4,500 points, all the points are in the same plane, and thus, the data can be calculated in the order of location to avoid calculation difficul- ties caused by the disorder of the point cloud data. The principle of the noise reduction method adopted by the present invention is that: a specified number of neighborhood points are searched for each point, an average value of distances from each point to its neighborhood points is calculated, and a mean value and a standard deviation of these average values of the distances are calculated.
Because a distance between two points in the point cloud data gen- erally conforms to the Gaussian distribution, if the average value of distances from a certain point to its neighborhood points is greater than the maximum distance, the point is considered to be a noise point and removed from the data. A calculation formula of the maximum distance is:
L=M+k*0 where, L is the maximum allowable distance between two points, M is a mean value of average values of distances between various points, k is standard deviation amplification coefficient, and co represents a standard deviation of the average values of the distances between various points. In order to preserve the real data as far as possible, in the present embodiment, the value of k is 3.
The data processing unit uses the three-dimensional point cloud data to perform spatial measurement, positions the length, width, depth, height, and location information of the bridge, and compares the previous three-dimensional laser scanning data with the present three-dimensional laser scanning data to obtain the bridge dynamic variation.
The transmission module transmits the obstacle information, the bridge dynamic variation, and the image information acquired by the image acquisition module to the ground control center by means of wireless communication such as 4G, 5G, and WIFI networks.
As a further optimized solution, the ground control center includes a processing unit, a detection positioning unit, and an abnormality assessment unit, wherein the processing unit is configured to correct the re- ceived image, and plan a flight route for the unmanned aerial ve- hicle according to the received obstacle information; the detection positioning unit is configured to position the detected abnormal area; and the abnormality assessment unit is configured to assess the abnormal condition of the abnormal area.
As a further optimized solution, the processing unit corrects the received image, removes motion blur and enhances the image, and splices the acquired images of areas between two bridge piers together to form a large-scale high-definition image.
As a further optimized solution, the detection positioning unit is configured to detect problems such as pier column defects,
column tilt, cracks on the bridge bottom surface, and an uneven expansion surface in the large-scale high-definition image, and position the problems.
As a further optimized solution, the abnormality assessment unit measures the cracks and analyzes the density according to the problems of the bridge to finally obtain the type of failure of the bridge.
First, the ground control center plans a flight route for the unmanned aerial vehicle according to the received obstacle infor- mation, and performs obstacle avoidance and route planning accord- ing to the acquired location and present posture of the unmanned aerial vehicle and the point cloud data of the obstacles 360° around the unmanned aerial vehicle that are acquired by the LIDAR device. The unmanned aerial vehicle will inspect the obstacle-free areas between the two bridge piers according to the planned route.
If there is a complex obstacle, the ground control center uses the
LIDAR device to calculate the mileage and position the unmanned aerial vehicle, achieves obstacle avoidance under the condition of accurate positioning of the unmanned aerial vehicle, and plans an optimal flight route.
Then, in order to obtain high-resolution images of the bridge bottom, a shooting distance needs to be maintained during the use of the wide-angle camera, and therefore, when flying, the unmanned aerial vehicle flies in a horizontal plane with the constant alti- tude to avoid unclear captured images caused by excessive ap- proaching of the unmanned aerial vehicle to the bridge bottom or instability of the unmanned aerial vehicle due to wind changes, and the shooting distance is maintained at about 3 meters.
After receiving the captured images, the ground control cen- ter pre-processes the images, mainly corrects the images, enhances the images and removes motion blur, and eliminates uneven lighting to obtain high-definition distortion-free images of the bridge bottom. Because a visual range of each image captured by the un- manned aerial vehicle is limited, it is necessary to splice the images to obtain a wide-view image of the bridge bottom surface for failure detection.
The relationship between the point cloud data and the image data at the time of shooting is used to acquire all image data of the areas between the two bridge piers. This step mainly relies on the synchronous scanning relationship between image shooting and laser point cloud. Each captured image can correspond to a series of laser point clouds. A distance between a photographed area and the bridge pier can be obtained through the point clouds. Based on a set safe distance between the unmanned aerial vehicle and the bridge pier, all images captured before the unmanned aerial vehi- cle reaches the safe distance twice are the images of the areas between the two bridge piers, and a collection of many images is obtained.
After a complete image of the bridge bottom is obtained, the detection positioning unit detects the image thoroughly to find problems such as pier column defects, column tilt, cracks on the bridge bottom surface, and an uneven expansion surface. Three- dimensional reconstruction can be performed on the scanned bridge by using the acquired point cloud data, and the detection posi- tioning unit performs measurement according to the reconstructed model to detect defects on the bridge and locations corresponding to the defects.
Finally, the abnormality assessment unit determines the type of failure of the bridge according to the found defects and the positioning information, classifies potential safety hazards, and notifies management personnel to carry out repair work in time.
The present invention achieves the following beneficial ef- fects. (1) The bridge bottom detection system of the present inven- tion uses the unmanned aerial vehicle as a platform, and has low cost, high efficiency, high reliability, and great flexibility, the detection time is not limited, and meanwhile, the flight safe- ty of the unmanned aerial vehicle is improved through the detec- tion of the obstacle information. (2) The system of the present invention uses laser to scan three-dimensional information of the bridge to obtain three- dimensional point cloud data, can realize multidimensional detec- tion of the bridge by processing the three-dimensional point cloud data, and can accurately position specific defects of the bridge.
Compared with the conventional detection, the device of the pre- sent invention can acquire more comprehensive bridge information. (3) The system of the present invention can dynamically de- tect a bridge variation by comparing previous data with present data so as to effectively guide workers to perform maintenance op- erations.
The above embodiments are only descriptions of the preferred modes of the present invention, and are not intended to limit the scope of the present invention. Various modifications and improve- ments made by those of ordinary skill in the art to the technical solutions of the present invention without departing from the de- sign spirit of the present invention shall fall within the scope of protection determined by the claims of the present invention.

Claims (8)

CONCLUSIESCONCLUSIONS 1. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig, met het kenmerk, dat de- ze omvat: een onbemand luchtvaartuig, geconfigureerd om een vluchtfunctie te vervullen en andere modules te vervoeren; een LiDAR acquisitie- en navigatiemodule, omvattende een GPS- eenheid en een traagheidsmeeteenheid, en geconfigureerd om het on- bemande luchtvaartuig te positioneren en obstakelinformatie en driedimensionale bruginformatie te verkrijgen; een beeldverwervingsmodule, omvattende een groothoekcamera en een geheugen, en geconfigureerd om de toestand van een detectiegebied te fotograferen en de foto's op te slaan; een transmissiemodule, geconfigureerd om de obstakelinformatie, een dynamische brugvariatie en de door de beeldacquisitiemodule verworven beeldinformatie te verzenden; en een grondcontrolecentrum, geconfigureerd om de ontvangen informa- tie te verwerken, een vliegroute te plannen en een abnormaal ge- bied te detecteren en te beoordelen, waarbij het onbemande luchtvaartuig informatie ter plaatse ver- werft via de gedragen LiDAR acquisitie- en navigatiemodule en de beeldacquisitiemodule, en de verkregen informatie doorgeeft aan het grondcontrolecentrum via de transmissiemodule.A system for detecting the bottom of a bridge with an unmanned aerial vehicle based on LiDAR, characterized in that it comprises: an unmanned aerial vehicle configured to perform a flight function and to carry other modules; a LiDAR acquisition and navigation module, comprising a GPS unit and an inertial measurement unit, and configured to position the unmanned aerial vehicle and obtain obstacle information and three-dimensional bridge information; an image acquisition module comprising a wide-angle camera and a memory and configured to photograph the state of a detection area and store the photographs; a transmission module configured to transmit the obstacle information, a dynamic bridge variation, and the image information acquired by the image acquisition module; and a ground control center, configured to process the information received, plan a flight path, and detect and assess an abnormal area, with the unmanned aerial vehicle acquiring information at the scene through the on-board LiDAR acquisition and navigation module and the image acquisition module, and transmits the acquired information to the ground control center through the transmission module. 2. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 1, met het kenmerk, dat de traagheidsmeeteenheid is geconfigureerd om driedimensionale puntenwolkinformatie te scannen van gebieden die een brug, een brugbodemoppervlak en een drager omvatten, en drie- dimensionale puntenwolkgegevens te genereren.A LiDAR-based unmanned aerial vehicle bridge bottom detection system according to claim 1, characterized in that the inertial measurement unit is configured to scan three-dimensional point cloud information of areas comprising a bridge, a bridge bottom surface and a carrier , and generate three-dimensional point cloud data. 3. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 2, met het kenmerk, dat de LiDAR acquisitie- en navigatiemodule verder een gegevensverwerkingseenheid omvat die is geconfigureerd om de driedimensionale puntenwolkgegevens te detecteren die zijn verkre- gen door middel van scannen met behulp van de traagheidsmeting eenheid.A system for unmanned aerial vehicle bridge bottom detection based on LiDAR according to claim 2, characterized in that the LiDAR acquisition and navigation module further comprises a data processing unit configured to detect the three-dimensional point cloud data generated by obtained by scanning using the inertial measurement unit. 4. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 3, met het kenmerk, dat de gegevensverwerkingseenheid een ruisonderdruk- kingsmethode gebruikt om de driedimensionale puntenwolkgegevens te detecteren, en is geconfigureerd om lengte, breedte, diepte, hoog- te- en locatie-informatie van de brug te positioneren om de obsta- kelinformatie en de dynamische variatie van de brug te verkrijgen.A system for detecting the bottom of a bridge with an unmanned aerial vehicle based on LiDAR according to claim 3, characterized in that the data processing unit uses a noise suppression method to detect the three-dimensional point cloud data, and is configured to calculate longitude, latitude , depth, height and location information of the bridge to obtain the obstruction information and the dynamic variation of the bridge. 5. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 1, met het kenmerk, dat het grondcontrolecentrum een verwerkingseenheid, een detectiepositioneringseenheid en een eenheid voor het beoorde- len van afwijkingen omvat, waarbij de verwerkingseenheid is geconfigureerd om het ontvangen beeld te corrigeren, en een vluchtroute te plannen voor het onbe- mande luchtvaartuig in overeenstemming met de ontvangen obstakel- informatie; de detectiepositioneringseenheid is geconfigureerd om het gedetec- teerde abnormale gebied te positioneren; en de afwijkingsbeoordelingseenheid is geconfigureerd om de abnormale toestand van het abnormale gebied te beoordelen.A system for detecting the bottom of a bridge with an unmanned aerial vehicle based on LiDAR according to claim 1, characterized in that the ground control center comprises a processing unit, a detection positioning unit and an anomaly assessment unit, wherein the processing unit is configured to correct the received image, and plan an escape route for the unmanned aerial vehicle in accordance with the received obstacle information; the detection positioning unit is configured to position the detected abnormal area; and the abnormality judging unit is configured to judge the abnormal condition of the abnormal area. 6. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 5, met het kenmerk, dat de verwerkingseenheid het ontvangen beeld corri- geert, bewegingsonscherpte verwijdert en het beeld verbetert, en de verkregen beelden van gebieden tussen twee brugpijlers samen- voegt om een grootschalig high-definition beeld te verkrijgen.A system for detecting the bottom of a bridge with an unmanned aerial vehicle based on LiDAR according to claim 5, characterized in that the processing unit corrects the received image, removes motion blur and enhances the image, and the acquired images of merges areas between two bridge piers to obtain a large-scale high-definition image. 7. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 5, met het kenmerk, dat de detectie-positioneringseenheid is geconfigu- reerd om problemen zoals brugpijlerdefecten, kolomkanteling,A LiDAR-based unmanned aerial vehicle bridge bottom detection system according to claim 5, characterized in that the detection positioning unit is configured to detect problems such as bridge pier defects, column tilt, scheuren in het bodemoppervlak van de brug en een ongelijkmatige uitzetting van het oppervlak in de grootschalige high-definition afbeelding te detecteren, en de problemen te plaatsen.detect cracks in the bottom surface of the bridge and uneven expansion of the surface in the large-scale high-definition image, and locate the problems. 8. Systeem voor het op basis van LiDAR detecteren van de bodem van een brug met een onbemande luchtvaartuig volgens conclusie 5, met het kenmerk, dat de eenheid voor het beoordelen van afwijkingen de scheuren meet en de dichtheid analyseert in overeenstemming met de problemen van de brug om uiteindelijk het type defect van de brug te bepalen.A LiDAR-based unmanned aerial vehicle bridge bottom detection system according to claim 5, characterized in that the anomaly assessment unit measures the cracks and analyzes the density in accordance with the problems of the bridge to finally determine the type of bridge defect.
NL2030333A 2021-08-18 2021-12-29 Lidar-based unmanned aerial vehicle bridge bottom detection system NL2030333B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110949834.7A CN113640829A (en) 2021-08-18 2021-08-18 Unmanned aerial vehicle bridge bottom detection system based on LiDAR

Publications (2)

Publication Number Publication Date
NL2030333A true NL2030333A (en) 2023-02-27
NL2030333B1 NL2030333B1 (en) 2023-04-17

Family

ID=78422725

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2030333A NL2030333B1 (en) 2021-08-18 2021-12-29 Lidar-based unmanned aerial vehicle bridge bottom detection system

Country Status (2)

Country Link
CN (1) CN113640829A (en)
NL (1) NL2030333B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108332926A (en) * 2018-01-05 2018-07-27 株洲时代电子技术有限公司 A kind of bridge cruising inspection system
CN109612427A (en) * 2019-01-16 2019-04-12 兰州交通大学 A kind of the unmanned plane highway bridge deformation detecting method and system of multi-sensor cooperation
CN109990778A (en) * 2019-04-11 2019-07-09 株洲时代电子技术有限公司 A kind of bridge pedestal inspection flight course planning method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106645205A (en) * 2017-02-24 2017-05-10 武汉大学 Unmanned aerial vehicle bridge bottom surface crack detection method and system
CN108629835B (en) * 2017-03-20 2021-10-01 哈尔滨工业大学 Indoor reconstruction method and system based on hyperspectral, true color image and point cloud complementation
CN108318499A (en) * 2018-01-05 2018-07-24 株洲时代电子技术有限公司 A kind of bridge method for inspecting
CN109492563A (en) * 2018-10-30 2019-03-19 深圳大学 A kind of tree species classification method based on unmanned plane Hyperspectral imaging and LiDAR point cloud
CN109541997B (en) * 2018-11-08 2020-06-02 东南大学 Spraying robot rapid intelligent programming method for plane/approximate plane workpiece
CN110176061A (en) * 2019-04-30 2019-08-27 中科恒运股份有限公司 Human body surface reconstructing method in a kind of three-dimensional reconstruction
CN110517193B (en) * 2019-06-28 2022-04-12 西安理工大学 Seabed sonar point cloud data processing method
KR102325501B1 (en) * 2019-07-11 2021-11-12 주식회사 아소아 Unmanned aerial vehicles and method for sensing and aboiding of obstacles and weather change thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108332926A (en) * 2018-01-05 2018-07-27 株洲时代电子技术有限公司 A kind of bridge cruising inspection system
CN109612427A (en) * 2019-01-16 2019-04-12 兰州交通大学 A kind of the unmanned plane highway bridge deformation detecting method and system of multi-sensor cooperation
CN109990778A (en) * 2019-04-11 2019-07-09 株洲时代电子技术有限公司 A kind of bridge pedestal inspection flight course planning method

Also Published As

Publication number Publication date
CN113640829A (en) 2021-11-12
NL2030333B1 (en) 2023-04-17

Similar Documents

Publication Publication Date Title
CN108491758B (en) Track detection method and robot
CN109901625B (en) Bridge inspection system
CN101913368B (en) System and method for fast precise measurement and total factor data acquisition of high speed railway
CN109885097B (en) Method for planning inspection route of outer edge surface of bridge
CN108447075B (en) Unmanned aerial vehicle monitoring system and monitoring method thereof
CN106645205A (en) Unmanned aerial vehicle bridge bottom surface crack detection method and system
CN103266559B (en) The method of BP bridge security inspection car and face, acquisition bridge surface phase
CN109885098B (en) Method for planning inspection route of bridge side fence
CN111999298A (en) Unmanned aerial vehicle bridge system of patrolling and examining fast based on 5G technique
GB2571045A (en) Facility patrol system and facility patrol method
EP2966400B1 (en) Overhead line position measuring device and method
CN210005927U (en) bridge inspection unmanned aerial vehicle system
CN102967263A (en) Bridge deflection-corner integrated measurement method
CN109901623B (en) Method for planning inspection route of pier body of bridge
CN210090988U (en) Unmanned aerial vehicle system of patrolling and examining
CN109990778B (en) Bridge base inspection route planning method
CN109990777B (en) Planning method for inspection route of bridge bottom surface
US20220178735A1 (en) Displacement measurement apparatus for structure
CN113104063A (en) Comprehensive detection system and method for network rail tunnel
CN112539704A (en) Method for measuring distance between hidden danger in transmission line channel and lead
CN116291724A (en) Real-time early warning detection method and system for highway tunnel construction
CN114241177A (en) Airport pavement apparent image detection system based on linear array scanning imaging
CN103422417A (en) Dynamic identification system and method for detecting road surface damages
CN113449688B (en) Power transmission tree obstacle recognition system based on image and laser point cloud data fusion
NL2030333B1 (en) Lidar-based unmanned aerial vehicle bridge bottom detection system