CN113012292B - AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography - Google Patents

AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography Download PDF

Info

Publication number
CN113012292B
CN113012292B CN202110467979.3A CN202110467979A CN113012292B CN 113012292 B CN113012292 B CN 113012292B CN 202110467979 A CN202110467979 A CN 202110467979A CN 113012292 B CN113012292 B CN 113012292B
Authority
CN
China
Prior art keywords
model
target area
point cloud
construction
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110467979.3A
Other languages
Chinese (zh)
Other versions
CN113012292A (en
Inventor
李俭楠
王迅
吴斌
张涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhaotong Liangfengtai Information Technology Co ltd
Original Assignee
Zhaotong Liangfengtai Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhaotong Liangfengtai Information Technology Co ltd filed Critical Zhaotong Liangfengtai Information Technology Co ltd
Priority to CN202110467979.3A priority Critical patent/CN113012292B/en
Publication of CN113012292A publication Critical patent/CN113012292A/en
Application granted granted Critical
Publication of CN113012292B publication Critical patent/CN113012292B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Abstract

The invention provides an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography, which are characterized in that an outdoor building BIM model is generated by four means of unmanned aerial vehicle inclination model, unmanned aerial vehicle laser radar surveying and mapping, field laser radar surveying and mapping and field high-definition video monitoring, then the outdoor building BIM model is compared with the BIM model during design, construction deviation is calculated, analyzed and obtained, and an engineering project report is generated; the method can also calculate and obtain the terrain data and the three-dimensional model of the site, and can identify and take out the relevant data of the area through the geographical position and the building characteristics when an engineer or a supervisor brings an AR device by combining a GIS map, thereby realizing the real-time information acquisition and finding the problems in real time; and prompting manual inspection for the area with larger deviation to ensure the implementation of the construction quality of the highway.

Description

AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography
Technical Field
The invention relates to the technical field of building model measurement and calculation, in particular to an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography.
Background
At present, most of expressway construction sites are in the field, and the systems for expressway construction and supervision in the existing informatization technology are weak, mostly depend on manual detection or measurement, are slow in data synchronization, cannot find problems in time, cannot acquire site conditions quickly, and are low in engineering quantity and efficiency. Compared manual measurement is adopted in the measurement mode, data cannot be updated after the data is delayed, and information acquisition encounters obstacles. The project management system is filled in manually, and the progress report of the construction site cannot be synchronized immediately, so that inconvenience is brought to supervision. Meanwhile, project management and supervision personnel cannot track and monitor construction conditions in real time through an informatization means, site details can be well acquired when the construction conditions reach an outdoor construction site, and no better analysis system provides site information.
Disclosure of Invention
In order to overcome the technical defects, the invention aims to provide an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography, which can improve the remote construction information acquisition speed.
The invention discloses an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography, which comprises the following steps: calibrating a construction target area; shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial photography inclination model image with the field laser radar point cloud to obtain a first model point cloud; performing unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area through the unmanned aerial vehicle according to the second flight parameters in the construction target area to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud to obtain an initial fusion model, wherein the registration accuracy is higher than a preset registration accuracy; conducting aerial triangulation on the initial fusion model, and then leading the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and obtaining progress data of the actually measured BIM model, and generating a construction progress report of the construction target area; comparing the actually measured BIM model with a designed BIM model to obtain a model deviation value, and introducing the actually measured BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area; and acquiring the construction progress report and the geographical live-action picture through AR equipment.
Preferably, the shooting, by the unmanned aerial vehicle, the region of interest in the construction target area with the first flight parameter in the construction target area to obtain the aerial inclination model image includes: during shooting, the position of fixed-point exposure is ensured to be accurate; the unmanned aerial vehicle laser radar mapping of the region of interest in the construction target area by the unmanned aerial vehicle with the second flight parameters to obtain an unmanned aerial vehicle laser radar point cloud comprises: the ground GPS base station is erected, and the unmanned plane compass is calibrated according to GPS signals.
Preferably, the performing site lidar mapping on the construction target area to obtain a site lidar point cloud comprises: setting a plurality of survey stations at intervals of a first preset distance in the construction target area, wherein each survey station sends a laser radar for surveying and mapping; and a first preset distance between adjacent stations is smaller than the scanning distance of the stations, so that the areas scanned by each station have overlapped parts.
Preferably, said fusing said aerial tilt model image with said venue lidar point cloud to obtain a first model point cloud comprises: carrying out image adjustment on the aerial photography inclination model image, wherein the image adjustment comprises brightness adjustment, saturation adjustment and contrast adjustment; performing aerial triangulation on the adjusted aerial tilt model image to determine the position and direction during shooting; extracting characteristic points of the aerial photography inclination model image and matching the characteristic points; then connecting the characteristic points; bringing all the aerial oblique model images of the construction target area into a uniform object coordinate system; performing aerial triangulation calculation on the aerial tilt model image to generate a three-dimensional sparse point cloud; and fusing the three-dimensional sparse point cloud and the field laser radar point cloud to obtain a first model point cloud.
Preferably, the performing, by the drone, drone lidar mapping on an area of interest within the construction target area with a second flight parameter at the construction target area to obtain a drone lidar point cloud includes:
performing unmanned aerial vehicle laser radar mapping on an interested area in the construction target area by the unmanned aerial vehicle in the construction target area according to a second flight parameter to acquire radar mapping data, performing combined calculation processing on the radar mapping data and GPS data of a ground radar base station, and acquiring accurate position information and attitude information of the unmanned aerial vehicle of each event point meeting point cloud processing requirements; solving a point cloud with an attitude position by using a positioning and attitude determination system; filtering the noise points of the point cloud by adopting a preset elevation limit interpolation value; and converting the coordinate system of the point cloud into a coordinate system consistent with the aerial inclination model image.
Preferably, the calculation obtains progress data of the actually measured BIM model, so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area further comprises:
comparing the model deviation value with a first preset deviation value, and calculating to obtain progress data of the actually-measured BIM only when the model deviation value is smaller than the first preset deviation value, so as to generate a construction progress report of the construction target area; and guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area.
Preferably, the comparing the model deviation value with a first preset deviation value further comprises: and comparing the model deviation value with a second preset deviation value, and when the model deviation value is greater than the second preset deviation value, highlighting the area corresponding to the model deviation value.
Preferably, the obtaining the construction progress report and the geographical live-action map through the AR device further comprises: and acquiring the construction parameters, the environmental parameters and the site high-definition video of the construction target area through AR equipment.
Preferably, the first flight parameters comprise the number of flight paths, the direction of the flight paths, the course overlapping degree, the side overlapping degree, the flight speed, the flight height, the resolution of the tripod head and the angle of the lens of the tripod head; the second flight parameters comprise the number of flight paths, the direction of the flight paths, the area overlapping degree, the flight speed and the flight height; the course direction overlapping degree is 75%, the side direction overlapping degree is 65%, and the flying height is 110m.
The invention also discloses an AR remote construction monitoring system based on unmanned aerial vehicle aerial photography, which comprises an unmanned aerial vehicle, a model fusion module, a model registration module, a model preprocessing module, a model analysis module and AR equipment;
after a construction target area is calibrated, shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud through the model fusion module to obtain a first model point cloud; performing unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area through the unmanned aerial vehicle according to the second flight parameters in the construction target area to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video through the model fusion module to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud through the model registration module to obtain an initial fusion model, wherein the registration accuracy is higher than a preset registration accuracy; importing the initial fusion model into the model preprocessing module to perform aerial triangulation, and then importing the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and acquiring progress data of the actually measured BIM through the model analysis module so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model through the model analysis module to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area; and acquiring the construction progress report and the geographical live-action map through the AR equipment.
After the technical scheme is adopted, compared with the prior art, the method has the following beneficial effects:
1. adopting four means of unmanned aerial vehicle inclination model, unmanned aerial vehicle laser radar mapping, field laser radar mapping and field high-definition video monitoring to generate an outdoor building BIM model, then comparing the outdoor building BIM model with the BIM model during design, calculating, analyzing, acquiring construction deviation and generating an engineering project report; the method can also calculate and obtain the terrain data and the three-dimensional model of the site, and can identify and take out the relevant data of the area through the geographical position and the building characteristics when an engineer or a supervisor brings an AR device by combining a GIS map, thereby realizing the real-time information acquisition and finding the problems in real time; prompting manual check for areas with large deviation to ensure the implementation of the construction quality of the highway;
2. the invention adopts high-speed three-dimensional laser to scan dense point cloud, and is suitable for severe environment; the real scene copy of the construction site is realized through modeling, so that the communication between managers and site constructors is smooth; need not clear the yard in the measurement process with the help of unmanned aerial vehicle, can not disturb the construction progress.
Drawings
FIG. 1 is a flow chart of an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography provided by the invention;
fig. 2 is a block diagram of a specific implementation flow of the AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography provided by the present invention.
Detailed Description
The advantages of the invention are further illustrated by the following detailed description of the preferred embodiments in conjunction with the drawings.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context.
In the description of the present invention, it is to be understood that the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are merely for convenience of description and simplicity of description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention.
In the description of the present invention, unless otherwise specified and limited, it should be noted that the terms "mounted," "connected," and "connected" are to be interpreted broadly, and may be, for example, a mechanical connection or an electrical connection, a communication between two elements, a direct connection, or an indirect connection through an intermediate medium, and those skilled in the art will understand the specific meaning of the terms as they are used in the specific case.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
Referring to the attached drawings 1-2, the invention discloses an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography, which is preferably applied to a construction site of a highway. Firstly, a construction target area is calibrated, and a plurality of radar mapping stations are set in the construction target area for radar mapping. And acquiring an unmanned aerial vehicle inclination model, an unmanned aerial vehicle laser radar mapping point cloud, a site laser radar mapping point cloud and a site high-definition video monitoring video through unmanned aerial vehicle aerial photography and field measurement.
The unmanned aerial vehicle shoots an interested area in the construction target area according to the first flight parameters in the construction target area so as to obtain an aerial photography inclination model image. Firstly, calibrating an unmanned aerial vehicle compass, setting a flight path and various parameters of an aircraft in a target measurement area according to terrain conditions, wherein the parameters comprise course number, course direction, area overlapping degree, flight speed, flight altitude, tripod head resolution, tripod head lens angle and the like, the course overlapping degree is usually 75%, the side overlapping degree is usually 65%, the flight altitude is usually 110m, and the course overlapping degree, the side overlapping degree and the flight altitude are required to be adjusted according to actual conditions; the flying height of the unmanned aerial vehicle and the shooting angle of the camera are adjusted, and manual shooting is carried out on the interest area. The first flight parameter is the flight parameter of the unmanned aerial vehicle in the shooting process of the inclination model image. In the shooting process of the unmanned aerial vehicle, the position accuracy of fixed point exposure needs to be ensured.
Because the photo video that unmanned aerial vehicle took photo by plane receives many-sided influence such as weather, illumination, there are luminance inconsistent, the not clear scheduling problem of photo in the image of taking photo by plane to lead to follow-up image matching mistake. Therefore, the aerial images need to be preprocessed, and in one embodiment, photoshop software is used to adjust brightness, saturation, contrast and other aspects of the aerial images so that the images are clear and consistent in brightness.
After the image is preprocessed, aerial triangulation is needed to be performed on the image data, so that the position and the direction of the image when the image is shot are determined, namely, the inside and outside orientation elements of the photo are determined. And extracting characteristic points and matching of the characteristic points, connecting the characteristic points, and bringing all images of the measured area into a uniform object coordinate system. And after the calculation of the aerial triangulation is completed, generating a three-dimensional sparse point cloud.
Because the problems of distortion, cavities and the like of some detail areas and ground object models exist in the aerial photography influence, laser radar data supplement needs to be carried out on the interest area to make up for the defects of the models.
And carrying out site laser radar mapping on the construction target area to obtain site laser radar point cloud data. And setting a plurality of survey stations at intervals of a first preset distance in the construction target area, and sending a laser radar by each survey station for scanning and surveying. The placement position of the scanning point needs to be determined according to the scanning range and required precision of the laser radar, and the farther the distance of the scanning point is, the lower the point cloud precision and the distance measurement precision are, so that in order to acquire high-precision data, the actual scanning distance of one measuring station needs to be controlled within the optimal range of the equipment. In order to conveniently and intelligently splice point clouds, a certain contact ratio is required between adjacent stations, and connection between the adjacent stations can be guaranteed. The first preset distance between adjacent stations is smaller than the scanning distance of the stations, so that the areas scanned by each station have overlapped parts.
The method comprises the steps of fusing a tilt model acquired by aerial photography of the unmanned aerial vehicle with field laser radar point cloud data to acquire first model point cloud, converting a coordinate system into the coordinate system identical to the tilt model of the unmanned aerial vehicle after point cloud data generated by the laser radar is exported, and fusing and modeling with the data.
And the unmanned aerial vehicle carries out unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to the second flight parameters so as to obtain unmanned aerial vehicle laser radar point cloud. For the precision of the laser radar, a ground GPS base station needs to be erected, and an unmanned aerial vehicle compass is calibrated firstly; setting various parameters of a flight route and an aircraft in a target measurement area according to terrain conditions, wherein the parameters comprise course number, course direction, area overlapping degree, flight speed, flight altitude, cradle head resolution, cradle head lens angle and the like, the course overlapping degree is usually 75%, the side overlapping degree is usually 65%, the flight altitude is usually 110m, and the course overlapping degree, the side overlapping degree and the flight altitude are required to be adjusted according to actual conditions; and adjusting the flying height of the unmanned aerial vehicle and the shooting angle of the camera, and manually shooting the interest area. The second flight parameter is the flight parameter when unmanned aerial vehicle carries out unmanned aerial vehicle laser radar survey and drawing promptly. In the shooting process of the unmanned aerial vehicle, the position of fixed-point exposure needs to be ensured to be accurate. And after the scanning is finished, the laser radar scanning control box is communicated, the original data of the laser radar is derived through software, and the software is accessed into the base station data everywhere in the base station control box. And opening the data through software after the data is exported, and verifying the integrity of the data. And performing combined resolving treatment on the derived POS data of the unmanned aerial vehicle and the GPS data acquired by the base station by using software, so as to obtain accurate position information and attitude information of the unmanned aerial vehicle at each event point, which meet the requirements of point cloud treatment.
And carrying out video shooting on the construction target area to obtain a site high-definition video. And fusing the unmanned aerial vehicle laser radar point cloud and the field high-definition video to obtain a second model point cloud.
For the unmanned aerial vehicle laser radar point cloud, laser radar data processing software is needed, reasonable high limit interpolation is set to filter noise points, a coordinate system of the laser radar point cloud after filtering is a relative coordinate system, the coordinate system needs to be converted into a coordinate system the same as that of an unmanned aerial vehicle tilt model, and fusion modeling can be carried out on the data with the coordinate system.
Model registration is carried out through a pairing algorithm on the first model point cloud and the second model point cloud which are acquired through an unmanned aerial vehicle inclination model, an unmanned aerial vehicle laser radar mapping point cloud, a field laser radar mapping point cloud and a field high-definition video monitoring video. The registration accuracy is higher than the preset registration accuracy, which is a large value, and the aim is to ensure the registration accuracy.
Firstly, the first model point cloud and the second model point cloud are preliminarily registered and registered manually through software, then the ICP algorithm is used for carrying out fine registration, and the registration accuracy is controlled to be within the required accuracy range. And then importing the registered laser point cloud data and unmanned aerial vehicle image data into software, and carrying out aerial triangulation in the software to finally obtain an actually measured BIM. The actual measurement BIM model solves the problems of distortion and deformation of the three-dimensional model and ground object cavities of the three-dimensional model, and greatly improves the texture precision of the three-dimensional model.
The actual measurement BIM model is connected into a BIM system, and the deviation between the designed BIM model and the actual measurement BIM model can be clearly seen. The deviation of the actually measured BIM model and the designed BIM model is calculated through model comparison, the deviation of the whole model can be calculated, and the deviation of the partial model surface can be manually specified. And finally generating a deviation report through the deviation data. And the construction progress data and the workload data can be obtained through the calculation model, and a construction progress report, a workload report and a construction progress chart are generated.
And importing an actual engineering project management progress report, a workload report and a construction progress to automatically generate an engineering management report.
And (4) guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area. The project management report and the geographical live-action map of the construction target area are connected to the AR equipment, and the supervision personnel can remotely acquire the construction progress report and the geographical live-action map through the AR equipment, so that the information is acquired timely and accurately.
After calculating the deviation between the actually measured BIM model and the designed BIM model, the model deviation value needs to be compared with a first preset deviation value, which is an allowable deviation critical value. And when the model deviation value is smaller than a first preset deviation value, calculating to acquire the progress data of the actually measured BIM, generating data such as a construction progress report of the construction target area, and importing the actually measured BIM and the model deviation into a GIS (geographic information system) to acquire a geographic live-action map of the construction target area. If the model deviation value is larger than the first preset deviation value, the cause needs to be analyzed, namely the measurement error or the construction error.
Preferably, for some places with larger deviation values, the focus needs to be paid attention to, the model deviation value needs to be compared with a second preset deviation value, and when the model deviation value is larger than the second preset deviation value, the area corresponding to the model deviation value is highlighted to prompt a supervisor.
The highlighting manner includes a color indication or a character indication.
It should be noted that, usually, the second preset deviation value of the whole model is consistent, that is, the deviation standard of the whole construction area is consistent, and for some special scenes, there are detail areas that need special attention, and the second preset deviation value of the area should be different from other areas, and the value should be smaller, and the requirement is higher.
Preferably, the AR equipment can also be used for acquiring construction parameters, environment parameters and site high-definition videos of the construction target area. Including concrete temperature monitoring, environmental monitoring, and the like.
For a construction site, not only the field earthwork and the mountain development earthwork need to be concerned, but also the quality of a finished project needs to be concerned.
The invention also discloses an AR remote construction monitoring system based on unmanned aerial vehicle aerial photography, which comprises:
-an unmanned aerial vehicle with a camera and radar scanning assembly for performing aerial photography and radar laser scanning of the construction site;
a model fusion module for fusing the point cloud data and the video data of the lidar to obtain a fusion model;
a model registration module for algorithmically registering the two fused models to ensure the accuracy of the subsequent live-action modeling;
a model preprocessing module, configured to perform preprocessing, such as aerial triangulation, on the registered model to meet subsequent analysis requirements;
the model analysis module is used for acquiring data through the model and analyzing the data to obtain a model deviation value and a construction progress report;
-an AR device for presenting a construction progress report and a geographical live-action map.
After the construction target area is calibrated, shooting an interested area in the construction target area by the unmanned aerial vehicle in the construction target area according to the first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on a construction target area to obtain site laser radar point cloud; and fusing the aerial inclination model image and the field laser radar point cloud through a model fusion module to obtain a first model point cloud.
Performing unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to the second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on a construction target area to obtain a site high-definition video; and fusing the unmanned aerial vehicle laser radar point cloud and the field high-definition video through a model fusion module to obtain a second model point cloud.
Registering the first model point cloud and the second model point cloud through a model registration module to obtain an initial fusion model, wherein the registration accuracy is higher than the preset registration accuracy; and importing the initial fusion model into a model preprocessing module to perform aerial triangulation, and then importing the initial fusion model into a BIM system to obtain an actually measured BIM.
Calculating and acquiring progress data of the actually measured BIM through a model analysis module, and generating a construction progress report of a construction target area; comparing the actually measured BIM model with the designed BIM model through a model analysis module to obtain a model deviation value, and introducing the actually measured BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of a construction target area; and acquiring the construction progress report and the geographical live-action map through the AR equipment.
It should be noted that the embodiments of the present invention have been described in a preferred embodiment and not limited to the embodiments, and those skilled in the art may modify and modify the above-disclosed embodiments to equivalent embodiments without departing from the scope of the present invention.

Claims (10)

1. An AR remote construction monitoring method based on unmanned aerial vehicle aerial photography is characterized by comprising the following steps:
calibrating a construction target area;
shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial photography inclination model image with the field laser radar point cloud to obtain a first model point cloud;
performing unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area through an unmanned aerial vehicle according to second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud to obtain an initial fusion model, wherein the registration accuracy is higher than the preset registration accuracy;
conducting aerial triangulation on the initial fusion model, and then leading the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and obtaining progress data of the actually measured BIM model, and generating a construction progress report of the construction target area; comparing the actual measurement BIM model with a design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area;
and acquiring the construction progress report and the geographical live-action map through AR equipment.
2. The AR remote construction monitoring method according to claim 1, wherein the capturing of the region of interest within the construction target area with the first flight parameters by the unmanned aerial vehicle to obtain the aerial tilt model image comprises: during shooting, the position of fixed-point exposure is ensured to be accurate;
the unmanned aerial vehicle laser radar mapping of the region of interest in the construction target area by the unmanned aerial vehicle with the second flight parameters to obtain an unmanned aerial vehicle laser radar point cloud comprises: erect ground GPS basic station, according to GPS signal calibration unmanned aerial vehicle compass.
3. The AR remote construction monitoring method of claim 1, wherein the performing site lidar mapping on the construction target area to obtain a site lidar point cloud comprises:
setting a plurality of survey stations at intervals of a first preset distance in the construction target area, wherein each survey station sends a laser radar for surveying and mapping; and a first preset distance between adjacent stations is smaller than the scanning distance of the stations, so that the areas scanned by each station have overlapped parts.
4. The AR remote construction monitoring method according to claim 1, wherein the fusing the aerial inclination model image with the field lidar point cloud to obtain a first model point cloud comprises:
carrying out image adjustment on the aerial photography inclination model image, wherein the image adjustment comprises brightness adjustment, saturation adjustment and contrast adjustment;
performing aerial triangulation on the adjusted aerial tilt model image to determine the position and direction during shooting;
extracting characteristic points of the aerial photography inclination model image and matching the characteristic points; then connecting the characteristic points; bringing all the aerial oblique model images of the construction target area into a uniform object coordinate system;
performing aerial triangulation calculation on the aerial inclination model image to generate a three-dimensional sparse point cloud; and fusing the three-dimensional sparse point cloud and the field laser radar point cloud to obtain a first model point cloud.
5. The AR remote construction monitoring method according to claim 1, wherein said performing drone lidar mapping by the drone at the construction target area of interest within the construction target area with second flight parameters to obtain drone lidar point cloud comprises:
performing unmanned aerial vehicle laser radar mapping on an area of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to second flight parameters to acquire radar mapping data, performing combined solution processing on the radar mapping data and GPS data of a ground radar base station, and acquiring accurate position information and attitude information of each event point unmanned aerial vehicle which meet the point cloud processing requirements;
solving a point cloud with an attitude position by using a positioning and attitude determination system;
filtering the noise points of the point cloud by adopting a preset elevation limit interpolation value;
and converting the coordinate system of the point cloud into a coordinate system consistent with the aerial tilt model image.
6. The AR remote construction monitoring method according to claim 1, wherein the calculation obtains progress data of the actually measured BIM model, thereby generating a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area further comprises:
comparing the model deviation value with a first preset deviation value, and calculating to obtain progress data of the actually-measured BIM only when the model deviation value is smaller than the first preset deviation value, so as to generate a construction progress report of the construction target area; and guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area.
7. The AR remote construction monitoring method according to claim 6, wherein the comparing the model deviation value to a first preset deviation value further comprises:
and comparing the model deviation value with a second preset deviation value, and when the model deviation value is greater than the second preset deviation value, highlighting the area corresponding to the model deviation value.
8. The AR remote construction monitoring method according to claim 1, wherein the obtaining of the construction progress report and the geographic map by the AR device further comprises:
and acquiring the construction parameters, the environmental parameters and the site high-definition video of the construction target area through AR equipment.
9. The AR remote construction monitoring method according to claim 1, wherein the first flight parameters include course number, course direction, course overlap, side overlap, flight speed, flight altitude, pan-tilt resolution, pan-tilt lens angle;
the second flight parameters comprise the number of flight paths, the direction of the flight paths, the area overlapping degree, the flight speed and the flight height;
the course direction overlapping degree is 75%, the side direction overlapping degree is 65%, and the flying height is 110m.
10. An AR remote construction monitoring system based on unmanned aerial vehicle aerial photography is characterized by comprising an unmanned aerial vehicle, a model fusion module, a model registration module, a model preprocessing module, a model analysis module and AR equipment;
after a construction target area is calibrated, shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud through the model fusion module to obtain a first model point cloud;
performing unmanned aerial vehicle laser radar mapping on the region of interest in the construction target area through the unmanned aerial vehicle according to the second flight parameters in the construction target area to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video through the model fusion module to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud through the model registration module to obtain an initial fusion model, wherein the registration precision is higher than the preset registration precision;
importing the initial fusion model into the model preprocessing module to perform aerial triangulation, and then importing the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and acquiring progress data of the actually-measured BIM through the model analysis module so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model through the model analysis module to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area;
and acquiring the construction progress report and the geographical live-action map through the AR equipment.
CN202110467979.3A 2021-04-28 2021-04-28 AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography Active CN113012292B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110467979.3A CN113012292B (en) 2021-04-28 2021-04-28 AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110467979.3A CN113012292B (en) 2021-04-28 2021-04-28 AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography

Publications (2)

Publication Number Publication Date
CN113012292A CN113012292A (en) 2021-06-22
CN113012292B true CN113012292B (en) 2023-02-24

Family

ID=76380860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110467979.3A Active CN113012292B (en) 2021-04-28 2021-04-28 AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography

Country Status (1)

Country Link
CN (1) CN113012292B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022239315A1 (en) * 2021-05-12 2022-11-17 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Information processing method, information processing device, and program
CN114489143B (en) * 2022-04-02 2022-07-15 清华大学 Unmanned aerial vehicle management system, method and device for construction safety risk monitoring
CN115526739B (en) * 2022-09-16 2023-06-30 杭州天界数字科技有限公司 Building engineering progress monitoring method based on BIM and machine vision
CN116448080B (en) * 2023-06-16 2023-09-05 西安玖安科技有限公司 Unmanned aerial vehicle-based oblique photography-assisted earth excavation construction method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662179A (en) * 2012-05-18 2012-09-12 四川省科学城久利科技实业有限责任公司 Three-dimensional optimizing route selection method based on airborne laser radar
CN108375367A (en) * 2018-01-25 2018-08-07 中铁第四勘察设计院集团有限公司 Combined ground laser radar and the work of oblique photograph point surveying method and system
CN109558622A (en) * 2018-09-19 2019-04-02 中建科技有限公司深圳分公司 A kind of execution management method therefor and device scanned based on cloud
CN110285792A (en) * 2019-07-02 2019-09-27 山东省交通规划设计院 A kind of fine grid earthwork metering method of unmanned plane oblique photograph
CN111737790A (en) * 2020-05-12 2020-10-02 中国兵器科学研究院 Method and equipment for constructing simulated city model
CN111951398A (en) * 2020-07-27 2020-11-17 中建三局第二建设工程有限责任公司 Intelligent lofting construction method based on unmanned aerial vehicle oblique image technology

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107291223A (en) * 2017-06-07 2017-10-24 武汉大学 A kind of super large data volume virtual reality space Information Visualization System and method
CN109520479A (en) * 2019-01-15 2019-03-26 成都建工集团有限公司 Method based on unmanned plane oblique photograph auxiliary earth excavation construction

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662179A (en) * 2012-05-18 2012-09-12 四川省科学城久利科技实业有限责任公司 Three-dimensional optimizing route selection method based on airborne laser radar
CN108375367A (en) * 2018-01-25 2018-08-07 中铁第四勘察设计院集团有限公司 Combined ground laser radar and the work of oblique photograph point surveying method and system
CN109558622A (en) * 2018-09-19 2019-04-02 中建科技有限公司深圳分公司 A kind of execution management method therefor and device scanned based on cloud
CN110285792A (en) * 2019-07-02 2019-09-27 山东省交通规划设计院 A kind of fine grid earthwork metering method of unmanned plane oblique photograph
CN111737790A (en) * 2020-05-12 2020-10-02 中国兵器科学研究院 Method and equipment for constructing simulated city model
CN111951398A (en) * 2020-07-27 2020-11-17 中建三局第二建设工程有限责任公司 Intelligent lofting construction method based on unmanned aerial vehicle oblique image technology

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于机载LiDAR点云的建筑物三维重建;郑超;《测绘技术装备》;20190625;第21卷(第2期);31-34 *
基于点云的隧道改建工程BIM建模方法与实践;张文胜 等;《长安大学学报(自然科学版)》;20210115;第41卷(第1期);59-68 *

Also Published As

Publication number Publication date
CN113012292A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN113012292B (en) AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography
CN111597666B (en) Method for applying BIM to transformer substation construction process
CN112113542A (en) Method for checking and accepting land special data for aerial photography construction of unmanned aerial vehicle
CN112147633A (en) Power line safety distance detection method
WO2022078240A1 (en) Camera precise positioning method applied to electronic map, and processing terminal
CN111006646B (en) Method for monitoring construction progress based on unmanned aerial vehicle oblique photography measurement technology
JPH0554128A (en) Formation of automatic video image database using photograph ic measurement
KR101214081B1 (en) Image expression mapping system using space image and numeric information
CN109859269B (en) Shore-based video auxiliary positioning unmanned aerial vehicle large-range flow field measuring method and device
CN113643254A (en) Efficient collection and processing method for laser point cloud of unmanned aerial vehicle
Guo et al. Accurate calibration of a self-developed vehicle-borne LiDAR scanning system
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
CN110780313A (en) Unmanned aerial vehicle visible light stereo measurement acquisition modeling method
CN108050995B (en) Oblique photography non-image control point aerial photography measurement area merging method based on DEM
CN115937446A (en) Terrain mapping device and method based on AR technology
KR102262120B1 (en) Method of providing drone route
Han et al. On-site vs. laboratorial implementation of camera self-calibration for UAV photogrammetry
CN108195359A (en) The acquisition method and system of spatial data
CN210027896U (en) Fine inclined aerial photography device for vertical face of inverted cliff
CN116824079A (en) Three-dimensional entity model construction method and device based on full-information photogrammetry
CN215767057U (en) Dynamic adjusting device for improving precision of rock mass of complex slope investigated by unmanned aerial vehicle
CN115965743A (en) Three-dimensional modeling system and method based on VR and oblique photography collected data
Pagliari et al. Use of fisheye parrot bebop 2 images for 3d modelling using commercial photogrammetric software
CN114943890A (en) Transformer substation field flatness identification method adopting unmanned aerial vehicle-mounted laser point cloud
Calantropio et al. 360 images for UAV multisensor data fusion: First tests and results

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant