CN113012292A - AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography - Google Patents
AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography Download PDFInfo
- Publication number
- CN113012292A CN113012292A CN202110467979.3A CN202110467979A CN113012292A CN 113012292 A CN113012292 A CN 113012292A CN 202110467979 A CN202110467979 A CN 202110467979A CN 113012292 A CN113012292 A CN 113012292A
- Authority
- CN
- China
- Prior art keywords
- model
- target area
- point cloud
- construction
- construction target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000010276 construction Methods 0.000 title claims abstract description 151
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000012544 monitoring process Methods 0.000 title claims abstract description 27
- 238000013507 mapping Methods 0.000 claims abstract description 34
- 238000013461 design Methods 0.000 claims abstract description 8
- 230000004927 fusion Effects 0.000 claims description 26
- 238000005259 measurement Methods 0.000 claims description 23
- 238000004458 analytical method Methods 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000007781 pre-processing Methods 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 2
- 238000010129 solution processing Methods 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography, which are characterized in that an outdoor building BIM model is generated by four means of unmanned aerial vehicle inclination model, unmanned aerial vehicle laser radar surveying and mapping, field laser radar surveying and mapping and field high-definition video monitoring, then the outdoor building BIM model is compared with the BIM model during design, construction deviation is calculated, analyzed and obtained, and an engineering project report is generated; the method can also calculate and obtain the terrain data and the three-dimensional model of the site, and can identify and take out the relevant data of the area through the geographical position and the building characteristics when an engineer or a supervisor brings an AR device by combining a GIS map, thereby realizing the real-time information acquisition and finding the problems in real time; and prompting manual check for the area with larger deviation to ensure the implementation of the construction quality of the highway.
Description
Technical Field
The invention relates to the technical field of building model measurement and calculation, in particular to an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography.
Background
At present, most of expressway construction sites are in the field, and the system for expressway construction and supervision in the existing informatization technology is weak, mostly depends on manual work to detect or measure, is slow in data synchronization, cannot find problems in time, cannot acquire site conditions quickly, and is low in engineering quantity and efficiency. Compared manual measurement is adopted in the measurement mode, data cannot be updated in time, and information acquisition encounters obstacles. The project management system is filled in manually, and the progress report of the construction site cannot be synchronized immediately, so that inconvenience is brought to supervision. Meanwhile, project management and supervision personnel cannot track and monitor construction conditions in real time through an informatization means, site details can be well acquired when the construction conditions reach an outdoor construction site, and no better analysis system provides site information.
Disclosure of Invention
In order to overcome the technical defects, the invention aims to provide an AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography, which can improve the remote construction information acquisition speed.
The invention discloses an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography, which comprises the following steps: calibrating a construction target area; shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud to obtain a first model point cloud; performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle according to second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud to obtain an initial fusion model, wherein the registration precision is higher than a preset registration precision; conducting aerial triangulation on the initial fusion model, and then leading the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and obtaining progress data of the actually measured BIM model, and generating a construction progress report of the construction target area; comparing the actual measurement BIM model with a design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area; and acquiring the construction progress report and the geographical live-action picture through AR equipment.
Preferably, the shooting, by the unmanned aerial vehicle, the region of interest in the construction target area with the first flight parameter in the construction target area to obtain the aerial tilt model image includes: during shooting, the position of fixed-point exposure is ensured to be accurate; the laser radar mapping of the region of interest in the construction target area by the unmanned aerial vehicle with the second flight parameters in the construction target area to obtain the unmanned aerial vehicle laser radar point cloud comprises: the ground GPS base station is erected, and the unmanned plane compass is calibrated according to GPS signals.
Preferably, the performing site lidar mapping on the construction target area to obtain a site lidar point cloud comprises: setting a plurality of survey stations at intervals of a first preset distance in the construction target area, wherein each survey station sends a laser radar for surveying and mapping; and a first preset distance between adjacent stations is smaller than the scanning distance of the stations, so that the areas scanned by each station have overlapped parts.
Preferably, said fusing said aerial tilt model image with said venue lidar point cloud to obtain a first model point cloud comprises: carrying out image adjustment on the aerial photography inclination model image, wherein the image adjustment comprises brightness adjustment, saturation adjustment and contrast adjustment; performing aerial triangulation on the adjusted aerial tilt model image to determine the position and direction during shooting; extracting characteristic points of the aerial photography inclination model image and matching the characteristic points; then connecting the characteristic points; bringing all the aerial oblique model images of the construction target area into a uniform object coordinate system; performing aerial triangulation calculation on the aerial inclination model image to generate a three-dimensional sparse point cloud; and fusing the three-dimensional sparse point cloud and the field laser radar point cloud to obtain a first model point cloud.
Preferably, the laser radar mapping, by the drone, the region of interest within the construction target area with the second flight parameters to obtain the drone laser radar point cloud comprises:
performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to second flight parameters to obtain radar mapping data, performing combined solution processing on the radar mapping data and GPS data of a ground radar base station, and obtaining accurate position information and attitude information of the unmanned aerial vehicle at each event point meeting the point cloud processing requirement; solving a point cloud with an attitude position by using a positioning and attitude determination system; filtering the noise points of the point cloud by adopting a preset elevation limit interpolation value; and converting the coordinate system of the point cloud into a coordinate system consistent with the aerial inclination model image.
Preferably, the calculation obtains progress data of the actually measured BIM model, so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area further comprises:
comparing the model deviation value with a first preset deviation value, and calculating to obtain progress data of the actually-measured BIM only when the model deviation value is smaller than the first preset deviation value, so as to generate a construction progress report of the construction target area; and guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area.
Preferably, the comparing the model deviation value with a first preset deviation value further comprises: and comparing the model deviation value with a second preset deviation value, and when the model deviation value is greater than the second preset deviation value, highlighting the area corresponding to the model deviation value.
Preferably, the obtaining the construction progress report and the geographical live-action map through the AR device further includes: and acquiring the construction parameters, the environmental parameters and the site high-definition video of the construction target area through AR equipment.
Preferably, the first flight parameters comprise the number of flight paths, the direction of the flight paths, the course overlapping degree, the side overlapping degree, the flight speed, the flight height, the resolution of the tripod head and the angle of the lens of the tripod head; the second flight parameters comprise the number of flight paths, the direction of the flight paths, the area overlapping degree, the flight speed and the flight height; the course direction overlapping degree is 75%, the side direction overlapping degree is 65%, and the flying height is 110 m.
The invention also discloses an AR remote construction monitoring system based on unmanned aerial vehicle aerial photography, which comprises an unmanned aerial vehicle, a model fusion module, a model registration module, a model preprocessing module, a model analysis module and AR equipment;
after a construction target area is calibrated, shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud through the model fusion module to obtain a first model point cloud; performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle according to second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video through the model fusion module to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud through the model registration module to obtain an initial fusion model, wherein the registration precision is higher than a preset registration precision; importing the initial fusion model into the model preprocessing module to perform aerial triangulation, and then importing the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and acquiring progress data of the actually-measured BIM through the model analysis module so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model through the model analysis module to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area; and acquiring the construction progress report and the geographical live-action map through the AR equipment.
After the technical scheme is adopted, compared with the prior art, the method has the following beneficial effects:
1. adopting four means of unmanned aerial vehicle inclination model, unmanned aerial vehicle laser radar mapping, field laser radar mapping and field high-definition video monitoring to generate an outdoor building BIM model, then comparing the outdoor building BIM model with the BIM model during design, calculating, analyzing, acquiring construction deviation and generating an engineering project report; the method can also calculate and obtain the terrain data and the three-dimensional model of the site, and can identify and take out the relevant data of the area through the geographical position and the building characteristics when an engineer or a supervisor brings an AR device by combining a GIS map, thereby realizing the real-time information acquisition and finding the problems in real time; prompting manual check for areas with large deviation to ensure the implementation of the construction quality of the highway;
2. the invention adopts high-speed three-dimensional laser to scan dense point cloud, and is suitable for severe environment; the real scene replication of a construction site is realized through modeling, so that the communication between managers and site constructors is smooth; the unmanned aerial vehicle is not needed to be cleaned in the measuring process, so that the construction progress is not disturbed.
Drawings
FIG. 1 is a flow chart of an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography provided by the invention;
fig. 2 is a block diagram of a specific implementation flow of the AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography provided by the present invention.
Detailed Description
The advantages of the invention are further illustrated in the following description of specific embodiments in conjunction with the accompanying drawings.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the description of the present invention, it is to be understood that the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience of description and for simplicity of description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention.
In the description of the present invention, unless otherwise specified and limited, it is to be noted that the terms "mounted," "connected," and "connected" are to be interpreted broadly, and may be, for example, a mechanical connection or an electrical connection, a communication between two elements, a direct connection, or an indirect connection via an intermediate medium, and specific meanings of the terms may be understood by those skilled in the art according to specific situations.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
Referring to the attached drawings 1-2, the invention discloses an AR remote construction monitoring method based on unmanned aerial vehicle aerial photography, which is preferably applied to a construction site of a highway. Firstly, a construction target area is calibrated, and a plurality of radar mapping stations are set in the construction target area for radar mapping. And then, acquiring an unmanned aerial vehicle inclination model, an unmanned aerial vehicle laser radar surveying point cloud, a field laser radar surveying point cloud and a field high-definition video monitoring video through unmanned aerial vehicle aerial photography and field measurement.
The unmanned aerial vehicle shoots an interested area in the construction target area according to the first flight parameters in the construction target area so as to obtain an aerial photography inclination model image. Firstly, calibrating an unmanned aerial vehicle compass, setting a flight path and various parameters of an aircraft in a target measurement area according to terrain conditions, wherein the parameters comprise course number, course direction, area overlapping degree, flight speed, flight altitude, tripod head resolution, tripod head lens angle and the like, the course overlapping degree is usually 75%, the side overlapping degree is usually 65%, the flight altitude is usually 110m, and the course overlapping degree, the side overlapping degree and the flight altitude are required to be adjusted according to actual conditions; the flying height of the unmanned aerial vehicle and the shooting angle of the camera are adjusted, and manual shooting is carried out on the interest area. The first flight parameter is the flight parameter of the unmanned aerial vehicle in the shooting process of the inclination model image. In the shooting process of the unmanned aerial vehicle, the position of fixed-point exposure needs to be ensured to be accurate.
Because the photo video that unmanned aerial vehicle took photo by plane receives many-sided influence such as weather, illumination, there are luminance inconsistent, the not clear scheduling problem of photo in the image of taking photo by plane to lead to follow-up image matching mistake. Therefore, the aerial images need to be preprocessed, and in one embodiment, Photoshop software is used to adjust the brightness, saturation, contrast and other aspects of the aerial images, so that the images are clear and consistent in brightness.
After the image is preprocessed, aerial triangulation is needed to be performed on the image data, so that the position and the direction of the image when being shot are determined, namely, the internal and external orientation elements of the photo are determined. And extracting characteristic points and matching of the characteristic points, connecting the characteristic points, and bringing all images of the measured area into a uniform object coordinate system. And after the calculation of the aerial triangulation is completed, generating a three-dimensional sparse point cloud.
Because the problems of distortion, cavities and the like of some detail areas and ground object models exist in the aerial photography influence, laser radar data supplement needs to be carried out on the interest area to make up for the defects of the models.
And carrying out site laser radar mapping on the construction target area to obtain site laser radar point cloud data. And setting a plurality of survey stations at intervals of a first preset distance in the construction target area, and sending a laser radar by each survey station for scanning and surveying. The placement position of the scanning point needs to be determined according to the scanning range and required precision of the laser radar, and the farther the distance of the scanning point is, the lower the point cloud precision and the distance measurement precision are, so that in order to acquire high-precision data, the actual scanning distance of one measuring station needs to be controlled within the optimal range of the equipment. In order to conveniently and intelligently splice point clouds, a certain contact ratio is required between adjacent stations, and connection between the adjacent stations can be guaranteed. The first preset distance between adjacent stations is less than the scanning distance of the stations so that there is an overlap in the area scanned by each station.
The method comprises the steps of fusing a tilt model acquired by aerial photography of the unmanned aerial vehicle with field laser radar point cloud data to acquire first model point cloud, converting a coordinate system into the coordinate system identical to the tilt model of the unmanned aerial vehicle after point cloud data generated by the laser radar is exported, and fusing and modeling with the data.
And the unmanned aerial vehicle carries out laser radar mapping on the region of interest in the construction target area by using the second flight parameters in the construction target area so as to obtain the unmanned aerial vehicle laser radar point cloud. For the precision of the laser radar, a ground GPS base station needs to be erected, and an unmanned aerial vehicle compass is calibrated firstly; setting various parameters of a flight route and an aircraft in a target measurement area according to terrain conditions, wherein the parameters comprise course number, course direction, area overlapping degree, flight speed, flight altitude, tripod head resolution, tripod head lens angle and the like, the course overlapping degree is usually 75%, the side overlapping degree is usually 65%, the flight altitude is usually 110m, and the course overlapping degree, the side overlapping degree and the flight altitude are required to be adjusted according to actual conditions; the flying height of the unmanned aerial vehicle and the shooting angle of the camera are adjusted, and manual shooting is carried out on the interest area. The second flight parameter is the flight parameter of the unmanned aerial vehicle in the shooting process of the inclination model image. In the shooting process of the unmanned aerial vehicle, the position of fixed-point exposure needs to be ensured to be accurate.
And after the scanning is finished, the laser radar scanning control box is communicated, the original data of the laser radar is derived through software, and the software is accessed into the base station data everywhere in the base station control box. And opening the data through software after the data is exported, and verifying the integrity of the data. And performing combined resolving treatment on the derived POS data of the unmanned aerial vehicle and the GPS data acquired by the base station by using software, so as to obtain accurate position information and attitude information of the unmanned aerial vehicle at each event point, which meet the requirements of point cloud treatment.
And carrying out video shooting on the construction target area to obtain a site high-definition video. And fusing the unmanned aerial vehicle laser radar point cloud and the field high-definition video to obtain a second model point cloud.
For the unmanned aerial vehicle laser radar point cloud, laser radar data processing software is needed, reasonable high limit interpolation is set to filter noise points, a coordinate system of the laser radar point cloud after filtering is a relative coordinate system, the coordinate system needs to be converted into a coordinate system the same as that of an unmanned aerial vehicle tilt model, and fusion modeling can be carried out on the data with the coordinate system.
Model registration is carried out through a pairing algorithm on the first model point cloud and the second model point cloud which are acquired through an unmanned aerial vehicle inclination model, an unmanned aerial vehicle laser radar mapping point cloud, a field laser radar mapping point cloud and a field high-definition video monitoring video. The registration accuracy is higher than the preset registration accuracy, which is a large value, and the aim is to ensure the registration accuracy.
Firstly, the first model point cloud and the second model point cloud are preliminarily registered and registered manually through software, then the ICP algorithm is used for carrying out fine registration, and the registration accuracy is controlled to be within the required accuracy range. And then importing the registered laser point cloud data and unmanned aerial vehicle image data into software, and carrying out aerial triangulation in the software to finally obtain an actually measured BIM. The actual measurement BIM model solves the problems of three-dimensional model distortion and deformation and three-dimensional model ground object cavities, and greatly improves the texture precision of the three-dimensional model.
The actual measurement BIM model is connected into a BIM system, and the deviation between the designed BIM model and the actual measurement BIM model can be clearly seen. The deviation between the actually measured BIM model and the designed BIM model is calculated through model comparison, the deviation of the whole model can be calculated, and the deviation of partial model surface issuing can be manually specified. And finally generating a deviation report through the deviation data. And the construction progress data and the workload data can be obtained through the calculation model, and a construction progress report, a workload report and a construction progress chart are generated.
And importing an actual project management progress report, a workload report and a construction progress to automatically generate a project management report.
And (4) guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area. The project management report and the geographical live-action map of the construction target area are connected to the AR equipment, so that a supervisor can remotely obtain the construction progress report and the geographical live-action map through the AR equipment, and information can be timely and accurately obtained.
After calculating the deviation between the actually measured BIM model and the designed BIM model, the model deviation value needs to be compared with a first preset deviation value, which is an allowable deviation critical value. And when the model deviation value is smaller than a first preset deviation value, calculating to obtain progress data of the actually measured BIM so as to generate data such as a construction progress report of the construction target area, and importing the actually measured BIM and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area. If the model deviation value is larger than the first preset deviation value, the cause of the model deviation value needs to be analyzed, namely the measurement error or the construction error.
Preferably, for some places with larger deviation values, attention needs to be paid, the model deviation value needs to be compared with a second preset deviation value, and when the model deviation value is larger than the second preset deviation value, the area corresponding to the model deviation value is highlighted to prompt a supervisor.
The highlighting manner includes a color indication or a character indication.
It should be noted that, usually, the second preset deviation value of the whole model is consistent, that is, the deviation standard of the whole construction area is consistent, and for some special scenes, there are detail areas that need special attention, and the second preset deviation value of the area should be different from other areas, and the value should be smaller, and the requirement is higher.
Preferably, the AR equipment can also be used for acquiring construction parameters, environment parameters and site high-definition videos of the construction target area. Including concrete temperature monitoring, environmental monitoring, and the like.
For a construction site, not only the field earthwork and the mountain development earthwork need to be concerned, but also the quality of a finished project needs to be concerned.
The invention also discloses an AR remote construction monitoring system based on unmanned aerial vehicle aerial photography, which comprises:
-an unmanned aerial vehicle with a camera and radar scanning assembly for performing aerial photography and radar laser scanning of the construction site;
a model fusion module for fusing the point cloud data and the video data of the lidar to obtain a fusion model;
a model registration module for algorithmically registering the two fused models to ensure the accuracy of the subsequent live-action modeling;
a model preprocessing module, configured to perform preprocessing, such as aerial triangulation, on the registered model to meet subsequent analysis requirements;
the model analysis module is used for acquiring data through the model and analyzing the data to obtain a model deviation value and a construction progress report;
-an AR device for presenting a construction progress report and a geographical live-action map.
After the construction target area is calibrated, shooting an interested area in the construction target area by the unmanned aerial vehicle in the construction target area according to the first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on a construction target area to obtain site laser radar point cloud; and fusing the aerial inclination model image and the field laser radar point cloud through a model fusion module to obtain a first model point cloud.
Performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to the second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on a construction target area to obtain a site high-definition video; and fusing the unmanned aerial vehicle laser radar point cloud and the field high-definition video through the model fusion module to obtain a second model point cloud.
Registering the first model point cloud and the second model point cloud through a model registration module to obtain an initial fusion model, wherein the registration precision is higher than the preset registration precision; and leading the initial fusion model into a model preprocessing module for aerial triangulation, and then leading the initial fusion model into a BIM system to obtain an actually measured BIM.
Calculating and acquiring progress data of the actually measured BIM through a model analysis module so as to generate a construction progress report of a construction target area; comparing the actually measured BIM model with the designed BIM model through a model analysis module to obtain a model deviation value, and guiding the actually measured BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of a construction target area; and acquiring the construction progress report and the geographical live-action map through the AR equipment.
It should be noted that the embodiments of the present invention have been described in terms of preferred embodiments, and not by way of limitation, and that those skilled in the art can make modifications and variations of the embodiments described above without departing from the spirit of the invention.
Claims (10)
1. An AR remote construction monitoring method based on unmanned aerial vehicle aerial photography is characterized by comprising the following steps:
calibrating a construction target area;
shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud to obtain a first model point cloud;
performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle according to second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud to obtain an initial fusion model, wherein the registration precision is higher than a preset registration precision;
conducting aerial triangulation on the initial fusion model, and then leading the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and obtaining progress data of the actually measured BIM model, and generating a construction progress report of the construction target area; comparing the actual measurement BIM model with a design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area;
and acquiring the construction progress report and the geographical live-action picture through AR equipment.
2. The AR remote construction monitoring method according to claim 1, wherein the capturing of the region of interest within the construction target area with the first flight parameters by the unmanned aerial vehicle to obtain the aerial tilt model image comprises: during shooting, the position of fixed-point exposure is ensured to be accurate;
the laser radar mapping of the region of interest in the construction target area by the unmanned aerial vehicle with the second flight parameters in the construction target area to obtain the unmanned aerial vehicle laser radar point cloud comprises: the ground GPS base station is erected, and the unmanned plane compass is calibrated according to GPS signals.
3. The AR remote construction monitoring method of claim 1, wherein the performing site lidar mapping on the construction target area to obtain a site lidar point cloud comprises:
setting a plurality of survey stations at intervals of a first preset distance in the construction target area, wherein each survey station sends a laser radar for surveying and mapping; and a first preset distance between adjacent stations is smaller than the scanning distance of the stations, so that the areas scanned by each station have overlapped parts.
4. The AR remote construction monitoring method of claim 1, wherein the fusing the aerial tilt model image with the field lidar point cloud to obtain a first model point cloud comprises:
carrying out image adjustment on the aerial photography inclination model image, wherein the image adjustment comprises brightness adjustment, saturation adjustment and contrast adjustment;
performing aerial triangulation on the adjusted aerial tilt model image to determine the position and direction during shooting;
extracting characteristic points of the aerial photography inclination model image and matching the characteristic points; then connecting the characteristic points; bringing all the aerial oblique model images of the construction target area into a uniform object coordinate system;
performing aerial triangulation calculation on the aerial inclination model image to generate a three-dimensional sparse point cloud; and fusing the three-dimensional sparse point cloud and the field laser radar point cloud to obtain a first model point cloud.
5. The AR remote construction monitoring method of claim 1, wherein the performing, by the UAV, a lidar mapping at the construction target area of interest within the construction target area with second flight parameters to obtain a UAV lidar point cloud comprises:
performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle in the construction target area according to second flight parameters to obtain radar mapping data, performing combined solution processing on the radar mapping data and GPS data of a ground radar base station, and obtaining accurate position information and attitude information of the unmanned aerial vehicle at each event point meeting the point cloud processing requirement;
solving a point cloud with an attitude position by using a positioning and attitude determination system;
filtering the noise points of the point cloud by adopting a preset elevation limit interpolation value;
and converting the coordinate system of the point cloud into a coordinate system consistent with the aerial inclination model image.
6. The AR remote construction monitoring method according to claim 1, wherein the calculation obtains progress data of the actually measured BIM model, thereby generating a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area further comprises:
comparing the model deviation value with a first preset deviation value, and calculating to obtain progress data of the actually-measured BIM only when the model deviation value is smaller than the first preset deviation value, so as to generate a construction progress report of the construction target area; and guiding the actually measured BIM model and the model deviation into a GIS system to obtain a geographical live-action map of the construction target area.
7. The AR remote construction monitoring method according to claim 1, wherein the comparing the model deviation value to a first preset deviation value further comprises:
and comparing the model deviation value with a second preset deviation value, and when the model deviation value is greater than the second preset deviation value, highlighting the area corresponding to the model deviation value.
8. The AR remote construction monitoring method according to claim 1, wherein the obtaining of the construction progress report and the geographic map by the AR device further comprises:
and acquiring the construction parameters, the environmental parameters and the site high-definition video of the construction target area through AR equipment.
9. The AR remote construction monitoring method according to claim 1, wherein the first flight parameters include course number, course direction, course overlap, side overlap, flight speed, flight altitude, pan-tilt resolution, pan-tilt lens angle;
the second flight parameters comprise the number of flight paths, the direction of the flight paths, the area overlapping degree, the flight speed and the flight height;
the course direction overlapping degree is 75%, the side direction overlapping degree is 65%, and the flying height is 110 m.
10. An AR remote construction monitoring system based on unmanned aerial vehicle aerial photography is characterized by comprising an unmanned aerial vehicle, a model fusion module, a model registration module, a model preprocessing module, a model analysis module and AR equipment;
after a construction target area is calibrated, shooting an interested area in the construction target area by an unmanned aerial vehicle in the construction target area according to first flight parameters to obtain an aerial photography inclination model image; performing site laser radar mapping on the construction target area to obtain site laser radar point cloud; fusing the aerial inclination model image with the field laser radar point cloud through the model fusion module to obtain a first model point cloud;
performing laser radar mapping on the region of interest in the construction target area by the unmanned aerial vehicle according to second flight parameters to obtain unmanned aerial vehicle laser radar point cloud; carrying out video shooting on the construction target area to obtain a site high-definition video; fusing the unmanned aerial vehicle laser radar point cloud with the field high-definition video through the model fusion module to obtain a second model point cloud;
registering the first model point cloud and the second model point cloud through the model registration module to obtain an initial fusion model, wherein the registration precision is higher than a preset registration precision;
importing the initial fusion model into the model preprocessing module to perform aerial triangulation, and then importing the initial fusion model into a BIM system to obtain an actually measured BIM model;
calculating and acquiring progress data of the actually-measured BIM through the model analysis module so as to generate a construction progress report of the construction target area; comparing the actual measurement BIM model with the design BIM model through the model analysis module to obtain a model deviation value, and guiding the actual measurement BIM model and the model deviation into a GIS (geographic information system) to obtain a geographic live-action map of the construction target area;
and acquiring the construction progress report and the geographical live-action map through the AR equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110467979.3A CN113012292B (en) | 2021-04-28 | 2021-04-28 | AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110467979.3A CN113012292B (en) | 2021-04-28 | 2021-04-28 | AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113012292A true CN113012292A (en) | 2021-06-22 |
CN113012292B CN113012292B (en) | 2023-02-24 |
Family
ID=76380860
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110467979.3A Active CN113012292B (en) | 2021-04-28 | 2021-04-28 | AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113012292B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114489143A (en) * | 2022-04-02 | 2022-05-13 | 清华大学 | Unmanned aerial vehicle management system, method and device for construction safety risk monitoring |
WO2022239315A1 (en) * | 2021-05-12 | 2022-11-17 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Information processing method, information processing device, and program |
CN115526739A (en) * | 2022-09-16 | 2022-12-27 | 杭州天界数字科技有限公司 | Building engineering progress monitoring method based on BIM and machine vision |
CN116448080A (en) * | 2023-06-16 | 2023-07-18 | 西安玖安科技有限公司 | Unmanned aerial vehicle-based oblique photography-assisted earth excavation construction method |
CN116907449A (en) * | 2023-07-11 | 2023-10-20 | 中交第三公路工程局有限公司 | BIM and oblique photography based measurement method for highway engineering |
CN117236883A (en) * | 2023-09-11 | 2023-12-15 | 无锡建设监理咨询有限公司 | Engineering project supervision method, device, equipment and medium based on AR technology |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662179A (en) * | 2012-05-18 | 2012-09-12 | 四川省科学城久利科技实业有限责任公司 | Three-dimensional optimizing route selection method based on airborne laser radar |
CN107291223A (en) * | 2017-06-07 | 2017-10-24 | 武汉大学 | A kind of super large data volume virtual reality space Information Visualization System and method |
CN108375367A (en) * | 2018-01-25 | 2018-08-07 | 中铁第四勘察设计院集团有限公司 | Combined ground laser radar and the work of oblique photograph point surveying method and system |
CN109520479A (en) * | 2019-01-15 | 2019-03-26 | 成都建工集团有限公司 | Method based on unmanned plane oblique photograph auxiliary earth excavation construction |
CN109558622A (en) * | 2018-09-19 | 2019-04-02 | 中建科技有限公司深圳分公司 | A kind of execution management method therefor and device scanned based on cloud |
CN110285792A (en) * | 2019-07-02 | 2019-09-27 | 山东省交通规划设计院 | A kind of fine grid earthwork metering method of unmanned plane oblique photograph |
CN111737790A (en) * | 2020-05-12 | 2020-10-02 | 中国兵器科学研究院 | Method and equipment for constructing simulated city model |
CN111951398A (en) * | 2020-07-27 | 2020-11-17 | 中建三局第二建设工程有限责任公司 | Intelligent lofting construction method based on unmanned aerial vehicle oblique image technology |
-
2021
- 2021-04-28 CN CN202110467979.3A patent/CN113012292B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662179A (en) * | 2012-05-18 | 2012-09-12 | 四川省科学城久利科技实业有限责任公司 | Three-dimensional optimizing route selection method based on airborne laser radar |
CN107291223A (en) * | 2017-06-07 | 2017-10-24 | 武汉大学 | A kind of super large data volume virtual reality space Information Visualization System and method |
CN108375367A (en) * | 2018-01-25 | 2018-08-07 | 中铁第四勘察设计院集团有限公司 | Combined ground laser radar and the work of oblique photograph point surveying method and system |
CN109558622A (en) * | 2018-09-19 | 2019-04-02 | 中建科技有限公司深圳分公司 | A kind of execution management method therefor and device scanned based on cloud |
CN109520479A (en) * | 2019-01-15 | 2019-03-26 | 成都建工集团有限公司 | Method based on unmanned plane oblique photograph auxiliary earth excavation construction |
CN110285792A (en) * | 2019-07-02 | 2019-09-27 | 山东省交通规划设计院 | A kind of fine grid earthwork metering method of unmanned plane oblique photograph |
CN111737790A (en) * | 2020-05-12 | 2020-10-02 | 中国兵器科学研究院 | Method and equipment for constructing simulated city model |
CN111951398A (en) * | 2020-07-27 | 2020-11-17 | 中建三局第二建设工程有限责任公司 | Intelligent lofting construction method based on unmanned aerial vehicle oblique image technology |
Non-Patent Citations (2)
Title |
---|
张文胜 等: "基于点云的隧道改建工程BIM建模方法与实践", 《长安大学学报(自然科学版)》 * |
郑超: "基于机载LiDAR点云的建筑物三维重建", 《测绘技术装备》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022239315A1 (en) * | 2021-05-12 | 2022-11-17 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Information processing method, information processing device, and program |
CN114489143A (en) * | 2022-04-02 | 2022-05-13 | 清华大学 | Unmanned aerial vehicle management system, method and device for construction safety risk monitoring |
CN115526739A (en) * | 2022-09-16 | 2022-12-27 | 杭州天界数字科技有限公司 | Building engineering progress monitoring method based on BIM and machine vision |
CN115526739B (en) * | 2022-09-16 | 2023-06-30 | 杭州天界数字科技有限公司 | Building engineering progress monitoring method based on BIM and machine vision |
CN116448080A (en) * | 2023-06-16 | 2023-07-18 | 西安玖安科技有限公司 | Unmanned aerial vehicle-based oblique photography-assisted earth excavation construction method |
CN116448080B (en) * | 2023-06-16 | 2023-09-05 | 西安玖安科技有限公司 | Unmanned aerial vehicle-based oblique photography-assisted earth excavation construction method |
CN116907449A (en) * | 2023-07-11 | 2023-10-20 | 中交第三公路工程局有限公司 | BIM and oblique photography based measurement method for highway engineering |
CN117236883A (en) * | 2023-09-11 | 2023-12-15 | 无锡建设监理咨询有限公司 | Engineering project supervision method, device, equipment and medium based on AR technology |
Also Published As
Publication number | Publication date |
---|---|
CN113012292B (en) | 2023-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113012292B (en) | AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography | |
CN111597666B (en) | Method for applying BIM to transformer substation construction process | |
WO2022078240A1 (en) | Camera precise positioning method applied to electronic map, and processing terminal | |
CN112113542A (en) | Method for checking and accepting land special data for aerial photography construction of unmanned aerial vehicle | |
CN112147633A (en) | Power line safety distance detection method | |
CN111006646B (en) | Method for monitoring construction progress based on unmanned aerial vehicle oblique photography measurement technology | |
CN112652065A (en) | Three-dimensional community modeling method and device, computer equipment and storage medium | |
CN115937446A (en) | Terrain mapping device and method based on AR technology | |
Raczynski | Accuracy analysis of products obtained from UAV-borne photogrammetry influenced by various flight parameters | |
KR102262120B1 (en) | Method of providing drone route | |
CN113034470B (en) | Asphalt concrete thickness nondestructive testing method based on unmanned aerial vehicle oblique photography technology | |
CN109883398A (en) | The system and method that the green amount of plant based on unmanned plane oblique photograph is extracted | |
CN113643254A (en) | Efficient collection and processing method for laser point cloud of unmanned aerial vehicle | |
CN116129064A (en) | Electronic map generation method, device, equipment and storage medium | |
CN215767057U (en) | Dynamic adjusting device for improving precision of rock mass of complex slope investigated by unmanned aerial vehicle | |
CN115965743A (en) | Three-dimensional modeling system and method based on VR and oblique photography collected data | |
CN116883604A (en) | Three-dimensional modeling technical method based on space, air and ground images | |
Zhou et al. | Application of UAV oblique photography in real scene 3d modeling | |
CN110780313A (en) | Unmanned aerial vehicle visible light stereo measurement acquisition modeling method | |
CN117557931B (en) | Planning method for meter optimal inspection point based on three-dimensional scene | |
CN108050995B (en) | Oblique photography non-image control point aerial photography measurement area merging method based on DEM | |
CN114943890A (en) | Transformer substation field flatness identification method adopting unmanned aerial vehicle-mounted laser point cloud | |
CN116753962B (en) | Route planning method and device for bridge | |
CN108195359A (en) | The acquisition method and system of spatial data | |
CN117308915A (en) | Surveying and mapping system for special topography in surveying and mapping engineering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |