US11288412B2 - Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays - Google Patents
Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays Download PDFInfo
- Publication number
- US11288412B2 US11288412B2 US15/956,266 US201815956266A US11288412B2 US 11288412 B2 US11288412 B2 US 11288412B2 US 201815956266 A US201815956266 A US 201815956266A US 11288412 B2 US11288412 B2 US 11288412B2
- Authority
- US
- United States
- Prior art keywords
- image
- bim
- point cloud
- images
- cloud model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000010276 construction Methods 0.000 title claims description 141
- 238000012544 monitoring process Methods 0.000 title claims description 10
- 230000001934 delay Effects 0.000 title description 21
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 45
- 238000012545 processing Methods 0.000 claims description 122
- 238000000034 method Methods 0.000 claims description 87
- 230000000007 visual effect Effects 0.000 claims description 26
- 238000003860 storage Methods 0.000 claims description 18
- 230000000694 effects Effects 0.000 claims description 16
- 239000003086 colorant Substances 0.000 claims description 11
- 230000015556 catabolic process Effects 0.000 claims description 10
- 230000003111 delayed effect Effects 0.000 claims description 10
- 230000009466 transformation Effects 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 6
- 239000000463 material Substances 0.000 claims description 5
- 238000004040 coloring Methods 0.000 claims description 4
- 230000003466 anti-cipated effect Effects 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 2
- 238000012800 visualization Methods 0.000 description 28
- 230000000875 corresponding effect Effects 0.000 description 26
- 230000008569 process Effects 0.000 description 23
- 230000003442 weekly effect Effects 0.000 description 20
- 230000006870 function Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 13
- 238000007726 management method Methods 0.000 description 13
- 230000003993 interaction Effects 0.000 description 10
- 238000013439 planning Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 9
- 229910000831 Steel Inorganic materials 0.000 description 7
- 239000010959 steel Substances 0.000 description 7
- 230000002354 daily effect Effects 0.000 description 6
- 230000002452 interceptive effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000012517 data analytics Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 238000013179 statistical model Methods 0.000 description 3
- 238000011960 computer-aided design Methods 0.000 description 2
- 239000004035 construction material Substances 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000009415 formwork Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000011435 rock Substances 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 238000009412 basement excavation Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000002311 subsequent effect Effects 0.000 description 1
- 230000009885 systemic effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000007794 visualization technique Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06313—Resource planning in a project environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06316—Sequencing of tasks or work
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0635—Risk analysis of enterprise or organisation activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
Definitions
- the present disclosure relates to construction site progress monitoring and planning, and more particularly, to computation of point cloud, as-built models for display and facilitation of user interaction with visualizations of construction progress monitoring and risk for delay.
- FIG. 1 is a diagram of a system for executing the disclosed modeling and visualization of a construction site according to an embodiment.
- FIGS. 2A and 2B are a flow diagram illustrating a flow of the disclosed methods for modeling and visualization of a construction site.
- FIG. 3 is a graph illustrating productivity in construction in thousands of dollars per worker since 1994.
- FIG. 4 is an image of a construction site that illustrates challenges associated with structure-from-motion modeling of the construction site, according to one embodiment.
- FIGS. 5A and 5B illustrate use of a graphical user interface to match fiduciary markers used for registration of point clouds or to associate a point cloud to a building information model (BIM) according to various embodiments.
- BIM building information model
- FIG. 6 is a set of images illustrating three-dimensional (3D) point clouds generated using images captured with a drone camera on the following jobsites, taken from the left to right: $500M Sacramento Kings stadium project in Sacramento, Calif. with Turner Construction; Athlete Village in Minneapolis, Minn. with Mortenson Construction; Wajiki Dam in Koji, Japan with Taisei Corporation;
- FIG. 7 is a set of images illustrating 3D point clouds generated using images captured with a drone camera on the following job sites, taken from left to right: Little rock project in Denver, Colo. with W.E. O'Neil construction; Zurich Insurance Company's headquarter project in Chicago, Ill. with Clayco Corp.; and McCormick Place hotel and stadium project in Chicago, Ill. with Clark Construction.
- FIG. 8 is a set of images from within a web-based interface illustrating registration of BIM (or point clouds) with 3D point clouds according to an embodiment.
- FIG. 9A is an image of a back projection of aerial images (e.g., from a drone camera) onto a point cloud according to an embodiment.
- FIG. 9B is an image of point cloud generated for the case study on the McCormick Place Hotel and Stadium project in Chicago, Ill.
- FIGS. 10A through 10C are images illustrating the impact of controlling the level of detail from low detail to high detail within a web-based interface according to various embodiments.
- FIG. 10D is an image illustrating a point cloud height map according to an embodiment.
- FIG. 10E is an image illustrating a point cloud in red green blue (RGB), with these colors arranged from top to bottom, according to one embodiment.
- RGB red green blue
- FIG. 10F is an image illustrating an octree used for pre-processing and point selection according to an embodiment.
- FIG. 11 is an image of the web-based interface in which a user may measure the volume in a 3D environment and track the changes from a different time, according to an embodiment.
- FIGS. 12A through 12D are images within a web-based interface illustrating the segmentation of point clouds for generating cross sections and volumetric measurements of a construction site according to an embodiment.
- FIG. 13 is an image of joint modeling and representation of BIM and point clouds within a web-based interface according to an embodiment.
- FIG. 14 is an image of the web-based interface in which a user may either annotate point clouds or their corresponding images to create and visualize process information according to project assignment as per an embodiment.
- FIG. 15 is an image illustrating, within the web-based interface, four dimensional (4D) simulation of construction activities, which includes a 3D point cloud model and scheduling information, according to an embodiment.
- FIG. 16A is a set of graphs that make up a progress report illustrating a performance overview of a construction project, including root causes for six-week delay, according to an embodiment.
- FIG. 16B is a graph illustrating a weekly work plan (WWP) and a look-ahead plan organized in work breakdown structure on which point clouds are recorded, as per various embodiments.
- WWP weekly work plan
- FIG. 16B is a graph illustrating a weekly work plan (WWP) and a look-ahead plan organized in work breakdown structure on which point clouds are recorded, as per various embodiments.
- FIG. 17A is a graph illustrating a productivity report with prior week point clouds, according to an embodiment.
- FIG. 17B is a graph illustrating the productivity report of FIG. 17A and further, from left to right, prior week quantities, prior week worker hours, and prior week productivity rates, according to various embodiments.
- FIG. 18A is a graph illustrating an at-risk location report organized by task based on work break down structure, WWP, and look-ahead schedule, according to an embodiment.
- FIGS. 18B and 18C are a graph illustrating tasks organized similar to a progress report, with additional task readiness columns, according to an embodiment.
- FIG. 19 is an image of the web-based interface illustrating a master schedule versus a weekly work plan that includes risk reporting, according to an embodiment.
- FIG. 20 is an image of a location visualization mode in which elements are grouped based on work breakdown structure (WBS), according to an embodiment.
- WBS work breakdown structure
- FIG. 21 is an image of a trade location mode within the web-based interface in which a trade location mode illustrates project assignment.
- FIG. 22 is an image of a planned mode within the web-based interface that illustrates a 4D BIM that also reflects project schedule, according to an embodiment.
- FIGS. 23A and 23B are images of a state of progress mode within the web-based interface that illustrates actual progress status on a BIM-based model, according to an embodiment.
- FIG. 23C is an image of an image-based legend for an advanced state-of-progress mode of the web-based interface according to an embodiment.
- FIG. 25 is a flow chart of a method for computation of point cloud, as-built models for display and facilitation of user interaction with visualizations of construction progress monitoring, according to various embodiments.
- FIG. 26 is a flow chart of a method for alignment of 3D point clouds generated at different times to illustrate construction progress according to an embodiment.
- FIG. 27 is a computer system that may be used for executing the modeling and visualization techniques and embodiments disclosed herein.
- the current disclosure provides access to actionable construction performance analytics to all project participants via transparent process views to project managers so that analytics may help to eliminate problems as soon as possible.
- the disclosed system and methods visually track and communicate project assignments, e.g., who does what work in what location, and enables efficient processes by preventing interferences among tens to hundreds of trades engaged in a construction project on a daily basis.
- the disclosed system tests the reliability and predictability of the planned execution, and offers high flexibility for changes via web-based interfaces that facilitate visualization of 3D point clouds (e.g., “as-built” models of completed portions of constructions site) as compared against scheduling constraints.
- Pilot projects illustrate that visual, at-risk location reports, together with as-planned, color-coded building information models (BIM) integrated with schedule information, empower contractors to collaboratively address problems in coordination meetings, provide a realistic view of total costs and project durations, enable better decisions, and ultimately improve productivity.
- BIM building information models
- the disclosed system and methods provide an end-to-end platform that offers actionable visual data analytics via schedule and task data, 3D BIMs, and images captured with unmanned aerial vehicles (UAVs) or ground cameras (e.g., consumer grade cameras, fixed time-lapse cameras, and smart phones cameras).
- UAVs unmanned aerial vehicles
- ground cameras e.g., consumer grade cameras, fixed time-lapse cameras, and smart phones cameras.
- UAVs are also commonly referred to as drones, and both terms may be used interchangeably herein.
- a 3D BIM is a computer-prepared building model of a planned structure, such as an architect would prepare for generation of blueprints for a construction project.
- Schedule data including the locations, start dates, and durations for a set of tasks, may be integrated with the 3D BIM by assigning elements in the BIM to tasks according to the locations of the elements and tasks.
- the BIM integrated with such scheduling data is called “4D BIM,” wherein the four in the “4D” is the scheduling data.
- the system may include data capture, processing, and delivery via an online interactive interface accessible through desktop computers, laptop computers, and mobile devices such as smartphones and tablets.
- the system may build 3D point cloud models reflecting the actual state of work in progress.
- the system may measure progress and analyze risk for delay at each location.
- the intuitive 3D interface facilitates communication among work crews (e.g., contractors, sub-contractors, and owners), tracks who does what work in what location (e.g., location-based work crew assignments), and visually communicates performance problems, requests for information (RFIs), and quality control reports.
- the system may also facilitate visualization of which tasks are at risk for delay by assigning colors to elements of the BIM (“color-coding”) and producing at-risk location reports to enable revision of short-term plans to head off delays at identified locations.
- color-coding colors to elements of the BIM
- Disclosed image-based 3D reconstruction may be ten times faster than current state-of-the-art tracking.
- the integration of 3D point cloud building models and construction schedule with actual 3D BIM, the visual data analytics solution including identification of at-risk locations, and an online 3D interface may function at low computational overhead, and thus facilitate user interaction via mobile devices such as smartphones, tablets, and the like.
- the system includes a processing device and computer storage coupled to the processing device.
- the computer storage may store a 3D BIM of a construction site, anchor images, each depicting a viewpoint having a 3D pose and containing features with known 3D positions with respect to the 3D BIM, and target images of the construction site with undetermined 3D position and orientation. Such target images are uncalibrated in not yet being registered to the 3D BIM.
- the 3D pose may be defined by a 3D position and orientation relative to the 3D BIM.
- the features may include at least one of structural points, edges, objects, or textured surfaces within the 3D BIM.
- the system may also include one or more 3D point clouds that depict the as-built conditions of the construction site, which may be derived from the anchor images or other images or 3D measurement devices.
- the system may also include a graphical user interface that displays images and point clouds with known 3D pose together with the BIM, enabling the user to compare as-built conditions to plans for the purpose of assessing progress towards task completion.
- the system may also include schedule data integrated with the BIM to further facilitate the assessment of task progress.
- the processing device may initialize a set of calibrated images, which have a known 3D pose relative to the 3D BIM.
- the processing device may detect target features within the target images and determine matches between the target features and the features of the anchor images, to generate a set of matching features.
- the processing device may further determine a subset of the target images that have at least a threshold number of the matching features and select a first image from the subset of the target images having the largest number of the matching features.
- the processing device may further execute an image-based reconstruction algorithm using the first image and the anchor images to calibrate the first image to the BIM and generate an initial 3D point cloud model of the construction site.
- the processing device may further incrementally repeat the last few steps to identify a second image from the subset of the target images and perform, starting with the initial 3D point cloud model and using the second image and the anchor images as constraints to the image-based reconstruction algorithm, 3D reconstruction to generate an updated 3D point cloud model of the construction site.
- the second image is selected as the target image that is not in the calibrated images set (e.g., not an anchor image) and has the most matching features with the calibrated image set, with the number of matching features being at least some threshold number.
- the processing device may further display a visual instantiation of the updated 3D point cloud model of the construction site in a graphical user interface of a display device, wherein the visual instantiation of the updated 3D point cloud model includes 3D points derived from at least the first image and the second image aligned to the 3D BIM.
- a system includes a processing device and a graphical user interface (GUI) executable by the processing device and accessible by a user device over a network.
- the processing device may be configured to retrieve a 4D BIM of a construction site, wherein the 4D BIM includes a 3D BIM that reflects which elements are to be constructed by a given date according to a construction schedule.
- the processing device may further retrieve a first 3D point cloud model of a construction site with known 3D pose relative to the 3D BIM, wherein the 3D pose is composed of a 3D position and orientation relative to the 3D BIM.
- the processing device may retrieve a second 3D point cloud model generated at a later time than the first 3D point cloud model with known 3D pose relative to the BIM.
- the processing device may further execute an alignment tool to display, in the GUI, a visual instantiation of the first 3D point cloud model and a visual instantiation of the second 3D point cloud model.
- the processing device may further receive a selection, through the GUI, of at least three points of the first 3D point cloud model and of the second 3D point cloud model that mutually correspond.
- the processing device may further align, using the alignment tool, the first 3D point cloud model with the second 3D point cloud model based on the at least three points.
- the processing device may also display, in the GUI, at least one of the first 3D point cloud model or the second 3D point cloud model superimposed on the 4D BIM to illustrate as-built construction progress over time with reference to the construction schedule.
- FIG. 1 is a diagram of a system 100 for executing the disclosed modeling and visualization of a construction site 10 .
- the system 100 may include a plurality of cameras 102 , 104 , and 105 , which may represent a multitude of cameras that may take photos that create images of the construction site 10 at different times. At least some of the cameras may be designated as anchor cameras from which reconstruction may be performed.
- the cameras may be standard mobile devices, UAV cameras, or a digital camera that embeds metadata (such as exchangeable image file format (EXIF) data) within the digital images taken.
- metadata such as exchangeable image file format (EXIF) data
- the system 100 may further include a plurality of photo images 106 generally referred to herein as digital images or simply images taken by the cameras of the construction site 10 (e.g., any kind of structure being built).
- the images 106 may be unordered and in any size, including one image up to hundreds or more images.
- the system may further include one or more users 108 and a processing device 110 .
- the users 108 may facilitate gathering the images by taking the pictures with the cameras and inputting them into the processing device 110 through known storage or transfer devices.
- the cameras 102 , 104 , 105 may stay in a certain location to take pictures on a regular schedule, and in an automated fashion, upload the images to the processing device, without user intervention.
- the images 106 may be uploaded directly, wirelessly, remotely over a network, and/or via the cloud.
- the camera locations may be varied and provide views of the construction site 10 at different angles or perspectives. Some cameras may, therefore, capture information of the construction site that other cameras may miss. Adding images from the plurality of cameras may, however, allow for a more complete compilation of progress of construction of a building or other structure. Images may be added to the processing device 110 incrementally over time.
- the processing device 110 may include a handler 124 for simultaneous localization and mapping (SLAM) algorithms, structure-from-motion (SfM) algorithms and the like, a graphical user interface (GUI) detector 128 , memory 132 , an image processor 136 , and a graphical user interface (GUI) 138 , among other components.
- the processing device may further include computer storage 139 working in conjunction with the memory 132 to store and provide to the processing device data and instructions.
- the computer storage 139 may include, but not be limited to, a building information model (BIM) 140 , images 144 , 4D visualizations, and scheduling information 152 such as work breakdown structure (WBS), weekly work plan (WWP), and the like. These may be databases that are joined into one database and/or otherwise include data that are related to each other through storage methodologies.
- BIM building information model
- WBS work breakdown structure
- WWP weekly work plan
- the BIM 140 may be a computer aided design (CAD) or digital architectural drawings containing computer-rendering information generally referred to as a mesh that corresponds to different model elements of the construction site.
- Model elements may be 3D drawings of walls, columns, or other structures and features to be constructed, together with data to inform the construction process, e.g., pertaining to job site location, construction materials, and constraints.
- This mesh may be designed to scale of an actual construction site, capable of correlation to portions of images of the construction site during construction, and may employ a hierarchical octree such as will be discussed on more detail.
- Such correlation makes possible registering and alignment of the images 106 against the 3D BIM and association of the registered images with the scheduling information 152 .
- the processing device 110 may form 4D visualizations for user viewing through the GUI 138 , indicating building progress for monitoring and planning purposes.
- the image processor 136 may help perform the modeling in conjunction with the handler 124 and, in some situations, the GUI interaction detector 128 .
- the GUI interaction handler 128 may detect points within the images 106 through the GUI 138 , which points may be correlated by the image processor 136 to locations with one of the BIMs 140 .
- the processing device 110 e.g., the image processor 136 , may align the images to the BIM during point cloud generation to enable a suite of visualization tools, in which point clouds and 4D BIM may be explored in 4D, evaluated in terms in terms of progress and building-plan deviations, and imbued with photorealistic architectural renderings.
- the benefits of the system 100 are relevant for both professional purposes and also customer interfaces.
- FIGS. 2A and 2B are a flow diagram illustrating a flow of the disclosed methods for modeling and visualization of a construction site.
- the flow diagram is disclosed in connection with an expanded description of the system 100 of FIG. 1 .
- the GUI 138 of the system 100 may include a UI front end 200 and the storage 139 may include a set of executables 280 , e.g., instructions for executing particular functions with reference to input data to generate certain output data to be stored and/or displayed.
- the blocks with a dashed-dotted pattern may be understood as database items, and therefore could be stored in one or more databases of the computer storage 139 , and may be stored in association with BIM-based and point-cloud based images or data.
- processing logic may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions run on a processing device), firmware, or a combination thereof.
- the processing logic may render images and receive data through a user interface that may be a web-based user interface, a stand-alone user interface, or a combination thereof.
- the UI front end 200 may include a login page 202 , a company management page 204 , and a project management page 206 .
- a user may create a new user ( 210 ) and log in using new user credentials ( 212 ).
- the company management page 204 may receive company data 224 and respond to user selection and information inputs to create a company ( 220 ), edit a company ( 222 ), create a project ( 228 ), and edit a project ( 230 ).
- Information used to create a company may be saved within a database for the company data 224 and information used to create a project may be saved within a database for project data 226 .
- the project management page 206 may provide a number of different options for viewing planned construction projects in association with 3D BIM, 3D point clouds, and scheduling information, along with a number of user interaction options as will be explained in more detail.
- the system 100 may execute a set of executables 280 , which may be stored within the computer storage 139 and at least partially loaded into the memory 132 when executed by the processing device 110 .
- the project management page 206 may include multiple page sections, including a viewer 232 , a file explorer 234 , and a schedule interface 236 .
- the viewer 232 may include a set of section components for performing various functions, including but not limited to, a user interface component 238 , a measure component 240 , and an annotation component 241 .
- the user interface component 238 may facilitate rendering images and/or visualizations for annotation, including, for example, a 4D BIM 231 , a 4D point cloud 233 (including octree data that will be discussed in more detail), discrete 3D point clouds, resultant point clouds with registered images, and the like.
- the 4D point cloud 233 may be understood to be a set of 3D point clouds over a time period, and thus tracks changes to the structure of the construction site 10 .
- the 4D BIM 233 may be the original construction site plan (building information model) with integrated scheduling information, or an updated plan created after construction began.
- the measure component 240 may interact with the BIM-based and point cloud-based visualizations rendered within the GUI 138 in order to facilitate determining distance, angles, and volumes of subparts and structures of the construction site 10 .
- the annotation component 241 may facilitate the addition of annotations 247 (e.g., text boxes, comments, lines, geometric shapes, and the like) on top of the BIM-based and point cloud-based visualizations.
- annotations 247 may allow users to highlight and identify areas of the construction site that may need attention, for example, are behind schedule or carry some additional risk.
- the UI front end 200 of FIGS. 2A and 2B may further include the file explorer 234 , which is a page section of the project management page 206 .
- the file explorer 234 may enable users to upload assets 253 ( 246 ), edit the assets 253 ( 248 ), process images ( 250 ), and process video ( 252 ), as will be explained.
- the assets 253 may include images, videos, and documents.
- a time-lapse executable 292 may generate raw video 293 , which may be uploaded during image-to-BIM registration (e.g., execution of a image-to-BIM registration executable 290 ) in order to process the raw video 293 .
- the video may be processed at block 252 in order to convert the video into a series of images, which in turn may be processed as other images would be starting with block 250 , which may also be a point cloud generation option for a user in one embodiment.
- the system 100 may respond to user-initiated requests to process images (whether through upload, submission of individual images, or video processed to generate a series of images) into either an existing point cloud or to generate a new point cloud as follows ( 250 ).
- the system 100 may determine whether there exists a point cloud associated with the project, e.g., a 3D or 4D point cloud ( 254 ). If there is not point cloud, then the image may be sent to a structure-form-motion (SfM) executable 286 (or similar executable for other image-based processing reconstruction algorithm) for point cloud generation and optional alignment to a BIM, which will be described in more detail.
- SfM structure-form-motion
- the system 100 may further determine to which of available point clouds the image is to be aligned (assuming more than one point cloud exists for the construction site 10 ), where the most recent may be provided as default if a user does not choose otherwise ( 256 ).
- the system 100 may align the image to the selected point cloud anchor images 259 , which will be discussed in more detail later on ( 258 ).
- anchor images 259 may be initialized from calibrated images for which a 3D pose is already known with relation to the selected point cloud.
- the SfM algorithm executable 286 may include a geometry-informed SfM algorithm 287 , a multi-view stereo (MVS) code 288 , and an octree point cloud 289 .
- the geometry-informed SfM algorithm 287 may analyze and base creation of as-built, sparse point clouds using aspects of the geometry of the construction site derived from the BIM.
- the MVS code 288 may generate a dense point cloud from the sparse point clouds by, from the calibrated images produced by the SfM algorithm, grouping sets of images according to viewpoint and matching patches centered about each pixel in each image to patches in the other images, for example.
- the octree point cloud 289 may provide a nested octree structure onto which the dense point cloud may be organized.
- a modifiable octree structure may be employed for ordering and fast searching of large-scale point clouds in a convenient manner, as will be discussed in more detail.
- the registration may map the point cloud to real world site coordinates and the registration may occur before or during the 3D point cloud generation.
- the registration may map an alignment between images and a BIM in one example, and map an alignment between a point cloud and a BIM in another example. If the images are aligned to a BIM to which the point cloud is also aligned, the images are thereby also aligned to the point cloud.
- the processing logic may execute a point cloud-to-BIM registration executable 284 to generate a transformation matrix 257 that encodes the alignment of the point cloud to physical coordinates of the construction site.
- Such BIM-registered images may then act as constraints to the SfM algorithm executable 286 for use in generating the point cloud data 291 .
- the SfM algorithm executable 286 may incorporate the BIM-registered images to enable SfM-related processing to generate the point cloud data 291 , which is made up of as-built information of the construction site.
- the transformation matrix 257 is a 3 ⁇ 4 matrix that, when multiplied by a 3D point (represented as [x y z 1] coordinates), applies a translation, rotation, and scaling to move the 3D point from its original coordinates to the BIM coordinates, which represent the site coordinates.
- the system 100 may display the image and the point cloud to a user through the GUI 138 with the user interface component 238 , e.g., through execution of the image-to-BIM registration executable 290 .
- the user may then be able to select points (e.g., at least three points through a web interface or other UI) on the BIM and on the point cloud that mutually correspond (e.g., identify identical locations of these at least three points).
- the processing device 110 may then transform the point cloud such that the point cloud is aligned with the BIM based on the at least three corresponding points.
- Corresponding points may be saved to the transformation matrix 257 in one embodiment, in order to store data associated with transformation between BIM and the point cloud.
- the system 100 may execute the SfM executable 286 using the images from the existing point cloud as anchor images 259 . If the point cloud with these anchor images 259 is aligned to the BIM, then the new point cloud will also be aligned to the BIM.
- the interface component 238 may further receive, from a user, corresponding points between the image and the BIM on which to based alignment and reconstruction.
- the schedule interface 236 may facilitate the uploading of tasks ( 260 ), the creation of new tasks ( 262 ), and the editing of existing tasks ( 264 ), all of which may generate schedule data 266 .
- the schedule data 266 may be integrated with a 3D BIM to create a 4D BIM, e.g., a 3D BIM that reflects which elements are to be constructed by a given data according to a construction schedule. More specifically, the schedule data 266 may be integrated with a 3D BIM by assigning elements of the 3D BIM to corresponding tasks, thus creating the 4D BIM.
- the 4D BIM may be used to illustrate visually which tasks will be performed at a given time, who is scheduled to work in which locations, and any analytics information such as which locations are behind schedule or at risk for delay. These various features of the scheduling data will be illustrated and discussed in more detail.
- FIG. 4 is an image of a construction site that illustrates challenges associated with structure-from-motion modeling of the construction site, according to one embodiment.
- the disclosed system 100 and related methods differ from standard structure-from-motion (SfM) algorithms.
- SfM structure-from-motion
- MVS multi-view stereo
- experiments show that problems are observed with employing standard SfM algorithms. For example, there is no guarantee as to the completeness of the point clouds even though images may be captured from the entirety of a scene (see first image). Further, current algorithms may produce drift in long distances and may result in projective reconstructions which are not meaningful for engineering applications (see middle image). Additionally, the resulting as-built, point cloud models are often up-to-scale, and thus without units, and still have to be transformed into a site coordinate system before being useful for engineering applications.
- the user identifies a number of corresponding feature points between the point cloud and the BIM, and brings the feature points into alignment.
- 7 degrees of freedom for this transformation (3 rotation, 3 translation, 1 uniform scale)
- previous work employed a closed-form quaternion solution to the similarity transformation problem.
- the challenge in this process is that the user has to select the features from whatever 3D points that are reconstructed as part of the point cloud. If a user is interested in selecting the corner of a wall or column that does not have any 3D representation (either because the surface is flat, or because the point cloud is incomplete), there is no mechanism to select the feature via the images that are registered with respect to the point cloud model.
- a previous system achieved more accurate and complete point clouds due to better camera registration and higher success in localization of images into the underlying base point cloud by leveraging the 3D BIM, which may include registered GPS locations from the construction site.
- 3D BIM 3D BIM
- this previous system still has not overcome several challenges.
- the previous system required the user to randomly and manually identify an anchor camera, select the correspondence between the 3D BIM, and the anchor camera, and initiate the 3D reconstruction process.
- the choice of the anchor camera is left blindly to the user, who may have little guidance or intuition for which image to choose.
- the previous system identified a subset of images with high overlap with the anchor camera (via a homography transformation) and used the images that have small 3D baselines for initializing the 3D reconstruction.
- This strategy when used for triangulations of the 3D points via Direct Linear Transform (DLT), results in numerical instability, produces a sparse 3D model as the initial reconstruction, and typically breaks down after a few steps in the incremental bundle adjustment optimization procedure. Additionally, the user needed to supervise the entire process because the reconstruction could fail due to the challenges introduced in identifying the subset of images. Such failures required the user to again manually align a new anchor camera with the 3D BIM to resume the process. For a dataset of about 100-150 images, there could be 6-10 anchor cameras. In addition to the heavy user input, the user had to wait until the algorithm failed and then provide a new anchor camera. The selection of a new anchor camera extended the computation of SfM algorithm by a few orders of magnitude and required constant supervision by the user.
- DLT Direct Linear Transform
- the computer storage 139 may store a 3D BIM of a construction site, anchor images, each depicting a viewpoint having a 3D pose and containing features with known 3D positions with respect to the 3D BIM, and target images of the construction site with undetermined 3D position and orientation.
- the 3D pose may be defined by a 3D position and orientation relative to the 3D BIM.
- the features may include at least one of structural points, edges, objects, or textured surfaces within the 3D BIM.
- the system 100 may also include one or more 3D point clouds that depict the as-built conditions of the construction site, which may be derived from the anchor images or other images or 3D measurement devices.
- the GUI 138 may display images and point clouds with known 3D pose together with the BIM, enabling the user to compare as-built conditions to plans for the purpose of assessing progress towards task completion.
- the system 100 may also include scheduling data integrated with the BIM to further facilitate the assessment of task progress.
- the processing device 110 may determine the 3D pose of yet-undetermined target images and produce a 3D point cloud from the images with a determined 3D pose.
- the processing device may determine 3D pose using a subset anchor images.
- the subset of anchor images may be chosen from all available calibrated images with known 3D pose using the GUI 138 and which may be stored in the images database 144 .
- the subset of anchor images may be identified by the date at which the anchor images were photographed (e.g., using the most recent photos), or by matching features in the target images to the calibrated, anchor images.
- the processing device 110 may detect target features within the target images and determine matches between the target features and the features of the anchor images, to generate a set of matching features.
- the processing device 110 may further determine the subset of the target images that have at least a threshold number of the matching features, and may tag the subset of target images as such within the images database 144 .
- the threshold number of matching features may be a predetermined threshold number (e.g., 50, 100, 150, 200, or more).
- the processing device 110 may further select a first image from the subset of the target images associated with one or more of the anchor images, the first image having the largest number of matching features.
- the processing device 110 may further execute an image-based reconstruction algorithm using the first image and the anchor images to calibrate the first image to the BIM and generate an initial 3D point cloud model of the construction site.
- the processing device 110 may perform one or more operations, in one or more combinations of code executions. For example, the processing device 100 may triangulate 3D points from matched features within the anchor images. The processing device 110 may further obtain an initial estimate of the 3D pose of the first image by determining the 3D pose that minimizes the sum of distances between the projection 3D points corresponding to matching features and the 2D positions of the matching features in the first image, which may also be the performance of re-sectioning of the first image. The processing device 110 may further triangulate the subset of matching features between the first image and anchor image that are not already represented in the 3D point cloud.
- the processing device 110 may perform one or more operations, in one or more combinations of code executions. For example, the processing device 100 may triangulate 3D points from matched features within the anchor images. The processing device 110 may further obtain an initial estimate of the 3D pose of the first image by determining the 3D pose that minimizes the sum of distances between the projection 3D points corresponding to matching features and the 2D positions of the matching features
- the processing device 110 may further refine the estimate of the intrinsic camera parameters (e.g., focal length and radial distortion) of the first image, the 3D pose of the first image, and the 3D positions of triangulated matching features to minimize the reprojection distance of the 3D points and their corresponding matching features in respective images, which may also be referred to as bundle adjustment of the first image and the anchor images.
- the 3D poses of the anchor images may be constrained to remain unchanged by the image-based reconstruction algorithm After reconstruction, the first image may be added to the set of calibrated images, along with the anchor images.
- the processing device 110 may further incrementally repeat the last few steps to identify a second image from the subset of the target images that has a next largest number of matching features.
- the processing device 110 may then perform, starting with the initial 3D point cloud model and using the second image and the anchor images as constraints to the image-based reconstruction algorithm, 3D reconstruction to generate an updated 3D point cloud model of the construction site.
- This processing may also be referred to as incremental 3D reconstruction.
- the second image may be selected as the target image that is not in the calibrated images set (e.g., anchor images) and has the most matching features with calibrated images, and where the first image is now considered a part of those calibrated images, and thus excluded.
- the processing device 110 may further perform the above-listed set of operations, but now to include the second image and the calibrated as constraints and for use to build on the initial 3D point cloud and generate the updated 3D point cloud model.
- the processing device may further display a visual instantiation of the updated 3D point cloud model of the construction site in a graphical user interface of a display device, wherein the visual instantiation of the updated 3D point cloud model includes at least the first image and the second image aligned to the 3D BIM.
- the anchor images may be removed from the calibrated set and 3D points triangulated from only the calibrated target images to create a sparse 3D point cloud, e.g., a new 3D point cloud.
- the sparse 3D point cloud may then be stored by the system and displayed by the system in the graphical user interface, e.g., the GUI 138 .
- the calibrated target images may also be displayed. Since the 3D pose of the anchor images is known (e.g., the position and orientation of anchor images relative to BIM is known), the 3D pose of the calibrated target images is also known with respect to the BIM.
- the calibrated target images may be undistorted using the estimated intrinsic camera parameters, and the undistorted images, intrinsic camera parameters, 3D poses, detected features, and feature matching data of the target images may be stored, e.g., in the computer storage 139 ( FIG. 1 ).
- the newly calibrated target images may be added to the set of all anchor images and may be used to facilitate processing of any other images with undetermined pose. Storing the features and feature matches is not necessary but may reduce the computation time for future processes.
- the calibrated target images may be further processed to produce a dense point cloud using a multi-view stereo (MVS) algorithm, e.g., via execution of the MVS executable 288 .
- the MVS algorithm may use the 3D pose of the calibrated target images, the content of the calibrated target images, and, optionally, the matching features and corresponding 3D points to generate a larger number of 3D points.
- the processing device 110 may execute the MVS executable 288 to determine, for each calibrated target image, a number of other images that have a similar viewpoint of the scene.
- the processing device 110 may further match patches centered on each pixel in the target image to patches in another image of the number of other images.
- the processing device 110 may further triangulate 3D points using the matching patches.
- the 3D points may then be accumulated across images and any of a number of processes used to remove points that have low photometric scores or are inconsistent with other points, e.g., based on occlusion.
- a 3D mesh may be constructed from the 3D points using any of a number of processed points to fit 3D surfaces to 3D points, including but not limited to Poisson surface reconstruction.
- the accuracy and completeness of image-based reconstruction may be enhanced by using 3D BIM as a constraint to the image-based reconstruction algorithm (e.g., SLAM, SfM, and the like).
- 3D BIM as a constraint to the image-based reconstruction algorithm
- the disclosed BIM-assisted SfM procedure together with application of the SfM algorithm improves completeness and accuracy in 3D reconstruction in ways that will be discussed.
- the new SfM algorithm may be informed by a priori knowledge of scene geometry.
- Such geometry includes, but is not limited to, existing 3D building models (e.g., 3D BIM), 2D fiduciary markers with surveyed 3D coordinates, and surveying benchmarking points.
- Two-dimensional (2D) fiduciary markers are encoded image codes with known patterns that can be recognized through image processing. While 2D fiduciary markers can be printed out on paper or other medium and easily placed around the construction site, the surveying benchmarking points need a professional surveyor with equipment to identify. For example, a benchmark is generally an existing item that will likely remain positioned without shifting for many, many years, as a stable elevation point. A surveyor may calculate, using surveying equipment, a precise location of the benchmark for use in determining other locations in relation to the benchmark.
- the processing device 110 may display a target image to the user that cannot be calibrated through feature matches together with the BIM.
- the user may then indicate three or more 3D coordinates on the BIM and corresponding 2D coordinates on the target image.
- the processor may then determine the 3D pose of the target image using the corresponding points and with use of one of a number of point-in-perspective (PnP) algorithms, e.g., the Efficient PnP (EPnP) algorithm.
- PnP point-in-perspective
- EPnP Efficient PnP
- the system 100 may transform the SfM algorithm into a constraint-based SfM algorithm by using the images associated with other anchor cameras as constraints.
- the disclosed methods do not necessarily need to rely on supervised 3D to 2D registration on anchor cameras.
- the disclosed methods may detect site-registered fiduciary markers and surveying benchmarks, automatically solve for registration, and automatically execute the entire 3D reconstruction.
- the fiduciary markers may be automatically detected during the point-cloud-generating process and the registration may be performed with at least three or more correspondences with the BIM to solve the similarity transformation between two coordinate systems as illustrated in FIGS. 5A and 5B .
- FIG. 6 is a set of images illustrating three-dimensional (3D) point clouds generated using images captured with a drone camera on the following jobsites, taken from left to right: $500M Sacramento Kings stadium project in Sacramento, Calif. with Turner Construction; Athlete Village in Minneapolis, Minn. with Mortenson Construction; Wajiki Dam in Koji, Japan with Taisei Corporation.
- FIG. 7 is a set of images illustrating 3D point clouds generated using images captured with a drone camera on the following job sites, taken from left to right: Little rock project in Denver, Colo. with W.E. O'Neil construction; Zurich Insurance Company's headquarter project in Chicago, Ill. with Clayco Corp.; and McCormick Place hotel and stadium project in Chicago, Ill. with Clark Construction.
- an offline desktop interface may be provided for users, with access to the system 100 , to bring a point cloud and the images that generated the point cloud into alignment with the BIM.
- This alignment may be performed by selecting pairs of corresponding points from the point cloud and BIM.
- a point on the point cloud may be selected with the aid of an image that views the point (e.g., an anchor image if this point cloud is used for alignment later).
- the processing device 110 may execute instructions to, for each second image of the plurality of second images, display, in the web interface, the 3D BIM and the second image.
- the processing device may further receive a selection, through the web interface, of at least three points of the 3D BIM and the second image that mutually corresponds, and solve a PnP algorithm between the second image and the at least three points of the 3D BIM to determine the 3D pose of the second camera.
- drawing the correspondence is performed by the system 100 without user interaction, e.g., with processing logic performing incremental 3D reconstruction.
- the processing logic may execute instructions to, for each second image of multiple second images corresponding to additional target images, detect a set of locations corresponding to site-registered fiduciary markers and surveying benchmarks in the second image.
- the processing logic may further register the second image using at least three locations of the set of locations corresponding to BIM locations within the 3D BIM, and solve a PnP algorithm between the second image and the at least three locations in the 3D BIM to determine the 3D pose of the second image.
- one or more of the executables 280 may be the source of the instructions executed by the processing logic.
- FIG. 8 is a set of images from within a web-based interface illustrating registration of BIM (or point clouds) with 3D point clouds according to an embodiment.
- an already reconstructed 3D point cloud might be available to the user or laser scanners may be used to generate the models.
- the system 100 may provide a web-based interface that allows the point cloud and BIM (or new point cloud) to be brought into alignment after receiving indications through the web-based interface of corresponding points.
- a user may indicate certain points on the point cloud corresponding to certain points on another point cloud or on a BIM.
- Point cloud and new point cloud alignment may facilitate the alignment of point clouds over time.
- An alignment tool (e.g., with execution of the point cloud-to-BIM registration executable 284 ) may be used in association with the web-based interface when newer point clouds are not automatically aligned to the old point cloud and when newer point clouds are generated from other software or from a laser scan.
- the interface illustrated in FIG. 8 is designed to assist the user in interacting with the BIM and point cloud (or with the point cloud independently) and manually provide the matching correspondences across the two models.
- the processing logic may execute instructions from the point cloud-to-BIM registration executable 284 to perform registration of BIM (or point clouds) with a 3D point cloud.
- the interface of the system may interactively display the BIM, project schedule, point cloud data, images, other project information (e.g., specifications) and the produced analytics on actual and potential performance deviations while accounting for computational power and connectivity bandwidth by using a nested octree data structure for point clouds on mobile devices such as smartphones and tablets.
- the main challenges that are addressed in visualizing integrated BIM and as-built point clouds in a web-based environment with limited memory and bandwidth are (1) techniques for handling, interacting with, presenting, and manipulation of large-scale point clouds; (2) techniques for displaying polygonal meshes, e.g., the BIM, together with textured point clouds and high-resolution images; and (3) the deployment of fast and intuitive analytical tools.
- a modifiable octree data structure may be employed for ordering and fast searching of large-scale point clouds in a convenient manner.
- An octree is a general tree data structure that may be used in computer graphics to represent 3D space by recursively dividing the space into eight octants and that includes a tree-like hierarchy.
- the octree may subsample the point cloud and store sets of points at octree nodes such that the union of all point sets stored in the octree forms the original point cloud.
- the point sets of a level down the hierarchy enhance the point cloud represented by the union of the point sets of higher levels of the tree-like hierarchy.
- This nested octree representation facilitates the rendering process, as the number of points projected to the same pixel on the screen will be reduced by using the different hierarchical levels of detail the point cloud, and from those points only one representative point may be needed to fill the pixel on the screen. Furthermore, the other octree points may only be displayed when the field of view corresponds to the particular level. To facilitate interaction and manipulation of the point cloud, after user selection, the selection point is inserted into the octree based on the closest points, and is tracked down the hierarchy to find the position of the closest points. This strategy takes only about two seconds for a point cloud with density of 10 million points.
- the disclosed system may employ a web graphic library (e.g., WebGL) viewer with functionalities so that users can browse through a point cloud through a first person view.
- the user may pan, rotate, move, and fly through the point cloud to explore the scene in detail.
- the system 100 may automatically load the points inside the view to reduce the loading speed by not needing to load all points for detailed views initially.
- the functionalities of this viewer are listed below. This may significantly reduce drawing on processing resources, e.g., processing bandwidth, power, and the like, and facilitate loading such scenes into lower bandwidth-capable and lower processing-capable devices such as mobile, hand-held devices, and other such computing devices.
- the system 100 may provide a user the ability to view the images together with the point cloud from any camera viewpoint, as illustrated in FIGS. 9A and 9B .
- the camera calibration information may be derived through the image-based 3D reconstruction procedure discussed earlier and be used to create a frustum in the scene.
- the frustum may be depicted as a wireframe connecting the corners of the image surface to each other (to form a rectangle) and connecting the corners of the image surface to its camera center.
- the frustum has the shape a pyramid, with the base representing the bounded image plane and the vertex representing the position of the camera.
- the base of the frustum is textured mapped with the image that was captured from that camera location.
- the texture-mapped image surface may be made semi-transparent so that the features of the 3D point cloud and/or BIM are visible together with the features of the overlaid image.
- FIGS. 10A through 10C are images illustrating the impact of controlling the level of detail from low detail to high detail within a web-based interface according to various embodiments.
- FIG. 10D is an image illustrating a point cloud height map according to an embodiment.
- FIG. 10E is an image illustrating a point cloud in red green blue (RGB), with these colors arranged from top to bottom, according to one embodiment.
- RGB red green blue
- a number of controls are also introduced for better visualization and manipulation of the point clouds. For example, the user can decide the level of detail, point size, and the opacity of the points to better manage memory limitations on mobile devices such as smartphones or tablets. Increasing the level of detail using the controls on the viewer 232 and its impact on the visualization of the point cloud is shown in FIGS. 10A-10C .
- Color coding of the point cloud can also be done based on the average color values captured from all images that see the points, or based on a color spectrum to show the relative height, as illustrated in FIGS. 10D and 10E .
- FIG. 10F illustrates the octree used for pre-processing and query of the points in the cloud.
- FIG. 11 is an image of the web-based interface in which a user may measure the volume in a 3D environment and track the changes from a different time, according to an embodiment.
- the changes in volume may be an indication of construction progress and be provided as a scheduling metric. This allows the user to have a better understanding of the current state of excavation, for example.
- a clipped region may be set to different width to show the terrain difference and height information, as illustrated in FIGS. 12A-12D , which are images within a web-based interface illustrating the segmentation of point clouds for generating cross sections and volumetric measurements of a construction site according to an embodiment. Note that a first clipped region 1202 in FIG. 12A may be isolated for measurement ( FIG. 12B ) and that a second clipped region 1204 may also be isolated for measurement ( FIG. 12D ).
- the disclosed system 100 may be built on top of a cloud server for hosting the BIM.
- the cloud server may be loaded with Autodesk's Forge, although other web-based platforms are envisioned.
- the cloud server can support over 60 different types of construction-related file types and translate the files into a feasible format for the web platform.
- the BIM model that is stored in the cloud may contain the geometry information and the semantic information such as element quantity, element color, element material, and element structural analytics.
- the geometry and other semantic information may be queried from the cloud-based server for visualization and information-retrieval purposes.
- FIG. 13 is an image of joint modeling and representation of BIM and point clouds within a web-based interface according to an embodiment.
- FIG. 13 illustrates different snapshots of the BIM model hosted on the server that is superimposed with the point cloud.
- colors may represent the original color of the BIM elements.
- a user selection may trigger a query to the server to extract semantic information embedded in BIM elements and which are interrelated.
- the semantic information used for construction progress monitoring e.g. expected construction materials and element inter-dependency information
- FIG. 14 is an image of the web-based interface in which a user may either annotate point clouds or their corresponding images to create and visualize process information according to project assignment as per an embodiment.
- FIG. 15 is an image illustrating, within the web-based interface, four dimensional (4D) simulation of construction activities, which includes a 3D point cloud model and scheduling information, according to an embodiment.
- the system 100 may also be adapted to capture and document work crew assignments (e.g., who does what in what location) to measure deviations between expected work-in-progress on a construction site (via 4D BIM that includes schedule data) and actual state of work in progress (via point clouds and images).
- the foreman or the superintendent of the crew logs into the system 100 (e.g., at block 212 ), chooses the personnel who are going to be specifically focusing on the assigned task from a list, and associates the personnel with a work task and a location (e.g., a group of BIM elements). Then, the system 100 may take the submitted information and assign a color to the location.
- the personnel associated with a construction crew may be highlighted in a particular color on the point cloud (end of arrow) and the relevant information about the work process may be provided in the control box on the top left.
- the disclosed web-based interfaces allow the teams on a jobsite to login into the system 100 every day, commit their crew to specific locations, track work in progress, and get real-time feedback on locations available for them. This strategy provides actionable information to project management as they can mobilize teams into work areas that can maximize productivity, minimize interferences among teams, and allow the measurement of actual productivity of each team and labor stability index, which will be discussed in more detail below.
- This process also streamlines in-process quality inspection.
- the last planner may log into the system and document the task completion.
- Task completion information may then be pushed into the system which informs the inspector that a location is ready for inspection. If the inspector needs certain information from the work crew, the system 100 allows the inspector to directly extract that information without the need for additional round of communication with the work crew or other personnel.
- the inspector can also use the system 100 for documentation purposes and inform all parties engaged in the task including the work crew or contractor, project management on site, and the owner about the full completion and approval of that task for billing purposes.
- disclosed processing logic may display, in the GUI, scheduling data associated with the 4D BIM within at least one of a bar chart, a spreadsheet, or a calendar.
- the processing logic may further detect, through the GUI, a user selection of a unit of measure for benchmarking and tracking progress associated with a work task.
- the processing logic may further, in response to detection of the user selection, query one or more databases for quantities associated with the query, and return the quantities and optionally other data pertaining in response to the query.
- task readiness metrics such as “task readiness index,” “task readiness reliability index,” and “location risk index” that together measure potential performance problems at the scheduled-task and work-location levels, using actual progress deviations in both preceding construction tasks and their logistical and contractual constraints.
- disclosed methods facilitate mapping and visualizing the location and activity of construction crews in addition to observed physical changes to the scene of a construction site and proposed alternatives in a web-based environment where point cloud and BIM may be jointly visualized.
- progress deviations and risk for potential delays may be color-coded over the 3D BIM using metaphors such as traffic light colors.
- Polygon representations may be used to locate work crews in an interactive 3D viewer to provide an intuitive assessment of current state of progress, their activities, or the risk associated with potential schedule delays.
- RI Readiness Index
- RR Readiness Reliability
- LRI Location Risk Index
- RI and RR metrics may provide the construction team the opportunity to examine the readiness of the upcoming tasks, and the tasks enable them to proactively monitor the project progress. The construction team may then have a better understanding of which tasks can start on time and which ones require revisions in the upcoming coordination meetings. Instead of retroactive metrics, such as Percent Plan Complete (PPC) or Earned Value Analysis (EVA) metrics such as Schedule Performance Index (SPI) and Cost Performance Index (CPI), RI and RR may proactively adjust the flow of people, material, work, and information according to schedule. To predict the readiness of an upcoming task, the system 100 may acquire statuses of the predecessor tasks and task constraints, the expected status of the task at the time of calculation, and the reliability of the task's on-time completion.
- PPC Percent Plan Complete
- EVA Earned Value Analysis
- SPI Schedule Performance Index
- CPI Cost Performance Index
- the Readiness Index measures the readiness of the task to be executed, which depends on whether all predecessors and task constraints are completed or released.
- RI may receive, as an input, a measure of “Percent Complete” (PC) for each predecessor task and task constraint.
- Percent Complete may be defined as the percent of physical progress towards completion of the task or the task constraint.
- Task constraints may be organized as approvals, information, labor, equipment, prerequisite work, directives, spaces, and safety.
- the Percent Complete of tasks and task constraints may be reported by engineers or subcontractors in the field, or by comparing Reality models (3D point clouds and 2D images) with the 4D BIM (model and schedule). The following process is recommended to objectively measure Percent Complete for a task. First, identify the unit of measure for tracking task progress.
- units of measure examples include cubic yard (C.Y.) for concrete placement, linear foot (L.F.) for drywall installation, and square foot contact area (S.F.C.A) for formwork.
- a total quantity of work for the task may be computed by conducting quantity takeoff among associated elements in the 4D BIM, e.g., 100 cubic yards of concrete.
- Percentage Complete per scheduled task may be calculated as the actual physical progress, ideally computed based on the completed elements and unit of measure, divided by the total quantity of work.
- Onsite project engineers and subcontractors may also provide direct input for the Percent Complete of task and task constraints. Other sources such as daily construction reports, photo logs, billing documentation, and the like provide equivalent information when the previous two are not available.
- ⁇ j ⁇ j is a weight determined by the importance of the predecessor tasks and the task constraints. The weight may be defined from the project engineers' experience, or a function of the total float from the master schedule as Equation 2.
- RFI Request for Information
- the weights may also be determined based on the total float (TF) as in Equation 2.
- Total float is a basic metric from the Critical Path Method (CPM) and is defined as the difference between the early start date and the late start date of the current task.
- CPM Critical Path Method
- Total Float indicates how much buffer a task may have before the task delays the whole schedule. The smaller the total float, the more critical the task. When the total float is zero (“0”), the task is on the critical path and cannot be delayed without impacting the entire project schedule and its project completion date.
- WWP Weekly Work Plan
- the tasks may be mapped to tasks from the master schedule to get the total float.
- the Readiness Reliability (RR) metric may be calculated as a composite of the current status RI and the reliability of the work crew to finish the remaining task according to Equation 3.
- RR i RI i + ⁇ i ⁇ i ⁇ (1 ⁇ PC j ) ⁇ mj (3)
- ⁇ mj presents the probability of work crew (m) finishing the remaining task (j)
- PC j is the Percent Complete on task j.
- the system 100 may compute a goodness of fit to determine the best statistical distribution (e.g., Poisson, Exponential, Weibull, Johnson, or the like) to estimate the expected duration via prior tasks executed by the same work crews.
- productivity is calculated by dividing the quantities according to man hours.
- the quantities may be automatically extracted from the production level 4D BIM, and the man hours may be input by an on-site project engineer, for example.
- a productivity data example of one steel subcontractor performing steel erection is shown in Table 1.
- steel sequence 1 has a total duration of 6 days with one day left.
- the total quantity of the task is 242,000 lbs.
- 50,000 lbs steel is planned to be erected with 80 expected man hours.
- Steel sequence 2 is the successor, and we want to calculate its RR. Based on the quantities, it is known that 80% of work is done, and with only one predecessor the RI is 0.8.
- the system 100 may result in a probability of 46.5% to finish the task on 23 April.
- the best statistical model for representing ⁇ mj may be chosen by employing a goodness-of-fit test or by inspecting, within plots of the probability density function (PDF), cumulative distribution function (CDF), quantiles, and survivor function.
- PDF probability density function
- CDF cumulative distribution function
- the system 100 may process the following statistical models: Johnson distribution, exponential distribution, Poisson distribution, and Weibull Distribution, among others.
- LRI Location Risk Index
- the system 100 may highlight on the 4D BIM, through the user interface, top at-risk locations to communicate the most important work locations and their corresponding schedule tasks that need attention in coordination meetings and also support collaborative planning, one of the pillars of lean construction.
- top at-risk locations to communicate the most important work locations and their corresponding schedule tasks that need attention in coordination meetings and also support collaborative planning, one of the pillars of lean construction.
- Visual production control reports Based on predictive visual data analytics, a method is developed to produce novel “visual production control” reports including “reports that highlight actual progress deviations,” “productivity reports,” “master plan versus WWP report” and “top at-risk locations” on a construction project. These reports contain both visual and non-visual analytics and provide insight on “reliability” of weekly work plans and look-ahead schedules on construction projects.
- Progress Reports are organized around locations within work breakdown structure (WBS) and may be offered to project teams on a weekly basis or on a daily basis.
- the data is usually compiled one day prior to weekly contractor coordination meetings to provide project teams with the most up-to-date state of work-in-progress for their coordination purposes.
- These reports can be customized to include status of work-in-progress on the prior week's activities. They can also be integrated with the look-ahead schedules to provide an outlook to the tasks that are planned to be executed within the next three, four, or six weeks.
- the progress reports empower project teams to conduct root-cause analysis on schedule performance, re-organize their weekly work plan and look-ahead schedule around WBS locations, and analyze the impact of risk for delays and plan changes on the overall project schedule and cost.
- the metrics that are used in the progress report are listed as below in Table 2.
- Percent Plan Complete Measure of how well the project plan is being (PPC) executed. May be calculated as the number of tasks completed on the day stated divided by the total number of tasks planned for the week, which measures the percentage of tasks 100% complete as planned. No. of Tasks Delayed Number of scheduled tasks delayed from the prior week. No. of Tasks Repeated Number of tasks scheduled to start in a week, but their start dates are pushed to the sub- sequent week due to delays in their preceding tasks or their task constraints not being met.
- Tasks Not Anticipated In the event a new task is added to the weekly work plan that was not initially scheduled, the new task will be flagged as “task NOT anticipated.” Task Repeated Tasks flagged under this column when their start date is pushed to the subsequent week due to delays in their preceding tasks or their task constraints not being met. Root Cause Delay This column in progress reports offers an opportunity for project teams to document root-causes of task delays in their weekly coordination meetings.
- the recommended list of delay root-causes includes, for example: RFI not Answered, Information Not Available, Submittal Not Approved, Staff Not Available, Labor Not Available, Materials Not Available, Equipment Not Available, Poor Task Description, Prior Work Not Complete, Under-Estimated Effort, Task Sequence Change, Change in Work Plan, Contracts/Revisions, Conditions Un- satisfactory, Off Project Demands, Weather Conditions, Design Change.
- FIG. 16A is a set of graphs that make up a progress report illustrating a performance overview of a construction project, including root causes for six-week delay, according to an embodiment.
- FIG. 16B is a graph illustrating a weekly work plan (WWP) and a look-ahead plan organized in work breakdown structure on which point clouds are recorded, as per various embodiments.
- WWP weekly work plan
- PPC Weekly Percent Plan Complete
- PPC provide an overview of how well the project plan is executed over the past six weeks.
- the schedule task volume indicates how many tasks are committed by the subcontractors and if the tasks are executed as planned.
- the prior week's PPC snapshots may provide performance broken down per subcontractors.
- the list of six-week delay root causes offers the project team the ability to quickly understand, communicate and analyze the root causes of delays and make plans to avoid them.
- FIG. 17A is a graph illustrating a productivity report with prior week point clouds, according to an embodiment.
- FIG. 17B is a graph illustrating the productivity report of FIG. 17A and further, from left to right, prior week quantities, prior week worker hours, and prior week productivity rates, according to various embodiments.
- the graphs of FIGS. 17A and 17B may be generated as a single report.
- Productivity reports offer detailed information of physical progress on construction sites that can be presented in various forms including percent plan complete per trade or per work package. Provided with the daily total number of man-hours committed to each task in the scheduling interface, e.g., the schedule interface 236 of the UI front end 200 , the reports can be presented at the trade or work package levels.
- Manpower and actual productivity rates are measured at the trade and work package levels. These reports empower project teams to stabilize personnel planning across their work locations and offer them actionable information that can improve reliability of their look-ahead schedules.
- the productivity reports may include a Productivity Rate metric, which may be physical progress, in form of units of completed work, per worker-hour; e.g., for concrete placement: C.Y./worker-hour, or concrete formwork: S.F.C.A./worker-hour, where S.F.C.A stands for square footage of contact area). These productivity rates may be presented at the specific trade or work package level.
- FIG. 18A is a graph illustrating an at-risk location report organized by locations within the work break down structure of the schedule, WWP, and look-ahead schedule, according to an embodiment.
- FIGS. 18B and 18C are a graph illustrating tasks organized similar to a progress report, with additional task readiness columns, according to an embodiment.
- At Risk Location Reports may be organized around work breakdown structure (WBS) locations, which highlight and communicate potential delays in the project look ahead schedule (e.g., three, four, or six week look-ahead schedules).
- WBS work breakdown structure
- FIG. 19 is an image of the web-based interface illustrating a master schedule versus a weekly work plan (WWP) that includes risk reporting, according to an embodiment.
- This report may highlight risk between 4D BIM that ties in tasks at the weekly work plans with BIM elements, and the 4D BIM that ties in tasks at the master schedule with BIM elements.
- WWP weekly work plan
- This report may visually highlight those BIM elements that their execution varies between 4D BIM that ties in tasks at the weekly work plans with BIM elements, and the 4D BIM that ties in tasks at the master schedule with BIM elements.
- These master schedule-versus-weekly-work plan-risk reports may be brought into pull planning meetings, contractor (e.g., work crew) coordination meetings, or can be directly used for reporting schedule performance to owners.
- the application of these reports may help bring immediate attention to the risk between how a project is being managed at the jobsite level and how progress was initially planned and communicated with the owner, and in turn may minimize time necessary for updating progress in a project's master schedule.
- system 100 may employ different visualization modes with different color coding on the disclosed models to clarify and set out the construction schedule, communicate the work crew location, state of progress, and at-risk locations.
- the system 100 may integrate the color-coded model with the reality point cloud models to more effectively communicate deviation between the as-built and as-planned models.
- a visualization mode may be built using 4D visualization (3D model+time) models including 3D point cloud models and/or a 4D BIM.
- the 4D BIM may include a 3D BIM that reflects which elements are to be constructed by a given date according to a construction schedule. As the user sets different dates, the 3D point cloud models and/or 4D BIM illustrate the corresponding progress on different sections in ways that will be explained in detail.
- the processing logic retrieves the 4D BIM of the construction site from memory (or from a location elsewhere across a network).
- the processing logic may further align the 4D BIM to one or more 3D point cloud models, and display the 4D BIM, as aligned, superimposed on the one or more 3D point cloud models to illustrate as-built construction progress compared to the construction schedule.
- the one or more 3D point cloud models may be selectively removed to focus in on the 4D BIM and associated color coding as will be explained in more detail.
- a user may reselect the point clouds (as seen in the legends of many of the following Figures) to see what updated 3D point clouds indicate to be as-built progress.
- FIG. 20 is an image of a location visualization mode in which elements are grouped based on work breakdown schedule (WBS), according to an embodiment.
- a location mode as illustrated in FIG. 20 offers a color-coded model based on the WBS (work breakdown structure) locations. Colors may be automatically assigned or picked for each location under the WBS.
- the color-coded model helps the user easily understand what tasks are grouped into a WBS location, extract the quantities of their corresponding BIM elements, and communicate the schedule and logistics among different activities.
- the hierarchical WBS tree menu on left of FIG. 20 may have squares that are color coded in a way that matches coloring on the BIM elements of the construction site.
- processing logic may retrieve a WBS set of data that categorizes sections of the elements of the 4D BIM according to WBS locations each of which is assignable to a different work crew.
- the processing logic may further display, in the GUI, the 4D BIM with color-coded sections of the elements according to respective WBS locations and display, in the GUI, a list of the WBS locations written in text and that includes a color legend corresponding to the color-coded sections.
- FIG. 21 is an image of a trade location mode within the web-based interface in which a trade location mode illustrates project assignment.
- the trade location mode may facilitate users to easily communicate work crew assignments to particular locations of the construction site.
- the trade location mode may be used during the coordination meeting and foremen meeting to clarify the plan daily and weekly.
- This trade location mode can also be used to communicate which crew is conducting what task at what locations, what work locations are occupied and not available. This mode is also helpful in analyzing space utilization and crew mobilization to avoid site congestion.
- the processing logic may color code sections of elements of the 4D BIM according to work crew assignment, to generate color-coded sections viewable on the 4D BIM.
- the processing logic may display, in the GUI, a legend indicative of which work crew is handling which color-coded section according to color code.
- the processing logic may further display, in the GUI, a list of activities performed by respective work crews in a form of one or more of a bar chart, a spreadsheet, or a calendar, wherein the activities are color-coded the same as are the respective work crews in the 4D BIM and the legend.
- FIG. 22 is an image of a planned mode within the web-based interface that illustrates a 4D BIM that also reflects project schedule, according to an embodiment.
- the planned mode may provide the basic 4D BIM visualization. Highlighted elements (e.g., with a bold or fluorescent color) are those portions of the construction site that according to the schedule are currently in progress and the elements are without color or highlighting have been completed.
- the planned mode visualization may communicate the schedule visually with color coding of the 4D BIM. This visualization may help planners, engineers, and the like understand what tasks needs to be led and conducted on the jobsite in the coming weeks, analyze and communicate where these tasks are located and what resources are necessary, and in turn avoid potential delays.
- FIGS. 23A and 23B are images of a state of progress mode within the web-based interface that illustrates actual progress status on a BIM-based model, according to an embodiment.
- the state-of-progress mode illustrates the color-coded model based on the physical progress of the activities.
- a basic state-of-progress mode communicates if the task is delayed (red) or on schedule (green).
- An advanced state-of-progress mode has a more-detailed task status listed in FIG. 23C . Note that the colors of the icons are, as listed from top to bottom in FIG. 23C , green, yellow, yellow, red, red, green, and red.
- the processing logic may color code sections of elements of the 4D BIM according to work progress defined by a group of work statuses, to generate color-coded sections viewable on the 4D BIM.
- the processing logic may further display, in the GUI, a legend illustrating which colors designate which work status of the group of work statuses, and display, in the GUI, a list of activities with reference to the color-coded sections in a form of one or more of a bar chart, a spreadsheet, or a calendar.
- the work statuses in the work progress include a combination of at least: started on time; late start; has not started; delayed; critical delay; and finished late.
- FIG. 24 is an image within the web-based interface in which locations most at risk for falling behind schedule are highlighted prior to a construction coordination meeting, according to an embodiment.
- a risk location mode illustrated in FIG. 24 may offer a color-coded model that indicates the at-risk location based on the LRI.
- the color coding follows the traffic light metaphor as high risk locations may be color coded in red, low risk location are color coded in green, and medium risk locations color coded in yellow, or according some of color scheme to identify at-risk locations of different levels.
- the color sections of the elements of the 4D BIM according to level of risk for potential delay in the construction schedule and display, in the GUI, a legend illustrating which colors designate which level of risk for potential delay.
- FIG. 25 is a flow chart of a method 2500 for computation of point cloud, as-built models for display and facilitation of user interaction with visualizations of construction progress monitoring, according to various embodiments.
- the method 2500 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions running on the processor), firmware or a combination thereof.
- a processor of the system 100 or the processing device 110 performs the method 2500 .
- other components of a computing device or cloud server may perform some or all of the operations of the method 2500 .
- the method 2500 may begin with the processing logic storing, in the computer storage, a 3D BIM of a construction site, anchor images each depicting a viewpoint having a 3D pose and containing features with known 3D positions with respect to the 3D BIM, and target images of the construction site with unknown 3D position and orientation ( 2505 ).
- the 3D pose may include a 3D position and orientation relative to the 3D BIM, and the features may include at least one of structural points, edges, objects, or textured surfaces within the 3D BIM.
- the 3D pose of an anchor image may be determined by detecting selection, through the GUI, of corresponding points between the anchor image and the 3D BIM, or detecting selection, through the GUI, of corresponding points between an anchor 3D point cloud model, generated from the anchor images, and the 3D BIM.
- the method 2500 may continue with the processing logic initializing a set of calibrated images, which have a known 3D pose relative to the 3D BIM, to be the anchor images ( 2507 ).
- the method 2500 may continue with the processing logic detecting target features within the target images ( 2510 ).
- the method 2500 may continue with the processing logic determining matches between the target features and the features of the set of calibrated images, to generate a plurality of matching features ( 2515 ).
- the method 2500 may continue with the processing logic determining a subset of the target images that have at least a threshold number of the matching features ( 2520 ).
- the method 2500 may continue with the processing logic selecting a first image from the subset of the target images that has the largest number of matching features ( 2525 ).
- the method 2500 may continue with the processing logic executing an image-based reconstruction algorithm using the first image and the anchor images to calibrate the first image to the BIM and generate an initial 3D point cloud model of the construction site ( 2530 ).
- the method 2500 may continue with the processing logic incrementally repeating the steps at blocks 2520 and 2525 to identify a second image from the subset of images with a next largest number of matching features ( 2535 ).
- the method 2500 may continue with the processing logic incrementally repeating the step at block 2530 to perform, starting with the initial 3D point cloud model and using the second image and the anchor images as constraints to the image-based reconstruction algorithm, 3D reconstruction to generate an updated 3D point cloud model of the construction site ( 2540 ).
- the initial 3D point cloud model may also be a constraint to the image-based reconstruction algorithm, and each iteration of reconstruction may work on a previous updated 3D point cloud model to generate a newly updated 3D point cloud model.
- the method may continue with the processing logic displaying a visual instantiation of the complete 3D point cloud model of the construction site in a graphical user interface of a display device, wherein the visual instantiation of the updated 3D point cloud model includes at least the first image and the second image aligned to the 3D BIM ( 2550 ).
- FIG. 26 is a flow chart of a method for alignment of 3D point clouds generated at different times to illustrate construction progress according to an embodiment.
- the method 2600 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions running on the processor), firmware or a combination thereof.
- processing logic may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (such as instructions running on the processor), firmware or a combination thereof.
- a processor of the system 100 or the processing device 200 performs the method 2600 .
- other components of a computing device or cloud server may perform some or all of the operations of the method 2600 .
- the method 2600 may begin with the processing logic retrieving a four-dimensional (4D) building information model (BIM) of a construction site, wherein the 4D BIM includes a three-dimensional (3D) BIM that reflects which elements are to be constructed by a given date according to a construction schedule ( 2605 ).
- the processing device may continue with the processing device retrieving a first three-dimensional (3D) point cloud model of a construction site with known 3D pose relative to the 3D BIM ( 2610 ).
- the method 2600 may continue with the processing logic retrieving second 3D point cloud model that is generated at a later time than the first 3D point cloud model ( 2615 ).
- the method 2600 may continue with the processing logic executing an alignment tool to display, in the GUI 138 , a visual instantiation of the first 3D point cloud model and a visual instantiation of the second 3D point cloud model ( 2620 ).
- the method 2600 may continue with the processing logic receiving a selection, through the GUI, of at least three points of the first 3D point cloud model and of the second 3D point cloud model that mutually correspond ( 2625 ).
- the method 2600 may continue with the processing logic aligning, using the alignment tool, the first 3D point cloud model with the second 3D point cloud model based on the at least three points ( 2630 ).
- the method 2600 may continue with the processing logic displaying, in the GUI, at least one of the first 3D point cloud model or the second 3D point cloud model superimposed on the 4D BIM to illustrate as-built construction progress over time with reference to the construction schedule ( 2635 ).
- Preliminary results shows the metrics and the system 100 improve the communication, coordination, and planning for construction projects.
- the project was delayed by two months before the team started to use the system 100 with the web-based and interactive interface.
- Plan Percent Complete (PPC) was employed as to delayed tasks to measure how the system 100 has improved the communication of tasks after the system is introduced.
- PPC has increased in a favorable trend and the PPC of almost all the weeks after the system introduced was above the national average. Specifically, the PPC increased 24% from a 46% baseline in six weeks and remains above the national average.
- the tasks delayed decreased from six to two tasks per week, and the average repeated tasks were reduced from 12 to four tasks per week, indicating that subcontractors become more committed to finish the tasks according to the plan.
- the system 100 also measured the reliability of predicting at-risk locations.
- the reported at risk location tasks are delayed in seven of eight total tasks, and the task has also been repeated for multiple times over the duration. This shows that the disclosed methods reliability predicted these locations that were at risk for potential delay since they in fact exhibited actual delays mainly because appropriate control interventions were not proposed.
- the disclosed system 100 facilitates the interactive creation of 4D BIM within a web-based environment. This includes interactive association of BIM elements with scheduled tasks, via BIM selection tree or the visual 3D interface, and animating construction sequences via the web-based system.
- FIG. 15 shows the newly created interfaces. As shown in FIG. 15 , the project schedule—with Gantt chart, calendar, and timeline views—may be visualized at the bottom of the screen. A user can select any of the tasks listed in the project schedule and visually review the corresponding BIM elements and work location associated to that task in the 3D viewer. Alternatively, this interface can be used to associate BIM elements and various work locations to their corresponding tasks in the schedule.
- the project schedule interface allows the start and end data of tasks and their duration to be interactively changed, providing an opportunity for the project participants on the site and the last planners to have flexibility in making changes in the schedule, and in particular, weekly work plans, while the project participants are on site.
- the developed method also allows the integration of BIM elements and scheduled activities to be visualized and simulated in 4D through the GUI 138 on a display device.
- the user can drag each of the components of the interface (model breakdown structure, or the scheduler), float them in the screen, or drag and push them into one of the sides (up, down, left, and right).
- FIG. 27 illustrates a general computer system 2700 , which may represent the processing device 110 or any other device or system to which is referred or which is capable of executing the embodiment as disclosed herein.
- the computer system 2700 may include an ordered listing of a set of instructions 2702 that may be executed to cause the computer system 2700 to perform any one or more of the methods or computer-based functions disclosed herein.
- the computer system 2700 may operate as a stand-alone device or may be connected to other computer systems or peripheral devices, e.g., by using a network 2750 .
- the computer system 2700 may operate in the capacity of a server or as a client-user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
- the computer system 2700 may also be implemented as or incorporated into various devices, such as a personal computer or a mobile computing device capable of executing a set of instructions 2702 that specify actions to be taken by that machine, including and not limited to, accessing the internet or web through any form of browser.
- each of the systems described may include any collection of sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
- the computer system 2700 may include a memory 2704 on a bus 2720 for communicating information. Code operable to cause the computer system to perform any of the acts or operations described herein may be stored in the memory 2704 .
- the memory 2704 may be a random-access memory, read-only memory, programmable memory, hard disk drive or any other type of volatile or non-volatile memory or storage device.
- the computer system 2700 may include a processor 2708 , such as a central processing unit (CPU) and/or a graphics processing unit (GPU).
- the processor 2708 may include one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, digital circuits, optical circuits, analog circuits, combinations thereof, or other now known or later-developed devices for analyzing and processing data.
- the processor 2708 may implement the set of instructions 2702 or other software program, such as manually-programmed or computer-generated code for implementing logical functions.
- the logical function or system element described may, among other functions, process and/or convert an analog data source such as an analog electrical, audio, or video signal, or a combination thereof, to a digital data source for audio-visual purposes or other digital processing purposes such as for compatibility for computer processing.
- an analog data source such as an analog electrical, audio, or video signal, or a combination thereof
- a digital data source for audio-visual purposes or other digital processing purposes such as for compatibility for computer processing.
- the processor 2708 may include a transform modeler 2706 or contain instructions for execution by a transform modeler 2706 provided a part from the processor 2708 .
- the transform modeler 2706 may include logic for executing the instructions to perform the transform modeling and image reconstruction as discussed in the present disclosure.
- the computer system 2700 may also include a disk (or optical) drive unit 2715 .
- the disk drive unit 2715 may include a non-transitory computer-readable medium 2740 in which one or more sets of instructions 2702 , e.g., software, can be embedded. Further, the instructions 2702 may perform one or more of the operations as described herein.
- the instructions 2702 may reside completely, or at least partially, within the memory 2704 and/or within the processor 2708 during execution by the computer system 2700 . Accordingly, the databases displayed and described above with reference to FIGS. 2A and 2B may be stored in the memory 2704 and/or the disk unit 2715 .
- the memory 2704 and the processor 2708 also may include non-transitory computer-readable media as discussed above.
- a “computer-readable medium,” “computer-readable storage medium,” “machine readable medium,” “propagated-signal medium,” and/or “signal-bearing medium” may include any device that includes, stores, communicates, propagates, or transports software for use by or in connection with an instruction executable system, apparatus, or device.
- the machine-readable medium may selectively be, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- the computer system 2700 may include an input device 2725 , such as a keyboard or mouse, configured for a user to interact with any of the components of system 2700 . It may further include a display 2730 , such as a liquid crystal display (LCD), a cathode ray tube (CRT), or any other display suitable for conveying information.
- the display 2730 may act as an interface for the user to see the functioning of the processor 2708 , or specifically as an interface with the software stored in the memory 2704 or the drive unit 2715 .
- the computer system 2700 may include a communication interface 2736 that enables communications via the communications network 2710 .
- the network 2710 may include wired networks, wireless networks, or combinations thereof.
- the communication interface 2736 network may enable communications via a number of communication standards, such as 802.11, 802.17, 802.20, WiMax, cellular telephone standards, or other communication standards.
- the method and system may be realized in hardware, software, or a combination of hardware and software.
- the method and system may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems.
- a computer system or other apparatus adapted for carrying out the methods described herein is suited to the present disclosure.
- a typical combination of hardware and software may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- Such a programmed computer may be considered a special-purpose computer.
- the method and system may also be embedded in a computer program product, which includes all the features enabling the implementation of the operations described herein and which, when loaded in a computer system, is able to carry out these operations.
- Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function, either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Game Theory and Decision Science (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Remote Sensing (AREA)
- Biodiversity & Conservation Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Civil Engineering (AREA)
- Architecture (AREA)
- Evolutionary Computation (AREA)
- Mathematical Analysis (AREA)
- Pure & Applied Mathematics (AREA)
- Structural Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- Processing Or Creating Images (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
RIi=Σjφj·PCj;Σjφj=1 (1)
In
RRi=RIi+Σiφi·(1−PCj)·Ωmj (3)
In
TABLE 1 | |||
|
Productivity (lbs/mhr) | ||
Steel Seq. 1 | 18 April | 712.18 | ||
19 April | 302.23 | |||
20 April | 1046.38 | |||
21 April | 321.27 | |||
22 April | 546.82 | |||
23 April | — | |||
Steel Seq. 2 | 24 April | — | ||
LRIi=ΠRRj (4)
TABLE 2 | |
Metric | Definition |
Percent Complete | The percent complete of physical progress per |
(PC): | scheduled task. |
Percent Plan Complete | Measure of how well the project plan is being |
(PPC) | executed. May be calculated as the number of |
tasks completed on the day stated divided by | |
the total number of tasks planned for the week, | |
which measures the percentage of |
|
complete as planned. | |
No. of Tasks Delayed | Number of scheduled tasks delayed from the |
prior week. | |
No. of Tasks Repeated | Number of tasks scheduled to start in a week, |
but their start dates are pushed to the sub- | |
sequent week due to delays in their preceding | |
tasks or their task constraints not being met. | |
Tasks Not Anticipated | In the event a new task is added to the weekly |
work plan that was not initially scheduled, the | |
new task will be flagged as “task NOT | |
anticipated.” | |
Task Repeated | Tasks flagged under this column when their |
start date is pushed to the subsequent week | |
due to delays in their preceding tasks or | |
their task constraints not being met. | |
Root Cause Delay | This column in progress reports offers an |
opportunity for project teams to document | |
root-causes of task delays in their weekly | |
coordination meetings. The recommended list | |
of delay root-causes includes, for example: | |
RFI not Answered, Information Not Available, | |
Submittal Not Approved, Staff Not Available, | |
Labor Not Available, Materials Not Available, | |
Equipment Not Available, Poor Task Description, | |
Prior Work Not Complete, Under-Estimated | |
Effort, Task Sequence Change, Change in Work | |
Plan, Contracts/Revisions, Conditions Un- | |
satisfactory, Off Project Demands, Weather | |
Conditions, Design Change. | |
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/956,266 US11288412B2 (en) | 2018-04-18 | 2018-04-18 | Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/956,266 US11288412B2 (en) | 2018-04-18 | 2018-04-18 | Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190325089A1 US20190325089A1 (en) | 2019-10-24 |
US11288412B2 true US11288412B2 (en) | 2022-03-29 |
Family
ID=68237891
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/956,266 Active 2039-11-20 US11288412B2 (en) | 2018-04-18 | 2018-04-18 | Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays |
Country Status (1)
Country | Link |
---|---|
US (1) | US11288412B2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210158546A1 (en) * | 2019-11-22 | 2021-05-27 | Baidu Usa Llc | Updated point cloud registration pipeline based on admm algorithm for autonomous vehicles |
US20210200174A1 (en) * | 2019-12-31 | 2021-07-01 | Johnson Controls Technology Company | Building information model management system with hierarchy generation |
US20210233274A1 (en) * | 2018-04-26 | 2021-07-29 | Continental Automotive Gmbh | Online Evaluation for Camera Intrinsic Parameters |
US11512955B1 (en) * | 2019-01-10 | 2022-11-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for enhanced base map generation |
US11900670B2 (en) * | 2022-06-30 | 2024-02-13 | Metrostudy, Inc. | Construction stage detection using satellite or aerial imagery |
US11908185B2 (en) * | 2022-06-30 | 2024-02-20 | Metrostudy, Inc. | Roads and grading detection using satellite or aerial imagery |
Families Citing this family (105)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10462156B2 (en) * | 2014-09-24 | 2019-10-29 | Mcafee, Llc | Determining a reputation of data using a data visa |
EP3398164B1 (en) * | 2015-12-30 | 2020-04-01 | Telecom Italia S.p.A. | System for generating 3d images for image recognition based positioning |
US10635841B2 (en) | 2017-02-23 | 2020-04-28 | OPTO Interactive, LLC | Method of managing proxy objects |
US11080890B2 (en) * | 2017-07-28 | 2021-08-03 | Qualcomm Incorporated | Image sensor initialization in a robotic vehicle |
WO2019222255A1 (en) * | 2018-05-14 | 2019-11-21 | Sri International | Computer aided inspection system and methods |
US10810734B2 (en) * | 2018-07-02 | 2020-10-20 | Sri International | Computer aided rebar measurement and inspection system |
CN109241844B (en) * | 2018-08-03 | 2020-11-17 | 百度在线网络技术(北京)有限公司 | Attitude estimation method, device and equipment for three-dimensional object and storage medium |
CN112771850B (en) * | 2018-10-02 | 2022-05-24 | 华为技术有限公司 | Motion compensation method, system and storage medium using 3D auxiliary data |
US11301683B2 (en) * | 2018-10-10 | 2022-04-12 | Autodesk, Inc. | Architecture, engineering and construction (AEC) construction safety risk analysis system and method for interactive visualization and capture |
US20200210938A1 (en) | 2018-12-27 | 2020-07-02 | Clicksoftware, Inc. | Systems and methods for fixing schedule using a remote optimization engine |
US11182513B2 (en) | 2019-02-15 | 2021-11-23 | Procore Technologies, Inc. | Generating technical drawings from building information models |
US10733775B1 (en) * | 2019-02-15 | 2020-08-04 | Procore Technologies, Inc. | Generating technical drawings from building information models |
US11574086B2 (en) | 2019-02-15 | 2023-02-07 | Procore Technologies, Inc. | Generating technical drawings from building information models |
US10878628B2 (en) * | 2019-03-22 | 2020-12-29 | Cesium GS, Inc. | System and method for converting massive, single-material mesh datasets to a hierarchical format |
US20200311681A1 (en) * | 2019-03-26 | 2020-10-01 | Construction Materials Technologies, Inc. | Linked workflow structure systems and methods |
US11591757B2 (en) * | 2019-04-17 | 2023-02-28 | Caterpillar Paving Products Inc. | System and method for machine control |
US11917487B2 (en) | 2019-06-14 | 2024-02-27 | 3990591 Canada Inc. | System and method of geo-location for building sites |
US20210004948A1 (en) * | 2019-09-14 | 2021-01-07 | Ron Zass | Verifying purported capturing parameters of images of construction sites |
WO2021064876A1 (en) * | 2019-10-01 | 2021-04-08 | 日揮グローバル株式会社 | Project management device, project management method, and recording medium |
US20210103864A1 (en) * | 2019-10-04 | 2021-04-08 | Procore Technologies, Inc. | Computer System and Method for Facilitating Creation and Management of an Inspection and Test Plan for a Construction Project |
JP2021071747A (en) * | 2019-10-29 | 2021-05-06 | 日本電気株式会社 | Information processing system, information processing method, and program |
CN111022066B (en) * | 2019-11-11 | 2021-10-01 | 北京住总集团有限责任公司 | Shield machine risk source crossing three-dimensional simulation and monitoring system based on BIM and GIS |
CN110929322A (en) * | 2019-11-19 | 2020-03-27 | 广东博智林机器人有限公司 | Method and system for establishing mapping between BIM model and three-dimensional cloud model |
CN110889901B (en) * | 2019-11-19 | 2023-08-08 | 北京航空航天大学青岛研究院 | Large-scene sparse point cloud BA optimization method based on distributed system |
CN111080491A (en) * | 2019-12-12 | 2020-04-28 | 成都阳帆网络科技有限公司 | Construction site inspection system and method based on video identification |
US11074701B2 (en) | 2019-12-13 | 2021-07-27 | Reconstruct Inc. | Interior photographic documentation of architectural and industrial environments using 360 panoramic videos |
CN111079826B (en) * | 2019-12-13 | 2023-09-29 | 武汉科技大学 | Construction progress real-time identification method integrating SLAM and image processing |
JP7383470B2 (en) * | 2019-12-20 | 2023-11-20 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Information processing device, information processing method and program |
WO2021140514A1 (en) | 2020-01-07 | 2021-07-15 | Datumate Ltd. | Building information modeling (bim) data model for construction infrastructure |
CN111429097B (en) * | 2020-03-24 | 2024-03-05 | 上海捷规建筑工程咨询有限公司 | BIM-based automatic field progress and model matching method and system |
CN111339601A (en) * | 2020-03-25 | 2020-06-26 | 中国十七冶集团有限公司 | BIM + VR construction site arrangement method |
CN113450415B (en) * | 2020-03-26 | 2024-07-30 | 阿里巴巴集团控股有限公司 | Imaging equipment calibration method and device |
US11620599B2 (en) * | 2020-04-13 | 2023-04-04 | Armon, Inc. | Real-time labor tracking and validation on a construction project using computer aided design |
CN111539568B (en) * | 2020-04-22 | 2021-11-05 | 深圳市地质局 | Safety monitoring system and method based on unmanned aerial vehicle and three-dimensional modeling technology |
JP7516851B2 (en) | 2020-05-15 | 2024-07-17 | 株式会社大林組 | Process management support system, process management support method, and process management support program |
US11263565B2 (en) * | 2020-05-19 | 2022-03-01 | Procore Technologies, Inc. | Systems and methods for creating and managing a lookahead schedule |
US11250361B2 (en) * | 2020-05-22 | 2022-02-15 | Hitachi, Ltd. | Efficient management method of storage area in hybrid cloud |
CN111680348A (en) * | 2020-05-22 | 2020-09-18 | 中国路桥工程有限责任公司 | Single-line tunnel construction risk control method |
CN111860225B (en) * | 2020-06-30 | 2023-12-12 | 阿波罗智能技术(北京)有限公司 | Image processing method and device, electronic equipment and storage medium |
WO2022040919A1 (en) * | 2020-08-25 | 2022-03-03 | 南京翱翔信息物理融合创新研究院有限公司 | Internet-based method for displaying three-dimensional road bridge progress |
CN111814245B (en) * | 2020-09-02 | 2020-12-15 | 广东博智林机器人有限公司 | Ceiling joint layout acquisition method and device, electronic equipment and storage medium |
CN118379683A (en) * | 2020-09-03 | 2024-07-23 | 金钱猫科技股份有限公司 | Method and system for monitoring engineering safety project through AI intelligent image analysis |
US20220083940A1 (en) * | 2020-09-15 | 2022-03-17 | Accenture Global Solutions Limited | Intelligent and Automatic Generation of Construction Rule Set |
ES2902098B2 (en) * | 2020-09-24 | 2022-09-06 | Check To Build S L | METHOD TO ANALYZE THE EXECUTION OF A CONSTRUCTION PROJECT |
CN112150629A (en) * | 2020-09-25 | 2020-12-29 | 福建华电可门发电有限公司 | Vision-based coal inventory system and method |
US20220114763A1 (en) * | 2020-10-09 | 2022-04-14 | Qualcomm Incorporated | High level syntax refinements for geometry point cloud compression (g-pcc) |
WO2022077296A1 (en) * | 2020-10-14 | 2022-04-21 | 深圳市大疆创新科技有限公司 | Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium |
CA3198669A1 (en) | 2020-10-19 | 2022-04-28 | OnsiteIQ Inc. | Risk assessment techniques based on images |
WO2022089714A1 (en) * | 2020-10-26 | 2022-05-05 | Swiss Reinsurance Company Ltd. | Digital platform for automated assessing and rating of construction and erection risks, and method thereof |
US20230385770A1 (en) * | 2020-10-28 | 2023-11-30 | Jgc Corporation | Project management device, project management method, and recording medium |
US20220138365A1 (en) * | 2020-10-30 | 2022-05-05 | Crown Equipment Corporation | Systems and methods for point cloud site commissioning |
CN112459136A (en) * | 2020-11-13 | 2021-03-09 | 中铁广州工程局集团深圳工程有限公司 | Deep foundation pit construction safety risk monitoring method, device, equipment and medium |
US20220156418A1 (en) * | 2020-11-17 | 2022-05-19 | Autodesk, Inc. | Progress tracking with automatic symbol detection |
CN112652015B (en) * | 2020-11-30 | 2023-05-09 | 中国公路工程咨询集团有限公司 | BIM-based pavement disease marking method and device |
CN112669440A (en) * | 2020-12-03 | 2021-04-16 | 悉地(苏州)勘察设计顾问有限公司 | Traffic planning engineering project achievement display method, device and storage medium |
CN112446114B (en) * | 2020-12-08 | 2023-09-05 | 国网江苏省电力工程咨询有限公司 | Three-dimensional model comparison-based power transmission line engineering construction progress monitoring method |
US20220215596A1 (en) * | 2021-01-04 | 2022-07-07 | Qualcomm Incorporated | Model-based prediction for geometry point cloud compression |
CN112734254B (en) * | 2021-01-15 | 2024-08-02 | 南方电网大数据服务有限公司 | Risk early warning method, device and computer equipment for transformer substation construction |
CN113010935A (en) * | 2021-02-19 | 2021-06-22 | 龙元建设集团股份有限公司 | Management method and system for BIM application based on Revit monitoring platform |
CN113011773A (en) * | 2021-04-02 | 2021-06-22 | 云河(河南)信息科技有限公司 | Universal BIM display platform construction method suitable for hydraulic engineering industry |
JP7125524B1 (en) * | 2021-04-09 | 2022-08-24 | 株式会社イクシス | Management support system for buildings or civil engineering structures |
US20220383231A1 (en) * | 2021-05-28 | 2022-12-01 | Doxel, Inc. | Generating status of construction site based on hierarchical modeling that standardizes physical relationships of elements of a structure |
CN113343016B (en) * | 2021-06-01 | 2023-06-16 | 中国计量大学 | System and method for supervising building materials |
CN113449360B (en) * | 2021-06-10 | 2023-12-26 | 上海建工四建集团有限公司 | Construction planning method based on building information model |
US12026834B2 (en) | 2021-06-15 | 2024-07-02 | Reconstruct Inc. | Method to determine from photographs the placement and progress of building elements in comparison with a building plan |
CN115600270A (en) * | 2021-07-08 | 2023-01-13 | 汽车成型工程有限公司(Ch) | Method for user interaction for data manipulation in a CAE/CAD system |
CN114065336B (en) * | 2021-09-28 | 2022-07-22 | 广州优比建筑咨询有限公司 | Revit-based high formwork region inspection method, device, medium and equipment |
US20230119214A1 (en) * | 2021-10-18 | 2023-04-20 | Faro Technologies, Inc. | Four-dimensional data platform using automatic registration for different data sources |
CN113989340A (en) * | 2021-10-29 | 2022-01-28 | 天津大学 | Point cloud registration method based on distribution |
CN113969589B (en) * | 2021-11-01 | 2023-08-25 | 山西建筑工程集团有限公司 | Construction method for anchor rod to penetrate pile group foundation |
CN118251696A (en) * | 2021-11-03 | 2024-06-25 | 瑞典爱立信有限公司 | Alignment of point clouds representing physical objects |
WO2023096588A1 (en) | 2021-11-25 | 2023-06-01 | Univerza V Mariboru | A system, a method and a computer program for construction progress monitoring |
CN114065366B (en) * | 2022-01-17 | 2022-04-29 | 四川省交通勘察设计研究院有限公司 | BIM-based construction map engine visual interaction method |
CN114494274A (en) * | 2022-03-31 | 2022-05-13 | 清华大学 | Building construction evaluation method, building construction evaluation device, electronic equipment and storage medium |
US12094014B2 (en) * | 2022-04-27 | 2024-09-17 | Procore Technologies, Inc. | Computer systems and methods for dynamic pull planning |
CN115169818B (en) * | 2022-06-14 | 2023-06-09 | 武汉建科科技有限公司 | Technical engineering measuring and calculating method and system based on digital modeling |
CN114925228B (en) * | 2022-06-20 | 2024-06-07 | 广东电网有限责任公司 | Visual monitoring method and device for point cloud calculation and computer equipment |
CN115063702B (en) * | 2022-06-21 | 2022-11-29 | 中国铁道科学研究院集团有限公司电子计算技术研究所 | Point cloud sampling-based high-speed rail continuous beam construction progress detection method |
CN114898235B (en) * | 2022-07-13 | 2022-10-28 | 集展通航(北京)科技有限公司 | Unmanned aerial vehicle point cloud-based high-speed railway pier construction progress detection method |
CN116091721A (en) * | 2022-09-01 | 2023-05-09 | 武汉天际航信息科技股份有限公司 | Building construction monitoring method, equipment and storage medium based on three-dimensional point cloud |
CN115290097B (en) * | 2022-09-30 | 2022-12-30 | 安徽建筑大学 | BIM-based real-time accurate map construction method, terminal and storage medium |
CN115293667B (en) * | 2022-10-10 | 2023-01-03 | 深圳市睿拓新科技有限公司 | Management method of project progress and cost management system |
CN115620278B (en) * | 2022-11-15 | 2023-03-10 | 广州奇志信息科技有限公司 | Method for identifying and measuring materials |
US20240265319A1 (en) * | 2023-02-03 | 2024-08-08 | Joshua MAY | Automatically Updating Scheduling Software with Notification & Alerts |
CN116168116B (en) * | 2023-04-19 | 2023-07-21 | 巴斯夫一体化基地(广东)有限公司 | Method and device for visually displaying test execution plan |
CN116227009B (en) * | 2023-05-10 | 2023-07-21 | 长江三峡集团实业发展(北京)有限公司 | Method, device and equipment for estimating bias of BIM model and point cloud model of tunnel |
CN116634101A (en) * | 2023-05-24 | 2023-08-22 | 深圳市联深科技发展有限公司 | BIM-based video monitoring method and system |
CN116384756B (en) * | 2023-06-05 | 2023-08-15 | 中铁四局集团有限公司 | Deep learning-based construction engineering progress risk prediction evaluation method |
CN116703127B (en) * | 2023-08-03 | 2024-05-14 | 山东青建智慧建筑科技有限公司 | Building construction supervision method and system based on BIM |
CN116757556B (en) * | 2023-08-14 | 2023-10-31 | 成都建工雅安建设有限责任公司 | Waterproof construction management method and system based on image processing |
CN117474321B (en) * | 2023-10-30 | 2024-04-19 | 郑州宝冶钢结构有限公司 | BIM model-based construction site risk intelligent identification method and system |
CN117315160B (en) * | 2023-10-31 | 2024-05-14 | 重庆市规划和自然资源信息中心 | Building three-dimensional live-action modeling working method |
CN117217715B (en) * | 2023-11-03 | 2024-02-06 | 中招国际招标有限公司 | Construction progress monitoring method and system based on engineering electronic map |
CN117391544B (en) * | 2023-12-11 | 2024-04-19 | 深圳朗生整装科技有限公司 | Decoration project management method and system based on BIM |
CN117440023B (en) * | 2023-12-19 | 2024-02-23 | 南京昊天路桥工程有限公司 | Roadbed project construction data processing method and system |
CN117998032B (en) * | 2024-01-04 | 2024-08-30 | 北京漫漫星图科技有限公司 | Design drawing and aerial video superposition display method and system based on real-time SLAM technology |
CN117541023B (en) * | 2024-01-05 | 2024-04-05 | 山东金呈阳建设工程有限公司 | BIM-based bridge construction progress management method and system |
CN118094914B (en) * | 2024-02-28 | 2024-08-09 | 江苏嘉耐高温材料股份有限公司 | Slag wall stability analysis system and method |
CN117830676B (en) * | 2024-03-06 | 2024-06-04 | 国网湖北省电力有限公司 | Unmanned aerial vehicle-based power transmission line construction risk identification method and system |
CN118036899B (en) * | 2024-04-10 | 2024-06-14 | 山东亿昌装配式建筑科技有限公司 | Building decoration intelligent management system based on BIM |
CN118071115B (en) * | 2024-04-18 | 2024-06-21 | 广州天奕技术股份有限公司 | Multi-source heterogeneous monitoring system, method, device and equipment |
CN118229244B (en) * | 2024-05-27 | 2024-08-16 | 山东商业职业技术学院 | Digital technology production service production and teaching integration practical training base construction project management method and system |
CN118246251B (en) * | 2024-05-28 | 2024-07-26 | 中铁建工集团第二建设有限公司 | BIM-based intelligent management method for hospital building construction |
CN118365821B (en) * | 2024-06-19 | 2024-09-10 | 江西省送变电工程有限公司 | Project progress visual management method and system |
CN118411001B (en) * | 2024-07-03 | 2024-09-10 | 青建集团股份公司 | BIM-based intelligent engineering period management method and system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110054652A1 (en) | 2009-08-27 | 2011-03-03 | Heil Duane A | Building Construction Software and System |
US8249909B2 (en) | 2008-10-31 | 2012-08-21 | Hitachi-Ge Nuclear Energy, Ltd. | System and method for visualizing the construction progress of scaffolding utilizing 3D CAD models |
US20130147798A1 (en) | 2011-12-08 | 2013-06-13 | The Board Of Trustees Of The University Of Illinois | Inserting objects into content |
US20130155058A1 (en) * | 2011-12-14 | 2013-06-20 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
US20150310135A1 (en) * | 2014-04-24 | 2015-10-29 | The Board Of Trustees Of The University Of Illinois | 4d vizualization of building design and construction modeling with photographs |
US9355495B2 (en) | 2013-10-09 | 2016-05-31 | Trimble Navigation Limited | Method and system for 3D modeling using feature detection |
US20170193693A1 (en) * | 2015-12-31 | 2017-07-06 | Autodesk, Inc. | Systems and methods for generating time discrete 3d scenes |
US20180315232A1 (en) * | 2017-05-01 | 2018-11-01 | Lockheed Martin Corporation | Real-time incremental 3d reconstruction of sensor data |
-
2018
- 2018-04-18 US US15/956,266 patent/US11288412B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8249909B2 (en) | 2008-10-31 | 2012-08-21 | Hitachi-Ge Nuclear Energy, Ltd. | System and method for visualizing the construction progress of scaffolding utilizing 3D CAD models |
US20110054652A1 (en) | 2009-08-27 | 2011-03-03 | Heil Duane A | Building Construction Software and System |
US20130147798A1 (en) | 2011-12-08 | 2013-06-13 | The Board Of Trustees Of The University Of Illinois | Inserting objects into content |
US20130155058A1 (en) * | 2011-12-14 | 2013-06-20 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
US9070216B2 (en) | 2011-12-14 | 2015-06-30 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
US9355495B2 (en) | 2013-10-09 | 2016-05-31 | Trimble Navigation Limited | Method and system for 3D modeling using feature detection |
US20150310135A1 (en) * | 2014-04-24 | 2015-10-29 | The Board Of Trustees Of The University Of Illinois | 4d vizualization of building design and construction modeling with photographs |
US9852238B2 (en) | 2014-04-24 | 2017-12-26 | The Board Of Trustees Of The University Of Illinois | 4D vizualization of building design and construction modeling with photographs |
US20170193693A1 (en) * | 2015-12-31 | 2017-07-06 | Autodesk, Inc. | Systems and methods for generating time discrete 3d scenes |
US20180315232A1 (en) * | 2017-05-01 | 2018-11-01 | Lockheed Martin Corporation | Real-time incremental 3d reconstruction of sensor data |
Non-Patent Citations (100)
Title |
---|
"Advancing the Competitiveness and Efficiency of the US Construction Industry," The National Academy of Sciences Engineering Medicine, The National Academies Press, http://nap.edu/12717, Washington, DC, 123 pages, 2009. |
Abeid, J., et al., "Photo-Net II: A Computer-Based Monitoring System Applied to Project Management", Automation in Construction, 12(2003), 603-616,Accepted May 13, 2003. |
Agarwal, Sameer, et. al., "Building Rome in a Day," Communications of the ACM 54, No. 10, pp. 105-112 (Oct. 2011). |
Alarcon, L. F., et al., "Assessing the Impacts of Implementing Lean Construction", Revista ingenieria de Construccion, Chile, vol. 23, No. 1, pp. 26-33, Apr. 2008. |
Aritua, B., et al., "Construction Client Multi-Projects—A Complex Adaptive System Perspective", International Journal of Project Management, Elsevier, 27, pp. 72-79, 2009. |
Atasoy, G. "Visualizing and Interacting with Construction Project Performance Information" [thesis] Carnegie Mellon University, Chap 3 and 4 [retrieved on Aug. 27, 2020]. (Year: 2013). * |
Aubry, Mathieu, Bryan C. Russell, and Josef Sivic, "Painting-to-3D Model Alignment via Discriminative Visual Elements," ACM Transactions on Graphics (TOG) 33, No. 2, pp. 1-14 (2014). |
Bae, H, et al., "High-Precision Vision-Based Mobile Augmented Reality System for Context-Aware Architectural, Engineering, Construction and Facility Management (AEC/FM) Applications", Visualization in Engineering, Springer, 1 (1), pp. 1-13, 2013. |
Bae, H. "Fast and Scalable Structure-from-Motion for High-precision Mobile Augmented Reality Systems" [thesis] Virginia Polytechnic Institute and State University [retrieved on Aug. 29, 2020]. (Year: 2014). * |
Bae, H., et al., "Image-Based Localization and Content Authoring in Structure-from-Motion Point Models for Real-Time Field Reporting Applications", J. Comput. Civ. Eng., DOI:1023 .1061/(ASCE)CP.1943-5487.0000392, 637-644, 2014. |
Bae, Soonmin, Aseem Agarwala, and Frédo Durand, "Computational Rephotography," ACM Trans Graph 29, No. 3, pp. 1-15 (Jun. 2010). |
Ballard, G. "The Last Planner System of Production Control," PhD at the University of Birmingham, 193 pages, May 2000. |
Ballard, G., et al., "Implementing Lean Construction: Stabilizing Work Flow," Lean Construction, 10 pages, 1994. |
Bartoli, Adrien, "Towards Gauge Invariant Bundle Adjustment: A Solution Based on Gauge Dependent Damping," In Computer Vision Proceedings., Ninth IEEE International Conference on Computer Vision, pp. 760-765 (Oct. 2003). |
Beetz, Jakob, et al., BIMServer.Org, "Org-An Open Source IFC Model Server," In Proceedings of the CIP W78 2010: 27th International Conference—Cairo, Egypt, 8 pages, Nov. 16-18, 2017. |
Behzadan, Amir H., and Kamat, Vineet R., "Visualization of Construction Graphics in Outdoor Augmented Reality," In Proceedings of the 37th conference on Winter Simulation, pp. 1914-1920 (2005). |
Besl, P.J., et al., "A Method for Registration of 3-D Shapes", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, No. 2, pp. 239-256, Feb. 1992. |
Bortolazza, R.C., et al., "A Quantitative Analysis of the Implementation of the Last Planner System in Brazil", 13th Conference of the International Group on Lean Construction, 413-420, 2005. |
Bosche, F., et al., "Tracking the Built Status of MEP Works: Assessing the Value of a Scan-vs-BIM System", Journal of Computing in Civil Engineering Article, 28(4); 05014004, 13 pages, 2014. |
Brodetskaia, I., "Stabilizing Production Flow of Interior and Finishing Works with Reentrant Flow in Building Construction," Journal of Construction Engineering and Management, article, 139(6), 665-674, Jun. 2013. |
Côté, Stephane, et.al., "Live Mobile Panoramic High Accuracy Augmented Reality for Engineering and Construction," Proceedings of the Construction Applications of Virtual Reality (CONVR), London, England, pp. 1-10 (Oct. 2013). |
Crandall, David, et.al., "Discrete-Continuous Optimization for Large-Scale Structure from Motion," In Computer Vision and Pattern Recognition (CVPR), IEEE Conference, pp. 3001-3008 (2011). |
Dave, B., et al., "Addressing Information Flow in Lean Production Management and Control in Construction," Proceeding of the 22nd International Group for Lean Construction (IGLC22), article, 2, 581-592, Jun. 2014. |
Dave, B., et al., "Exploring the Recurrent Problems in the Last Planner Implementation on Construction Projects," Lean Construction Conference (ILCC 2015), 1-10. |
Dellepiane, Matteo, et.al., "Assisted Multi-View Stereo Reconstruction," In 3D Vision—3DV International Conference, pp. 318-325 (2013). |
Dunston, Phillip S., et. al., "Mixed Reality Benefits for Design Perception," NIST Special Publication SP, pp. 191-196 (2003). |
Echeverry, D. et al., "Sequencing Knowledge for Construction Scheduling", Journal of Construction Engineering and Management, 117(1):118-130, 1991. |
Fathi et al. "Automated as-built 3D reconstruction of civil infrastructure using computer vision: Achievements, opportunities, and challenges"Advanced Engineering Informatics, vol. 29, pp. 149-161 <https://www.sciencedirect.com/science/article/pii/S1474034615000245> (Year: 2015). * |
Fitzgibbon, Andrew W., "Robust Registration of 2D and 3D Point Sets", 10 pages, 2001. |
Franken, Thomas, et al., "Minimizing User Intervention in Registering 2D Images to 3D Models," The Visual Computer 21, No. 8-10, pp. 619-628 (2005). |
Furukawa, Yasutaka, and Ponce, Jean, "Accurate, Dense, and Robust Multiview Stereopsis," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, No. 8, pp. 1362-1376 (2010). |
Garcia-Lopez, et al. "A System to Track Work Progress at Construction Jobsites" Proceedings of the 2014 Industrial and Systems Engineering Research Conf. (Year: 2015). * |
Garcia-Lopez, N.P., et al., "A System to Track Work Progress at Construction Jobsites," Industrial and Systems Engineering Research Conference, 10 pages, 2014. |
Golparvar Fard, M. "D4AR—4 Dimensional Augmented Reality-Models for Automation and Interactive Visualization of Construction Progress Monitoring" [thesis] University of Illinois at Urbana-Champaign [retrieved on Aug. 26, 2020]. (Year: 2010). * |
Golparvar-Fard et al. "D4AR—A 4-Dimensional Augmented Reality Model for Automatic Construction Progress Monitoring Data Collection, Processing and Communication" Journal of Information Technology in Construction. (Year: 2009). * |
Golparvar-Fard, et al. "Monitoring Changes of 3D Building Elements from Unordered Photo Collections" 2011 IEEE Int. Conf. on Computer Vision Workshops [retrieved on Aug. 21, 2020]. (Year: 2011). * |
Golparvar-Fard, M., et al., "D4AR—A 4-Dimensional Augmented Reality Model for Automating Construction Progress Data Collection, Processing and Communication", Journal of ITCON—vol. 14, published at http://www.itcon.org/2009/13, pp. 129-153, Jun. 2009. |
Golparvar-Fard, Mani, et al. "Automated Progress Monitoring Using Unordered Daily Construction Photographs and IFC-Based Building Information Models," Journal of Computing in Civil Engineering, pp. 04014025-1-04014025-19, 2015. |
Golparvar-Fard, Mani, et al., "Integrated Sequential As-Built and As-Planned Representation with D4AR Tools in Support of Decision-Making Tasks in the AEC/FM industry," Journal of Construction Engineering and Management, pp. 1099-1116(Dec. 2011). |
Golparvar-Fard, Mani, et al., "Visualization of Construction Progress Monitoring with 4D Simulation Model Overlaid on Time-Lapsed Photographs," Journal of Computing in Civil Engineering, vol. 23, No. 6, pp. 391-404 (2009). |
Golparvar-Fard, Mani, et. al., "Visual Representation of Construction Progress Monitoring Metrics on Time-Lapse Photographs," In Proc. Construction Management and Economics Conference, pp. 1-10 (Retrieved on Aug. 18, 2015). |
Gurevich, U., et al., "Examination of the Effects of KanBIM Production Control System on Subcontractors' Task Selection in Interior Works," Automation in Construction, 37, pp. 81-87, 2014. |
Hakkarainen, M. et al., "Software Architecture for Mobile Mixed Reality and 4D BIM Interaction," In Proc. 25th CIB W78 Conference, pp. 1-8 (2009). |
Halmedari et al. "IFC-Based Development of As-Built and As-Is BIMs Using Construction and Facility Inspection Data: Site-to-BIM Data Transfer Automation" Journal of Computing in Civil Engineering, vol. 32, Iss. 2 (Mar. 2018) [retrieved on Aug. 26, 2020] (Year: 2018). * |
Ham, Y., et al., "Visual Monitoring of Civil Infrastructure Systems via Camera-Equipped Unmanned Aerial Vehicles: A Review of Related Works," Visualization in Engineering, 4(1), pp. 1-8, 2016. |
Hammad, Amin, Wang, Hui, and Mudur, Sudhir P., "Distributed Augmented Reality for Visualizing Collaborative Construction Tasks," Journal of Computing in Civil Engineering, 10 pages (Nov./Dec. 2009). |
Hamzeh, F., et al., "Rethinking Lookahead Planning to Optimize Construction Workflow," Lean Construction Journal, article, pp. 15-34, 2012. |
Hamzeh, F.R. et al., "Understanding the Role of ‘Tasks Anticipated’ in Lookahead Planning Through Simulation," Automation in Construction., 49, 18-26, 2015. |
Han et al. "Potential of big visual data and building information modeling for construction performance analytics: An exploratory study" Automation in Construction, vol. 73 [retrieved on Aug. 25, 2020] (Year: 2017). * |
Han, K., et al., "Appearance-Based Material Classification for Monitoring of Operation-Level Construction Progess Using 4D BIM and Site Photologs," Automation in Construction, 53, 44-57, 2015. |
Han, K., K., et al., "Formalized Knowledge of Construction Sequencing for Visual Monitoring of Work-In-Progree via Incomplete Point Clouds and Low-LoD 4D BIMs," Advanced Engineering Informatics, 29(4), 889-901, 2015. |
Hartley, Richard, Gupta, Rajiv, and Chang, Tom, "Stereo from Uncalibrated Cameras," In Computer Vision and Pattern Recognition, Proceedings CVPR '92., IEEE Computer Society Conference on Computer Vision, pp. 761-764 (1992). |
Hewage, K. N., et al., "A Novel Solution for Construction On-Site Communication—The Information Booth," Canadian Journal of Civil Engineering,vol. 36, pp. 659-671, 2009. |
Horn, Berthold, K. P., "Closed-Form Solution of Absolute Orientaton Using Unit Quaternions," Journal of the Optical Society of America A, vol. 4, No. 4, pp. 629-642, Apr. 1987. |
Huttenlocher, Daniel P. and Ullman, Shimon, "Object Recognition using Alignment," In Proc. ICCV, vol. 87, pp. 102-111. (1987). |
Kahkonen, Kalle, et. al., "Integrating Building Product Models with Live Video Stream," pp. 176-188 (Oct. 22-23, 2007). |
Kamat, Vineet R., et al., "Integration of Global Positioning System and Inertial Navigation for Ubiquitous Context-Aware Engineering Applications," Proc., National Science Foundation Grantee Conference, Atlanta, GA, inproceedings, 1-10, 6 pages, 2010. |
Karsch, Kevin, et al., "ConstructAide: Analyzing and Visualizing Construction Sites Through Photographs and Building Models," ACM Transactions on Graphics (TOG), 33(6): 176, 11 pages, 2014. |
Karsch, Kevin, et. al., "Rendering Synthetic Objects into Legacy Photographs," In ACM Transactions on Graphics (TOG), vol. 30, No. 6, 12 pages (2011). |
Koo, B. et al., "Formalization of Construction Sequencing Rationale and Classification Mechanism to Support Rapid Generation of Sequencing Alternatives," Journal of Computing in Civil Engineering, 21 (6):423-433, Nov./Dec. 2007. |
Koskela, L., et al., "The Underlying Theory of Project Management is Obsolete," Engineering Management Review, IEEE, article 36(2), 22-34, 16 pages, 2008. |
Kropp et al. "Interior construction state recognition with 4D BIM registered image sequences" Automation in Construction, vol. 86 (Published Nov. 2017), pp. 11-32 [retrieved on Jun. 4, 2021] (Year: 2018). * |
Lee, Sanghoon, and Akin, Omer,"Augmented Reality-Based Computational Fieldwork Support for Equipment Operations and Maintenance," Automation in Construction, vol. 4, No. 20, pp. 338-352 (2011). |
Leigard, A., et al., "Defining the Path—A Case Study of Large Scale Implementation of Last Planner," Proceedings IGLC-18, pp. 396-405, Jul. 2010. |
Li et al. "Location Recognition Using Prioritized Feature Matching" K. Daniilidis, P. Maragos, N. Paragios (Eds.): ECCV 2010, Part II, LNCS 6312, pp. 791-804 [retrieved on Aug. 25, 2020] (Year: 2010). * |
Lin et al. "Web-Based 4D Visual Production Models for Decentralized Work Tracking and Information Communication on Construction Sites" Construction Research Congress 2016 [retrieved on Aug. 27, 2020]. (Year: 2016). * |
Lin, J., et al., "Model-Based Monitoring of Work-In-Progress via Images Taken by Camera-Equipped UAV and BIM," 2nd ICCCBEI, Tokyo, Japan, Mossman, A., 8 pages, (2013), (2015). |
Lindhard, S., et al., "Improving Onsite Scheduling: Looking Into the Limits of Last Planner System," The Built & Human Environment Review, vol. 6, pp. 46-60, 2013. |
Lowe, David G. "The Viewpoint Consistency Constraint," International Journal of Computer Vision, vol. 1, No. 1, pp. 57-72 (1987). |
Mossman, A. "Last Planner®: 5+1 Crucial & Collaborative Conversations for Predictable Design & Construction Delivery," Lean Construction Journal, 2013, 37 pages, Dec. 2015. |
Muthukumar, B. "Appearance-Based Material Classification After Occlusion Removal for Operation-Level Construction Progress Monitoring" [Master's Thesis] Graduate College of Civil Engineering, University of Illinois at Urbana-Champaign [retrieved on Aug. 25, 2020] (Year: 2015). * |
Muzzupappa et al. "3D reconstruction of an outdoor archaeological site through a multi-view stereo technique" 2013 Digital Heritage International Congress (DigitalHeritage), 2013, pp. 169-176, doi: 10.1109/DigitalHeritage.2013.6743727 [retrieved on Nov. 23, 2021] (Year: 2013). * |
Paris, Sylvain and Durand, Fredo, "A Fast Approximation of the Bilateral Filter Using a Signal Processing Approach," In Computer Vision—ECCV 2006, pp. 568-580 (2006). |
Photosynth, Wikipedia, the free encyclopedia, download available via web page at https://en.wikipedia.org/wiki/Photosynth, 3 pages, downloaded (Jul. 17, 2015), last modified on Apr. 13, 2015. |
Russell, Bryan C., et. al., "Automatic Alignment of Paintings and Photographs Depicting a 3D Scene," 3rd International IEEE Workshop on 3D Representation for Recognition (3dRR-11), associated with IEEE International Conference on Computer Vision Workshops (ICCV) 2011, pp. 545-552 (2011). |
Sacks et al. "Requirements for Building Information Modeling based Lean Production Management Systems for Construction" Automation in Construction. (Year: 2010). * |
Sacks, R., "Requirements for Building Information Modeling Based Lean Production Management Systems for Construction," Automation in Construction, 19(5), pp. 641-655, 2010. |
Sacks, R., et al., "Interaction of Lean and Building Information Modeling in Construction," Journal of construction Engineering and Management, article, American Society of Civil Engineers, 136(9), 968-980, Sep. 2010. |
Sacks, R., et al., "KanBIM Workflow Management System: Prototype Implemenation and Field Testing," Lean Construction Journal, LCI, 9(1), pp. 19-35, 2013. |
Scheiblauer, Claus, et al., "Graph-Based Guidance in Huge Point Clouds," In Proceedings of the 17th International Conference on Cultural Heritage and New Technologies, 8 pages, 2012. |
Seitz, Steven M., et.al., "A Comparison and Evaluation of Multi-View Stereo Reconstruction Algorithms," In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'2006), vol. 1, pp. 519-526, (Jun. 2006). |
Shin, Do Hyoung and Dunston, Phillip S., "Technology Development Needs for Advancing Augmented Reality-Based Inspection," Automation in Construction, vol. 19, No. 2, pp. 169-182 (2010). |
Snavely, Noah, et al., "Finding Paths through the World's Photos," In ACM Transactions on Graphics (TOG), vol. 27, No. 3, 11 pages (2008). |
Snavely, Noah, et. al., "Photo Tourism: Exploring Photo Collections in 3D," In ACM transactions on graphics (TOG), vol. 25, No. 3, pp. 835-846 (2006). |
Snavely, Noah, et. al.,"Modeling the World from Internet Photo Collections," International Journal of Computer Vision 80, No. 2. pp. 189-210 (Oct. 31, 2007). |
Son, H., "As-Built Data Acquisition and its Use in Production Monitoring and Automated Layout of Civil Infrastructure: A Survey," Advanced Engineering Informatics, 29(2), pp. 172-183, 2015. |
Song et al. "Project Dashboard: Concurrent Visual Representation Method of Project Metrics on 3D Building Models" Computing in Civil Engineering [retrieved on Aug. 28, 2020] (Year: 2005). * |
Turkan, Y., et al., "Toward Automated Earned Value Tracking Using 3D Imaging Tools," Journal of Construction Engineering and Management, pp. 423-433, Apr. 2013. |
Turkan, Yelda et. al., "Automated Progress Tracking using 4D Schedule and 3D Sensing Technologies," Automation in Construction 22, pp. 414-421 (2012). |
Verhoeven et al. "Mapping by matching: a computer vision-based approach to fast and accurate georeferencing of archaeological aerial photographs" Journal of Archaeological Science 39 (2012) 2060-2070 [retrieved on Nov. 23, 2021] (Year: 2012). * |
Verykokou et al. "3D reconstruction of disaster scenes for urban search and rescue" Multimed Tools Appl (2018) 77:9691-9717; https://doi.org/10.1007/s11042-017-5450-y [retrieved on Nov. 23, 2021] (Year: 2017). * |
Werner, Tomas and Zisserman, Andrew, "New Techniques for Automated Architectural Reconstruction from Photographs," In Computer Vision—ECCV 2002, pp. 541-555 (2002). |
Woodward, Charles, et. al., "Mixed Reality for Mobile Construction Site Visualization and Communication," In Proc. 10th International Conference on Construction Applications of Virtual Reality (CONVR), pp. 4-5, (2010). |
Wu, Changchang, "Towards Linear-Time Incremental Structure from Motion," In International Conference on 3D Vision—3DV, pp. 127-134 (2013). |
Wu, Changchang, et. al., "Multicore Bundle Adjustment," In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3057-3064 (2011). |
Yabuki, Nobuyoshi, et. al., "Collaborative and Visualized Safety Planning for Construction Performed at High Elevation," In Cooperative Design, Visualization, and Engineering, pp. 282-285 (2010). |
Yang, J., et al., "Construction Performance Monitoring via Still Images, Time-Lapse Photos, and Video Streams: Now, Tomorrow, and the Future," Advanced Engineering Informatics, article, Elsevier, pp. 211-224, 2015. |
Yu, H., et al., "Development of Lean Model for House Construction Using Value Stream Mapping," Journal of Construction Engineering and Management, 135(8), pp. 782-790, Aug. 2009. |
Zollmann, Stefanie, "Interactive 4D Overview and Detail Visualization in Augmented Reality," In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 167-176 (2012). |
Zollmann, Stefanie, et. al., "Image-Based Ghostings for Single Layer Occlusions in Augmented Reality," In 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 19-26 (2010). |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210233274A1 (en) * | 2018-04-26 | 2021-07-29 | Continental Automotive Gmbh | Online Evaluation for Camera Intrinsic Parameters |
US11562503B2 (en) * | 2018-04-26 | 2023-01-24 | Continental Automotive Gmbh | Online evaluation for camera intrinsic parameters |
US11512955B1 (en) * | 2019-01-10 | 2022-11-29 | State Farm Mutual Automobile Insurance Company | Systems and methods for enhanced base map generation |
US11954797B2 (en) | 2019-01-10 | 2024-04-09 | State Farm Mutual Automobile Insurance Company | Systems and methods for enhanced base map generation |
US20210158546A1 (en) * | 2019-11-22 | 2021-05-27 | Baidu Usa Llc | Updated point cloud registration pipeline based on admm algorithm for autonomous vehicles |
US11521329B2 (en) * | 2019-11-22 | 2022-12-06 | Baidu Usa Llc | Updated point cloud registration pipeline based on ADMM algorithm for autonomous vehicles |
US20210200174A1 (en) * | 2019-12-31 | 2021-07-01 | Johnson Controls Technology Company | Building information model management system with hierarchy generation |
US12099334B2 (en) | 2019-12-31 | 2024-09-24 | Tyco Fire & Security Gmbh | Systems and methods for presenting multiple BIM files in a single interface |
US11900670B2 (en) * | 2022-06-30 | 2024-02-13 | Metrostudy, Inc. | Construction stage detection using satellite or aerial imagery |
US11908185B2 (en) * | 2022-06-30 | 2024-02-20 | Metrostudy, Inc. | Roads and grading detection using satellite or aerial imagery |
Also Published As
Publication number | Publication date |
---|---|
US20190325089A1 (en) | 2019-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11288412B2 (en) | Computation of point clouds and joint display of point clouds and building information models with project schedules for monitoring construction progress, productivity, and risk for delays | |
US11557092B2 (en) | Methods and systems for wireframes of a structure or element of interest and wireframes generated therefrom | |
Han et al. | Potential of big visual data and building information modeling for construction performance analytics: An exploratory study | |
Braun et al. | Improving progress monitoring by fusing point clouds, semantic data and computer vision | |
Golparvar-Fard et al. | D4AR–a 4-dimensional augmented reality model for automating construction progress monitoring data collection, processing and communication | |
US9070216B2 (en) | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring | |
Lin et al. | Bridge inspection with aerial robots: Automating the entire pipeline of visual data capture, 3D mapping, defect detection, analysis, and reporting | |
US20190138667A1 (en) | Systems and methods for the digital verification of industrial construction execution | |
Mahdjoubi et al. | Providing real-estate services through the integration of 3D laser scanning and building information modelling | |
Turkan et al. | Toward automated earned value tracking using 3D imaging tools | |
US9852238B2 (en) | 4D vizualization of building design and construction modeling with photographs | |
Bastem et al. | Development of historic building information modelling: A systematic literature review | |
Karsch et al. | ConstructAide: analyzing and visualizing construction sites through photographs and building models | |
Lin et al. | Construction progress monitoring using cyber-physical systems | |
Lin et al. | Visual and virtual production management system for proactive project controls | |
US10861247B2 (en) | Roof report generation | |
Lin et al. | Visual data and predictive analytics for proactive project controls on construction sites | |
Memon et al. | An automatic project progress monitoring model by integrating auto CAD and digital photos | |
Fernandes | Advantages and disadvantages of BIM platforms on construction site | |
Golparvar-Fard et al. | Grand challenges in data and information visualization for the architecture, engineering, construction, and facility management industries | |
WO2020241331A1 (en) | Three-dimensional data management method for building and mobile terminal for implementing same | |
Hullo et al. | Advances in multi-sensor scanning and visualization of complex plants: The utmost case of a reactor building | |
J Skibniewski | Construction project monitoring with site photographs and 4D project models | |
Alizadehsalehi | BIM/Digital Twin-Based Construction Progress Monitoring through Reality Capture to Extended Reality (DRX) | |
US20230034273A1 (en) | Building information modeling (bim) data model for construction infrastructure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: THE BOARD OF TRUSTEES OF THE UNIVERSITY OF ILLINOIS, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAN, KOOK IN;REEL/FRAME:045691/0389 Effective date: 20180425 Owner name: THE BOARD OF TRUSTEES OF THE UNIVERSITY OF ILLINOIS, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOLPARVAR-FARD, MANI;HOIEM, DEREK;LIN, JACOB JE-CHIAN;AND OTHERS;SIGNING DATES FROM 20180424 TO 20180426;REEL/FRAME:045691/0745 Owner name: NATIONAL SCIENCE FOUNDATION, VIRGINIA Free format text: CONFIRMATORY LICENSE;ASSIGNOR:UNIVERSITY OF ILLINOIS, URBANA-CHAMPAIGN;REEL/FRAME:046054/0057 Effective date: 20180420 Owner name: RECONSTRUCT INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOLPARVAR-FARD, MANI;HOIEM, DEREK;LIN, JACOB JE-CHIAN;AND OTHERS;SIGNING DATES FROM 20180424 TO 20180426;REEL/FRAME:045691/0745 Owner name: THE BOARD OF TRUSTEES OF THE UNIVERSITY OF ILLINOI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOLPARVAR-FARD, MANI;HOIEM, DEREK;LIN, JACOB JE-CHIAN;AND OTHERS;SIGNING DATES FROM 20180424 TO 20180426;REEL/FRAME:045691/0745 Owner name: THE BOARD OF TRUSTEES OF THE UNIVERSITY OF ILLINOI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAN, KOOK IN;REEL/FRAME:045691/0389 Effective date: 20180425 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |