CN114727064A - Construction safety macro monitoring system and method - Google Patents

Construction safety macro monitoring system and method Download PDF

Info

Publication number
CN114727064A
CN114727064A CN202210339974.7A CN202210339974A CN114727064A CN 114727064 A CN114727064 A CN 114727064A CN 202210339974 A CN202210339974 A CN 202210339974A CN 114727064 A CN114727064 A CN 114727064A
Authority
CN
China
Prior art keywords
image data
construction
camera device
value
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210339974.7A
Other languages
Chinese (zh)
Other versions
CN114727064B (en
Inventor
方东平
古博韬
岳清瑞
李建华
黄玥诚
郭红领
王尧
曹思涵
刘云飞
曹海涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Beijing Urban Construction Group Co Ltd
Original Assignee
Tsinghua University
Beijing Urban Construction Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Beijing Urban Construction Group Co Ltd filed Critical Tsinghua University
Priority to CN202210339974.7A priority Critical patent/CN114727064B/en
Publication of CN114727064A publication Critical patent/CN114727064A/en
Application granted granted Critical
Publication of CN114727064B publication Critical patent/CN114727064B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30184Infrastructure

Abstract

The invention relates to a construction safety macro monitoring system and a method, wherein the system comprises: the data acquisition module is used for acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery; the calibration module is used for calibrating the camera device and the image data acquired by the camera device, and aligning the image data with data in a preset construction plane layout diagram to obtain processed image data; the identification module is used for identifying the processed image data so as to determine construction elements and construction operation of a construction site; and the visualization module is used for displaying the construction elements and the construction operation.

Description

Construction safety macroscopic monitoring system and method
Technical Field
The invention relates to the technical field of data processing of construction sites, in particular to a construction safety macro monitoring system and a construction safety macro monitoring method.
Background
The construction site monitoring is used as a tool capable of directly reflecting construction conditions through a camera, and is widely applied to construction projects. However, due to problems of installation and power supply, the camera installed at a fixed position often monitors only a part of industrial control on a construction site, and due to the existence of a plurality of machine positions, a manager cannot obtain all site conditions at the same time. For this reason, researchers try to solve the problem by adopting modes such as an unmanned aerial vehicle and a tower crane camera. But traditional unmanned aerial vehicle can't monitor the job site for a long time because the time of endurance is within 30 minutes, and the tower crane camera relies on the approach of tower crane, can only solve installation and power supply problem in partial stage. Therefore, no suitable device or system is available at present, the requirement of integral monitoring of a construction site can be effectively met, and the effective information of the whole construction site can be displayed to a manager in real time.
Disclosure of Invention
In order to overcome the problems in the related art, the invention provides a construction safety macroscopic monitoring system and a construction safety macroscopic monitoring method.
According to a first aspect of embodiments of the present invention, there is provided a construction safety macro monitoring system, including:
the data acquisition module is used for acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
the calibration module is used for calibrating the camera device and the image data acquired by the camera device, and aligning the image data with data in a preset construction plane layout diagram to obtain processed image data;
the recognition module is used for recognizing the processed image data so as to determine construction elements and construction operation of a construction site;
and the visualization module is used for displaying the construction elements and the construction operation.
In one embodiment, preferably, the data acquisition module includes a first data acquisition submodule and a second data acquisition submodule, wherein the first data acquisition submodule is used for acquiring first image data before the entering of the engineering machine, and the second data acquisition submodule is used for acquiring second image data after the entering of the engineering machine.
In one embodiment, preferably, the first data acquisition submodule includes a camera, a pan-tilt, a tethered airship, a remote control unit and a graph transmission unit;
the second data acquisition submodule comprises a camera device, a holder, a tower crane power supply unit, a remote control unit and a picture transmission unit;
the camera device is used for monitoring the construction site in real time;
the holder is used for carrying out attitude calibration on the lens of the camera device to ensure the angle of the lens;
the mooring airship is used for providing an installation position for the camera device before the engineering machinery enters a field;
the remote control unit is used for remotely controlling the steering of the lens of the camera device;
the image transmission unit is used for transmitting image data acquired by the camera device to a ground industrial personal computer in real time;
and the tower crane power supply unit is used for providing installation positions for the camera device, the holder, the remote control unit and the image transmission unit and supplying power after the engineering machinery enters the field.
In one embodiment, preferably, the calibration module includes a color calibration module, a camera calibration module and a BIM system alignment module;
the color calibration module is used for performing color calibration on the image data by adopting a perfect reflection algorithm;
the camera device calibration module is used for calibrating the position of the camera device by adopting a plurality of positioning color blocks;
and the BIM system alignment module is used for inserting the calibrated image data into a preset construction floor plan in the BIM system and performing data alignment on the image data and the preset construction floor plan.
In one embodiment, preferably, the color calibration module is specifically configured to:
traversing each pixel point in each image in the image data in RGB space, and calculating points (A, B, C)X i , Y j ) And:
C ij =R ij +G ij +B ij
find in the imageC ij Maximum pointC m To obtain its RGB valueR m , G m , B m Calculating the RGB mean value of the pixel points distributed in the previous preset number according to the C value
Figure 371197DEST_PATH_IMAGE001
And calculating the gain coefficient of each pixel of the image:
Figure 288337DEST_PATH_IMAGE002
thus, the final RGB value of each pixel is obtained:
Figure 5758DEST_PATH_IMAGE003
in one embodiment, preferably, the camera calibration module is configured to:
setting three positioning color blocks in a construction plane layout in a BIM system, wherein the center of each color block is provided with a mirror, and each positioning color block comprises a square plastic plate;
after the captive airship is lifted off, the camera device is adjusted through the remote control unit, so that three positioning color blocks are located in a picture and distributed in four parts of [ (0,0), (2871,1536) ], [ (2871,0), (5742, 1536) ], [ (0, 1536), (2871, 3072) ], [ (2871,1536), (5742, 3072) ], and the connecting line between the color blocks is kept to be flush with the picture frame as much as possible;
converting the RGB picture into HSV picture, searching pixel blocks meeting the preset requirement in the four parts of pictures, recording pixel block coordinates corresponding to the maximum value and the minimum value of the four parts of pictures in the horizontal direction, adding the two pixel block coordinates to obtain the average, obtaining the coordinates of the central points of three positioning color blocks, and recording the coordinates as fixed color blocksThe original coordinates of the color blocks are located; wherein the preset requirements include:H∈[125, 155], S∈[43, 255], V∈[46, 255]
recalculating the coordinates of the central points of the three positioning color blocks at preset time intervals, and calculating the difference between the coordinates and the original coordinates;
and when the difference exceeds a preset value or the coordinate of the central point of any positioning color block is lacked, outputting an alarm prompt to prompt that the calibration needs to be carried out again.
In one embodiment, preferably, the identification module is configured to:
when the construction elements are identified, training workers, mobile engineering machinery and a tower crane based on Mask-RCNN and Deepsort algorithms through construction images collected and marked in advance, respectively identifying contour coordinates of the workers, the mobile engineering machinery and the tower crane in image data of a construction site by taking minutes as a unit, and tracking each identified object; respectively counting the total number of the identified workers, the mobile engineering machinery and the tower crane in a database by taking minutes as a unit; when the outlines of workers are identified, whether the workers wear safety helmets or not and the colors of the safety helmets are identified, each identified worker, the mobile engineering machine and the tower crane are numbered, and the corresponding outline coordinates and the outline centroid coordinates are stored in the data sets of the workers, the mobile engineering machines and the tower cranes corresponding to the moment.
In one embodiment, preferably, the identification module is configured to:
traversing all pixel points of the image data when the construction operation is identified;
calculating the maximum value in the RGB three-component brightness of the full pixel point, and recording the maximum value as a gray value;
storing one frame of image shot by the camera device every second, calculating the average value of corresponding pixels in each frame of gray scale image by taking 600 seconds as a sliding window, and recording the average value asBG i , i=0,1,……nWherein the first 600 seconds corresponds toBG 0
Using 60 seconds as sliding window to obtain correspondence in each frame of gray scale imageAverage value of pixel, recorded asTS i , i=0,1,……n
Will be provided withTS i AndBG i performing gray value subtraction operation and solving the absolute value of the gray value to obtain the gray difference value of each pixel point;
when the gray difference value of a pixel point is greater than the gray judgment threshold, recording the pixel point as a change point, and storing corresponding coordinates;
and performing K-means unsupervised clustering on all stored coordinates, judging the clustering number according to a clustering n value corresponding to the maximum value of the contour coefficient, calculating the coordinates of each type of center point, and finding corresponding points and scene information in the BIM system.
According to a second aspect of the embodiments of the present invention, there is provided a construction safety macro monitoring method for a construction safety macro monitoring system, the method including:
acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
calibrating the camera device and image data acquired by the camera device, and aligning the image data with data in a preset construction floor plan to obtain processed image data;
identifying the processed image data to determine construction elements and ongoing construction work of a construction site;
and displaying the construction elements and the construction operation.
According to a third aspect of the embodiments of the present invention, there is provided a construction safety macro monitoring apparatus, the apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
calibrating the camera device and image data acquired by the camera device, and aligning the image data with data in a preset construction floor plan to obtain processed image data;
identifying the processed image data to determine construction elements and construction operation of a construction site;
and displaying the construction elements and the construction operation.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects:
the construction method is based on machine vision technology, target detection, target tracking, image white balance algorithm, BIM technology and combination of hardware such as a captive airship, a high-definition camera and an industrial personal computer, realizes a macroscopic monitoring system of a construction site, completes real-time monitoring of workers, engineering machinery and an operation area, and has the core characteristics of manageable process, traceable problems, retainable data and the like.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a block diagram illustrating a construction safety macro-monitoring system in accordance with an exemplary embodiment.
FIG. 2A is a schematic diagram illustrating a 2D interface in accordance with an exemplary embodiment.
FIG. 2B is a schematic diagram illustrating an RGB interface, according to an example embodiment.
Fig. 3 is a block diagram illustrating a data collection module in a construction safety macro monitoring system according to an exemplary embodiment.
Fig. 4 is a block diagram illustrating a first data collection submodule in a construction safety macro monitoring system according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating a second data collection submodule in a construction safety macro monitoring system according to an exemplary embodiment.
FIG. 6 is a schematic diagram of a positioning patch shown in accordance with an exemplary embodiment.
FIG. 7 is a flow diagram illustrating a construction safety macro-monitoring method in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
FIG. 1 is a block diagram illustrating a construction safety macro-monitoring system in accordance with an exemplary embodiment.
As shown in fig. 1, according to a first aspect of the embodiment of the present invention, there is provided a construction safety macro monitoring system, including:
the data acquisition module 11 is configured to acquire image data of a construction site through a camera device, where the image data includes first image data before an approach of the engineering machine and second image data after the approach of the engineering machine;
the calibration module 12 is configured to calibrate the camera device and image data acquired by the camera device, and align the image data with data in a preset construction plan layout map to obtain processed image data;
the identification module 13 is used for identifying the processed image data so as to determine construction elements and construction operation of a construction site;
and the visualization module 14 is used for displaying the construction elements and the construction operation.
The visualization module is used for displaying the construction elements and the construction operation obtained from the identification module on a picture of a ground industrial personal computer in a visual and simple mode, and helping managers to quickly recognize the distribution condition of the workers on the site at the current construction site, the position of the engineering machinery and the position of the construction operation. The visualization module comprises two interfaces, namely a 2D interface based on a construction layout plan and an RGB interface based on a high-definition camera picture. The schematic of the 2D interface is shown in fig. 2A and the RGB interface is shown in fig. 2B. The identification module transmits the worker position information, the worker identity information, the engineering machine information and the centroid point coordinates of the corresponding pixels of the engineering machine information to the BIM, and for each building entity in the model, such as a construction area 1 or a temporary road and the like, when the building entity is selected by a mouse, the number of workers with different identities in the area and the type and the number of the engineering machines are displayed. Because the data is stored every minute, the invention supports the backtracking function, and can return to the personnel and equipment distribution situation of the construction site at the past moment by dragging the time axis.
The specific algorithm is to determine the extreme point of the coordinates in the category according to the coordinates obtained by clustering based on the operation identification module, and draw the transparent gray frame as the starting point and the ending point. The corresponding operation area operation content is obtained by reading the operation content of the type from the center point coordinate of the type in the BIM model. When the mouse moves to the gray frame line, the operation content of the area is displayed. The 'under construction' characters appear above the gray frame, the fixed points fall on the gray frame, and the positions of the fixed points are fixed at the upper right corner of the gray frame. Workers and engineering machinery identified by the identification module can be respectively represented by different colors, and for the same type of machinery, natural number numbering from 1 is carried out by combining the identification with a target tracking algorithm in sequence. When the mouse is moved to the position of the worker or the construction machine, information such as the name of the machine, the number of the machine, the type of the worker, and the like is displayed. A time axis can be displayed below the display page, and information of workers, engineering machinery and operation area change can be dragged on the time axis in minutes, so that backtracking of actual changes of a construction site in the process is completed.
As shown in fig. 3, in an embodiment, preferably, the data acquisition module 11 includes a first data acquisition submodule 31 and a second data acquisition submodule 32, where the first data acquisition submodule 31 is configured to acquire first image data before a process of the construction machine, and the second data acquisition submodule 32 is configured to acquire second image data after the process of the construction machine.
As shown in fig. 4, the first data acquisition submodule 31 includes a camera, a pan-tilt, a captive airship, a remote control unit, and a map transmission unit;
as shown in fig. 5, the second data acquisition submodule 32 includes a camera device, a pan-tilt, a tower crane power supply unit, a remote control unit and a picture transmission unit;
the camera device is used for monitoring the construction site in real time;
the holder is used for carrying out attitude calibration on the lens of the camera device to ensure the angle of the lens;
the mooring airship is used for providing an installation position for the camera device before the engineering machinery enters a field;
the remote control unit is used for remotely controlling the steering of the lens of the camera device;
the image transmission unit is used for transmitting image data acquired by the camera device to a ground industrial personal computer in real time;
and the tower crane power supply unit is used for providing installation positions for the camera device, the holder, the remote control unit and the image transmission unit and supplying power after the engineering machinery enters the field.
In order to ensure that the picture covers the construction site, the high-definition camera is arranged at a height of at least 40 meters, and the inclination angle of the camera is based on the picture covering the whole construction site. In order to ensure the definition of the picture, the invention selects a high-definition camera of 6K or more to monitor the construction site in real time. The holder can adopt a three-axis self-stabilizing holder.
Before the tower crane enters the field, the construction site lacks a high-altitude position for installing the camera, so that the mooring airship is selected to provide a position for installing the camera at the stage. The main reason for choosing the mooring airship here is to consider that it can be powered uninterruptedly so as to work at high altitude for a long time, and the price is more economical than that of an unmanned aerial vehicle (the unmanned aerial vehicle, an industrial balloon and other equipment can also meet the requirements here in cooperation with a mooring system). In specific implementation, firstly, based on construction drawings, a proper position is selected on site to lift the airship off (the vertical position of the ground can be the same as the position where a tower crane is to be erected in the future), and the airship is lifted to the height of 40 meters on the premise of stable power supply through a mooring system. And then the orientation of the high-definition camera is adjusted by connecting the remote control module and the image transmission module on the ground through an industrial personal computer. The image transmission module transmits the pictures shot by the high-definition camera to a ground industrial personal computer in real time. In addition, in order to ensure long-time work of the data acquisition module, the mooring airship is connected with a power supply of a construction site (which can be a living area power supply) and supplies power for the whole data acquisition module. After the tower crane enters a field, a mooring airship in the data acquisition module is replaced by a tower crane power supply module, and the whole equipment (a high-definition camera, a cradle head, a picture transmission module and a remote control module) is installed on the tower crane and is powered by the tower crane power supply module. Of course, the data acquisition module can be continuously used, and the corresponding height needs to be raised to a position higher than the tower crane.
In one embodiment, preferably, the calibration module includes a color calibration module, a camera calibration module and a BIM system alignment module;
the color calibration module is used for performing color calibration on the image data by adopting a perfect reflection algorithm;
the camera device calibration module is used for calibrating the position of the camera device by adopting a plurality of positioning color blocks;
and the BIM system alignment module is used for inserting the calibrated image data into a preset construction floor plan in the BIM system and performing data alignment on the image data and the preset construction floor plan.
In one embodiment, preferably, the color calibration module is specifically configured to:
traversing each pixel point in each image in the image data in RGB space, and calculating points (A, B, C)X i , Y j ) And:C ij =R ij +G ij +B ij
find in the imageC ij Maximum pointC m To obtain its RGB valueR m , G m , B m Calculating the RGB mean value of the pixel points distributed in the previous preset number according to the C value
Figure 338650DEST_PATH_IMAGE004
And calculating the gain coefficient of each pixel of the image:
Figure 153022DEST_PATH_IMAGE005
thus, the final RGB value of each pixel is obtained:
Figure 975485DEST_PATH_IMAGE006
in one embodiment, preferably, the camera calibration module is configured to:
setting three positioning color blocks in a construction plane layout in a BIM system, wherein the center of each color block is provided with a mirror, and each positioning color block comprises a square plastic plate;
at the beginning of construction, no obvious building structure is used for identifying a construction operation area on the site, so that 3 positioning color blocks of 1000mm multiplied by 1000mm (a mirror of 100mm multiplied by 100mm is arranged at the center of each color block) are required to be additionally arranged in a construction plane layout diagram in a BIM system, and the positioning color blocks are formed by arranging purple square plastic plates at three corners of a construction range on the construction site through a total station lofting as shown in figure 6.
After the captive airship is lifted off, the camera device is adjusted through the remote control unit, so that three positioning color blocks are located in a picture and distributed in four parts of [ (0,0), (2871,1536) ], [ (2871,0), (5742, 1536) ], [ (0, 1536), (2871, 3072) ], [ (2871,1536), (5742, 3072) ], and the connecting line between the color blocks is kept to be flush with the picture frame as much as possible;
converting the RGB picture into HSV picture, searching pixel blocks meeting the preset requirement in the four parts of pictures, recording the pixel block coordinates corresponding to the maximum value and the minimum value of the pixel blocks in the horizontal direction, and seating the two pixel blocksAdding and averaging to obtain the coordinates of the central points of the three positioning color blocks, and recording the coordinates as the original coordinates of the positioning color blocks; wherein the preset requirements include:H∈[125, 155], S∈[43, 255], V∈[46, 255]
recalculating the coordinates of the central points of the three positioning color blocks at preset time intervals, and calculating the difference value between the coordinates and the original coordinates;
and when the difference exceeds a preset value or the coordinate of the central point of any positioning color block is lacked, outputting an alarm prompt to prompt that the calibration needs to be carried out again.
And transmitting the corrected image captured by the high-definition camera into an industrial personal computer on the ground in real time, and inserting the image into a preset construction plane layout chart in the BIM system by taking the image as a unit every day. And selecting a central point of the upper left corner positioning color block in the graph captured by the high-definition camera as an alignment central point in an insertion mode, and aligning the central point coordinate of the upper left corner positioning color block which is pre-designed in the BIM system, so that the actual data is aligned with the data in the preset construction floor layout graph. By data alignment, the range coordinates of a construction area, a transportation road (large-scale engineering machinery approach channel), a construction road (medium and small-scale engineering machinery approach road) and a temporary road (temporarily arranged, road capable of meeting the action of medium and small-scale engineering machinery) can be obtained in the space captured by the high-definition camera. A manager on a construction site can confirm and adjust the arrangement position relation and the actual position relation of the construction plane in the drawing by monitoring the picture of the camera, so that the picture shot by the camera in real time is aligned with the design drawing.
In one embodiment, preferably, the identification module is configured to:
when the construction elements are identified, training workers, mobile engineering machinery and a tower crane based on Mask-RCNN and Deepsort algorithms through construction images collected and marked in advance, respectively identifying contour coordinates of the workers, the mobile engineering machinery and the tower crane in image data of a construction site by taking minutes as a unit, and tracking each identified object; respectively counting the total number of the identified workers, the mobile engineering machinery and the tower crane in a database by taking minutes as a unit; when the outlines of workers are identified, whether the workers wear safety helmets or not and the colors of the safety helmets are identified, each identified worker, the mobile engineering machine and the tower crane are numbered, and the corresponding outline coordinates and the corresponding central point coordinates are stored in the data sets of the workers, the mobile engineering machines and the tower cranes corresponding to the moment.
The identification module mainly comprises two parts, namely key construction element identification and operation identification algorithm, wherein the key construction elements comprise constructors and engineering machinery 2 parts, and the aim is to put the key construction elements out of a complex construction environment. According to the construction site data acquisition method, construction site data are collected and labeled in advance, a Mask-RCNN algorithm is used for training constructors and typical engineering machinery (an excavator, a mobile crane, a forklift, a pile driver and the like), the positions of the constructors and the engineering machinery are segmented in an image by taking minutes as a unit, the constructors can be in light green, and the engineering machinery can be represented in light red. Because high definition camera shoots the building site downwards at the eminence, consequently can shoot constructor's safety helmet to fix a position its worker's kind. Yellow is typically a general worker, blue and red are typically professional workers, and white is typically a supervisor or proctor. When the Mask R-CNN judges that the identified person is a worker, the cap in the divided area is identified by the Mask R-CNN, and the color of the cap is judged. The identified work machines need to be tracked to be labeled. The method applies a Deepsort multi-target tracking algorithm, has the specific flow as follows, and records a prediction sequence based on a Kalman filtering algorithm asT i At each 10 second interval, the engineering machine is identified by a target identification algorithm (such as Mask R-CNN) to obtain objectsD j Again 10 seconds apart each, whereinT 0 = D 0 Will beD n Predicted by applying Kalman filtering algorithmT n Performing matching onD n The part with successful matching is recorded asD nt Let us orderT n = D nt . The specific matching process is based on appearance information (appearance information)And Mahalanobis distance (Mahalanobis distance) calculationT n AndD n then performing cascade matching and IoU matching to obtainD nt
In one embodiment, preferably, the identification module is configured to:
traversing all pixel points of the image data when the construction operation is identified;
calculating the maximum value in the RGB three-component brightness of the full pixel point, and recording the maximum value as a gray value;
storing one frame of image shot by the camera device every second, calculating the average value of corresponding pixels in each frame of gray scale image by taking 600 seconds as a sliding window, and recording the average value asBG i , i=0,1,……nWherein the first 600 seconds corresponds toBG 0
The average value of corresponding pixels in each frame of gray scale image is obtained by taking 60 seconds as a sliding window, and the average value is recorded asTS i , i=0,1,……n
Will be provided withTS i AndBG i performing gray value subtraction operation and solving the absolute value of the gray value to obtain the gray difference value of each pixel point;
when the gray difference value of a pixel point is greater than the gray judgment threshold, recording the pixel point as a change point, and storing a corresponding coordinate;
and performing K-means unsupervised clustering on all stored coordinates, judging the clustering number according to a clustering n value corresponding to the maximum value of the contour coefficient, calculating the coordinates of each type of center point, and finding corresponding points and scene information in the BIM system.
FIG. 7 is a flow diagram illustrating a construction safety macro-monitoring method in accordance with an exemplary embodiment.
As shown in fig. 7, according to a second aspect of the embodiment of the present invention, there is provided a construction safety macro monitoring method for a construction safety macro monitoring system, the method including:
step S701, acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
step S702, calibrating the camera device and the image data collected by the camera device, and aligning the image data with data in a preset construction plane layout diagram to obtain processed image data;
step S703, identifying the processed image data to determine construction elements and construction operation of a construction site;
and step S704, displaying the construction elements and the construction operation.
According to a third aspect of the embodiments of the present invention, there is provided a construction safety macro monitoring apparatus, the apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
calibrating the camera device and image data acquired by the camera device, and aligning the image data with data in a preset construction floor plan to obtain processed image data;
identifying the processed image data to determine construction elements and construction operation of a construction site;
and displaying the construction elements and the construction operation.
According to a third aspect of the embodiments of the present invention, there is provided a construction safety macro evaluation apparatus, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a data image of a construction site, and carrying out labeling and identification processing to determine each worker and the engineering machine of the construction site and a central point coordinate and a contour coordinate corresponding to each worker and each engineering machine;
determining an unsafe behavior value, a worker risk consciousness value and a management agility value of each worker without wearing a safety helmet according to each worker and the engineering machine on a construction site and a central point coordinate and a contour coordinate corresponding to each worker and the engineering machine;
training by adopting a fuzzy neural network algorithm according to the unsafe behavior value, the worker risk consciousness value and the management agility value in the preset time period and the corresponding preset construction safety macroscopic evaluation value to obtain a construction safety macroscopic evaluation model;
and performing construction safety macroscopic evaluation on the target construction site by using the construction safety macroscopic evaluation model.
It is further understood that the term "plurality" means two or more, and other terms are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another, and do not indicate a particular order or degree of importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. A construction safety macro monitoring system, comprising:
the data acquisition module is used for acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
the calibration module is used for calibrating the camera device and the image data acquired by the camera device, and aligning the image data with data in a preset construction plane layout diagram to obtain processed image data;
the identification module is used for identifying the processed image data so as to determine construction elements and construction operation of a construction site;
and the visualization module is used for displaying the construction elements and the construction operation.
2. The construction safety macroscopic monitoring system according to claim 1, wherein the data acquisition module comprises a first data acquisition submodule and a second data acquisition submodule, wherein the first data acquisition submodule is used for acquiring first image data before an approach of the engineering machinery, and the second data acquisition submodule is used for acquiring second image data after the approach of the engineering machinery.
3. The construction safety macroscopic monitoring system according to claim 2, wherein the first data acquisition submodule comprises a camera device, a cradle head, a mooring airship, a remote control unit and a graph transmission unit;
the second data acquisition submodule comprises a camera device, a holder, a tower crane power supply unit, a remote control unit and a picture transmission unit;
the camera device is used for monitoring the construction site in real time;
the holder is used for carrying out attitude calibration on the lens of the camera device to ensure the angle of the lens;
the mooring airship is used for providing a mounting position for the camera device before the engineering machinery enters a field;
the remote control unit is used for remotely controlling the steering of the lens of the camera device;
the image transmission unit is used for transmitting image data acquired by the camera device to a ground industrial personal computer in real time;
and the tower crane power supply unit is used for providing installation positions for the camera device, the holder, the remote control unit and the image transmission unit and supplying power after the engineering machinery enters the field.
4. The construction safety macro monitoring system according to claim 3, wherein the calibration module includes a color calibration module, a camera calibration module and a BIM system alignment module;
the color calibration module is used for performing color calibration on the image data by adopting a perfect reflection algorithm;
the camera device calibration module is used for calibrating the position of the camera device by adopting a plurality of positioning color blocks;
and the BIM system alignment module is used for inserting the calibrated image data into a preset construction floor plan in the BIM system and performing data alignment on the image data and the preset construction floor plan.
5. Construction safety macroscopic monitoring system according to claim 4,
the color calibration module is specifically configured to:
traversing each pixel point in each image in the image data in RGB space, and calculating points (A, B, C)X i , Y j ) And:
C ij =R ij +G ij +B ij
find in the imageC ij Maximum point ofC m To obtain its RGB valueR m , G m , B m Calculating the RGB mean value of the pixel points distributed in the previous preset number according to the C value
Figure 87245DEST_PATH_IMAGE001
And calculating the gain coefficient of each pixel of the image:
Figure 286145DEST_PATH_IMAGE002
thus, the final RGB value of each pixel is obtained:
Figure 955023DEST_PATH_IMAGE003
6. the construction safety macro monitoring system of claim 4, wherein the camera calibration module is configured to:
arranging three positioning color blocks in a construction plane layout picture in a BIM system, wherein the center of each color block is provided with a mirror, and each positioning color block comprises a square plastic plate;
after the captive airship is lifted off, the camera device is adjusted through a remote control unit, so that three positioning color blocks are positioned in a picture and distributed in four parts of [ (0,0), (2871,1536) ], [ (2871,0), (5742, 1536) ], [ (0, 1536), (2871, 3072) ], [ (2871,1536), (5742, 3072) ], and connecting lines among the color blocks are kept to be flush with the picture frame as much as possible;
converting the RGB picture into an HSV picture, searching pixel blocks meeting preset requirements in the four parts of pictures, recording pixel block coordinates corresponding to the maximum value and the minimum value of the four parts of pictures in the horizontal direction, adding the two pixel block coordinates for averaging to obtain coordinates of center points of three positioning color blocks, and recording the coordinates as original coordinates of the positioning color blocks; wherein the preset requirements include:H∈[125, 155], S∈[43, 255], V∈[46, 255]
recalculating the coordinates of the central points of the three positioning color blocks at preset time intervals, and calculating the difference between the coordinates and the original coordinates;
and when the difference exceeds a preset value or the coordinate of the central point of any positioning color block is lacked, outputting an alarm prompt to prompt that the calibration needs to be carried out again.
7. The construction safety macro monitoring system of claim 4, wherein the identification module is configured to:
when the construction elements are identified, training workers, mobile engineering machinery and a tower crane based on Mask-RCNN and Deepsort algorithms through construction images collected and marked in advance, respectively identifying contour coordinates of the workers, the mobile engineering machinery and the tower crane in image data of a construction site by taking minutes as a unit, and tracking each identified object; respectively counting the total number of the identified workers, the mobile engineering machinery and the tower crane in a database by taking minutes as a unit; when the outlines of workers are identified, whether the workers wear safety helmets or not and the colors of the safety helmets are identified, each identified worker, the mobile engineering machine and the tower crane are numbered, and the corresponding outline coordinates and the outline centroids of the workers, the mobile engineering machine and the tower crane are stored into the data set corresponding to the moment.
8. The construction safety macro monitoring system of claim 4, wherein the identification module is configured to:
traversing all pixel points of the image data when the construction operation is identified;
calculating the maximum value in the RGB three-component brightness of the full pixel point, and recording the maximum value as a gray value;
storing one frame of image shot by the camera device every second, calculating the average value of corresponding pixels in each frame of gray scale image by taking 600 seconds as a sliding window, and recording the average value asBG i , i=0,1,……nWherein the first 600 seconds corresponds toBG 0
The average value of corresponding pixels in each frame of gray scale image is obtained by taking 60 seconds as a sliding window, and the average value is recorded asTS i , i=0,1,……n;
Will be provided withTS i AndBG i performing gray value subtraction operation and solving the absolute value of the gray value to obtain the gray difference value of each pixel point;
when the gray difference value of a pixel point is greater than the gray judgment threshold, recording the pixel point as a change point, and storing a corresponding coordinate;
and performing K-means unsupervised clustering on all stored coordinates, judging the clustering number according to a clustering n value corresponding to the maximum value of the contour coefficient, calculating the coordinates of each type of center point, and finding corresponding points and scene information in the BIM system.
9. A construction safety macroscopic monitoring method is characterized by comprising the following steps:
acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before the entering of the engineering machinery and second image data after the entering of the engineering machinery;
calibrating the camera device and image data acquired by the camera device, and aligning the image data with data in a preset construction floor plan to obtain processed image data;
identifying the processed image data to determine construction elements and ongoing construction work of a construction site;
and displaying the construction elements and the construction operation.
10. A construction safety macro monitoring apparatus, characterized in that the apparatus comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring image data of a construction site through a camera device, wherein the image data comprises first image data before an engineering machine enters the field and second image data after the engineering machine enters the field;
calibrating the camera device and image data acquired by the camera device, and aligning the image data with data in a preset construction floor plan to obtain processed image data;
identifying the processed image data to determine construction elements and construction operation of a construction site;
and displaying the construction elements and the construction operation.
CN202210339974.7A 2022-04-02 2022-04-02 Construction safety macroscopic monitoring system and method Active CN114727064B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210339974.7A CN114727064B (en) 2022-04-02 2022-04-02 Construction safety macroscopic monitoring system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210339974.7A CN114727064B (en) 2022-04-02 2022-04-02 Construction safety macroscopic monitoring system and method

Publications (2)

Publication Number Publication Date
CN114727064A true CN114727064A (en) 2022-07-08
CN114727064B CN114727064B (en) 2022-11-25

Family

ID=82241373

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210339974.7A Active CN114727064B (en) 2022-04-02 2022-04-02 Construction safety macroscopic monitoring system and method

Country Status (1)

Country Link
CN (1) CN114727064B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436665A (en) * 2011-08-25 2012-05-02 清华大学 Two-dimensional plane representation method for images of alimentary tract
CN103295084A (en) * 2012-03-02 2013-09-11 姜海西 Highway engineering two-dimensional office system building method based on Google earth
US20150260541A1 (en) * 2014-03-13 2015-09-17 Christopher Lacy Smith Map content management
CN107667366A (en) * 2015-03-24 2018-02-06 开利公司 System and method for capturing and analyzing multidimensional building information
KR101873839B1 (en) * 2017-09-22 2018-07-04 한국해양과학기술원 System and method for displaying augmented reality of view of underwater construction robot
CN109564651A (en) * 2016-05-19 2019-04-02 思比机器人公司 Method for automatically generating the planogram for the shelf structure being assigned to product in shop
CN111210442A (en) * 2020-01-02 2020-05-29 广东博智林机器人有限公司 Drawing image positioning and correcting method and device and electronic equipment
CN111611937A (en) * 2020-05-22 2020-09-01 陈金山 Prison personnel abnormal behavior monitoring method based on BIM and neural network
WO2021044422A1 (en) * 2019-09-04 2021-03-11 Lightyx Systems Ltd. System and method for controlling a light projector in a construction site
CN112508417A (en) * 2020-12-11 2021-03-16 民航中南机场设计研究院(广州)有限公司 Civil aviation professional engineering general contract project management system based on BIM technology

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436665A (en) * 2011-08-25 2012-05-02 清华大学 Two-dimensional plane representation method for images of alimentary tract
CN103295084A (en) * 2012-03-02 2013-09-11 姜海西 Highway engineering two-dimensional office system building method based on Google earth
US20150260541A1 (en) * 2014-03-13 2015-09-17 Christopher Lacy Smith Map content management
CN107667366A (en) * 2015-03-24 2018-02-06 开利公司 System and method for capturing and analyzing multidimensional building information
CN109564651A (en) * 2016-05-19 2019-04-02 思比机器人公司 Method for automatically generating the planogram for the shelf structure being assigned to product in shop
KR101873839B1 (en) * 2017-09-22 2018-07-04 한국해양과학기술원 System and method for displaying augmented reality of view of underwater construction robot
WO2021044422A1 (en) * 2019-09-04 2021-03-11 Lightyx Systems Ltd. System and method for controlling a light projector in a construction site
CN111210442A (en) * 2020-01-02 2020-05-29 广东博智林机器人有限公司 Drawing image positioning and correcting method and device and electronic equipment
CN111611937A (en) * 2020-05-22 2020-09-01 陈金山 Prison personnel abnormal behavior monitoring method based on BIM and neural network
CN112508417A (en) * 2020-12-11 2021-03-16 民航中南机场设计研究院(广州)有限公司 Civil aviation professional engineering general contract project management system based on BIM technology

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李冠杰: "澳门某高层公共房屋建造项目施工策划及关键工艺研究", 《华侨大学硕士论文》 *
李建华,张瑞: "逆向工程关键技术的研究与应用", 《制造业自动化》 *

Also Published As

Publication number Publication date
CN114727064B (en) 2022-11-25

Similar Documents

Publication Publication Date Title
CN111275759B (en) Transformer substation disconnecting link temperature detection method based on unmanned aerial vehicle double-light image fusion
CN114727063B (en) Path safety monitoring system, method and device for construction site
US20230186680A1 (en) Information processing device and recognition support method
CN102867417B (en) Taxi anti-forgery system and taxi anti-forgery method
CN111080679A (en) Method for dynamically tracking and positioning indoor personnel in large-scale place
CN109255568A (en) A kind of intelligent warehousing system based on image recognition
CN111275923B (en) Man-machine collision early warning method and system for construction site
CN115311592B (en) Construction site material safety evaluation system based on computer vision technology
CN114727064B (en) Construction safety macroscopic monitoring system and method
CN114511592A (en) Personnel trajectory tracking method and system based on RGBD camera and BIM system
CN112101260B (en) Method, device, equipment and storage medium for identifying safety belt of operator
CN114489143B (en) Unmanned aerial vehicle management system, method and device for construction safety risk monitoring
CN112800918A (en) Identity recognition method and device for illegal moving target
CN115373416B (en) Intelligent inspection method for railway electric power through line
CN115994953A (en) Power field security monitoring and tracking method and system
CN115049975A (en) Method and system for dynamically displaying safety activity factors of construction site
CN115909094A (en) Underground pile foundation construction progress identification method based on 2D image and video fusion
CN115083212A (en) Unmanned aerial vehicle location intelligent management system based on three-dimensional modeling
CN111343431B (en) Airport target detection system based on image rectification
CN114067365A (en) Safety helmet wearing detection method and system based on central attention centripetal network
CN106248058A (en) A kind of for the storage localization method of means of transport, Apparatus and system
CN110909606A (en) Transformer substation personnel behavior detection method based on deep learning
CN117424988B (en) Image processing system and processing method for intelligently managing welding machine
CN114783000B (en) Method and device for detecting dressing standard of worker in bright kitchen range scene
CN113091627B (en) Method for measuring vehicle height in dark environment based on active binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant