CN115294533A - Building construction state monitoring method based on data processing - Google Patents

Building construction state monitoring method based on data processing Download PDF

Info

Publication number
CN115294533A
CN115294533A CN202211205586.6A CN202211205586A CN115294533A CN 115294533 A CN115294533 A CN 115294533A CN 202211205586 A CN202211205586 A CN 202211205586A CN 115294533 A CN115294533 A CN 115294533A
Authority
CN
China
Prior art keywords
construction
worker
state
area
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211205586.6A
Other languages
Chinese (zh)
Other versions
CN115294533B (en
Inventor
张�杰
邱凯铤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Yiyun Zhilian Information Technology Co ltd
Original Assignee
Nantong Yiyun Zhilian Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Yiyun Zhilian Information Technology Co ltd filed Critical Nantong Yiyun Zhilian Information Technology Co ltd
Priority to CN202211205586.6A priority Critical patent/CN115294533B/en
Publication of CN115294533A publication Critical patent/CN115294533A/en
Application granted granted Critical
Publication of CN115294533B publication Critical patent/CN115294533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The invention relates to the technical field of data processing, in particular to a building construction state monitoring method based on data processing, which collects a plurality of frames of panoramic views of a building construction site so as to divide each frame of panoramic view into a worker area and a building equipment area; detecting human key points of each worker in the worker area, constructing an action database to compare the human key point coordinates of various dangerous construction actions contained in the action database with the human key point coordinates of each worker, and analyzing the construction behavior state of each worker; and detecting the wearing condition of the safety helmet of a worker and the equipment construction state in each building equipment area so as to perform danger early warning according to the construction behavior state, the wearing condition of the safety helmet and the equipment construction state. The detection precision is improved through regional detection, the influence of overall analysis complex environment is avoided, meanwhile, construction tools and equipment with problems can be overhauled in time, the construction efficiency is improved, and the safety problem of the construction process is guaranteed.

Description

Building construction state monitoring method based on data processing
Technical Field
The invention relates to the technical field of data processing, in particular to a building construction state monitoring method based on data processing.
Background
As one of high-risk industries, the safety problem situation of the construction industry is very severe, and the supervision and monitoring of the construction site are the core content of safety management. The construction state monitoring is mainly used for checking the safety state of the structure of the project in the construction process and detecting the construction state of workers on the construction site so that managers can efficiently manage and guide the construction site.
At present, most monitoring personnel find and record illegal operations in a field inspection mode to prevent dangerous actions of workers from being converted into accidents, however, the number and the energy of management personnel are limited, real-time monitoring on the construction state of the workers is difficult to achieve, the method is not timely, the workload of the inspection process is large, the supervision efficiency is low, efficient and accurate detection and analysis of the construction field state are difficult to achieve, abnormal conditions of the construction field are difficult to find by watching videos artificially and subjectively, wrong judgment information is easy to obtain due to human negligence or subjective consciousness, and therefore a wrong analysis report is obtained.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a building construction state monitoring method based on data processing, which adopts the following technical scheme:
collecting a multi-frame panorama of a building construction site; detecting key points of a human body on the panoramic image to divide the panoramic image into an unmanned area and a worker area; dividing the unmanned area into a plurality of construction equipment areas based on the positions of the construction equipment;
detecting human body key points of each worker in the worker area to obtain two-dimensional coordinates of the human body key points of each worker; forming two-dimensional coordinates of all human key points of a current worker into a two-dimensional sequence, obtaining a three-dimensional sequence corresponding to the two-dimensional sequence by utilizing a TCN (transmission control network) network, and forming the three-dimensional sequence of the current worker in a multi-frame panoramic picture into a three-dimensional action sequence; constructing an action database of various dangerous construction actions based on a simulator, and acquiring construction behavior state indexes of each worker based on the action database; carrying out safety helmet wearing detection on each worker to obtain a safety helmet wearing index of each worker, and constructing a 2 x N-dimensional worker area monitoring matrix by using the construction behavior state index and the safety helmet wearing index of each worker, wherein N refers to the number of workers, and N is a positive integer;
obtaining equipment construction state indexes of each building equipment area according to state similarity between real-time states of the building equipment in the building equipment area and standard states in normal work to form a 1 x M-dimensional construction equipment monitoring matrix, wherein M is the number of the building equipment and is a positive integer;
and respectively carrying out danger early warning on workers and construction equipment according to the worker area monitoring matrix and the construction equipment monitoring matrix.
Further, the method for constructing the action database of the plurality of dangerous construction actions based on the simulator comprises the following steps:
obtaining standard three-dimensional coordinates of each human body key point under the current dangerous construction action by using a simulator to form a standard three-dimensional coordinate sequence; and forming an action database by using various dangerous construction actions and the corresponding standard three-dimensional coordinate sequences.
Further, the method for obtaining the construction behavior state index of each worker based on the action database comprises the following steps:
respectively calculating the construction behavior state index of each dangerous construction action corresponding to the current worker according to the three-dimensional action sequence of the current worker and each standard three-dimensional coordinate sequence in the action database;
and taking the maximum construction behavior state index as the final construction behavior state index of the current worker.
Further, the calculation formula of the construction behavior state index is as follows:
Figure 105551DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 186639DEST_PATH_IMAGE002
the construction behavior state index is obtained;
Figure 648101DEST_PATH_IMAGE003
a sequence length of a three-dimensional motion sequence for a worker;
Figure 207258DEST_PATH_IMAGE004
the number of human body key points for a worker;
Figure 185579DEST_PATH_IMAGE005
is a human body key point of a worker
Figure 657143DEST_PATH_IMAGE006
In the first place
Figure 88124DEST_PATH_IMAGE007
Three-dimensional coordinates corresponding to the frame panorama;
Figure 185393DEST_PATH_IMAGE008
as key points of human body in standard three-dimensional coordinate sequence
Figure 18220DEST_PATH_IMAGE006
Standard three-dimensional coordinates of (a).
Further, the method for detecting the wearing of the safety helmet of each worker to obtain the wearing index of the safety helmet of each worker comprises the following steps:
acquiring a safety helmet surrounding frame and a human body surrounding frame of a worker by using a target detection network, calculating the intersection area between the safety helmet surrounding frame and the human body surrounding frame, and taking the ratio of the intersection area to the area of the safety helmet surrounding frame as an initial wearing detection index of each worker;
based on the initial wearing detection index, confirming the safety helmet wearing index of each worker according to the worker position and the safety helmet position, wherein the calculation formula of the safety helmet wearing index is as follows:
Figure 159220DEST_PATH_IMAGE009
Figure 77497DEST_PATH_IMAGE010
Figure 978457DEST_PATH_IMAGE011
in the formula (I), the compound is shown in the specification,
Figure 400211DEST_PATH_IMAGE012
a safety helmet wearing index is obtained;
Figure 479157DEST_PATH_IMAGE013
coordinates of the upper left corner of the safety helmet surrounding frame,
Figure 619151DEST_PATH_IMAGE014
Coordinates of the lower right corner of the enclosure frame for the helmet,
Figure 323802DEST_PATH_IMAGE015
is a first distance;
Figure 865642DEST_PATH_IMAGE016
is the coordinate of the central point of the safety helmet surrounding frame,
Figure 611094DEST_PATH_IMAGE017
is the coordinate of the center point of the face of the worker,
Figure 503964DEST_PATH_IMAGE018
is a second distance;
Figure 933677DEST_PATH_IMAGE019
an initial wear detection indicator;
Figure 330023DEST_PATH_IMAGE020
a wear detection index threshold.
Further, the method for acquiring the equipment construction state index of each building equipment area includes:
acquiring a standard state set formed by images of all building equipment areas and images of standard states of all building construction equipment during normal work; extracting the same characteristic points of each building equipment area image and each standard state image respectively, and performing characteristic point matching on each building equipment area image and each standard state image in a standard state set respectively to obtain a plurality of characteristic point pairs;
calculating the second step according to the matched characteristic point pairs
Figure 39DEST_PATH_IMAGE006
The area image of the building equipment and the standard state are concentrated
Figure 114626DEST_PATH_IMAGE021
The similarity between the standard state images is obtained
Figure 911811DEST_PATH_IMAGE006
The similarity degree set of the area images of the building equipment takes the maximum similarity degree in the similarity degree set as the first
Figure 428243DEST_PATH_IMAGE006
The calculation formula of the similarity degree of the equipment construction state indexes of the building equipment areas is as follows:
Figure 269160DEST_PATH_IMAGE022
in the formula (I), the compound is shown in the specification,
Figure 235759DEST_PATH_IMAGE023
for the number of pairs of matched characteristic points,
Figure 476116DEST_PATH_IMAGE024
represents the first
Figure 847054DEST_PATH_IMAGE007
For the euclidean distance between pairs of feature points,
Figure 858873DEST_PATH_IMAGE025
is as follows
Figure 810036DEST_PATH_IMAGE006
Area image of building equipment and
Figure 198292DEST_PATH_IMAGE021
the degree of similarity between the standard status images.
The embodiment of the invention at least has the following beneficial effects: the construction safety state of construction safety state detection and the construction safety behavior of constructors in worker areas and the safety helmet wearing detection are carried out on the construction equipment in the unmanned area, so that the safety detection of the whole construction scene is realized, the detection precision is improved by regional detection, the influence of the whole analysis complex environment is avoided, meanwhile, the construction tool and the equipment which have problems can be overhauled in time, the construction efficiency is improved, and the safety problem of the construction process is guaranteed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart illustrating steps of a building construction state monitoring method based on data processing according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the method for monitoring construction status based on data processing according to the present invention, its specific implementation, structure, features and effects will be given below with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the building construction state monitoring method based on data processing in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart illustrating steps of a building construction state monitoring method based on data processing according to an embodiment of the present invention is shown, where the method includes the following steps:
s001, collecting a multi-frame panorama of a building construction site; detecting key points of a human body on the panoramic image to divide the panoramic image into an unmanned area and a worker area; the unmanned area is divided into a plurality of construction equipment areas based on the locations of the construction equipment.
Specifically, carry on the camera through unmanned aerial vehicle and carry out image acquisition to the construction scene, splice and fuse each image that will acquire to acquire the multiframe panorama of construction scene, so that carry out the safety condition monitoring to each region of being under construction, wherein the person of implementing such as flying height, speed and camera shooting frame rate of unmanned aerial vehicle sets for according to actual conditions by oneself.
Considering that the construction environment of the building is relatively complex and each area of a construction site has relatively large difference, therefore, in order to realize accurate monitoring of the construction state of the building, the partition processing is carried out on the construction of the building, and the area partition process specifically comprises the following steps:
(1) And detecting the key points of the human body of the panoramic image through a key point detection network to obtain a corresponding key point thermodynamic diagram.
Specifically, the training process of the key point detection network is as follows: firstly, making label data, marking key points on the head of a human body, namely coordinates of X and Y, and convolving a marked human head scatter diagram with a Gaussian kernel to form a key point thermodynamic diagram corresponding to a panoramic image; then taking the panoramic image and the label image as training data of a key point detection network, performing feature extraction on the image and the label data through a key point detection encoder to obtain a corresponding feature image, further performing up-sampling and feature extraction on the feature image through a key point detection decoder, and finally generating a key point thermodynamic diagram corresponding to the panoramic image; the key point detection network is trained and supervised by a mean square error loss function.
(2) Based on the human body key points in the key point thermodynamic diagram, a convex hull algorithm is utilized to divide the building construction site into a worker area and an unmanned area.
Specifically, the minimum convex hull of the region where the human body key points are located in the key point thermodynamic diagram is obtained through a convex hull algorithm, each human body key point region is obtained and used as a worker region, and then other regions in the key point thermodynamic diagram are used as unmanned regions, so that the worker region and the unmanned regions in the panoramic diagram are obtained.
(3) The unmanned area is divided into a plurality of construction equipment areas according to the positions of the construction equipment.
Specifically, the unmanned area is cut out from the panoramic image, an RGB image of the unmanned area is obtained, and each building equipment area in the unmanned area is obtained by using a target detection network, wherein the training process of the target detection network is as follows: label data creation for RGB images: carrying out data annotation on the surrounding frame of the building equipment, namely the center point coordinate of the surrounding frame
Figure 689316DEST_PATH_IMAGE026
And width and height information
Figure 872036DEST_PATH_IMAGE027
(ii) a And carrying out bounding box detection on the building equipment in the unmanned area through a target detection network, and carrying out supervision training on the network by adopting a mean square error loss function.
Step S002, detecting human body key points of each worker in the worker area to obtain two-dimensional coordinates of the human body key points of each worker, forming the two-dimensional coordinates of all the human body key points of the current worker into a two-dimensional sequence, obtaining a three-dimensional sequence corresponding to the two-dimensional sequence by utilizing a TCN (transmission control network), forming the three-dimensional sequence of the current worker in a multi-frame panorama into a three-dimensional action sequence, and obtaining construction behavior state indexes of each worker based on the standard three-dimensional sequence of each dangerous construction action in the action database; the method comprises the steps of carrying out safety helmet wearing detection on each worker to obtain a safety helmet wearing index of each worker, and constructing a 2 x N-dimensional worker area monitoring matrix by using the construction behavior state index and the safety helmet wearing index of each worker, wherein N refers to the number of workers, and N is a positive integer.
Specifically, for the construction state to each region of more accurate detects, improve construction state and detect the precision, avoid the influence of whole analysis complex environment, carry out workman state analysis to the workman region of construction scene, specific process is:
(1) And cutting the worker area out of the panoramic image to obtain an RGB image of the worker area. And extracting the human body key points of each worker in the RGB image by using a key point detection network.
Specifically, a key point detection network is used for extracting human body key points of workers, and the human body key points in the embodiment of the invention mainly comprise the following components: left and right eyes, left and right ears, nose, mouth, neck, left and right shoulders, left and right elbow joints, left and right wrist joints, hip joints, left and right knee joints, and left and right ankle joints. The method comprises the steps of making label data based on set human key points, training a key point detection network through the label data and RGB images of a worker area, performing network supervision by adopting a mean square error loss function, and continuously updating network parameters.
(2) The two-dimensional coordinates of the human key points of each worker in the frame of panorama are obtained by the step (1), the two-dimensional coordinates of all the human key points of one worker form a two-dimensional sequence, in order to analyze the construction actions of the workers, a three-dimensional sequence corresponding to the two-dimensional sequence is obtained by a TCN network, and the process of obtaining the three-dimensional sequence by the TCN network is a known technology and is not described in the invention. And forming a three-dimensional action sequence by the three-dimensional sequence of the same worker in the continuous multi-frame panoramic picture, and further obtaining the three-dimensional action sequences of all workers.
(3) And establishing an action database of various dangerous construction actions based on the simulator.
Specifically, standard three-dimensional coordinates of each human body key point corresponding to dangerous construction actions are obtained in a simulation mode to form a standard three-dimensional coordinate sequence of the dangerous construction actions, different dangerous construction actions and the establishment of the standard three-dimensional coordinate sequence are the prior art, an implementer can automatically select dangerous types of the construction actions to obtain the corresponding standard three-dimensional coordinate sequence, a large number of dangerous construction actions need to be obtained to guarantee the detection precision of the construction state of workers, each dangerous construction action and the corresponding standard three-dimensional coordinate sequence form an action database, and the action database comprises a large number of dangerous construction actions and is used for detecting the construction state of the workers.
(4) And acquiring the construction behavior state index of each worker based on the action database.
Specifically, after the action database is obtained, matching is performed according to the three-dimensional action sequence of each worker and each dangerous construction action in the action database, so as to analyze a construction behavior state index of each worker corresponding to each dangerous construction action, wherein the construction behavior state index represents a normative degree of the construction action and is used for explaining the danger of the construction behavior state of the worker, and a calculation formula of the construction behavior state index is as follows:
Figure 199243DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 125611DEST_PATH_IMAGE002
the construction behavior state index;
Figure 205562DEST_PATH_IMAGE003
a sequence length that is a three-dimensional sequence of actions for the worker;
Figure 824762DEST_PATH_IMAGE004
the number of human body key points for a worker;
Figure 872222DEST_PATH_IMAGE005
for human body key points of workers
Figure 602280DEST_PATH_IMAGE006
At the moment of time
Figure 536738DEST_PATH_IMAGE007
Is the three-dimensional coordinate of (i.e. is the first
Figure 326840DEST_PATH_IMAGE007
Three-dimensional coordinates corresponding to the frame panorama;
Figure 363060DEST_PATH_IMAGE008
is a human body key point in a standard three-dimensional coordinate sequence
Figure 631230DEST_PATH_IMAGE006
Standard three-dimensional coordinates of (a).
It should be noted that, the greater the difference between the three-dimensional coordinates of the worker and the standard three-dimensional coordinates, the greater the value of the construction behavior state index, and the higher the corresponding construction action risk degree.
Obtaining the construction behavior state index between the three-dimensional action sequence of the current worker and the standard three-dimensional sequence of each dangerous construction action in the action database by using the calculation formula of the construction behavior state index, and enabling the maximum construction behavior state index
Figure 685774DEST_PATH_IMAGE028
As the final construction behavior state index of the current worker, the construction behavior state index of each worker can be obtained in the same way
Figure 646777DEST_PATH_IMAGE028
(5) For the realization to the comprehensive detection of workman's construction state, will detect whether wearing the safety helmet in the workman work progress, then whether the detection course of wearing the safety helmet of workman specifically does: the RGB image of a worker area is used as the input of a target detection network; marking data as an enclosing frame of the safety helmet and a human body enclosing frame of a worker; training and monitoring the target detection network through a mean square error loss function; the detection of the safety helmet surrounding frame and the human body surrounding frame of a worker can be realized through the trained target detection network. Further, the area of frame is enclosed to the safety helmet and the area of human body enclosure frame is enclosed to every workman and preliminary detection that the safety helmet was worn is carried out to obtain the initial detection index of wearing of every workman, then the computational formula of wearing the detection index initially is:
Figure 400319DEST_PATH_IMAGE029
in the formula (I), the compound is shown in the specification,
Figure 737760DEST_PATH_IMAGE019
an initial wear detection index;
Figure 912389DEST_PATH_IMAGE030
the area of the enclosure frame for the safety helmet;
Figure 529446DEST_PATH_IMAGE031
the area of the frame is enclosed for the human body.
The larger the intersection between the area of the safety helmet surrounding frame and the area of the human body surrounding frame is, the more likely the worker is to wear the safety helmet, and the corresponding initial wearing detection index
Figure 789526DEST_PATH_IMAGE019
The larger.
Setting a wear detection index threshold
Figure 665078DEST_PATH_IMAGE020
When the detection index is worn initially
Figure 209061DEST_PATH_IMAGE019
Greater than or equal to a wear detection index threshold
Figure 511866DEST_PATH_IMAGE020
The worker is deemed to be suspected of wearing a hard hat, otherwise the worker is deemed not to be wearing a hard hat.
Preferably, in the embodiment of the present invention, the threshold value of the wear detection index is set
Figure 259243DEST_PATH_IMAGE020
=0.35。
Considering that in the construction process, the safety helmet is often held by a worker, and therefore, when workers gather together for construction, the workers also have a staggered condition, so that in order to accurately detect the condition of the workers in the construction process, the embodiment of the invention further determines the safety helmet wearing index of each worker according to the position of the worker and the position of the safety helmet based on the initial wearing detection index, and the calculation formula of the safety helmet wearing index is as follows:
Figure 689218DEST_PATH_IMAGE009
Figure 572860DEST_PATH_IMAGE010
Figure 46567DEST_PATH_IMAGE011
in the formula (I), the compound is shown in the specification,
Figure 736699DEST_PATH_IMAGE012
the value is a safety helmet wearing index, the value is 1, which represents that a worker wears a safety helmet, and the value is 0, which represents that the worker does not wear the safety helmet;
Figure 688474DEST_PATH_IMAGE013
coordinates of the upper left corner of the safety helmet surrounding frame,
Figure 816836DEST_PATH_IMAGE014
Coordinates of the lower right corner of the enclosure frame for the helmet,
Figure 195865DEST_PATH_IMAGE015
is a first distance;
Figure 652254DEST_PATH_IMAGE016
coordinates of the center point of the safety helmet surrounding frame,
Figure 673300DEST_PATH_IMAGE017
coordinates of the center point of the face of the worker, wherein
Figure 43451DEST_PATH_IMAGE032
Figure 858961DEST_PATH_IMAGE033
,(
Figure 802646DEST_PATH_IMAGE034
) Representing the center coordinates of the eyes, ears, nose and mouth,
Figure 846956DEST_PATH_IMAGE018
is the second distance.
And calculating a second distance between the coordinate of the center point of the face of the worker and the coordinate of the center point of the safety helmet surrounding frame, and a first distance between the coordinate of the upper left corner of the safety helmet surrounding frame and the coordinate of the lower right corner of the safety helmet surrounding frame, wherein the more positive the difference between the first distance and the second distance is, the more the worker wears the safety helmet, and otherwise the worker does not wear the safety helmet.
(6) Construction behavior state indexes of each worker in the worker area can be obtained based on the steps (1) to (5)
Figure 559698DEST_PATH_IMAGE028
And the wearing index of the safety helmet
Figure 546108DEST_PATH_IMAGE012
And then each workerThe construction behavior state index and the safety helmet wearing index construct a 2 x N-dimensional worker area monitoring matrix
Figure 977089DEST_PATH_IMAGE035
And N refers to the number of workers, is a positive integer greater than 0, and each column represents the number of workers.
And S003, obtaining equipment construction state indexes of each building equipment area according to the state similarity between the real-time state of the building equipment in the building equipment area and the standard state in normal work to form a 1 x M-dimensional construction equipment monitoring matrix, wherein M is the number of the building equipment, and M is a positive integer.
Specifically, the RGB images corresponding to the building equipment areas are cut to obtain the building equipment area images, so as to avoid the influence of other unrelated areas. Standard state images of each building construction device during normal work are acquired through unmanned aerial vehicle aerial photography to constitute a standard state set of the building construction device, and then the similarity degree between each building device area image and each standard state image in the standard state set is calculated respectively: extracting feature points in the building equipment area image by adopting an SIFT corner point detection algorithm, similarly extracting the same feature points in the standard state image, then matching the feature points to obtain a plurality of feature point pairs, and calculating the similarity between the building equipment area image and the standard state image according to the matched feature point pairs, wherein the calculation formula of the similarity is as follows:
Figure 323626DEST_PATH_IMAGE022
in the formula (I), the compound is shown in the specification,
Figure 890874DEST_PATH_IMAGE023
for the number of pairs of matched characteristic points,
Figure 313765DEST_PATH_IMAGE024
represents the first
Figure 232042DEST_PATH_IMAGE007
For the euclidean distance between pairs of feature points,
Figure 618155DEST_PATH_IMAGE025
is as follows
Figure 571068DEST_PATH_IMAGE006
Area image of building equipment and
Figure 899281DEST_PATH_IMAGE021
the degree of similarity between the standard status images.
The smaller the Euclidean distance between the feature point pairs, the greater the similarity, and the greater the number of matched feature point pairs, the greater the similarity.
Based on the above-mentioned similarity degree obtaining method, the first one can be obtained
Figure 291472DEST_PATH_IMAGE006
The similarity between the image of the area of the building equipment and each standard state image in the standard state set forms the second
Figure 261703DEST_PATH_IMAGE006
Similarity degree set of individual building equipment area images
Figure 803542DEST_PATH_IMAGE036
And then the maximum similarity degree in the similarity degree set is taken as the first
Figure 302657DEST_PATH_IMAGE006
Equipment construction state index of individual building equipment area
Figure 680680DEST_PATH_IMAGE037
It should be noted that the equipment construction state index is normalized, so that the function value is 0 to 1, and the working state of the building construction equipment in the construction process is conveniently monitored in real time.
Further, a 1 x M-dimensional construction equipment monitoring matrix is formed according to the equipment construction state indexes of each building equipment area
Figure 657863DEST_PATH_IMAGE038
M is the number of construction equipment, and M is a positive integer greater than 0.
And step S004, respectively carrying out danger early warning on workers and construction equipment according to the worker area monitoring matrix and the construction equipment monitoring matrix.
Specifically, carry out safety monitoring to the construction site, not only carry out real-time supervision to each workman's construction state, monitor the working condition of each architectural equipment in the work progress simultaneously to make corresponding early warning suggestion according to the monitoring result, and in time remind relevant managers to take corresponding measure as early as possible and manage, administer it, then early warning suggestion process is: setting a behavior state index threshold, when the behavior state index is higher than the behavior state index threshold, considering that dangerous behaviors exist in the construction behaviors and safety risks exist, broadcasting corresponding worker numbers, timely prompting managers to take corresponding measures as soon as possible for processing, broadcasting the worker numbers with a helmet wearing index value of 0, and timely prompting related workers to wear the helmets so as to ensure construction safety; similarly, set up equipment construction state index threshold value, when equipment construction state index is less than equipment construction state index threshold value, then carry out the early warning to the architectural equipment region, remind the maintainer to overhaul the architectural construction equipment of corresponding serial number, avoid the problem that traditional manual construction equipment of patrolling and examining does not have the real-time to improve the efficiency of construction.
Preferably, in the embodiment of the present invention, the state index threshold and the equipment construction state index threshold are both empirical values, that is, the equipment construction state index threshold is 0.5, and the state index threshold is 0.6.
In summary, the embodiment of the present invention provides a method for monitoring a building construction state based on data processing, which collects a plurality of frames of panoramic views of a building construction site to divide each frame of panoramic views into a worker area and a building equipment area; the method comprises the steps of detecting human key points of each worker in a worker area, constructing an action database, comparing the human key point coordinates of various dangerous construction actions included in the action database with the human key point coordinates of each worker, analyzing the construction behavior state of each worker, detecting the wearing condition of the safety helmet of the worker and the equipment construction state in each building equipment area, and performing danger early warning according to the construction behavior state, the wearing condition of the safety helmet and the equipment construction state. The detection precision is improved through regional detection, the influence of overall analysis complex environment is avoided, meanwhile, construction tools and equipment with problems can be overhauled in time, the construction efficiency is improved, and the safety problem of the construction process is guaranteed.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit of the present invention.

Claims (6)

1. A building construction state monitoring method based on data processing is characterized by comprising the following specific steps:
collecting a multi-frame panorama of a building construction site; detecting key points of a human body on the panoramic image to divide the panoramic image into an unmanned area and a worker area; dividing the unmanned area into a plurality of construction equipment areas based on the positions of the construction equipment;
detecting human body key points of each worker in the worker area to obtain two-dimensional coordinates of the human body key points of each worker; forming two-dimensional coordinates of all human key points of a current worker into a two-dimensional sequence, obtaining a three-dimensional sequence corresponding to the two-dimensional sequence by utilizing a TCN (transmission control network) network, and forming the three-dimensional sequence of the current worker in a multi-frame panorama into a three-dimensional action sequence; constructing an action database of various dangerous construction actions based on a simulator, and acquiring construction behavior state indexes of each worker based on the action database; carrying out safety helmet wearing detection on each worker to obtain a safety helmet wearing index of each worker, and constructing a 2 x N-dimensional worker area monitoring matrix by using the construction behavior state index and the safety helmet wearing index of each worker, wherein N refers to the number of workers, and N is a positive integer;
obtaining equipment construction state indexes of each building equipment area according to state similarity between a real-time state of building equipment in the building equipment area and a standard state in normal work, and forming a 1-by-M-dimensional construction equipment monitoring matrix, wherein M is the number of the building equipment and is a positive integer;
and respectively carrying out danger early warning on workers and construction equipment according to the worker area monitoring matrix and the construction equipment monitoring matrix.
2. The method for monitoring construction state based on data processing as claimed in claim 1, wherein the method for constructing the action database of the plurality of dangerous construction actions based on the simulator comprises:
obtaining standard three-dimensional coordinates of each human body key point under the current dangerous construction action by using a simulator to form a standard three-dimensional coordinate sequence; and forming an action database by using various dangerous construction actions and the corresponding standard three-dimensional coordinate sequences.
3. The method for monitoring the construction state based on the data processing as claimed in claim 2, wherein the method for obtaining the construction behavior state index of each worker based on the action database comprises:
respectively calculating the construction behavior state index of each dangerous construction action corresponding to the current worker according to the three-dimensional action sequence of the current worker and each standard three-dimensional coordinate sequence in the action database;
and taking the maximum construction behavior state index as the final construction behavior state index of the current worker.
4. The building construction state monitoring method based on data processing as claimed in claim 3, wherein the calculation formula of the construction behavior state index is:
Figure 973477DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 93879DEST_PATH_IMAGE002
the construction behavior state index;
Figure 591726DEST_PATH_IMAGE003
a sequence length of a three-dimensional motion sequence for a worker;
Figure 455776DEST_PATH_IMAGE004
the number of human body key points for a worker;
Figure 738990DEST_PATH_IMAGE005
is a human body key point of a worker
Figure 30294DEST_PATH_IMAGE006
In the first place
Figure 15437DEST_PATH_IMAGE007
Three-dimensional coordinates corresponding to the frame panorama;
Figure 683178DEST_PATH_IMAGE008
as key points of human body in standard three-dimensional coordinate sequence
Figure 555319DEST_PATH_IMAGE006
Standard three-dimensional coordinates of (a).
5. The building construction state monitoring method based on data processing as claimed in claim 1, wherein the method for detecting the wearing of the safety helmet of each worker to obtain the wearing index of the safety helmet of each worker comprises:
acquiring a safety helmet surrounding frame and a human body surrounding frame of a worker by using a target detection network, calculating the intersection area between the safety helmet surrounding frame and the human body surrounding frame, and taking the ratio of the intersection area to the area of the safety helmet surrounding frame as an initial wearing detection index of each worker;
based on the initial wearing detection index, confirming the safety helmet wearing index of each worker according to the worker position and the safety helmet position, wherein the calculation formula of the safety helmet wearing index is as follows:
Figure 266792DEST_PATH_IMAGE009
Figure 489963DEST_PATH_IMAGE010
Figure 695817DEST_PATH_IMAGE011
in the formula (I), the compound is shown in the specification,
Figure 688044DEST_PATH_IMAGE012
the wearing index of the safety helmet is shown;
Figure 573347DEST_PATH_IMAGE013
coordinates of the upper left corner of the safety helmet surrounding frame,
Figure 549394DEST_PATH_IMAGE014
Coordinates of the lower right corner of the enclosure frame for the helmet,
Figure 27780DEST_PATH_IMAGE015
is a first distance;
Figure 405671DEST_PATH_IMAGE016
is the coordinate of the central point of the safety helmet surrounding frame,
Figure 458947DEST_PATH_IMAGE017
is the coordinate of the center point of the face of the worker,
Figure 656710DEST_PATH_IMAGE018
is a second distance;
Figure 938787DEST_PATH_IMAGE019
an initial wear detection index;
Figure 905606DEST_PATH_IMAGE020
a wear detection index threshold.
6. The method for monitoring construction state based on data processing as claimed in claim 1, wherein the method for obtaining the equipment construction state index of each building equipment area comprises:
acquiring a standard state set consisting of the images of all the building equipment areas and the images of the standard states of all the building construction equipment during normal work; extracting the same characteristic points of each building equipment area image and each standard state image respectively, and performing characteristic point matching on each building equipment area image and each standard state image in a standard state set respectively to obtain a plurality of characteristic point pairs;
calculating the second step according to the matched characteristic point pairs
Figure 395362DEST_PATH_IMAGE006
Building facilityThe image of the standby area and the standard state are concentrated
Figure 814842DEST_PATH_IMAGE021
The similarity between the standard state images is obtained
Figure 166189DEST_PATH_IMAGE006
The similarity degree set of the images of the building equipment areas takes the maximum similarity degree in the similarity degree set as the first
Figure 987514DEST_PATH_IMAGE006
The calculation formula of the similarity degree of the equipment construction state indexes of the building equipment areas is as follows:
Figure 648172DEST_PATH_IMAGE022
in the formula (I), the compound is shown in the specification,
Figure 554948DEST_PATH_IMAGE023
for the number of pairs of matched characteristic points,
Figure 444406DEST_PATH_IMAGE024
represents the first
Figure 366576DEST_PATH_IMAGE007
For the euclidean distance between pairs of feature points,
Figure 948867DEST_PATH_IMAGE025
is as follows
Figure 342939DEST_PATH_IMAGE006
Area image of building equipment and
Figure 36089DEST_PATH_IMAGE021
the degree of similarity between the standard status images.
CN202211205586.6A 2022-09-30 2022-09-30 Building construction state monitoring method based on data processing Active CN115294533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211205586.6A CN115294533B (en) 2022-09-30 2022-09-30 Building construction state monitoring method based on data processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211205586.6A CN115294533B (en) 2022-09-30 2022-09-30 Building construction state monitoring method based on data processing

Publications (2)

Publication Number Publication Date
CN115294533A true CN115294533A (en) 2022-11-04
CN115294533B CN115294533B (en) 2022-12-20

Family

ID=83834863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211205586.6A Active CN115294533B (en) 2022-09-30 2022-09-30 Building construction state monitoring method based on data processing

Country Status (1)

Country Link
CN (1) CN115294533B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830719A (en) * 2023-02-16 2023-03-21 青岛旭华建设集团有限公司 Construction site dangerous behavior identification method based on image processing
CN116227803A (en) * 2022-12-01 2023-06-06 中国建筑第四工程局有限公司 Intelligent building construction data processing method
CN116311082A (en) * 2023-05-15 2023-06-23 广东电网有限责任公司湛江供电局 Wearing detection method and system based on matching of key parts and images
CN117068976A (en) * 2023-08-04 2023-11-17 山东高速建设管理集团有限公司 Crane construction standard safety detection method
CN117068976B (en) * 2023-08-04 2024-05-03 山东高速建设管理集团有限公司 Crane construction standard safety detection method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160140113A (en) * 2015-05-29 2016-12-07 주식회사비엠테크 Safety System and Method for Monitoring of Construction Equipment
CN111144263A (en) * 2019-12-20 2020-05-12 山东大学 Construction worker high-fall accident early warning method and device
AU2020100711A4 (en) * 2020-05-05 2020-06-11 Chang, Cheng Mr The retrieval system of wearing safety helmet based on deep learning
CN112396652A (en) * 2020-11-25 2021-02-23 创新奇智(西安)科技有限公司 Construction site safety monitoring method and device, electronic equipment and storage medium
CN113724105A (en) * 2021-09-08 2021-11-30 姚成龙 Building construction site monitoring system and monitoring method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160140113A (en) * 2015-05-29 2016-12-07 주식회사비엠테크 Safety System and Method for Monitoring of Construction Equipment
CN111144263A (en) * 2019-12-20 2020-05-12 山东大学 Construction worker high-fall accident early warning method and device
AU2020100711A4 (en) * 2020-05-05 2020-06-11 Chang, Cheng Mr The retrieval system of wearing safety helmet based on deep learning
CN112396652A (en) * 2020-11-25 2021-02-23 创新奇智(西安)科技有限公司 Construction site safety monitoring method and device, electronic equipment and storage medium
CN113724105A (en) * 2021-09-08 2021-11-30 姚成龙 Building construction site monitoring system and monitoring method thereof

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116227803A (en) * 2022-12-01 2023-06-06 中国建筑第四工程局有限公司 Intelligent building construction data processing method
CN116227803B (en) * 2022-12-01 2024-02-09 中国建筑第四工程局有限公司 Intelligent building construction data processing method
CN115830719A (en) * 2023-02-16 2023-03-21 青岛旭华建设集团有限公司 Construction site dangerous behavior identification method based on image processing
CN116311082A (en) * 2023-05-15 2023-06-23 广东电网有限责任公司湛江供电局 Wearing detection method and system based on matching of key parts and images
CN117068976A (en) * 2023-08-04 2023-11-17 山东高速建设管理集团有限公司 Crane construction standard safety detection method
CN117068976B (en) * 2023-08-04 2024-05-03 山东高速建设管理集团有限公司 Crane construction standard safety detection method

Also Published As

Publication number Publication date
CN115294533B (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN115294533B (en) Building construction state monitoring method based on data processing
CN110502965B (en) Construction safety helmet wearing monitoring method based on computer vision human body posture estimation
CN111191586B (en) Method and system for inspecting wearing condition of safety helmet of personnel in construction site
KR101715001B1 (en) Display system for safety evaluation in construction sites using of wearable device, and thereof method
CN109034215A (en) A kind of safety cap wearing detection method based on depth convolutional neural networks
CN110414400B (en) Automatic detection method and system for wearing of safety helmet on construction site
CN112396658A (en) Indoor personnel positioning method and positioning system based on video
CN115035088A (en) Helmet wearing detection method based on yolov5 and posture estimation
CN112036327A (en) SSD-based lightweight safety helmet detection method
CN115223249A (en) Quick analysis and identification method for unsafe behaviors of underground personnel based on machine vision
Wang et al. A safety helmet and protective clothing detection method based on improved-yolo v 3
CN115797864A (en) Safety management system applied to smart community
CN113807240A (en) Intelligent transformer substation personnel dressing monitoring method based on uncooperative face recognition
CN111177468A (en) Laboratory personnel unsafe behavior safety inspection method based on machine vision
CN116682034A (en) Dangerous behavior detection method under complex production operation scene
CN113536842A (en) Electric power operator safety dressing identification method and device
CN111274888B (en) Helmet and work clothes intelligent identification method based on wearable mobile glasses
CN206948499U (en) The monitoring of student's real training video frequency tracking, evaluation system
CN114926778A (en) Safety helmet and personnel identity recognition system under production environment
CN115100495A (en) Lightweight safety helmet detection method based on sub-feature fusion
CN113221640A (en) Live working initiative early warning and safety monitoring system based on accurate location of artificial intelligence
CN110598569A (en) Action recognition method based on human body posture data
CN116958883B (en) Safety helmet detection method, system, storage medium and electronic equipment
CN116311082B (en) Wearing detection method and system based on matching of key parts and images
CN116052223B (en) Method, system, equipment and medium for identifying people in operation area based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant