CN116697922B - Big data-based monitoring system for engineering measurement and application method thereof - Google Patents

Big data-based monitoring system for engineering measurement and application method thereof Download PDF

Info

Publication number
CN116697922B
CN116697922B CN202310973146.3A CN202310973146A CN116697922B CN 116697922 B CN116697922 B CN 116697922B CN 202310973146 A CN202310973146 A CN 202310973146A CN 116697922 B CN116697922 B CN 116697922B
Authority
CN
China
Prior art keywords
building
monitoring
deformation
box
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310973146.3A
Other languages
Chinese (zh)
Other versions
CN116697922A (en
Inventor
李力
刘聪聪
王宝坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yishui Huachen Real Estate Surveying And Mapping Co ltd
Original Assignee
Yishui Huachen Real Estate Surveying And Mapping Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yishui Huachen Real Estate Surveying And Mapping Co ltd filed Critical Yishui Huachen Real Estate Surveying And Mapping Co ltd
Priority to CN202310973146.3A priority Critical patent/CN116697922B/en
Publication of CN116697922A publication Critical patent/CN116697922A/en
Application granted granted Critical
Publication of CN116697922B publication Critical patent/CN116697922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • G01C15/004Reference lines, planes or sectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K5/00Casings, cabinets or drawers for electric apparatus
    • H05K5/02Details
    • H05K5/0217Mechanical details of casings

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Testing Of Devices, Machine Parts, Or Other Structures Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a monitoring system for engineering measurement based on big data and a use method thereof.

Description

Big data-based monitoring system for engineering measurement and application method thereof
Technical Field
The application relates to the technical field of engineering measurement, in particular to a monitoring system for engineering measurement based on big data and a use method thereof.
Background
With the progress of technology and the development of big data technology, a monitoring system for engineering measurement based on big data plays an important role in the field of modern engineering. The system can monitor and analyze various parameters and indexes in engineering projects in real time by utilizing the sensor, the data acquisition equipment and an advanced data analysis algorithm, and provides comprehensive data support and decision basis for engineers and decision makers.
In the aspect of building settlement observation measurement, the traditional manual measurement method needs to install observation points in advance, measure at regular or irregular time and record the result. This method has problems of inconvenience and insufficient obvious monitoring results. To solve this problem, the document of the prior art publication No. CN115790525a provides a monitoring system based on engineering measurement, by setting a settlement observation target on a building to mark the observation point, and shooting the settlement observation target through a camera in the first monitoring device, and then transmitting the photo to a remote terminal for comparison, thereby judging whether the building is settled or not. Meanwhile, the second monitoring device is used for monitoring the deviation of the position of the first monitoring device so as to ensure the accuracy of the data. However, this prior art solution has problems, such as the first monitoring device lacks a protection structure when used outdoors, is vulnerable to dust pollution and damage, and has a low service life.
In addition, during construction and operation of a building, a certain degree of deformation may occur for various reasons, and when the deformation exceeds an allowable limit, disasters may be caused. In order to solve the problem, the publication of the prior art with publication number CN111457854A provides a deformation monitoring method and device based on a building, and the method can monitor the deformation condition of the building through analysis of a building information model and wall surface crack data, and establish a building monitoring model for deformation analysis. By implementing the method, the manpower resource consumption can be effectively reduced, and the measurement efficiency is improved, so that the requirement of efficiently checking the potential safety hazard of the building is met. However, this prior art solution is limited to monitoring wall cracks and does not cover other parts of the building and other deformations such as bending, torsion, stretching, expansion and contraction of the building.
Therefore, in order to more comprehensively monitor deformation conditions of a building, a monitoring system for engineering measurement based on big data and a using method thereof are provided, and by adopting big data technology, deformation conditions of various parts of the building can be comprehensively monitored, and other deformation problems, such as bending, torsion, stretching, expansion, contraction and the like of the building, can be accurately monitored, so that a more comprehensive monitoring result is provided.
Disclosure of Invention
In order to overcome the drawbacks of the prior art, an object of the present invention is to provide a monitoring system for engineering measurement based on big data, comprising.
The monitoring box 1, the inside of the monitoring box 1 is fixedly provided with a collecting box 2, and the collecting box 2 and the same side of the monitoring box 1 are provided with collecting ports; the outside of the monitoring box 1 is fixedly provided with a sliding table 11, and one side and the top of the collecting box 2 are respectively fixedly provided with a camera A24 and a camera B25; a camera C26 is fixedly arranged on one side, far away from the acquisition port, of the acquisition box 2, and a storage battery 17 is fixedly arranged at the bottom of the monitoring box 1; a data processing module 13, a data transmission module 14, a control module 15, a wireless communication module 16 and a deformation monitoring module are fixedly arranged on one side of the inside of the monitoring box 1, wherein the data processing module 13 is used for processing coordinate information acquired by the camera A24, the camera B25 and the camera C26; the data transmission module 14 is configured to send the collected coordinate information to a monitoring terminal.
The deformation monitoring module comprises a receiver, a processor, a building component identification model, a building deformation identification model and a signal transmitter.
The receiver is used for receiving signals sent by the monitoring terminal and building external images shot by the camera A24, the camera B25 and the camera C26.
The processor is used for preprocessing the shot building exterior image.
The building component recognition model is used for carrying out image recognition on the preprocessed image so as to recognize each component of the building in the image.
The building deformation recognition model is used for recognizing each part of the building so as to recognize deformation conditions of each part of the building.
The signal transmitter is used for transmitting the deformation information of each part of the identified building to the monitoring terminal.
The control module 15 is used for controlling the collection work of the collection box 2 and controlling the opening and closing of the camera A24, the camera B25 and the camera C26, and simultaneously remotely controlling the laser instrument on the outer side of the building through the wireless communication module 16.
The guard plate 3, guard plate 3 rotates the collection mouth department that sets up in monitoring case 1.
The protective housing 4, protective housing 4 slides along gathering the outside that the mouth direction set up in monitoring case 1, protective housing 4 drives in the slip in-process guard plate 3 is automatic to be opened.
The support component 5, the support component 5 sets up in the bottom of monitoring case 1.
The motor 6, the motor 6 is fixed to be set up in the inside of supporting component 5, motor 6 is used for the drive protective housing 4 slides.
Further, the light supplementing lamp 21 is fixedly arranged inside the collection box 2, a plane coordinate system A22 is arranged on the same plane on one side and the top of the collection box 2, a plane coordinate system B23 is arranged on one side, far away from the collection port, of the collection box 2 in parallel to the plane coordinate system A22, and the centers of the plane coordinate system A22 and the plane coordinate system B23 are all collinear with the initial laser line 7 emitted by the laser instrument.
Further, two slide bars 42 are fixedly connected to the inner side of the protective housing 4 symmetrically, the slide bars 42 are slidably arranged at the top of the corresponding sliding table 11, one end, away from the collection port, of each slide bar 42 is fixedly provided with a tooth segment 43, the top of the monitoring box 1 is rotatably provided with two gears A31, the tooth segments 43 are movably meshed with the corresponding gears A31, the gears A31 are coaxially and fixedly arranged at the rotating ends of the protective plate 3, racks 44 are fixedly arranged on the inner side of the protective housing 4, the racks 44 drive the protective housing 4 to slide under the transmission of the motor 6, a solar panel 45 is fixedly arranged at the top of the protective housing 4, the solar panel 45 is electrically connected with the storage battery 17, a notch 41 is formed in the bottom of the protective housing 4, and the notch 41 is located on the outer side of the storage battery 17.
Further, one of the output ends of the motor 6 is driven by the transmission component 62 to slide on the protective shell 4, the transmission component 62 comprises a sliding shaft 621, a shaft sleeve 622 is arranged on the outer side of the sliding shaft 621 in a sliding manner along the axial direction, one end of the sliding shaft 621 is in transmission arrangement with the output end of the motor 6, a gear D623 is fixedly arranged on the other end of the shaft sleeve 622, the shaft sleeve 622 is rotatably arranged in the support component 5, the gear D623 is rotatably arranged in the monitoring box 1, an interlayer 12 is arranged in the monitoring box 1 corresponding to the gear D623, a gear E624 is arranged on the outer side of the gear D623 in a meshed manner, the gear E624 is arranged on the outer side of the rack 44 in a meshed manner, and the gear E624 is rotatably arranged in the interlayer 12.
Further, the support assembly 5 includes a support 51, a support 52 is slidably disposed on the inner side of the support 51, the support 52 is fixedly disposed at the bottom of the monitoring box 1, the support 52 is rotatably disposed on the outer side of the shaft sleeve 622, a lifting assembly 53 for driving the support 52 to lift is disposed inside the support 51, a fixing seat 511 for fixing the motor 6 is fixedly disposed inside the support 51, the sliding shaft 621 is rotatably disposed inside the support 51 through the fixing seat 511, a control switch 512 support 52 for controlling the motor 6 to operate is fixedly disposed on the outer side of the support 51, and two output ends of the motor 6 are respectively disposed in a transmission manner with the sliding shaft 621 and the lifting assembly 53 through the clutch assembly 61.
Further, the clutch assembly 61 includes a spline shaft 611, the spline shaft 611 is fixedly disposed at two output ends of the motor 6, two clutch assemblies 61 are respectively slidably disposed at outer sides of the spline shaft 611 and are respectively provided with a tooth sleeve a612 and a tooth sleeve B613, the tooth sleeve a612 and the tooth sleeve B613 are respectively rotatably disposed at two ends of the U-shaped plate 614, a spring a615 and a spring B616 are respectively disposed at outer sides of the tooth sleeve a612 and the tooth sleeve B613, the tooth sleeve a612 is disposed under the tooth sleeve C617, the tooth sleeve C617 is fixedly disposed at the bottom of the slide shaft 621, the tooth sleeve B613 is disposed over the tooth sleeve D618, the tooth sleeve D618 is rotatably disposed inside the support 51, the tooth sleeve D618 drives the lifting assembly 53 to operate when being rotated, a limiting block 619 is fixedly disposed at outer sides of the U-shaped plate 614, and a plug rod 513 is inserted at outer sides of the support 51.
Further, the lifting assembly 53 includes a gear B531, the gear B531 and the gear sleeve D618 are coaxially and fixedly arranged, a gear C532 is meshed and arranged on the outer side of the gear B531, the gear C532 is rotatably arranged in the support 51, a screw 533 is fixedly arranged on the inner side of the gear C532, a slide plate 534 is arranged on the outer side of the screw 533 in a threaded manner, the slide plate 534 is fixedly arranged at the bottom of the support 52, and the slide plate 534 is slidably arranged on the inner side of the support 51.
The application provides a use method of the engineering measurement monitoring system based on big data, which comprises the following steps.
In step S1, the laser is fixed to the outside of the building, and the monitor box 1 is placed in alignment with the laser line so that the laser line can be injected into the interior of the collection box 2 to be collected.
Step S2, when monitoring, the motor 6 is driven by the motor 6 in the supporting component 5 to slide towards the front of the collection port, and when the protective shell 4 stops sliding, the protective plate 3 is driven to be automatically opened, so that the collection ports of the monitoring box 1 and the collection box 2 are in an opened state.
And S3, collecting laser emitted by the laser instrument through the collecting box 2, and more intuitively observing the lifting and tilting conditions of the building by monitoring the change of the laser line.
And S4, when the lifting or tilting condition of the building is monitored, shooting the external condition of the building through a shooting unit on the monitoring box, and transmitting the shot image into a deformation monitoring module in the monitoring box for image recognition so as to monitor the external deformation condition of the building.
Further, step S3 includes the following steps.
In step S3.1, coordinate values on the plane coordinate system a22 and the plane coordinate system B23 are read based on machine vision by the camera a24, the camera B25, and the camera C26.
In step S3.2, a spatial line segment, i.e. the path of the laser line 71 to be measured, is obtained from the coordinate points on the two planes, and the monitored laser line is determined from the two coordinate points in space.
And S3.3, simulating and generating corresponding straight lines at the monitoring terminal according to the acquired coordinate points, and marking each straight line according to the measurement sequence.
And S3.4, monitoring the change condition of the building according to the change condition of the plurality of line segments.
Further, step S4 includes the following steps.
And S4.1, shooting an external image of the building through an imaging unit and transmitting the shot image to a deformation monitoring module in the detection box.
And S4.2, the processor of the deformation monitoring module performs image preprocessing on the input image.
And S4.3, performing image recognition on the processed image by using a building component recognition model of the deformation monitoring module, and recognizing each part of the building in the image.
And S4.4, the building deformation recognition model of the deformation monitoring module carries out image recognition on each part of the building, and recognizes the deformation condition of each part of the building.
And S4.5, the signal transmitter of the deformation monitoring module transmits the deformation condition of each part of the building to the monitoring terminal for display so that a user can check the deformation condition of the building.
Further, step 4.3 includes the following steps.
And S4.3.1, collecting building component images of various angles including doors, windows and walls, preprocessing the images, taking the processed images as characteristics, taking the names of the building components as labels, taking the characteristics and the labels as data sets, and dividing the images into a training set and a testing set according to a certain proportion.
In step S4.3.2, building a building component recognition model based on machine learning, wherein the building component recognition model based on machine learning adopts a YOLO algorithm.
Step S4.3.3, training the built machine learning based building component identification model using the training set.
And S4.3.4, evaluating the trained building component recognition model based on machine learning by using the test set, and evaluating the accuracy, precision, recall and F1 value of the building component recognition model.
And S4.3.5, inputting the input preprocessed image into a building component recognition model, recognizing various components in the image, and inputting the recognition result into a building deformation recognition model.
Further, step 4.4 includes the following steps.
S4.4.1 collecting deformation images of a building at various angles including cracking, bending, torsion, stretching, expansion and contraction, preprocessing the collected images, taking the preprocessed images as characteristics, taking deformation conditions as labels, forming a dataset by the characteristics and the labels, and dividing the dataset into a training set and a testing set according to a certain proportion.
S4.4.2, building a building deformation identification model based on machine learning, wherein the building deformation identification model based on machine learning adopts a VGG-16 neural network algorithm.
S4.4.3, training the built building deformation recognition model based on machine learning by using the training set.
S4.4.4, evaluating the trained building deformation recognition model based on machine learning by using a test set; and evaluating the accuracy, the precision, the recall rate and the F1 value of the building deformation recognition model recognition by calculating.
S4.4.5, building deformation recognition is performed for each building component based on the incoming recognition result of the building component recognition model, and the recognition result is transmitted to the monitoring terminal.
Compared with the prior art, the application has at least the following technical effects or advantages.
1) The application effectively solves the problem that the monitoring device cannot be protected by the protection plate, ensures that the collecting box works normally when collecting light, simultaneously, can protect the monitoring box and the collecting box before and after measurement by the protection plate, and automatically opens the protection plate when measuring, thereby being convenient to use.
2) According to the application, the protective shell is arranged on the outer side of the monitoring box in a sliding manner, so that the monitoring box is comprehensively protected, the length of the collecting opening is prolonged by sliding the protective shell to the front of the collecting opening, the influence of external light on the collecting box is reduced, and meanwhile, the protective shell drives the protection plate to automatically open when the sliding of the protective shell is about to be stopped, so that the protection plate is more convenient to open.
3) The application is provided with the collecting box in the monitoring box for collecting a plurality of straight lines for comparison so as to measure the settlement or inclination of the building, thereby leading the measuring result to be more visual.
4) The plane coordinate system A and the plane coordinate system B are arranged on the inner side of the collecting box, so that two-point coordinates of the incident light can be conveniently measured, and a measurement result is automatically read through the camera A, the camera B and the camera C and sent to the monitoring terminal, so that data collection is more convenient.
5) The application sets up the video camera outside the monitoring box, is used for shooting the external situation of the building. The monitoring box is internally provided with a deformation detection module, and the building component recognition model and the building deformation recognition model are utilized for carrying out twice image recognition, and recognition results are sent to the monitoring terminal so as to directly observe the deformation condition of the building. Such a design makes the monitoring process more intuitive and real-time.
Drawings
Fig. 1 is a schematic view of a monitoring end structure of a monitoring box according to an embodiment of the present application.
Fig. 2 is a schematic structural view of a side of the monitoring box away from the monitoring end in an embodiment of the present application.
Fig. 3 is a schematic view illustrating an internal structure of a support assembly according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a transmission assembly according to an embodiment of the present application.
Fig. 5 is a schematic diagram of an assembly structure of a monitoring box and a collecting box in an embodiment of the present application.
Fig. 6 is a schematic diagram of an internal structure of a monitoring box according to an embodiment of the application.
Fig. 7 is a schematic structural diagram of a collection box according to an embodiment of the present application.
Fig. 8 is a schematic diagram of the internal structure of the collection tank according to the embodiment of the present application.
Fig. 9 is a schematic view showing an internal structure of a protective case according to an embodiment of the present application.
Fig. 10 is a schematic diagram of laser line acquisition in an embodiment of the application.
Reference numerals are used in the figures.
1. A monitoring box; 2. a collection box; 3. a protection plate; 4. a protective shell; 5. a support assembly; 6. a motor; 7. an initial laser line.
11. A sliding table; 12. an interlayer; 13. a data processing module; 14. a data transmission module; 15. a control module; 16. a wireless communication module; 17. and a storage battery.
21. A light supplementing lamp; 22. a plane coordinate system A; 23. a plane coordinate system B; 24. a camera A; 25. a camera B; 26. and a camera C.
31. Gear a.
41. A notch; 42. a slide bar; 43. tooth sections; 44. a rack; 45. a solar panel.
51. A support; 52. a support post; 53. and a lifting assembly.
61. A clutch assembly; 62. and a transmission assembly.
71. The laser line being measured.
511. A fixing seat; 512. a control switch; 513. and a plunger.
531. A gear B; 532. a gear C; 533. a screw; 534. and (3) a sliding plate.
611. A flower shaft; 612. a tooth sleeve A; 613. a tooth sleeve B; 614. a U-shaped plate; 615. a spring A; 616. a spring B; 617. a tooth sleeve C; 618. a tooth sleeve D; 619. and a limiting block.
621. A sliding shaft; 622. a shaft sleeve; 623. a gear D; 624. and a gear E.
Detailed Description
The application is described in further detail below with reference to the drawings.
Referring to fig. 1, fig. 2, fig. 3 and fig. 4, the embodiment of the application discloses a monitoring system for engineering measurement based on big data, which comprises a monitoring box 1, wherein a collecting box 2 is fixedly arranged in the monitoring box 1 and is used for receiving and collecting laser lines emitted by a laser instrument on the outer side of a building, collecting ports are formed in one sides of the monitoring box 1 and the collecting box 2, a protection plate 3 is rotatably arranged at the collecting port of the monitoring box 1, a protection shell 4 is slidably arranged on the outer side of the monitoring box 1, a supporting component 5 is arranged at the bottom of the monitoring box 1, a motor 6 for driving the protection shell 4 to slide towards the front of the collecting port is fixedly arranged in the supporting component 5, and the protection plate 3 is driven to be automatically opened in the sliding process of the protection shell 4.
In this embodiment, during installation, the laser instrument is fixed on the outer side of the building, the monitoring box 1 is placed to be aligned with the laser line, so that the laser line can be injected into the collecting box 2 to be collected, the whole of the monitoring box 1 arranged outdoors can be protected through the protective shell 4, the collecting port of the monitoring box 1 can be protected through the protective plate 3, so that the outside dust is prevented from accumulating in the monitoring box 1, and the interference of the outside environment to the collecting box 2 is reduced; when monitoring, the motor 6 in the support assembly 5 drives the protective shell 4 to slide towards the front of the acquisition port, so that the length of the acquisition port is prolonged; when the protective shell 4 stops sliding, the protective plate 3 is driven to automatically open, so that the acquisition ports of the monitoring box 1 and the acquisition box 2 are in an open state, and the acquisition box 2 is allowed to acquire laser emitted by the laser instrument; by monitoring the change of the laser line, the lifting and tilting conditions of the building can be observed more intuitively. When the monitoring terminal monitors that the building is lifted or inclined, a signal is sent to the deformation monitoring module and the control module 15, and the control module 15 starts a camera unit on the monitoring box to shoot the external condition of the building; the processor of the deformation monitoring module can preprocess the shot external image of the building, including light compensation, histogram equalization, image normalization, geometric transformation, filtering denoising and the like, the building component recognition model of the deformation monitoring module carries out first image recognition on the preprocessed image so as to recognize each component of the building, the building deformation recognition model of the deformation monitoring module carries out second image recognition on the image so as to recognize the deformation condition of each component of the building, including the problems of bending, torsion, stretching, expansion, contraction and the like of the building, and the signal transmitter of the deformation monitoring module sends the recognition result of the building deformation recognition model to the monitoring terminal to be displayed so that a user can check the deformation condition of the building.
Referring to fig. 7, 8 and 10, a light supplementing lamp 21 is fixedly arranged inside the collection box 2, a plane coordinate system a22 is arranged on the same plane on one side and the top of the collection box 2, a plane coordinate system B23 is arranged on one side, far away from the collection port, of the collection box 2 parallel to the plane coordinate system a22, the centers of the plane coordinate system a22 and the plane coordinate system B23 are all collinear with an initial laser line 7 emitted by the laser instrument, a camera a24 and a camera B25 are respectively fixedly arranged on one side and the top of the collection box 2, and are used for reading incident coordinates of a measured laser line 71 emitted by the laser instrument in a later stage based on machine vision, and a camera C26 is fixedly arranged on one side, far away from the collection port, of the collection box 2, and is used for reading end point coordinates of the measured laser line 71 based on machine vision.
In this embodiment, after the laser is installed, the center of the plane coordinate system a22 and the plane coordinate system B23 needs to be adjusted to be collinear with the initial laser line 7, and in the adjustment process, according to the pictures shot by the camera a24 and the camera B25, the two coordinates axially coincide with the initial laser line 7 at the same time; when monitoring is carried out in the later stage, if the building subsides or inclines, the laser instrument is driven to move, so that the tested laser line 71 emitted by the laser instrument is emitted into the collecting box 2, at the moment, the light supplementing lamp 21 in the collecting box 2 is started, proper illumination is provided for the collecting box 2, coordinate values on the plane coordinate system A22 and the plane coordinate system B23 can be clearly read by the camera A24, the camera B25 and the camera C26, a space line segment can be obtained according to coordinate points on two planes, namely, a straight line where the tested laser line 71 is located, the monitored laser line can be determined according to two coordinate points in the space, and the change condition of the building can be more intuitively monitored according to the change condition of a plurality of collected line segments.
Referring to fig. 5, 6 and 9, the outside of the monitoring box 1 is fixedly provided with a sliding table 11, the sliding table 11 is slidably arranged on the inner side of the protective shell 4, one side inside the monitoring box 1 is fixedly provided with a data processing module 13, a data transmission module 14, a control module 15 and a wireless communication module 16, the bottom of the monitoring box 1 is fixedly provided with a storage battery 17, the data processing module 13 is used for processing coordinate information collected by a camera a24, a camera B25 and a camera C26, the data transmission module 14 is used for sending the collected coordinate information to a monitoring terminal, the control module 15 is used for controlling the collection work of the collection box 2, and the laser instrument is remotely controlled to work through the wireless communication module 16.
In this embodiment, when measurement is started, the control module 15 remotely controls the laser instrument to be turned on through the wireless communication module 16 so as to emit laser to the inside of the acquisition box 2, and meanwhile, the data processing module 13 processes information acquired by the camera a24, the camera B25 and the camera C26, and transmits the processed data to the monitoring terminal through the data transmission module 14; after the data acquisition is completed, the control module 15 turns off the laser instrument again through the wireless communication module 16 so as to prevent the laser instrument from being continuously turned on. The flow can ensure that the opening and the closing of the laser instrument are controlled remotely in the measuring process so as to improve the accuracy and the safety of the measurement. Meanwhile, through data processing and transmission, the monitoring terminal can acquire the acquired data in time, so that follow-up analysis and checking of monitoring results are facilitated.
Referring to fig. 2, 5 and 9, the inside symmetry rigid coupling of protective housing 4 has slide bar 42, slide bar 42 slides and sets up in the top of corresponding slip table 11, the one end that slide bar 42 kept away from the collection mouth is all fixed and is provided with tooth section 43, the top rotation of monitoring case 1 is provided with two gears a31, tooth section 43 and corresponding gear a31 movable engagement setting, gear a31 and the coaxial fixed setting of rotation end of guard plate 3, the fixed rack 44 that is provided with of protective housing 4 inboard, rack 44 drives protective housing 4 and slides under motor 6's transmission, the fixed solar panel 45 that is provided with in protective housing 4 top, solar panel 45 and battery 17 electric connection, notch 41 has been seted up to protective housing 4 bottom, notch 41 is located the outside of battery 17.
In this embodiment, when the motor 6 drives the protective housing 4 to slide forward of the collection port, the protective housing 4 drives the slide bar 42 to slide on top of the sliding table 11, and when the slide bar 42 drives the tooth segment 43 to slide to the bottom of the gear a31, the gear a31 drives the protection plate 3 to rotate upward under the drive of the tooth segment 43, so that the protection plate 3 can be automatically opened before the sliding of the protective housing 4 is stopped; in this way, the laser can inject laser light into the interior of the collection box 2. Simultaneously, through the slip of protective housing 4, increased the length distance of collection mouth with the external world, reduced the influence of external light to the inside light of collection box 2 to can gather more clear light signal. After data acquisition is completed, the protection shell 4 is driven by the motor 6 to reset to the outer side of the monitoring box 1, and meanwhile, the protection shell 4 can drive the protection plate 3 to automatically close the acquisition ports of the monitoring box 1 and the acquisition box 2, so that the acquisition box 2 can be effectively protected before and after use.
Referring to fig. 4, fig. 5, fig. 6 and fig. 9, one of the output ends of the motor 6 drives the protective housing 4 to slide through the transmission assembly 62, the transmission assembly 62 includes a sliding shaft 621, a shaft sleeve 622 is provided on the outer side of the sliding shaft 621 in a sliding manner along the axial direction, one end of the sliding shaft 621 is in transmission arrangement with the output end of the motor 6, a gear D623 is fixedly provided on the other end of the shaft sleeve 622, the shaft sleeve 622 is rotatably arranged in the support assembly 5, the gear D623 is rotatably arranged in the monitoring box 1, an interlayer 12 is arranged in the monitoring box 1 corresponding to the gear D623, a gear E624 is arranged on the outer side of the gear D623 in a meshing manner, the gear E624 is arranged on the outer side of the rack 44 in a meshing manner, and the gear E624 is rotatably arranged in the interlayer 12.
In this embodiment, when the motor 6 is running, one of the output ends drives the sliding shaft 621 and the shaft sleeve 622 to rotate synchronously, so as to drive the gear D623 inside the interlayer 12 to rotate synchronously, so as to drive the gear E624 to rotate, and under the meshing effect of the gear E624 and the rack 44, the protective housing 4 is driven to slide, and then the protective housing 4 is driven to slide and reset before and after collection through the motor 6.
Referring to fig. 1, 2 and 4, the supporting assembly 5 includes a support 51, a strut 52 is slidably disposed on the inner side of the support 51, the strut 52 is fixedly disposed at the bottom of the monitoring box 1, the strut 52 is rotatably disposed on the outer side of a shaft sleeve 622, a lifting assembly 53 for driving the strut 52 to lift is disposed inside the support 51, a fixing seat 511 for fixing a motor 6 is fixedly disposed inside the support 51, a sliding shaft 621 is rotatably disposed inside the support 51 through the fixing seat 511, a control switch 512 for controlling the motor 6 to operate is fixedly disposed on the outer side of the support 51, and two output ends of the motor 6 are respectively disposed in a transmission manner with the sliding shaft 621 and the lifting assembly 53 through a clutch assembly 61.
In this embodiment, when the motor 6 drives the sliding shaft 621 to rotate, the output end of the motor 6 is connected with the sliding shaft 621 through the clutch assembly 61 in a transmission manner, at this time, the clutch assembly 61 disconnects the transmission between the other output end and the lifting assembly 53, so that the motor 6 can independently drive the protective shell 4 to slide through the transmission assembly 62, and when the height of the monitoring box 1 is adjusted, the monitoring box 1 can drive the shaft sleeve 622 to slide on the outer side of the sliding shaft 621, so that the sliding shaft 621 and the shaft sleeve 622 can adapt to the monitoring boxes 1 with different heights, meanwhile, the transmission between the motor 6 and the sliding shaft 621 is disconnected through the clutch assembly 61, and the motor 6 is independently connected with the lifting assembly 53 in a transmission manner, so that the motor 6 can adjust the height of the monitoring box 1 through the lifting assembly 53. By such design, the motor 6 can flexibly control the sliding of the protective housing 4, or the height of the monitoring box 1 can be adjusted by the lifting assembly 53. Such a design increases the operability and flexibility of the system so that both the height adjustment of the monitor box 1 and the sliding of the protective shell 4 can be performed independently when required.
Referring to fig. 2, 3 and 4, the clutch assembly 61 includes a spline shaft 611, the spline shaft 611 is fixedly disposed at an output end of the motor 6, a tooth sleeve a612 and a tooth sleeve B613 are slidably disposed at outer sides of the two clutch assemblies 61, the tooth sleeve a612 and the tooth sleeve B613 are rotatably disposed at two ends of the U-shaped plate 614, a spring a615 and a spring B616 are disposed at outer sides of the tooth sleeve a612 and the tooth sleeve B613, the spring a615 and the spring B616 are disposed at outer sides of the U-shaped plate 614, the tooth sleeve a612 is disposed under the tooth sleeve C617, the tooth sleeve C617 is fixedly disposed at a bottom of the slide shaft 621, the tooth sleeve B613 is disposed over the tooth sleeve D618, the tooth sleeve D618 is rotatably disposed inside the support 51, the lifting assembly 53 is driven to operate when the tooth sleeve D618 is rotated, a stopper 619 is fixedly disposed at outer sides of the U-shaped plate 614, and the inserted rod 513 is inserted at outer sides of the support 51.
In this embodiment, when the limiting block 619 is located below the plunger 513, the U-shaped plate 614 drives the tooth sleeve B613 and the tooth sleeve D618 to be clamped with each other, and simultaneously drives the tooth sleeve a612 and the tooth sleeve C617 to be separated from each other, in this state, the motor 6 drives the tooth sleeve B613 to rotate through the spline shaft 611 at the output end, and then drives the tooth sleeve D618 to rotate through the tooth sleeve B613, so as to drive the lifting assembly 53 to operate, and achieve height adjustment of the monitoring box 1, and by such design, the height adjustment of the monitoring box 1 can be driven by the motor 6, so that the use is more convenient; after finishing the regulation, upwards remove U-shaped plate 614 through stopper 619 for U-shaped plate 614 drives tooth cover A612 and tooth cover C617 block each other, and drives tooth cover B613 simultaneously and keep away from tooth cover D618, then inserts the inserted bar 513 and establish the bottom at stopper 619, carries out spacing to stopper 619 and U-shaped plate 614 through inserted bar 513, and such design has guaranteed tooth cover A612 and tooth cover C617 block each other throughout in the monitoring process, in order to guarantee that motor 6 drives protective housing 4 through drive assembly 62 and normally slides in the monitoring process. Through the design, the height of the monitoring box 1 is conveniently and fast adjusted, the normal operation of the motor 6 in the monitoring process is guaranteed through the limiting block 619 and the inserted link 513, and the monitoring stability and reliability are guaranteed.
Referring to fig. 1, 3 and 4, the lifting assembly 53 includes a gear B531, the gear B531 is coaxially and fixedly arranged with a gear sleeve D618, a gear C532 is engaged with the outer side of the gear B531, the gear C532 is rotatably arranged in the support 51, a screw 533 is fixedly arranged on the inner side of the gear C532, a slide plate 534 is arranged on the outer side of the screw 533 in a threaded manner, the slide plate 534 is fixedly arranged at the bottom of the support 52, and the slide plate 534 is slidably arranged on the inner side of the support 51.
In this embodiment, when the gear sleeve D618 rotates, the gear B531 is driven to rotate synchronously, then, the gear B531 drives the screw 533 to rotate through the gear C532, the screw 533 drives the outer slide plate 534 to slide and lift on the inner side of the support 51, and the slide plate 534 pushes the support column 52 to lift, so that the support column 52 drives the monitor box 1 to adjust the height.
The embodiment of the application also discloses a using method of the engineering measurement monitoring system based on big data, which comprises the following steps.
In step S1, the laser is fixed to the outside of the building, and the monitor box 1 is placed in alignment with the laser line so that the laser line can be injected into the interior of the collection box 2 to be collected.
Step S2, when monitoring, the motor 6 is driven by the motor 6 in the supporting component 5 to slide towards the front of the collection port, and when the protective shell 4 stops sliding, the protective plate 3 is driven to be automatically opened, so that the collection ports of the monitoring box 1 and the collection box 2 are in an opened state.
And S3, collecting laser emitted by the laser instrument through the collecting box 2, and more intuitively observing the lifting and tilting conditions of the building by monitoring the change of the laser line.
And S4, when the lifting or tilting condition of the building is monitored, shooting the external condition of the building through a shooting unit on the monitoring box, and transmitting the shot image into a deformation monitoring module in the monitoring box for image recognition so as to monitor the external deformation condition of the building.
Step S3 includes the following steps.
In step S3.1, coordinate values on the plane coordinate system a22 and the plane coordinate system B23 are read based on machine vision by the camera a24, the camera B25, and the camera C26.
In step S3.2, a spatial line segment, i.e. the path of the laser line 71 to be measured, is obtained from the coordinate points on the two planes, and the monitored laser line is determined from the two coordinate points in space.
And S3.3, simulating and generating corresponding straight lines at the monitoring terminal according to the acquired coordinate points, and marking each straight line according to the measurement sequence.
And S3.4, monitoring the change condition of the building according to the change condition of the plurality of line segments.
Step S4 includes the following steps.
Step S4.1, capturing an external image of the building by the imaging unit and transmitting the captured image to the deformation monitoring module in the monitoring box 1.
In step S4.2, the processor of the deformation monitoring module performs image preprocessing on the incoming image, including but not limited to the following steps.
In step S4.2.1, the light is compensated by using a reference white-based algorithm. Counting the pixel number of each gray value, and circularly obtaining the gray value of the pixel of 5% before arrangement as reference white, wherein the average value aveGray of the brightness of the reference white is: avegray=gray ref /Gray refNum Wherein Gray ref Is a reference white gray value; gray refNum Is the reference white pixel number. The illumination compensation coefficient coe: coe =255/aveGray is calculated. The original pixel values are multiplied by illumination compensation coefficients coe respectively to obtain pixel values after illumination compensation.
In step S4.2.2, the histogram is equalized, and the gray scale of each pixel in the image is changed by changing the histogram of the image, so as to enhance the contrast of the image with a smaller dynamic range.
At step S4.2.3, the image is normalized to convert the image to a fixed standard form. By adopting a Min-Max normalization method, setting Max and Min values by traversing each pixel point in an image matrix, and carrying out data normalization processing, wherein the formula is as follows: x '= (x-min (x))/(max (x) -min (x)), the calculated result x' is the normalized pixel value, and x is the original pixel value.
And S4.2.4, performing geometric transformation, namely processing the acquired image through geometric transformation such as translation, transposition, mirror image, rotation, scaling and the like, and correcting the systematic error and the random error of the instrument position (imaging angle, perspective relation and even the self reason of the lens) when the camera acquires the garbage image.
In step S4.2.5, filtering and denoising are performed, and gaussian filtering is performed on the image, which is a process of weighted averaging on the whole image, and the value of each pixel is obtained by weighted averaging of the pixel and other pixel values in the neighborhood.
And S4.3, performing image recognition on the processed image by using a building component recognition model of the deformation monitoring module, and recognizing each part of the building in the image.
And S4.4, the building deformation recognition model of the deformation monitoring module carries out image recognition on each part of the building, and recognizes the deformation condition of each part of the building.
And S4.5, the signal transmitter of the deformation monitoring module transmits the deformation condition of each part of the building to the monitoring terminal for display so that a user can check the deformation condition of the building.
Step S4.3 comprises the following steps.
In step S4.3.1, building component images of a plurality of different angles, including doors, windows, walls, etc., are collected, the images are preprocessed, the processed images are used as features, and the name of the building component is used as a label. Feature + tag as dataset, 100 samples were collected, including 25 doors, 24 windows, 15 walls, 24 posts, 12 glasses, and 3:1 into training set and test set, 75 training set samples including 20 doors, 19 windows, 9 walls, 20 posts, 7 glasses; test set samples 25 including door 5, window 5, wall 6, column 4, glass 5.
And step S4.3.2, building a building component identification model. The building component recognition model based on machine learning adopts a YOLO algorithm.
The YOLO algorithm divides the input image into sx S grids, and the grid into which the coordinates of the center position of the groundtrunk of the building component falls is responsible for detecting the building component, each grid needs to predict B prediction boxes (boundingboxes), each prediction box needs to predict the building component position coordinates (x, y), predict confidence values, and predict the scores of C building component categories (C categories include doors, windows, walls, etc.).
Step S4.3.3, training the built machine learning based building component identification model using the training set.
Step S4.3.4, evaluating the trained machine learning based building component identification model using the test set. And evaluating by calculating the accuracy, precision, recall rate, F1 value and the like of the building component identification model identification.
The number of samples of the wall is TP, the number of samples of the wall is FN, the number of samples of the wall is TN, and the number of samples of the wall is FP.
The test set comprises 25 samples, the number of samples of which the label is a wall is 6, the number of samples of which the model is identified as a wall is 8, wherein 5 identification errors are correct, and 3 identification errors are wrong; the number of samples for which the model identification is not a wall is 17, of which 16 identifications are correct and 1 identification is incorrect. Tp=5, fp=3, tn=16, fn=1.
For walls.
The accuracy is accuracy= (tp+tn)/(tp+fp+tn+fn) = (5+16)/(5+3+16+1) =84%.
The accuracy is precision=tp/(tp+fp) = (5)/(5+3) =62.5%.
The recall rate was recovery=tp/(tp+fn) =5/(5+1) =83.3%.
F1 is F 1 =((recall -1 +precision -1 )/2)=71.4%。
And S4.3.5, inputting the input preprocessed image into a building component recognition model, recognizing various components in the image, and inputting the recognition result into a building deformation recognition model.
Step S4.4 includes the following steps.
And S4.4.1, collecting deformation images of the building at various angles, including cracking, bending, twisting, stretching, expanding, contracting and the like, carrying out image preprocessing on the collected images, and taking the preprocessed images as characteristics and deformation conditions as labels. The signature + tag constituted the dataset, 2000 samples were collected, 600 slits, 385 bends, 350 twists, 280 stretches, 215 swells, 170 contractions, and 3:1 into training set and test set, wherein the training set comprises 500 cracks, 300 bends, 250 twists, 200 stretches, 150 expands and 100 contracts; 500 test set samples, including 100 cracks, 85 bends, 100 twists, 80 stretches, 65 expands, 70 contracts.
And S4.4.2, building a building deformation identification model. The building deformation recognition model based on machine learning adopts VGG-16 neural network algorithm.
VGG-16 is responsible for extracting characteristics through a 13-layer convolution layer and a 5-layer pooling layer, and a final 3-layer full-connection layer is responsible for completing classification tasks.
In step S4.4.2.1, the input image size is 224×224×3, the step size is 1,0 filled by 64 convolution kernels of 3*3, the convolution is performed twice, and the output image size is 224×224×64 after activation by the Relu activation function.
Step S4.4.2.2, the filter is 2×2, the step size is 2, and the output image size is 112×112×64.
Step S4.4.2.3, the step is filled with 1,0 by 128 convolution kernels with size 3*3, convolved twice, activated by the Relu activation function, and output image size 112×112×128.
Step S4.4.2.4, the filter is 2×2, the step size is 2, and the output image size is 56×56×128.
Step S4.4.2.5, the step is filled with 1,0 by 256 convolution kernels with size 3*3, convolved twice, activated by the Relu activation function, and output image size is 56×56×256.
Step S4.4.2.6, the filter is 2×2, the step size is 2, and the output image size is 28×28×256.
Step S4.4.2.7, the step is filled with 1,0 by 512 convolution kernels with size 3*3, and the convolution is performed twice, and the output image size is 28×28×512 after activation by the Relu activation function.
Step S4.4.2.8, the filter is 2×2, the step size is 2, and the output image size is 14×14×512.
Step S4.4.2.9, the step is filled with 1,0 by 512 convolution kernels with size 3*3, and the convolution is performed twice, and the step is activated by the Relu activation function, and the size of the output image is 14×14×512.
Step S4.4.2.10, the filter is 2×2, the step size is 2, and the output image size is 7×7×512.
In step S4.4.2.11, flat (), the data is flattened into a vector, which becomes one-dimensional 512×7×7= 25088.
Step S4.4.2.12, via two fully connected layers 1 x 4096, one layer 1 x C (C is the number of categories of building elements), is activated via the Relu activation function.
Step S4.4.2.13, sorting by softmax sorter.
And step S4.4.3, training the built building deformation identification model based on machine learning by using the training set.
Step S4.4.4, evaluating the trained building deformation recognition model based on machine learning by using a test set; and evaluating by calculating the accuracy rate, the precision rate, the recall rate, the F1 value and the like of the building deformation recognition model recognition.
The number of samples that are actually cracked is TP, the number of samples that are not cracked is FN, the number of samples that are not cracked is TN, and the number of samples that are not cracked is FP.
The test set comprises 500 samples, the number of samples with the label being a crack is 100, the number of samples with the building deformation identification model being identified as the crack is 120, and 90 of the samples are correctly identified and 30 of the samples are incorrectly identified; the number of samples that the building deformation identification model identified as not being a crack was 380, with 370 identification correct and 10 identification incorrect. Tp=90, fp=30, tn=370, fn=10.
For cracks.
Accuracy is accuracy= (tp+tn)/(tp+fp+tn) =92% >.
The accuracy is precision=tp/(tp+fp) =75%.
The recall rate is recovery=tp/(tp+fn) =5/(5+1) =90%.
F1 is F 1 =((recall -1 +precision -1 )/2)=81.8%。
And S4.4.5, carrying out building deformation identification on each building component according to the incoming identification result of the building component identification model, and transmitting the identification result to the monitoring terminal.
In view of the above, when the monitoring system for engineering measurement based on big data disclosed in the embodiment of the application is used, the laser instrument is horizontally installed on the outer side of a building, the center of the plane coordinate system A22 and the center of the plane coordinate system B23 are adjusted to be collinear with the initial laser line 7, and when the monitoring system is adjusted, the laser is aligned with the center of the plane coordinate system B23 according to the picture shot by the camera C26.
When the height of the monitoring box 1 is adjusted, the limiting block 619 is arranged below the inserted link 513, at this time, the U-shaped plate 614 drives the tooth sleeve B613 and the tooth sleeve D618 to be mutually clamped, simultaneously drives the tooth sleeve A612 and the tooth sleeve C617 to be mutually far away, drives the tooth sleeve B613 to rotate through the spline shaft 611 at the output end of the motor 6, further drives the tooth sleeve D618 to rotate through the tooth sleeve B613, so as to drive the gear B531 in the lifting assembly 53 to rotate, the gear B531 drives the screw 533 to rotate through the gear C532, and the screw 533 drives the outer sliding plate 534 to slide and lift on the inner side of the support 51, and further drives the support 52 to lift through the sliding plate 534, so that the monitoring box 1 is driven to adjust the height; after the adjustment is completed, the U-shaped plate 614 is moved upwards through the limiting block 619, so that the tooth sleeve A612 and the tooth sleeve C617 are clamped with each other, meanwhile, the tooth sleeve B613 is driven to be away from the tooth sleeve D618, the inserted rod 513 is inserted into the bottom of the limiting block 619, the limiting block 619 and the U-shaped plate 614 are limited through the inserted rod 513, and the tooth sleeve A612 and the tooth sleeve C617 are ensured to be clamped with each other all the time in the monitoring process.
When the lifting and tilting of the building are required to be measured, the motor 6 drives the spline shaft 611 to drive the tooth sleeve A612 to rotate, the tooth sleeve A612 drives the tooth sleeve C617 to rotate, the tooth sleeve C617 drives the sliding shaft 621 in the transmission component 62 to rotate, the shaft sleeve 622 drives the gear D623 in the interlayer 12 to rotate, the gear E624 is driven to rotate when the gear D623 rotates, the protective shell 4 is driven to slide through the rack 44, when the motor 6 drives the protective shell 4 to slide towards the front of the collecting opening, the protective shell 4 drives the sliding rod 42 to slide on the top of the sliding table 11, when the sliding rod 42 drives the tooth segment 43 to slide to the bottom of the gear A31, the gear A31 drives the protection plate 3 to rotate upwards under the driving of the tooth segment 43, and then the protection plate 3 is automatically opened before the protective shell 4 stops sliding, so that the laser can be injected into the collecting box 2; meanwhile, the length distance between the acquisition port and the outside is increased through the protective shell 4, the influence of outside light on the light inside the acquisition box 2 is reduced, and the clear light is ensured to be acquired.
When the measurement is started, the control module 15 remotely controls the laser instrument to be started through the wireless communication module 16, so that the laser instrument emits laser to the interior of the collection box 2, if the building is subsided or inclined, the building can drive the laser instrument to move, the measured laser line 71 emitted by the laser instrument is emitted into the interior of the collection box 2, at the moment, the light supplementing lamp 21 in the collection box 2 is started, proper illumination is provided for the collection box 2, so that the coordinate values on the plane coordinate system A22 and the plane coordinate system B23 can be clearly read by the camera A24, the camera B25 and the camera C26, the data processing module 13 processes the information collected by the camera A24, the camera B25 and the camera C26, the processed data is transmitted to the monitoring terminal through the data transmission module 14, the monitoring terminal obtains a space line segment according to coordinate points on two planes, namely, the line segment where the measured laser line 71 is located is determined according to the two coordinate points in space, and the change condition of the plurality of the monitored laser line segments can be more intuitively monitored according to the change condition of the collected line segments.
After the collection is completed, the control module 15 closes the laser instrument through the wireless communication module 16, and the protection shell 4 is driven by the motor 6 to reset to the outer side of the monitoring box 1, so that the collection ports of the monitoring box 1 and the collection box 2 are automatically closed by the protection plate 3, and the collection box 2 is ensured to be protected before and after use.

Claims (10)

1. The monitoring system for engineering measurement based on big data comprises a monitoring box (1), a protection plate (3), a protection shell (4), a supporting component (5) and a motor (6), and is characterized in that a collecting box (2) is fixedly arranged in the monitoring box (1), and collecting ports are formed in the same sides of the collecting box (2) and the monitoring box (1); the intelligent monitoring device is characterized in that a sliding table (11) is fixedly arranged on the outer side of the monitoring box (1), a camera A (24) and a camera B (25) are fixedly arranged on one side and the top of the collecting box (2) respectively, a camera C (26) is fixedly arranged on one side, far away from a collecting port, of the collecting box (2), and a storage battery (17) is fixedly arranged at the bottom of the monitoring box (1); a data processing module (13), a data transmission module (14), a control module (15), a wireless communication module (16) and a deformation monitoring module are fixedly arranged on one side of the inside of the monitoring box (1), wherein the data processing module (13) is used for processing coordinate information acquired by the camera A (24), the camera B (25) and the camera C (26); the data transmission module (14) is used for sending the acquired coordinate information to the monitoring terminal; the deformation monitoring module comprises a receiver, a processor, a building component recognition model, a building deformation recognition model and a signal transmitter, wherein the receiver is used for receiving signals sent by a monitoring terminal and building external images shot by a camera A (24), a camera B (25) and a camera C (26); the processor is used for preprocessing the shot building external image; the building component recognition model is used for carrying out image recognition on the preprocessed image so as to recognize each component of the building in the image; the building deformation recognition model is used for recognizing each part of the building so as to recognize the deformation condition of each part of the building; the signal transmitter is used for transmitting the deformation information of each part of the identified building to the monitoring terminal; the control module (15) is used for controlling the collection work of the collection box (2) and controlling the opening and closing of the camera A (24), the camera B (25) and the camera C (26), and simultaneously remotely controlling a laser instrument on the outer side of the building through the wireless communication module (16);
The protection plate (3) is rotatably arranged at the acquisition port of the monitoring box (1);
the protective shell (4) is arranged on the outer side of the monitoring box (1) in a sliding mode along the direction of the collecting port, and the protective shell (4) drives the protective plate (3) to be automatically opened in the sliding process;
the supporting component (5) is arranged at the bottom of the monitoring box (1);
the motor (6) is fixedly arranged in the supporting component (5), and the motor (6) is used for driving the protective shell (4) to slide.
2. The monitoring system for engineering measurement based on big data according to claim 1, wherein a light supplementing lamp (21) is fixedly arranged inside the collection box (2), a plane coordinate system A (22) is arranged on the same plane on one side and the top of the collection box (2), a plane coordinate system B (23) is arranged on one side, far away from the collection port, of the collection box (2) parallel to the plane coordinate system A (22), and the centers of the plane coordinate system A (22) and the plane coordinate system B (23) are all collinear with an initial laser line (7) emitted by the laser instrument.
3. The engineering measurement monitoring system based on big data according to claim 2, wherein two slide bars (42) are fixedly connected to the inner side of the protective housing (4) symmetrically, the slide bars (42) are slidably arranged at the top of the corresponding sliding table (11), one ends of the slide bars (42) away from the collection port are fixedly provided with tooth segments (43), the top of the monitoring box (1) is rotatably provided with two gears A (31), the tooth segments (43) are movably meshed with the corresponding gears A (31), the gears A (31) are coaxially and fixedly arranged at the rotating ends of the protective plate (3), racks (44) are fixedly arranged on the inner side of the protective housing (4), the racks (44) drive the protective housing (4) to slide under the transmission of the motor (6), the top of the protective housing (4) is fixedly provided with a solar panel (45), the solar panel (45) is electrically connected with the storage battery (17), the bottom of the protective housing (4) is provided with a notch (41), and the notch (41) is located on the outer side of the storage battery (17).
4. A big data based engineering survey monitoring system according to claim 3, wherein one of the output ends of the motor (6) drives the protective housing (4) to slide through a transmission component (62), the transmission component (62) comprises a sliding shaft (621), a shaft sleeve (622) is arranged on the outer side of the sliding shaft (621) in a sliding manner along the axial direction, one end of the sliding shaft (621) is in transmission arrangement with the output end of the motor (6), a gear D (623) is fixedly arranged on the other end of the shaft sleeve (622), the shaft sleeve (622) is rotatably arranged in the support component (5), the gear D (623) is rotatably arranged in the monitoring box (1), an interlayer (12) is arranged in the monitoring box (1) corresponding to the gear D (623), a gear E (624) is arranged on the outer side of the gear D (623) in a meshed manner, the gear E (624) is arranged on the outer side of the rack (44) in a meshed manner, and the gear E (624) is rotatably arranged in the interlayer (12).
5. The monitoring system for engineering measurement based on big data according to claim 4, wherein the supporting component (5) comprises a supporting seat (51), a supporting column (52) is slidably arranged on the inner side of the supporting seat (51), the supporting column (52) is fixedly arranged at the bottom of the monitoring box (1), the supporting column (52) is rotatably arranged on the outer side of the shaft sleeve (622), a lifting component (53) for driving the supporting column (52) to lift is arranged in the supporting seat (51), a fixing seat (511) for fixing the motor (6) is fixedly arranged in the supporting seat (51), the sliding shaft (621) is rotatably arranged in the supporting seat (51) through the fixing seat (511), a control switch (512) for controlling the motor (6) to operate is fixedly arranged on the outer side of the supporting seat (51), two output ends of the motor (6) are respectively in transmission arrangement with the sliding shaft (621) and the lifting component (53) through a clutch component (61), the clutch component (61) comprises a spline shaft (611), the motor (611) is fixedly arranged on the outer sides of the two gear sleeve (613) and the two output ends (612) of the spline component (61) are respectively arranged on the tooth sleeve (A and the tooth sleeve (613), tooth cover A (612) and tooth cover B (613) rotate respectively and set up in the both ends of U-shaped board (614), the outside of tooth cover A (612) and tooth cover B (613) is provided with spring A (615) and spring B (616) respectively, spring A (615) and spring B (616) are located the outside of U-shaped board (614), tooth cover A (612) are located the bottom of tooth cover C (617), tooth cover C (617) is fixed to be set up in the bottom of sliding shaft (621), tooth cover B (613) are located the top of tooth cover D (618), tooth cover D (618) rotates and sets up in the inside of support (51), tooth cover D (618) drives lifting assembly (53) operation when rotating, U-shaped board (614) outside fixedly is provided with stopper (619), peg graft in the outside of support (51) has inserted bar (513), lifting assembly (53) include gear B (531), gear B (533) and tooth cover D (618) coaxial gear (532) are provided with tooth cover D (532), inside (532) are provided with screw (532) rotation, screw (532) are provided with inside (532), the sliding plate (534) is fixedly arranged at the bottom of the supporting column (52), and the sliding plate (534) is arranged on the inner side of the supporting seat (51) in a sliding mode.
6. The method of using a big data based monitoring system for engineering survey according to claim 5, comprising the steps of:
step S1, fixing a laser instrument on the outer side of a building, and placing a monitoring box (1) to be aligned with a laser line so that the laser line can be shot into the collecting box (2) to be collected;
step S2, when monitoring is carried out, the motor (6) is driven to slide towards the front of the collection port through the motor (6) in the supporting component (5), and when the protective shell (4) is about to stop sliding, the protective plate (3) is driven to be automatically opened, so that the collection ports of the monitoring box (1) and the collection box (2) are in an opened state;
step S3, collecting laser emitted by a laser instrument through a collecting box (2), and more intuitively observing lifting and tilting conditions of the building by monitoring the change of laser lines;
and S4, when the lifting or tilting condition of the building is monitored, shooting the external condition of the building through a shooting unit on the monitoring box, and transmitting the shot image into a deformation monitoring module in the monitoring box for image recognition so as to monitor the external deformation condition of the building.
7. The method of using a big data based monitoring system for engineering survey according to claim 6, wherein the step S3 comprises the steps of:
Step S3.1, reading coordinate values on a plane coordinate system A (22) and a plane coordinate system B (23) based on machine vision through a camera A (24), a camera B (25) and a camera C (26);
step S3.2, obtaining a space line segment, namely a path of the laser line (71) to be detected, according to the coordinate points on the two planes, and further determining the monitored laser line according to the two coordinate points in the space;
s3.3, simulating and generating corresponding straight lines at the monitoring terminal according to the acquired coordinate points, and marking each straight line according to the measurement sequence;
and S3.4, monitoring the change condition of the building according to the change condition of the plurality of line segments.
8. The method of using a big data based monitoring system for engineering survey according to claim 6, wherein step S4 comprises the steps of:
s4.1, shooting an external image of a building through a shooting unit and transmitting the shot image into a deformation monitoring module in a detection box;
s4.2, the processor of the deformation monitoring module performs image preprocessing on the input image;
s4.3, performing image recognition on the processed image by a building component recognition model of the deformation monitoring module, and recognizing each part of a building in the image;
S4.4, carrying out image recognition on each part of the building by using a building deformation recognition model of the deformation monitoring module, and recognizing deformation conditions of each part of the building;
and S4.5, the signal transmitter of the deformation monitoring module transmits the deformation condition of each part of the building to the monitoring terminal for display so that a user can check the deformation condition of the building.
9. The method of using a big data based monitoring system for engineering survey according to claim 8, wherein the step S4.3 comprises the steps of:
s4.3.1, collecting building component images with various angles including doors, windows and walls, preprocessing the images, taking the processed images as characteristics, taking the names of the building components as labels, taking the characteristics and the labels as data sets, and dividing the images into a training set and a testing set according to a certain proportion;
step S4.3.2, building a building component recognition model based on machine learning, wherein the building component recognition model based on machine learning adopts a YOLO algorithm;
step S4.3.3, training the built building component recognition model based on machine learning by using a training set;
step S4.3.4, evaluating the trained building component recognition model based on machine learning by using a test set, and evaluating the accuracy, precision, recall rate and F1 value of the building component recognition model by calculating;
And S4.3.5, inputting the input preprocessed image into a building component recognition model, recognizing various components in the image, and inputting the recognition result into a building deformation recognition model.
10. The method of using a big data based monitoring system for engineering survey according to claim 8, wherein the step S4.4 comprises the steps of:
s4.4.1 collecting deformation images of a building at various angles including cracking, bending, torsion, stretching, expansion and contraction, preprocessing the collected images, taking the preprocessed images as characteristics, taking deformation conditions as labels, forming a data set by the characteristics and the labels, and dividing the data set into a training set and a test set according to a certain proportion;
s4.4.2, building a building deformation identification model based on machine learning, wherein the building deformation identification model based on machine learning adopts a VGG-16 neural network algorithm;
s4.4.3, training the built building deformation recognition model based on machine learning by using a training set;
s4.4.4, evaluating the trained building deformation recognition model based on machine learning by using a test set; evaluating the accuracy, the precision, the recall rate and the F1 value of the building deformation recognition model recognition by calculating;
S4.4.5, building deformation recognition is performed for each building component based on the incoming recognition result of the building component recognition model, and the recognition result is transmitted to the monitoring terminal.
CN202310973146.3A 2023-08-04 2023-08-04 Big data-based monitoring system for engineering measurement and application method thereof Active CN116697922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310973146.3A CN116697922B (en) 2023-08-04 2023-08-04 Big data-based monitoring system for engineering measurement and application method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310973146.3A CN116697922B (en) 2023-08-04 2023-08-04 Big data-based monitoring system for engineering measurement and application method thereof

Publications (2)

Publication Number Publication Date
CN116697922A CN116697922A (en) 2023-09-05
CN116697922B true CN116697922B (en) 2023-12-12

Family

ID=87829655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310973146.3A Active CN116697922B (en) 2023-08-04 2023-08-04 Big data-based monitoring system for engineering measurement and application method thereof

Country Status (1)

Country Link
CN (1) CN116697922B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008203214A (en) * 2007-02-22 2008-09-04 Taiko Denki Co Ltd Work deformation/distortion detecting method
CN105806242A (en) * 2016-04-15 2016-07-27 同济大学 Surface type measuring device adopting laser rotary scanning
CN105841629A (en) * 2016-05-24 2016-08-10 上海建为历保工程科技股份有限公司 Photogrammetry monitoring system for monitoring inclination and sedimentation of cultural building relics
CN208847130U (en) * 2018-09-29 2019-05-10 中南大学 A kind of girder construction deformation monitoring system based on CCD
CN210089628U (en) * 2019-04-26 2020-02-18 苏州利力升光电科技有限公司 Multi-scale two-dimensional measurement system
WO2020199538A1 (en) * 2019-04-04 2020-10-08 中设设计集团股份有限公司 Bridge key component disease early-warning system and method based on image monitoring data
WO2021068848A1 (en) * 2019-10-09 2021-04-15 山东大学 Tunnel structure disease multi-scale measurement and intelligent diagnosis system and method
CN214951500U (en) * 2021-05-28 2021-11-30 云南万合建筑工程有限公司 Building engineering construction operation environment intelligent monitoring system based on big data
CN115993096A (en) * 2023-02-08 2023-04-21 金陵科技学院 High-rise building deformation measuring method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008203214A (en) * 2007-02-22 2008-09-04 Taiko Denki Co Ltd Work deformation/distortion detecting method
CN105806242A (en) * 2016-04-15 2016-07-27 同济大学 Surface type measuring device adopting laser rotary scanning
CN105841629A (en) * 2016-05-24 2016-08-10 上海建为历保工程科技股份有限公司 Photogrammetry monitoring system for monitoring inclination and sedimentation of cultural building relics
CN208847130U (en) * 2018-09-29 2019-05-10 中南大学 A kind of girder construction deformation monitoring system based on CCD
WO2020199538A1 (en) * 2019-04-04 2020-10-08 中设设计集团股份有限公司 Bridge key component disease early-warning system and method based on image monitoring data
CN210089628U (en) * 2019-04-26 2020-02-18 苏州利力升光电科技有限公司 Multi-scale two-dimensional measurement system
WO2021068848A1 (en) * 2019-10-09 2021-04-15 山东大学 Tunnel structure disease multi-scale measurement and intelligent diagnosis system and method
CN214951500U (en) * 2021-05-28 2021-11-30 云南万合建筑工程有限公司 Building engineering construction operation environment intelligent monitoring system based on big data
CN115993096A (en) * 2023-02-08 2023-04-21 金陵科技学院 High-rise building deformation measuring method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
测量机器人变形监测自动化系统;高改萍, 李双平, 苏爱军, 张京生, 崔政权;人民长江(第03期);全文 *

Also Published As

Publication number Publication date
CN116697922A (en) 2023-09-05

Similar Documents

Publication Publication Date Title
CN108564065B (en) Cable tunnel open fire identification method based on SSD
CN111381579B (en) Cloud deck fault detection method and device, computer equipment and storage medium
CN107767377B (en) Liquid crystal display defect and dust distinguishing method and detection device based on binocular vision system
CN105812746A (en) Target detection method and system
CN110120074B (en) Cable positioning method for live working robot in complex environment
CN107255874A (en) A kind of electric energy meter liquid-crystal apparatus fault detection method and system
CN109447090B (en) Shield door obstacle detection method and system
CN114821376B (en) Unmanned aerial vehicle image geological disaster automatic extraction method based on deep learning
CN111212203A (en) Self-adaptive face detection and recognition device
CN114905512B (en) Panoramic tracking and obstacle avoidance method and system for intelligent inspection robot
CN107333123A (en) Detecting system of focusing and focusing detection method
CN105376485B (en) Two-way real-time vehicle chassis image combining method based on Linear Array Realtime video camera
CN116697922B (en) Big data-based monitoring system for engineering measurement and application method thereof
CN111683225A (en) Multi-view camera, multi-channel video signal processing method and module
CN214097188U (en) Automatic detection device for battery case
CN112489017A (en) Intelligent identification method and system for power equipment faults
CN112781518A (en) House deformation monitoring method and system
CN112541478A (en) Insulator string stain detection method and system based on binocular camera
CN108615057B (en) CNN-based abnormity identification method for cable tunnel lighting equipment
CN116429329A (en) Building curtain wall leakage detection and identification system, method and water spraying device
CN112801072B (en) Elevator non-flat-layer door opening fault recognition device and method based on computer vision
CN111882619B (en) Sea surface target identification method for simulating and testing visual equipment on intelligent ship
CN114155432A (en) Meter reading identification method based on robot
CN114299141A (en) Two-degree-of-freedom flame recognition device and method applied to fire-fighting robot
CN118149722A (en) Big data-based monitoring method for engineering measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant