CN211453394U - Non-contact detection system for quality of building outer wall surface - Google Patents

Non-contact detection system for quality of building outer wall surface Download PDF

Info

Publication number
CN211453394U
CN211453394U CN202020129119.XU CN202020129119U CN211453394U CN 211453394 U CN211453394 U CN 211453394U CN 202020129119 U CN202020129119 U CN 202020129119U CN 211453394 U CN211453394 U CN 211453394U
Authority
CN
China
Prior art keywords
module
aerial vehicle
unmanned aerial
wall surface
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020129119.XU
Other languages
Chinese (zh)
Inventor
倪文晖
仇铮
李畅
杨晓峰
毛金龙
陈敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Construction Engineering Quality Inspection Center
Original Assignee
Wuxi Construction Engineering Quality Inspection Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Construction Engineering Quality Inspection Center filed Critical Wuxi Construction Engineering Quality Inspection Center
Priority to CN202020129119.XU priority Critical patent/CN211453394U/en
Application granted granted Critical
Publication of CN211453394U publication Critical patent/CN211453394U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model relates to a building check out test set technical field, concretely relates to non-contact detecting system of building outer wall surface course quality is building outer wall surface course hollowing size, crack length, cold and hot bridge size, the non-contact detecting system of size that drops. A non-contact detection system for the quality of the outer wall surface of a building is characterized by comprising an unmanned aerial vehicle, an unmanned aerial vehicle remote controller and a data measurement control processing device; the data measurement control processing device is installed on the unmanned aerial vehicle and comprises a telemetering module, a ground human-computer interaction module, a self-stabilization measurement and control module, an image video storage module and an image wireless transmission module I, wherein the image video storage module and the image wireless transmission module I are arranged in the unmanned aerial vehicle; the unmanned aerial vehicle carries on the data measurement control processing apparatus and carries out ground remote control or detects the outer wall surface course of building arbitrary height according to predetermineeing the scheme by the unmanned aerial vehicle remote controller.

Description

Non-contact detection system for quality of building outer wall surface
Technical Field
The utility model relates to a building check out test set technical field, concretely relates to non-contact detecting system of building outer wall surface course quality is building outer wall surface course hollowing size, crack length, cold and hot bridge size, the non-contact detecting system of size that drops.
Background
The building outer wall surface layer (including face brick finish coat, heat preservation, plastering layer and other structural layers which are positioned at the outdoor side of the building outer wall main body, hereinafter collectively referred to as surface layer) can have the quality problems of hollowing, cracking, cold and hot bridges and the like in the use process due to the reasons of construction quality, material quality, weather conditions, biological erosion and the like, and the quality problems of the surface layer can reduce the overall energy-saving effect of the building, can seriously cause the surface layer to fall off and endanger the life and property safety of people. Therefore, the method can be used for accurately detecting and evaluating the hollowing size, the crack length, the cold and hot bridge size and the falling size of the outer wall surface layer, and has very important significance for building energy conservation, safety control and the like.
The existing quality detection of the outer wall surface of the building mainly comprises methods such as a surface field drawing test, a knocking method, an infrared thermal imaging method and the like. The field drawing test is destructive detection, and the knocking method is limited in implementation range, so that the field drawing test can be only carried out on partial building facades, and the detection result is difficult to comprehensively represent the quality of the whole surface layer. Although the conventional handheld thermal infrared imager shooting analysis method can be used for commonly measuring the outer wall surface of the building, the quality problem of the outer wall surface cannot be accurately judged due to shooting angles, wall surface heat reflection and the like. Meanwhile, the method can only carry out qualitative detection on the quality problem of the outer wall surface, and cannot carry out quantitative analysis on the surface layer hollowing size, the crack length, the cold and hot bridge size, the falling size and the like. The method is simple to operate, but because the reproducibility of a measurement result is poor, the position of the quality problem of the outer wall surface layer cannot be accurately restored according to the detection result, and the method has great difficulty in detecting invisible defects such as hollowing, cold and hot bridges and the like.
In view of the defects of the above technical solutions, the invention patent application with the reference number CN106501316A discloses a detection device and a corresponding implementation method for installing an infrared thermometer, a cross laser wire bonder and a camera on an unmanned aerial vehicle, which enlarges the inspection range to a certain extent compared with a tapping method and a drawing test, but the technical solution does not involve quantitative detection of quality problems such as crack length, temperature abnormal part size and the like;
the invention patent application with the document number of CN107202793A discloses a detection device for installing a thermal infrared imager and a camera on an unmanned aerial vehicle and a corresponding implementation method. Moreover, the technical scheme does not relate to the quantitative detection of quality problems such as crack length, temperature abnormal part size and the like;
the invention patent with the document number of CN102914261B, which belongs to the technical field of photoelectric integration, discloses a device consisting of a laser range finder, a thermal infrared imager and a data processing system and a corresponding implementation method.
Disclosure of Invention
The utility model aims at the above-mentioned weak point provide a non-contact detecting system of building outer wall surface layer quality, to outer wall surface layer hollowing, crack, cold and hot bridge, drop etc. fix a position and quantitative determination, provide more accurate data foundation for the judgement and the aassessment of building outer wall surface layer quality problems.
In order to achieve the above object, the utility model provides a following technical scheme: a non-contact detection system for the quality of the outer wall surface of a building adopts the detection principle of horizontally shooting a target plane, measuring the shooting distance and the horizontal angle between a shooting optical axis and the target plane, and obtaining the actual size of a target by reading the size of the target in a shooting picture and carrying out triangular transformation. The implementation process of the technical scheme is as follows: utilize the utility model discloses a non-contact detecting system of building outer wall surface course quality shoots building outer wall surface course comprehensively. A non-contact detecting system of building outer wall surface course quality measures in real time and shoots the distance and shoots the horizontal angle of optical axis and building outer wall surface course, presses through microprocessor the utility model provides an algorithm real-time computation proportionality coefficient. The measurement personnel reads the scale reading of quality problems such as outer wall surface layer hollowing, crack, cold and hot bridge, drop in the picture of taking at scene or later stage, according to the utility model provides an algorithm calculates the actual size that can obtain quality problems such as outer wall surface layer hollowing, crack, cold and hot bridge, drop. The technical scheme of the utility model the influence of shooting angle to measuring result has been considered to control the influence factor, improved the accuracy of testing result.
A non-contact detection system for the quality of the outer wall surface of a building is realized by adopting the following technical scheme:
a non-contact detection system for the quality of the outer wall surface of a building is composed of an unmanned aerial vehicle, an unmanned aerial vehicle remote controller and a data measurement control processing device. The data measurement control processing device is installed on the unmanned aerial vehicle and comprises a telemetering module, a ground human-computer interaction module, a self-stabilization measurement and control module, an image video storage module and an image wireless transmission module I, wherein the image video storage module and the image wireless transmission module I are arranged in the unmanned aerial vehicle.
The unmanned aerial vehicle carries on observing and controling system carries out ground remote control or detects the outer wall surface course of building arbitrary height according to predetermineeing the scheme by the unmanned aerial vehicle remote controller.
The self-stabilization measurement and control module comprises a self-stabilization holder, a rotation angle sensor, an infrared thermal imager, a visible light camera, a laser range finder and a data measurement and control processing module.
The rotation angle sensor is mounted in the Z-axis direction of the self-stabilizing pan-tilt. The infrared thermal imager, the visible light camera and the laser range finder are arranged in an optical coaxial mode. The data measurement and control processing module is installed on the surface of the infrared thermal imager, which is far away from the lens.
The infrared thermal imager, the visible light camera and the laser range finder are respectively connected with the data measurement and control processing module. The data measurement and control module is also connected with the image video storage module, the rotation angle sensor and the telemetry module. The remote measuring module is also connected with the image video storage module and is in wireless communication with the ground human-computer interaction module. The image video storage module is also connected with the image wireless transmission module I and is in wireless communication with the ground human-computer interaction module.
The data measurement and control processing module consists of a screen-following display generator, a first microprocessor, an air pressure height sensor and an acceleration vertical angle sensor.
The remote measuring module consists of a third on-screen display generator, a second microprocessor, an acceleration vertical angle sensor, a remote control signal analysis microprocessor and a wireless transmission module.
The ground human-computer interaction module consists of an image wireless transmission module II, a display, a wireless transmission module, a remote control signal coding microprocessor and a keyboard.
The unmanned aerial vehicle carries a telemetering module, a self-stabilization measurement and control module, an image video storage module and an image wireless transmission module I. And carrying out ground remote control by an unmanned aerial vehicle remote controller or detecting the outer wall surface layer of any height of the building according to a preset scheme.
The non-contact detection method for the quality of the building outer wall surface is a non-contact detection method for the quality problems of the building outer wall surface, such as the hollowing size, the crack length, the cold and hot bridge size, the falling size and the like, and comprises the following steps:
1. when the data measurement control processing device is set up, the infrared thermal imager, the visible light camera and the laser range finder are arranged in an optical coaxial mode, and the infrared thermal imager, the visible light camera and the laser range finder are connected with corresponding ports of the data measurement control processing module. After the system is started, the unmanned aerial vehicle carries the telemetering module, the self-stability measurement and control module, the image video storage module and the image wireless transmission module I, and the unmanned aerial vehicle is remotely controlled by the unmanned aerial vehicle remote controller or shoots the building outer wall surface layer according to a preset scheme.
2. And the infrared thermal imager, the visible light camera and the laser range finder of the self-stability measurement and control module are used for continuously shooting the surface layer of the outer wall of the building and measuring the shooting distance. And transmitting the image and distance data to a data measurement and control processing module. Furthermore, in the shooting process, the vertical angle of the lens is monitored through an acceleration vertical angle sensor I arranged in the data measurement and control processing module, monitoring data are processed through a microprocessor I and then are respectively superposed on an infrared thermal imaging picture and a visible light picture through an on-screen display generator and an on-screen display generator, and the monitoring data are analyzed and controlled by ground detection personnel to ensure horizontal shooting. Furthermore, the vertical angle of the Z axis is monitored by an acceleration vertical angle sensor II arranged in the telemetering module, monitoring data is processed by a microprocessor II and then is superposed on an infrared thermal imaging picture by a display generator III on a screen, and the monitoring data is analyzed and controlled by ground detection personnel to ensure that the Z axis is vertical to the horizontal plane.
3. In the data measurement and control processing module, an on-screen display generator and an on-screen display generator respectively carry out scale superposition processing on an infrared thermal imaging graph and a visible light graph of a target plane. And the microprocessor I analyzes the distance data and then transmits the distance data to the on-screen display generator and the on-screen display generator to be respectively superposed on the infrared thermal imaging picture and the visible light picture. Meanwhile, the air pressure height sensor detects the current height of the system in real time, and height data are transmitted to the microprocessor for analysis and then are respectively superposed on an infrared thermal imaging picture and a visible light picture by the on-screen display generator and the on-screen display generator. And after the video subjected to data superposition processing is transmitted to the telemetry module and the image video storage module, the video is transmitted to the ground human-computer interaction module in real time by the image wireless transmission module I.
4. When the ascending/descending height difference of the unmanned aerial vehicle reaches a preset value or a ground human-computer interaction module sends a detection remote control instruction, the microprocessor starts to execute single detection. At the moment, the processor executes the horizontal angle between the optical axis of the lens and the target planeαAngle measurement procedure (which will be further explained),αreading the shooting distance after the angle measurement is finishedDAnd calculating the target object proportionality coefficients in the infrared thermal image and the visible light image respectively, and then transmitting the proportionality coefficients and the flight altitude of the unmanned aerial vehicle to the on-screen display generator and the on-screen display generator to be superposed on the infrared thermal image and the visible light image respectively. And then, the microprocessor sends a storage instruction to the infrared thermal imager and the telemetry module, and then the telemetry module sends the storage instruction to the image video storage module, and the image video storage module stores the image on which the data are superimposed in a picture or video form. Meanwhile, the image on which the data are superimposed sends a real-time image to the ground human-computer interaction module through the image wireless transmission module. This concludes the single test. If measurement of the image takenαThe angle does not meet the error control requirement, and after the direction of the unmanned aerial vehicle is adjusted by a detector, a detection instruction is sent out again through the ground human-computer interaction module.
5. The quality problems of the outer wall surface layer hollowing size, the crack length, the cold and hot bridge size, the falling size and the like are quantitatively measured by detection personnel through scale reading, a proportionality coefficient and a measurement error control factor of a display picture of the ground human-computer interaction module in real time. And the detection result can be further analyzed by reading the picture stored in the image video storage module and the target plane temperature field distribution file stored in the infrared thermal imager at the later stage.
The utility model discloses a non-contact's detection method and detecting system of building outer wall surface course quality combines unmanned aerial vehicle, optical imaging, laser rangefinder, utilizes image superposition technique and triangle transform principle, realizes the non-contact quantitative determination of quality problems such as high-rise building outer wall surface course hollowing size, crack length, cold and hot bridge size, size that drops. Compared with other prior art schemes, the utility model discloses technical scheme provides building outer wall surface course hollowing size, crack length, cold and hot bridge size, the concrete algorithm that drops size etc to carry out analysis and control to the measuring error control factor, improved the measuring accuracy degree, thereby provide the data basis for the aassessment of building outer wall surface course quality. Has very important significance for building energy conservation, safety control and the like.
Drawings
The present invention will be further explained with reference to the accompanying drawings:
FIG. 1 is a schematic view of a non-contact detection system for the quality of the outer wall surface of a building according to the present invention;
FIG. 2 is an exploded view of the non-contact detection system for the quality of the exterior wall surface of a building according to the present invention;
FIG. 3 is an exploded view of the self-stabilizing measurement and control module of the present invention;
FIG. 4 is a structural diagram of a data measurement control processing device of the present invention;
FIG. 5 is a block diagram of the data measurement and control processing module of the present invention;
FIG. 6 is a diagram of a telemetry module configuration of the present invention;
FIG. 7 is a structure diagram of the ground human-computer interaction module of the present invention;
FIG. 8 is a flow chart of the data measurement and control process of the present invention;
fig. 9 is a schematic diagram of target dimension calculation according to the present invention;
fig. 10 is a schematic diagram of the measurement of the shooting angle α of the present invention;
FIG. 11 is a scene diagram of the quality inspection of the outer wall surface of the wall building of the present invention;
figure 12 time the utility model discloses a wall building outer wall surface quality testing result output map.
In the figure: 1. a telemetering module, 2, an unmanned aerial vehicle, 3, a ground human-computer interaction module, 4, a self-stabilization measurement and control module, 5, an unmanned aerial vehicle remote controller, 6, a rotation angle sensor, 7, an infrared thermal imager, 8, a visible light camera, 9, a laser range finder, 10, a self-stabilization cradle head, 11, a Z axis of the self-stabilization cradle head, 12, an X axis of the self-stabilization cradle head, 13, a Y axis of the self-stabilization cradle head, 14, a data measurement and control processing module, 15, a first image wireless transmission module, 16, an image video storage module, 17, a first on-screen display generator, 18, a second on-screen display generator, 19, a first microprocessor, 20, a first acceleration vertical angle sensor, 21, a second acceleration vertical angle sensor, 22, a third on-screen display generator, 23, a second microprocessor, 24, a first wireless transmission module, 25, a remote control signal analysis microprocessor, 26 and a second image wireless transmission module, 27. a second wireless transmission module 28, a remote control signal coding microprocessor 29, an air pressure height sensor 30, a building outer wall 31, a crack of a building outer wall surface layer 32, a falling off of the building outer wall surface layer 33, a scale X axis and a scale 34 of a detection output result graph, a scale Y axis and a scale 35 of the detection output result graph, a sighting frame of the detection output result graph 36, a shooting height in the detection output result graph 37, a vertical angle of a Z axis (11) of a self-stabilizing cradle head in the detection output result graph 38, a vertical angle of a Y axis (13) of the self-stabilizing cradle head in the detection output result graph 39, a shooting angle in the detection output result graphαAnd 40, detecting the shooting distance in the output result graphD41, detecting the scale factor in the output result graphKAnd 42, detecting the actual side length of the target object corresponding to the aiming frame (35) in the output result graph.
Detailed Description
Referring to the attached drawings 1-12, the non-contact detection system for the quality of the outer wall surface of the building is composed of an unmanned aerial vehicle (2), an unmanned aerial vehicle remote controller (5) and a data measurement control processing device. The measurement and control system is installed on the unmanned aerial vehicle (2), and the data measurement control processing device comprises a telemetry module (1), a ground human-computer interaction module (3), a self-stabilization measurement and control module (4), an image video storage module (16) arranged in the unmanned aerial vehicle (2) and a first image wireless transmission module (15).
Unmanned aerial vehicle (2) carry on data measurement control processing apparatus and carry out ground remote control or detect the outer wall surface course of building arbitrary height according to predetermineeing the scheme by unmanned aerial vehicle remote controller (5).
The self-stabilization measurement and control module (4) comprises a self-stabilization holder (10), a rotation angle sensor (6), an infrared thermal imager (7), a visible light camera (8), a laser range finder (9) and a data measurement and control processing module (14).
The rotation angle sensor (6) is mounted in the Z-axis direction (11) of the self-stabilizing pan/tilt head (10). The infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are arranged coaxially in an optical mode. The data measurement and control processing module (14) is installed on the surface, deviating from the lens, of the infrared thermal imager (7).
The infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are respectively connected with a data measurement and control processing module (14). The data measurement and control module (14) is also connected with the image video storage module (16), the rotation angle sensor (6) and the telemetry module (1). The telemetering module (1) is also connected with the image video storage module (16) and is in wireless communication with the ground human-computer interaction module (3). The image video storage module is also connected with the image wireless transmission module I (15) and is in wireless communication with the ground human-computer interaction module (3).
The data measurement and control processing module (14) is composed of a screen display generator (17), a screen display generator (18), a microprocessor I (19), an air pressure height sensor (29) and an acceleration vertical angle sensor (20).
The remote measuring module consists of a third on-screen display generator (22), a second microprocessor (23), an acceleration vertical angle sensor (21), a remote control signal analysis microprocessor (25) and a wireless transmission module (24).
The ground human-computer interaction module consists of a second image wireless transmission module (26), a display, a wireless transmission module (27), a remote control signal coding microprocessor (28) and a keyboard.
The unmanned aerial vehicle (2) carries the telemetry module (1), the self-stabilization measurement and control module (4), the image video storage module (16) and the image wireless transmission module I (15). The unmanned aerial vehicle remote controller (5) carries out ground remote control or detects the outer wall surface course of any height of the building according to a preset scheme.
The unmanned aerial vehicle remote controller (5) adopts an FUTABA16SZ type unmanned aerial vehicle remote controller.
The rotation angle sensor (6) employs an AS5600 magnetic encoder. The infrared thermal imager (7) is an FLIR VUIPO infrared thermal imager. The visible light camera (8) adopts a 700-line PAL/NTSC dual-mode camera. The laser range finder (9) adopts a Deke L10 laser radar. The self-stabilizing cradle head (10) adopts a special cradle head for a flying FLIR VUE.
The first image wireless transmission module (15) adopts a flying wing FOXEER ClearTX image transmitter 5.8G. The image video storage module (16) adopts a Runcology AHD double-channel video recording and photographing module. The on screen display generator one (17) adopts an AT6457 on screen display generator. And the second on-screen display generator (18) adopts an AT6457 on-screen display generator. The first microprocessor (19) employs a MEGA328P microprocessor. The first acceleration vertical angle sensor (20) adopts an LSM303 acceleration vertical angle sensor. And the second acceleration vertical angle sensor (21) adopts an LSM303 acceleration vertical angle sensor. And the third on-screen display generator (22) adopts an AT6457 on-screen display generator. The second microprocessor (23) employs a MEGA328P microprocessor. The first wireless transmission module (24) adopts an RF24L01 wireless transmission module. The remote control signal analysis microprocessor (25) adopts STC15W401AS remote control signal analysis microprocessor. And the second image wireless transmission module (26) adopts a 5.8G image transmission receiver. The second wireless transmission module (27) adopts an RF24L01 wireless transmission module. The remote control signal encoding microprocessor (28) adopts an STC12C5A60S2 remote control signal encoding microprocessor. The air pressure height sensor (29) adopts a BMP280 air pressure height sensor.
The non-contact detection method for the quality of the building outer wall surface is a non-contact detection method for the quality problems of the building outer wall surface, such as the hollowing size, the crack length, the cold and hot bridge size, the falling size and the like, and comprises the following steps:
1. when the data measurement control processing device is set up, the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are optically and coaxially arranged, and the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are connected with corresponding ports of the data measurement and control processing module (14). After the system is started, the unmanned aerial vehicle (2) carries the telemetering module (1), the self-stability measurement and control module (4), the image video storage module (16) and the image wireless transmission module (15), and the unmanned aerial vehicle is remotely controlled by the unmanned aerial vehicle remote controller (5) or shoots a building outer wall surface layer according to a preset scheme.
2. The infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) of the self-stability measurement and control module (4) continuously shoot the building outer wall surface layer and measure the shooting distance. The image and distance data are transmitted to a data measurement and control processing module (14), as shown in fig. 4. Furthermore, in the shooting process, the vertical angle of the lens is monitored through a first acceleration vertical angle sensor (20) arranged in the data measurement and control processing module (14), monitoring data are processed through a first microprocessor (19) and then are respectively superposed on an infrared thermal imaging picture and a visible light picture through a screen display generator (17) and a screen display generator (18), and ground detection personnel perform analysis and control to ensure horizontal shooting. Furthermore, the vertical angle of the Z axis (11) is monitored through a second acceleration vertical angle sensor (21) arranged in the telemetry module (1), monitoring data are processed through a second microprocessor (23) and then are superposed on an infrared thermal imaging picture through a third on-screen display generator (22), and the monitoring data are analyzed and controlled by ground detection personnel to ensure that the Z axis (11) is vertical to the horizontal plane.
3. In the data measurement and control processing module (14), an on-screen display generator (17) and an on-screen display generator (18) respectively carry out scale superposition processing on an infrared thermal imaging graph and a visible light graph of a target plane. The first microprocessor (19) analyzes the distance data and transmits the distance data to the on-screen display generator (17) and the on-screen display generator (18) which are respectively superposed on the infrared thermal imaging picture and the visible light picture. Meanwhile, the air pressure height sensor (29) detects the current height of the system in real time, height data are transmitted to the microprocessor I (19) to be analyzed, and then the height data are respectively superposed on an infrared thermal imaging picture and a visible light picture by the on-screen display generator (17) and the on-screen display generator (18). And after the video subjected to data superposition processing is transmitted to the telemetering module (1) and the image video storage module (16), the video is transmitted to the ground human-computer interaction module (3) in real time by the image wireless transmission module I (15).
4. When the ascending/descending height difference of the unmanned aerial vehicle (2) reaches a preset value or a ground human-computer interaction module (3) sends a detection remote control instruction, the microprocessor I (19) starts to execute single detection. The processor (19) then executes the horizontal angle of the optical axis of the lens to the target planeαAngle measurement procedure (which will be further explained),αreading the shooting distance after the angle measurement is finishedDAnd calculating the proportionality coefficient of the target object in the infrared thermal image and the visible light image respectively, and then transmitting the proportionality coefficient and the flight altitude of the unmanned aerial vehicle to an on-screen display generator (17) and an on-screen display generator (18) to be superposed on the infrared thermal image and the visible light image respectively. And then, the first microprocessor (19) sends a storage instruction to the infrared thermal imager (7) and the telemetry module (1), then the telemetry module (1) sends the storage instruction to the image video storage module (16), and the image video storage module (16) stores the image on which the data are superimposed in the form of pictures or videos. Meanwhile, the picture on which the data are superposed sends a real-time picture to the ground human-computer interaction module (3) through the first image wireless transmission module (15). This concludes the single test. If measurement of the image takenαThe angle does not meet the error control requirement, and after the direction of the unmanned aerial vehicle is adjusted by a detector, a detection instruction is sent out again through the ground human-computer interaction module (3).
5. The quality problems of the outer wall surface layer hollowing size, the crack length, the cold and hot bridge size, the falling size and the like are quantitatively measured by detection personnel through scale reading, a proportionality coefficient and a measurement error control factor of a display picture of the ground human-computer interaction module (3). And the pictures stored in the image video storage module (16) and the target plane temperature field distribution file stored in the infrared thermal imager (7) can also be read at the later stage to further analyze the detection result.
Further, the step 3 of performing scale stacking processing on the infrared thermal imaging graph and the visible light graph of the target plane is to stack a preset scale picture on each frame of the video data stream by using the on-screen display generator for measuring the proportional size of the targetn
Furthermore, the remote control instruction that ground human-computer interaction module (3) sent the detection in step 4 means that the detection personnel is transmitted to by the keyboard input detection instruction that sets up on ground human-computer interaction module (3) remote control signal code microprocessor (28) that set up on ground human-computer interaction module (3) encode, retransmit to wireless transmission module (27) that set up on ground human-computer interaction module (3), retransmit to after telemetry module (1) analyzes, microprocessor (19) that set up in data measurement and control processing module (14).
Further, in the step 4, the horizontal angle between the optical axis of the lens and the target planeαThe measurement procedure was:αthe angle is measured indirectly as shown in fig. 10. When the single detection is triggered, the first microprocessor (19) records the current position of the lens as an initial position, then sends an instruction to the self-stabilizing cradle head (10) to enable the lens to horizontally deflect by a certain angle, and then the laser range finder (9) measures the distance from the lens to the auxiliary distance measuring pointC 1 Is a distance ofoC 1 And calculating a yaw angle from an initial azimuth and a current azimuth by a rotation angle sensor (6) mounted in a Z-axis direction (11) of the self-stabilizing pan/tilt head (10)γ 1 . Then the microprocessor I (19) sends an instruction to the self-stabilizing cradle head (10) to enable the lens to horizontally deflect and reversely deflect a smaller angle, and at the moment, the laser range finder (9) measures the distance from the lens to an auxiliary distance measuring pointC 2 Is a distance ofoC 2 And through the rotation angle sensing in the Z-axis direction (11) of the self-stabilizing pan/tilt head (10)The device (6) calculates the deflection angle from the initial position and the current positionγ 2 And the measurement is circulated until the lens deflects reversely to the initial orientation. Thereby obtaining a series ofoC i Andγ i and by the formula
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE003
Figure DEST_PATH_IMAGE004
Computingα i i=1, 2, 3 …) end upαTake a seriesα i Arithmetic mean of values. Wherein:αthe angle is a measurement error control factor, the method of use of which will be further explained,q i pfor the center shot pointpTo the footq i iDistance of =1, 2, 3 …),oC i for lenses to auxiliary distance-measuring pointsC i iDistance of =1, 2, 3 …),Dfor taking a shot to a shooting pointpThe distance of (a) to (b),γ i deflection angle from initial orientation to auxiliary ranging orientation, see in detail the shooting angle of FIG. 10αMeasuring schematic diagram.
Further, the algorithm in step 4 is shown in fig. 9 according to the principle of triangle transformation, and has the following advantages according to the principle of similar triangle
Figure DEST_PATH_IMAGE005
Wherein:Dthe distance from the center of the lens to the object,dThe distance from the center of the lens to the photosensitive element, namely the focal length,lTo capture the size of the object imaged on the photosensitive element,A’B’calculating the points in the schematic for the target dimensions of FIG. 9A’To pointB’Length of (d). Firstly, setting:
Figure DEST_PATH_IMAGE006
whereinnThe superimposed scale reading for imaging the target or the number of pixels the target occupies on the light sensing element, as shown in FIG. 9
Figure DEST_PATH_IMAGE007
κSetting the following for the actual size of the image on the corresponding photosensitive element or the actual width or height of a single pixel on the photosensitive element of the reading of the unit scale:
Figure DEST_PATH_IMAGE008
then there are:
Figure DEST_PATH_IMAGE009
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
. Then according to the triangle transformation has
Figure DEST_PATH_IMAGE012
Figure DEST_PATH_IMAGE013
Further conversion is carried out to obtain:
Figure DEST_PATH_IMAGE014
Figure DEST_PATH_IMAGE015
then, there are:
Figure DEST_PATH_IMAGE016
. Wherein:ABis a target size,DIn order to measure the distance by the laser,n A’p n B’p the number of superimposed scale readings or pixels for the target is shown in FIG. 9,RatioIs a lens coefficient obtained by calibration and stored in an EEPROM of a microprocessor I (19),αFor measuring horizontal angle between optical axis of lens and target planeA quantity error control factor,ApCalculating the points in the schematic for the target dimensions of FIG. 9ATo pointpThe distance of,BpCalculating the points in the schematic for the target dimensions of FIG. 9BTo pointpThe distance of,A’pCalculating the points in the schematic for the target dimensions of FIG. 9A’To pointpThe distance of,B’pCalculating the points in the schematic for the target dimensions of FIG. 9B’To pointpThe distance of (c). Preferably, let
Figure DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE018
Further converting the formula to obtain:
Figure DEST_PATH_IMAGE019
Figure DEST_PATH_IMAGE020
. According to the above formula, whenαWhen approaching 90 DEG, Δ1,Δ2Limit value of
Figure DEST_PATH_IMAGE021
Therefore, only the shooting direction needs to be adjusted to make the horizontal angle between the optical axis of the lens and the target planeαControlled within a certain range around 90 deg., delta1,Δ2It can be controlled to a sufficiently small value. At this time, the process of the present invention,
Figure DEST_PATH_IMAGE022
thereby simplifying the calculation process and ensuring higher measurement accuracy.
Further, theRatioA fixed value for a specific fixed-focus lens, different fixed-focus lensesRatioAre different from each other. In the device, a lensRatioValues are obtained by calibration. The calibration process comprises the following steps: vertically shooting plane target objects with known sizes at different distances and calculating to obtain a series of target objectsRatio i The value of the one or more of,
Figure DEST_PATH_IMAGE023
wherein,n i is as followsiThe scale reading of the target object of known size at the time of the second shot,L i is as followsiThe actual size of the object of known size at the time of the secondary shot,D i as the shooting distance at the time of the ith shooting,D i the estimated distance in practical applications should be covered. Finally, the product is processedRatioValue takingRatio i Is arithmetic mean of
Figure DEST_PATH_IMAGE024
. Of infra-red thermal imagers and visible light camerasRatioThe values are respectively calibrated and stored in a memory EEPROM of the microprocessor I (19).
Examples
As shown in fig. 1-12, the utility model relates to a measurement system includes unmanned aerial vehicle (2), unmanned aerial vehicle remote controller (5), install in data measurement control processing apparatus on unmanned aerial vehicle (2). Furthermore, the data measurement control processing device comprises a telemetry module (1), a ground human-computer interaction module (3), a self-stabilization measurement and control module (4), an image video storage module (16) arranged in the unmanned aerial vehicle (2) and a first image wireless transmission module (15).
Unmanned aerial vehicle (2) carry on data measurement control processing apparatus and carry out ground remote control or detect the outer wall surface course of building arbitrary height according to predetermineeing the scheme by unmanned aerial vehicle remote controller (5).
Furthermore, the self-stabilization measurement and control module (4) comprises a self-stabilization holder (10), a rotation angle sensor (6), an infrared thermal imager (7), a visible light camera (8), a laser range finder (9) and a data measurement and control processing module (14).
Further, the rotation angle sensor (6) is mounted in a Z-axis direction (11) of the self-stabilizing pan/tilt head (10). Furthermore, the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are arranged in an optical coaxial manner. Furthermore, the data measurement and control processing module (14) is installed on the surface, deviating from the lens, of the infrared thermal imager (7).
Furthermore, the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are respectively connected with a data measurement and control processing module (14). The data measurement and control module (14) is also connected with the image video storage module (16), the rotation angle sensor (6) and the telemetry module (1). The telemetering module (1) is also connected with the image video storage module (16) and is in wireless communication with the ground human-computer interaction module (3). The image video storage module is also connected with the image wireless transmission module I (15) and is in wireless communication with the ground human-computer interaction module (3).
Furthermore, the data measurement and control processing module (14) is composed of a screen display generator (17), a screen display generator (18), a first microprocessor (19), an air pressure height sensor (29) and an acceleration vertical angle sensor (20).
Furthermore, the telemetry module consists of a third on-screen display generator (22), a second microprocessor (23), an acceleration vertical angle sensor (21), a remote control signal analysis microprocessor (25) and a wireless transmission module (24).
Furthermore, the ground human-computer interaction module consists of a second image wireless transmission module (26), a display, a wireless transmission module (27), a remote control signal coding microprocessor (28) and a keyboard.
The utility model discloses a building outer wall surface course hollowing size, crack length, cold and hot bridge size, the concrete implementation step that drops the non-contact detection of quality problems such as size as follows:
1. when the data measurement control processing device is set up, the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are optically and coaxially arranged, and the infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) are connected with corresponding ports of the data measurement and control processing module (14). After the system is started, the unmanned aerial vehicle (2) carries the telemetering module (1), the self-stability measurement and control module (4), the image video storage module (16) and the image wireless transmission module (15), and the unmanned aerial vehicle is remotely controlled by the unmanned aerial vehicle remote controller (5) or shoots a building outer wall surface layer according to a preset scheme.
2. The infrared thermal imager (7), the visible light camera (8) and the laser range finder (9) of the self-stability measurement and control module (4) continuously shoot the building outer wall surface layer and measure the shooting distance. The image and distance data are transmitted to a data measurement and control processing module (14), as shown in fig. 4. Furthermore, in the shooting process, the vertical angle of the lens is monitored through a first acceleration vertical angle sensor (20) arranged in the data measurement and control processing module (14), monitoring data are processed through a first microprocessor (19) and then are respectively superposed on an infrared thermal imaging picture and a visible light picture through a screen display generator (17) and a screen display generator (18), and ground detection personnel perform analysis and control to ensure horizontal shooting. Furthermore, the vertical angle of the Z axis (11) is monitored through a second acceleration vertical angle sensor (21) arranged in the telemetry module (1), monitoring data are processed through a second microprocessor (23) and then are superposed on an infrared thermal imaging picture through a third on-screen display generator (22), and the monitoring data are analyzed and controlled by ground detection personnel to ensure that the Z axis (11) is vertical to the horizontal plane.
3. In the data measurement and control processing module (14), an on-screen display generator (17) and an on-screen display generator (18) respectively carry out scale superposition processing on an infrared thermal imaging graph and a visible light graph of a target plane. The first microprocessor (19) analyzes the distance data and transmits the distance data to the on-screen display generator (17) and the on-screen display generator (18) which are respectively superposed on the infrared thermal imaging picture and the visible light picture. Meanwhile, the air pressure height sensor (29) detects the current height of the system in real time, height data are transmitted to the microprocessor I (19) to be analyzed, and then the height data are respectively superposed on an infrared thermal imaging picture and a visible light picture by the on-screen display generator (17) and the on-screen display generator (18). And after the video subjected to data superposition processing is transmitted to the telemetering module (1) and the image video storage module (16), the video is transmitted to the ground human-computer interaction module (3) in real time by the image wireless transmission module I (15).
4. When the ascending/descending height difference of the unmanned aerial vehicle (2) reaches a preset value or a ground human-computer interaction module (3) sends a detected remote control instruction, a first microprocessor (19) starts to executeAnd (5) performing single detection on the line. The processor (19) then executes the horizontal angle of the optical axis of the lens to the target planeαAngle measurement procedure (which will be further explained),αreading the shooting distance after the angle measurement is finishedDAnd calculating the proportionality coefficient of the target object in the infrared thermal image and the visible light image respectively, and then transmitting the proportionality coefficient and the flight altitude of the unmanned aerial vehicle to an on-screen display generator (17) and an on-screen display generator (18) to be superposed on the infrared thermal image and the visible light image respectively. And then, the first microprocessor (19) sends a storage instruction to the infrared thermal imager (7) and the telemetry module (1), then the telemetry module (1) sends the storage instruction to the image video storage module (16), and the image video storage module (16) stores the image on which the data are superimposed in the form of pictures or videos. Meanwhile, the picture on which the data are superposed sends a real-time picture to the ground human-computer interaction module (3) through the first image wireless transmission module (15). This concludes the single test. If measurement of the image takenαThe angle does not meet the error control requirement, and after the direction of the unmanned aerial vehicle is adjusted by a detector, a detection instruction is sent out again through the ground human-computer interaction module (3).
5. The quality problems of the outer wall surface layer hollowing size, the crack length, the cold and hot bridge size, the falling size and the like are quantitatively measured by detection personnel through scale reading, a proportionality coefficient and a measurement error control factor of a display picture of the ground human-computer interaction module (3). And the pictures stored in the image video storage module (16) and the target plane temperature field distribution file stored in the infrared thermal imager (7) can also be read at the later stage to further analyze the detection result.
Further, the step 3 of performing scale stacking processing on the infrared thermal imaging graph and the visible light graph of the target plane is to stack a preset scale picture on each frame of the video data stream by using the on-screen display generator for measuring the proportional size of the targetn
Furthermore, the remote control instruction that ground human-computer interaction module (3) sent the detection in step 4 means that the detection personnel is transmitted to by the keyboard input detection instruction that sets up on ground human-computer interaction module (3) remote control signal code microprocessor (28) that set up on ground human-computer interaction module (3) encode, retransmit to wireless transmission module (27) that set up on ground human-computer interaction module (3), retransmit to after telemetry module (1) analyzes, microprocessor (19) that set up in data measurement and control processing module (14).
Further, in the step 4, the horizontal angle between the optical axis of the lens and the target planeαThe measurement procedure was:αthe angle is measured indirectly as shown in fig. 10. When the single detection is triggered, the first microprocessor (19) records the current position of the lens as an initial position, then sends an instruction to the self-stabilizing cradle head (10) to enable the lens to horizontally deflect by a certain angle, and then the laser range finder (9) measures the distance from the lens to the auxiliary distance measuring pointC 1 Is a distance ofoC 1 And calculating a yaw angle from an initial azimuth and a current azimuth by a rotation angle sensor (6) mounted in a Z-axis direction (11) of the self-stabilizing pan/tilt head (10)γ 1 . Then the microprocessor I (19) sends an instruction to the self-stabilizing cradle head (10) to enable the lens to horizontally deflect and reversely deflect a smaller angle, and at the moment, the laser range finder (9) measures the distance from the lens to an auxiliary distance measuring pointC 2 Is a distance ofoC 2 And calculating a yaw angle from an initial azimuth and a current azimuth by a rotation angle sensor (6) mounted in a Z-axis direction (11) of the self-stabilizing pan/tilt head (10)γ 2 And the measurement is circulated until the lens deflects reversely to the initial orientation. Thereby obtaining a series ofoC i Andγ i and by the formula
Figure DEST_PATH_IMAGE025
Figure DEST_PATH_IMAGE026
Figure DEST_PATH_IMAGE027
Figure DEST_PATH_IMAGE028
Computingα i i=1, 2, 3 …) end upαTake a seriesα i Arithmetic mean of values. Wherein:αthe angle is a measurement error control factor, the method of use of which will be further explained,q i pfor the center shot pointpTo the footq i iDistance of =1, 2, 3 …),oC i for lenses to auxiliary distance-measuring pointsC i iDistance of =1, 2, 3 …),Dfor taking a shot to a shooting pointpThe distance of (a) to (b),γ i deflection angle from initial orientation to auxiliary ranging orientation, see in detail the shooting angle of FIG. 10αMeasuring schematic diagram.
Further, the algorithm in step 4 is shown in fig. 9 according to the principle of triangle transformation, and has the following advantages according to the principle of similar triangle
Figure DEST_PATH_IMAGE029
Wherein:Dthe distance from the center of the lens to the object,dThe distance from the center of the lens to the photosensitive element, namely the focal length,lTo capture the size of the object imaged on the photosensitive element,A’B’calculating the points in the schematic for the target dimensions of FIG. 9A’To pointB’Length of (d). Firstly, setting:
Figure DEST_PATH_IMAGE030
whereinnThe superimposed scale reading for imaging the target or the number of pixels the target occupies on the light sensing element, as shown in FIG. 9
Figure DEST_PATH_IMAGE031
κSetting the following for the actual size of the image on the corresponding photosensitive element or the actual width or height of a single pixel on the photosensitive element of the reading of the unit scale:
Figure DEST_PATH_IMAGE032
then there are:
Figure DEST_PATH_IMAGE033
Figure DEST_PATH_IMAGE034
Figure DEST_PATH_IMAGE035
. Then according to the triangle transformation has
Figure DEST_PATH_IMAGE036
Figure DEST_PATH_IMAGE037
Further conversion is carried out to obtain:
Figure DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE039
then, there are:
Figure DEST_PATH_IMAGE040
. Wherein:ABis a target size,DIn order to measure the distance by the laser,n A’p n B’p the number of superimposed scale readings or pixels for the target is shown in FIG. 9,RatioIs a lens coefficient obtained by calibration and stored in an EEPROM of a microprocessor I (19),αThe horizontal angle between the optical axis of the lens and the target plane is a measurement error control factor,ApCalculating the points in the schematic for the target dimensions of FIG. 9ATo pointpThe distance of,BpCalculating the points in the schematic for the target dimensions of FIG. 9BTo pointpThe distance of,A’pCalculating the points in the schematic for the target dimensions of FIG. 9A’To pointpThe distance of,B’pCalculating the points in the schematic for the target dimensions of FIG. 9B’To pointpThe distance of (c). Preferably, let
Figure DEST_PATH_IMAGE041
Figure DEST_PATH_IMAGE042
Further converting the formula to obtain:
Figure DEST_PATH_IMAGE043
Figure DEST_PATH_IMAGE044
. According to the above formula, whenαApproaching 90 deg., limit
Figure DEST_PATH_IMAGE045
So only according to the measurementαThe angle is adjusted to the shooting direction to make the optical axis of the lens and the horizontal angle of the target planeαControlled within an acceptable range, Δ, around 90 °1,Δ2It can be controlled within an acceptable value. At this time, the process of the present invention,
Figure DEST_PATH_IMAGE046
in one embodiment, the first microprocessor (19) calculates the scaling factor
Figure DEST_PATH_IMAGE047
And the calculation result is respectively superposed on the infrared thermal imaging and the visible light picture through an on-screen display generator (17) and an on-screen display generator (18), thereby simplifying the calculation process and having higher measurement accuracy.
Further, theRatioA fixed value for a specific fixed-focus lens, different fixed-focus lensesRatioAre different from each other. In the device, a lensRatioValues are obtained by calibration. The calibration process comprises the following steps: vertically shooting plane target objects with known sizes at different distances and calculating to obtain a series of target objectsRatio i The value of the one or more of,
Figure DEST_PATH_IMAGE048
wherein,n i is as followsiThe scale reading of the target object of known size at the time of the second shot,L i is as followsiThe actual size of the object of known size at the time of the secondary shot,D i as the shooting distance at the time of the ith shooting,D i the estimated distance in practical applications should be covered. Finally, the product is processedRatioValue takingRatio i Is arithmetic mean of
Figure DEST_PATH_IMAGE049
. Of infra-red thermal imagers and visible light camerasRatioThe values are respectively calibrated and stored in a memory EEPROM of the microprocessor I (19).
The quality inspection scene of the outer wall surface of the wall building is illustrated as fig. 11. After the system starts, the unmanned aerial vehicle (2) carries on the telemetering module (1), the self-stabilization measurement and control module (4), the image video storage module (16) and the image wireless transmission module (15) which are arranged in the unmanned aerial vehicle (2) are remotely controlled by the unmanned aerial vehicle remote controller (5) or shot the building outer wall (30) with the surface layer crack (31) and the surface layer falling (32) according to a preset scheme, and the shot picture is transmitted to the ground human-computer interaction module (3) in real time. When the ascending/descending height difference of the unmanned aerial vehicle (2) reaches a preset value or the ground human-computer interaction module (3) sends a detection remote control instruction, the system starts to detect, and the detection result is transmitted to the ground human-computer interaction module (3) in real time.
The detection output result graph comprises: scale X axis (33), scale Y axis (34), aiming frame (35), shooting height (36), vertical angle (37) from Z axis (11) of steady pan/tilt, vertical angle (38) from Y axis (13) of steady pan/tilt, shooting angleα(39) Distance of shotD(40) Coefficient of proportionalityK(41) The actual side length (42) of the target object corresponding to the aiming block (35) is shown in fig. 12.
In a proper shooting azimuth, the vertical angle (37) of the Z axis (11) of the self-stabilizing pan-tilt and the vertical angle (38) of the Y axis (13) of the self-stabilizing pan-tilt meet the requirement, and the error control factor is measured to shoot the angleα(39) When the requirement of error control is met, a ground inspector reads the X axis (33) and Y axis (34) of the scale (32) falling off from the surface layer of the outer wall of the building and multiplies the X axis and Y axis by a proportionality coefficientK(41) And the actual sizes of the building outer wall surface layer falling off (32) in the length direction and the height direction can be obtained.
Preferably, if the unmanned aerial vehicle is adjusted to a proper shooting direction and distance, the boundary of the aiming frame (35) is just corresponding to the boundary of the target object, and the length or height of the target object can be directly read through the actual side length (42) of the target object corresponding to the aiming frame (35).

Claims (5)

1. A non-contact detection system for the quality of the outer wall surface of a building is characterized by comprising an unmanned aerial vehicle, an unmanned aerial vehicle remote controller and a data measurement control processing device;
the data measurement control processing device is installed on the unmanned aerial vehicle and comprises a telemetering module, a ground human-computer interaction module, a self-stabilization measurement and control module, an image video storage module and an image wireless transmission module I, wherein the image video storage module and the image wireless transmission module I are arranged in the unmanned aerial vehicle;
the unmanned aerial vehicle carries on the data measurement control processing apparatus and carries out ground remote control or detects the outer wall surface course of building arbitrary height according to predetermineeing the scheme by the unmanned aerial vehicle remote controller.
2. The system of claim 1, wherein the self-stabilization measurement and control module comprises a self-stabilization holder, a rotation angle sensor, an infrared thermal imager, a visible light camera, a laser range finder, and a data measurement and control processing module;
the rotation angle sensor is mounted in the Z-axis direction of the self-stabilizing holder, the infrared thermal imager, the visible light camera and the laser range finder are arranged in an optical coaxial mode, and the data measurement and control processing module is mounted on the surface, away from the lens, of the infrared thermal imager;
the infrared thermal imager, the visible light camera and the laser range finder are respectively connected with the data measurement and control processing module; the data measurement and control module is also connected with the image video storage module, the rotation angle sensor and the telemetering module, and the telemetering module is also connected with the image video storage module and wirelessly communicated with the ground human-computer interaction module; the image video storage module is also connected with the image wireless transmission module I and is in wireless communication with the ground human-computer interaction module;
the data measurement and control processing module consists of a screen-following display generator, a first microprocessor, an air pressure height sensor and an acceleration vertical angle sensor.
3. The system of claim 1, wherein the telemetry module comprises a third on-screen display generator, a second microprocessor, an acceleration vertical angle sensor, a remote control signal analysis microprocessor, and a wireless transmission module.
4. The system of claim 1, wherein the ground human-computer interaction module comprises a second image wireless transmission module, a display, a wireless transmission module, a remote control signal coding microprocessor and a keyboard.
5. The system of claim 1, wherein the unmanned aerial vehicle embarkation monitoring system is remotely controlled from the remote controller on the ground or detects the external wall surface of any height of the building according to a preset scheme.
CN202020129119.XU 2020-01-20 2020-01-20 Non-contact detection system for quality of building outer wall surface Active CN211453394U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020129119.XU CN211453394U (en) 2020-01-20 2020-01-20 Non-contact detection system for quality of building outer wall surface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020129119.XU CN211453394U (en) 2020-01-20 2020-01-20 Non-contact detection system for quality of building outer wall surface

Publications (1)

Publication Number Publication Date
CN211453394U true CN211453394U (en) 2020-09-08

Family

ID=72299739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020129119.XU Active CN211453394U (en) 2020-01-20 2020-01-20 Non-contact detection system for quality of building outer wall surface

Country Status (1)

Country Link
CN (1) CN211453394U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111103297A (en) * 2020-01-20 2020-05-05 无锡市建筑工程质量检测中心 Non-contact detection method and system for quality of building outer wall surface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111103297A (en) * 2020-01-20 2020-05-05 无锡市建筑工程质量检测中心 Non-contact detection method and system for quality of building outer wall surface
CN111103297B (en) * 2020-01-20 2024-07-12 无锡市建筑工程质量检测中心 Non-contact detection method and detection system for quality of building exterior wall surface layer

Similar Documents

Publication Publication Date Title
CN111103297B (en) Non-contact detection method and detection system for quality of building exterior wall surface layer
CN211553794U (en) Data measurement and control processing device of non-contact detection system for quality of building outer wall surface
CN111103296A (en) Data measurement control processing device of non-contact detection system for quality of building outer wall surface
CN106091946B (en) Self-calibration measuring device and method for bridge deformation or displacement parameter
CN109632103B (en) High-altitude building temperature distribution and surface crack remote monitoring system and monitoring method
CN106197288B (en) Self-calibration measuring device and method for large structure vertical displacement or deformation
CN106197287B (en) Self-calibration measuring device and method for large scale structure composition deformation or displacement parameter
CN102853916B (en) Method and system for conducting remote infrared temperature measurement on coal pile surfaces
CN102183237A (en) Device and method for measuring two-waveband cloud height of foundation
CN113793367B (en) Visual measurement and power identification system and method for engineering structure corner displacement
CN206223097U (en) For the vertical displacement of large structure body and the measurement apparatus of ground settlement
CN109269525B (en) Optical measurement system and method for take-off or landing process of space probe
CN107607091A (en) A kind of method for measuring unmanned plane during flying flight path
CN109631771A (en) Crack dynamic change sensor and crack measurement method based on the processing of more target images
CN211453394U (en) Non-contact detection system for quality of building outer wall surface
CN111122053B (en) Device and method for detecting early unstable leakage of small reservoir dam body
CN111707374B (en) Distance estimation method and system for human body infrared thermometer
CN115797998A (en) Dual-waveband intelligent temperature measuring device and intelligent temperature measuring method based on image fusion
CN106303412B (en) Refuse dump based on monitoring image is displaced remote real time monitoring method
CN108163223B (en) Portable aircraft infrared stealth performance evaluation device and method
Liu et al. Multi-instrument characterization of optical turbulence at the Ali observatory
CN109685362B (en) Building legacy protection evaluation system and method based on intelligent network
CN114235157A (en) Thermal infrared imager with TOF sensor
CN207097004U (en) A kind of infrared fire-fighting survey meter of characteristics of human body
CN113405666A (en) Human body temperature difference detection method and device based on infrared thermal imaging for face recognition

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant