CN112419287A - Building deflection determination method and device and electronic equipment - Google Patents

Building deflection determination method and device and electronic equipment Download PDF

Info

Publication number
CN112419287A
CN112419287A CN202011364789.0A CN202011364789A CN112419287A CN 112419287 A CN112419287 A CN 112419287A CN 202011364789 A CN202011364789 A CN 202011364789A CN 112419287 A CN112419287 A CN 112419287A
Authority
CN
China
Prior art keywords
image
matching
detected
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011364789.0A
Other languages
Chinese (zh)
Inventor
王一妍
赵文一
江子君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ruhr Technology Co Ltd
Original Assignee
Hangzhou Ruhr Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ruhr Technology Co Ltd filed Critical Hangzhou Ruhr Technology Co Ltd
Priority to CN202011364789.0A priority Critical patent/CN112419287A/en
Publication of CN112419287A publication Critical patent/CN112419287A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30132Masonry; Concrete

Abstract

The invention discloses a building deflection determination method and device and electronic equipment. The method comprises the following steps: acquiring a reference image and at least one image to be detected of a target building; the method comprises the steps of carrying out correlation matching on images to be measured according to template images in a reference image, determining a target matching area corresponding to each image to be measured, determining the deflection of a target building based on the position change of the target matching area corresponding to each image to be measured and the template images, so as to determine the deflection of the target building based on image matching, and improving the accuracy of the determined deflection of the building while improving the determination speed of the deflection of the building.

Description

Building deflection determination method and device and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of building deformation testing, in particular to a method and a device for determining building deflection and electronic equipment.
Background
Buildings such as bridges, house beams, high-rise buildings and the like are related to the safety of people's lives and properties. Deflection refers to the amount of bending of a building or its elements in either the horizontal or vertical direction. For example, the middle part of a bridge bends downwards, and high-rise buildings bend laterally. The deflection is an important key index for evaluating the building safety, is an important component of building detection, can directly reflect whether the deformation of the building structure exceeds the safe allowable range, and can provide effective parameters for damage identification and health monitoring of the building structure.
As an example, the following methods are mainly used to measure the deflection of the bridge: the leveling instrument method is characterized in that a measuring tool leveling rod and a leveling instrument are used for measuring the vertical height difference between two points on a bridge floor to calculate the deflection value, the requirement on measuring environment is strict, the deflection measuring range is small, and too many measuring points cannot be provided; the dial indicator and the dial indicator measurement method are characterized in that a gear rotating device is utilized to amplify displacement data transmitted by a measuring rod, the detected data are displayed on a circular dial plate and can be used only under a bridge or on a bridge pier where a support can be erected, the support must be erected, so that the working efficiency of the method is low, the use is not very convenient, and the field application is very limited; the communicating pipe measuring method utilizes the principle of 'communicating vessels', namely, the pressure intensity on the same horizontal plane is equal, the communicating pipe method generally utilizes a flexible plastic pipe with the diameter of about 10mm and a plurality of three-way connectors to form, the communicating pipe is placed on a bridge and is manually read by a common millimeter scale, but the precision is lower; the laser interferometer method is characterized in that a prism or a reflector plate is arranged on a point to be measured, then the laser interferometer is used for measuring the distance change of an observation point, and after the vibration amplitude of a bridge exceeds a certain range, a light spot cannot be captured, so that the corresponding measurement effect cannot be achieved; a digital image processing technology measuring method is characterized in that targets with special shapes are pasted or fixed on a bridge, the actual displacement of the bridge is calculated by calculating the displacement of the target marking positions in two images before and after the displacement of the bridge and then by space coordinate conversion, but the targets need to be installed on the side faces of the bridge or box girders of the bridge, government permission needs to be obtained in construction, certain technical difficulty exists, and the requirement of operation portability cannot be met.
Disclosure of Invention
The invention provides a building deflection determination method, a building deflection determination device and electronic equipment, which are used for determining the position change of a target building based on image matching and determining the deflection of the target building based on the position change, so that the speed and the accuracy of building deflection measurement are improved, and meanwhile, the method and the device are also suitable for buildings with low contrast and scenes with harsh measurement environments.
In a first aspect, an embodiment of the present invention provides a method for determining building deflection, including:
acquiring a reference image and at least one image to be detected of a target building;
performing correlation matching on the images to be detected according to template images in the reference images, and determining a target matching area corresponding to each image to be detected;
and determining the deflection of the target building based on the position change of the target matching area corresponding to each image to be detected and the template image.
In a second aspect, an embodiment of the present invention further provides a building deflection determining apparatus, including:
the image acquisition module is used for acquiring a reference image and at least one image to be detected of a target building;
the matching module is used for performing correlation matching on the images to be detected according to the template images in the reference images and determining a target matching area corresponding to each image to be detected;
and the deflection determining module is used for determining the deflection of the target building based on the position changes of the target matching area corresponding to each image to be detected and the template image.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of building deflection determination as provided by embodiments of the present invention.
A fourth aspect. Embodiments of the present invention also provide a computer-readable storage medium on which a computer program is stored, where the program, when executed by a processor, implements a method for determining building deflection according to embodiments of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
acquiring a reference image and at least one image to be detected of a target building; the method comprises the steps of carrying out correlation matching on images to be measured according to template images in a reference image, determining a target matching area corresponding to each image to be measured, determining the deflection of a target building based on the position change of the target matching area corresponding to each image to be measured and the template images, so as to determine the deflection of the target building based on image matching, and improving the accuracy of the determined deflection of the building while improving the determination speed of the deflection of the building.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flow chart of a method for determining building deflection according to an embodiment of the present invention;
fig. 2 is a schematic diagram of selecting a template image according to an embodiment of the present invention;
fig. 3 is an edge image of a pyramid structure according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of a method for determining deflection of a building according to a second embodiment of the present invention;
fig. 5 is a simplified diagram of a building photography according to a second embodiment of the present invention;
FIG. 6 is a simplified vertical imaging diagram according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a building deflection determining apparatus according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow chart of a method for determining deflection of a building according to an embodiment of the present invention, which is applicable to a situation where deflection of a target building needs to be determined according to captured images of various time points of the target building, and the method may be executed by a device for determining deflection of a building, where the device may be implemented by hardware and/or software, and the method specifically includes the following steps:
and S110, acquiring a reference image and at least one image to be measured of the target building.
The target building refers to a building which needs to be subjected to deflection determination so as to judge the structural deformation degree of the building, such as a bridge, a railway, a wall or a house girder, and the target building may be a bridge in this embodiment. The reference image and the at least one image to be detected are both from the same image frame sequence, namely the same acquisition video, and the reference image is used for mapping partial content to the at least one image to be detected, so that the partial content of the reference image and points of the same position of each image to be detected in space can be in one-to-one correspondence. The method comprises the steps of taking a first frame image of a collected video as a reference image, and taking each image frame after the first frame as an image to be detected. It can be understood that a plurality of videos can be continuously acquired aiming at a target building, a plurality of reference images and each image to be detected corresponding to the reference images are obtained, so that deflection corresponding to each image frame in the plurality of acquired videos is obtained, a deflection change curve along with time is formed based on the time sequence relation of each image frame, and the deflection change of the target building is monitored in real time.
In this embodiment, the video to be detected is shot based on an image acquisition system, and the image acquisition system includes a camera, an inclinometer, and a tripod. Considering that the shooting distance is relatively long (over 300 meters) and the shooting precision requirement is relatively high (millimeter level), the frame rate of the camera needs to be larger than the vibration frequency when the bridge passes through the vehicle, and the video data transmission distance and speed requirements, an industrial camera with high resolution, small pixel size, high frame rate and long transmission distance (such as a Genie Nano M2020 type industrial camera of Zhiqian Vision science and technology Limited with resolution of 2064x 1544, pixel size of 3.45 μ M, frame rate of 38fps and transmission distance up to 100 meters) can be selected. For the video acquisition of a target building, the SV-5014H near-infrared zoom lens is selected by considering the factors of cost, size, convenience in installation, integration, protection and the like, has high resolution and high contrast compared with the common CCTV lens, and can be applied to bridge deflection monitoring at various observation distances within the range of tens of meters to 500 meters.
Generally, an optical image capturing device does not have a function of measuring an included angle between a plane where a camera is located and a horizontal plane, and therefore, a biaxial inclinometer is required to be additionally arranged to obtain an inclination angle of the plane where the camera is located. The selection of the inclinometer mainly considers factors such as the precision, the single-axis measurement or the double-axis measurement, the durability, the sampling frequency and the like. Therefore, a two-axis dynamic inclinometer (RST460) with an accuracy of 0.005, with an effective range of + -15, and a sampling frequency of 20Hz, can be used. In addition, in order to ensure that the inclinometer and the camera are positioned on the same horizontal plane, a flat plate made of aviation aluminum material can be additionally arranged, the thickness of the flat plate is 8mm, and the flat plate does not bend obviously under the action of loads of equipment such as a lens and the like.
In order to facilitate the installation of the camera and the lens and ensure the stability of the observation height, the observation angle and the erection point, a Mi Pong MTT609A heavy tripod and a hydraulic damping tripod head sleeve can be used as the fixing equipment of the lens.
The image acquisition system is in communication connection or electric connection with the electronic equipment executing the method, and transmits the acquired video to be detected.
And S120, performing correlation matching on the images to be detected according to the template images in the reference images, and determining target matching areas corresponding to the images to be detected.
The template image refers to a partial area in the reference image, which is used for matching with partial contents in the image to be detected one by one. As shown in fig. 2, a region 1 in fig. 2 may be used as a template image, a region 2 may be used as a substitute template image, and regions 3 and 4 include fewer texture features and do not belong to the self range of a bridge and cannot be used as template images.
The target matching area refers to an area with the highest information correlation with the template image in the image to be detected. The correlation matching is used for determining the region corresponding to the template image in the image to be detected according to the information correlation degree of the template image and each region in the image to be detected, wherein the information correlation degree can be the correlation degree of gradient information, gray value information or gray level change trend information and the like of each pixel point. Specifically, each region with the same size as the template image is selected from the image to be detected in a sliding window manner, wherein the size of the sliding window is equal to the size of the template image, and the direction of the sliding window may be from left to right, from top to bottom, or from left to right, from top to bottom, which is not limited in the present application. And calculating the information correlation degree of each region in the image to be detected and the template image, and comparing the correlation degree of each region to use the region with the highest correlation degree as a target matching region.
Optionally, performing correlation matching on the images to be detected according to the template images in the reference image, and determining a target matching area corresponding to each image to be detected, including: and performing edge correlation matching and/or gray scale correlation matching on the image to be detected based on the template image in the reference image, and determining a target matching area corresponding to each image to be detected based on an edge matching result and/or a gray scale matching result.
The edge correlation matching means that the correlation between the edge information of the template image and the image to be detected is determined by extracting the edge information of the template image. The gray scale correlation matching refers to determining the correlation of the gray scale values of all pixel points between the template image and the image to be detected. The method includes the steps that edge features of a template image can be extracted in advance, when the edge features meet preset conditions, edge correlation matching is conducted, and when the edge features do not meet the preset conditions, gray scale correlation matching is conducted; in some embodiments, the template image and the image to be detected may be simultaneously subjected to edge correlation matching and gray scale correlation matching to obtain an edge matching result and a gray scale matching result, and the edge matching result and the gray scale matching result are integrated to determine a target matching region, so as to improve the accuracy of the target matching region. In the embodiment, the precision of the target matching area corresponding to each image to be detected is improved by performing edge correlation matching and/or gray scale correlation matching on the image to be detected based on the template image, so that the precision of the deflection of the target building is improved.
Optionally, performing gray scale correlation matching on the image to be detected based on the template image in the reference image, including: and respectively carrying out relevance matching on the information of each pixel point in the template image and the information of the pixel point in each area in the image to be detected, determining the matching degree, and determining the area with the highest matching degree as a gray matching result.
Wherein, each region with the same size as the template image is selected from the image to be detected according to the preset step length, and the matching degree of each region in the image to be detected is determined based on the correlation matching between the information of each pixel point in the template image and the information of the pixel point in each region, which satisfies the following formula,
Figure BDA0002805097470000081
wherein ncc (r, c) is the matching degree of the region with (r, c) as the center and the same size with the template image in the image to be measured, mi(r, c) is the mean value of the gray levels of the region,
Figure BDA0002805097470000082
for the gray variance of the region, r and c respectively represent the horizontal and vertical coordinates of the pixel points in the image to be detected, u and v respectively represent the horizontal and vertical coordinates of the pixel points in the template image, t (u, v) represents the gray values of the pixel points with the horizontal and vertical coordinates u and v in the template image, i (r + u, c + v) represents the gray values of the pixel points with the horizontal and vertical coordinates r + u and c + v in the image to be detected, St 2Is the gray variance, m, of the template imagetIs the gray average value of the template image, and n is the number of pixel points in the template image. Wherein S ist 2、mi(r,c)、
Figure BDA0002805097470000086
Satisfies the following formula:
Figure BDA0002805097470000083
Figure BDA0002805097470000084
Figure BDA0002805097470000085
ncc (r, c) represents the matching degree of the region taking (r, c) as the center in the image to be detected and the template image, therefore, the matching degree of each region in the image to be detected and the template image is obtained, the value is between-1 and 1, the larger the absolute value of ncc (r, c) is, the larger the similarity degree of the region in the image to be detected and the template image is, and if the absolute value is 1, the region in the image to be detected is equal to the image of the template image after linear transformation. Therefore, the region where the absolute value of the matching degree is the largest is taken as the gradation matching result. In this embodiment, according to the pixel point information of each region in the image to be measured, correlation matching is performed with the pixel point information in the template image, and the matching degree of each region is determined, so that the region with the highest matching degree is determined as a gray level matching result, a target matching region is accurately obtained based on an image gray level value, and the method is suitable for deflection measurement of a target building without texture features or a target building with unclear texture features.
Optionally, performing edge correlation matching on the image to be detected based on the template image in the reference image, including: carrying out downsampling processing on the template image to obtain a template image set with a pyramid structure; performing edge extraction on each image in the template image set to obtain edge images with pyramid structures; and performing correlation matching on the edge image and the image to be detected layer by layer based on the pyramid structure.
The edge extraction refers to extracting information of an area of which the gray level changes discontinuously in the surface normal direction in the template image, and the extracted edge comprises the amplitude and the gradient direction of the edge. The edge extraction can be realized by operators such as Roberts, Sobel, Prewitt, Canny or Krisch, and preferably, the Canny operator is adopted to extract the edge of each image in the template image set, so that the noise in the template image is suppressed, the actual edge of each image is identified as much as possible, and the precision of the target matching region is improved.
The down-sampling process is used to reduce the resolution of the edge image and form edge images of different scales. Specifically, if the size of the image is M × N, performing 2-fold down-sampling on the image to obtain an image with (M/2) × (N/2) size, for example, deleting pixels in even rows and even columns of the image; or all the pixel points in each 2 × 2 region in the image may be converted into one pixel point, and the pixel value of the converted pixel point may be the pixel mean value of all the pixel points in the 2 × 2 region. By performing downsampling processing on the template image iteration, a template image set with gradually reduced resolution can be obtained, and therefore each edge image of the pyramid structure with gradually reduced resolution is obtained. The bottom layer image of the pyramid is an edge image extracted from the original template image, namely the edge image which is not subjected to down sampling, and each image on the upper layer is obtained by down sampling the previous layer image layer by layer. Illustratively, as shown in fig. 3, the number of pyramid layers is set to 3, and 3 edge images are generated, the size of which is equal to the original template image size, 1/4 of the original template image size, and 1/16 of the original template image size, respectively. The template image is downsampled according to the set pyramid layer number, so that an edge image with gradually reduced resolution is generated, the calculation amount of the subsequent matching process of the edge image and the image to be detected is reduced, and the matching speed is accelerated.
The specific process of carrying out correlation matching on the edge image and the image to be detected layer by layer is as follows: gradually enlarging the searching range from the vertex of the image to be detected, namely gradually increasing the number of pixel points participating in similarity matching to obtain each searching range, and obtaining the similarity of each searching range of the image to be detected based on all pixel point information in each searching range of the image to be detected and corresponding pixel point information of the edge image.
Optionally, based on the pyramid structure, performing correlation matching on the edge image and the image to be detected layer by layer, including: and the pyramid structure performs correlation matching on the edge image and each image to be detected layer by layer from the top layer to the bottom layer, and stops the correlation matching of the next layer if the correlation matching result of the edge image of the current layer and the image to be detected is not empty.
When the edge image is subjected to correlation matching with the image to be detected layer by layer from the top layer to the bottom layer, similarity calculation is carried out on the edge image of the current layer and each search range in the image to be detected, if the similarity result of each search range in the image to be detected is smaller than a preset threshold value, correlation matching of the current layer fails, the correlation matching result of the current layer is empty, and correlation matching of the next layer is continued; and if the similarity results of all the search ranges in the image to be detected are not all smaller than the preset threshold, the correlation matching of the current layer is successful, and the correlation matching result of the current layer is not empty. By means of carrying out correlation matching layer by layer, edge images with resolution ratio from low to high are matched, matching speed of correlation matching is reduced, and meanwhile matching precision of correlation matching is improved.
Optionally, performing correlation matching on the edge image and the image to be detected layer by layer, including: determining gradient information of each point in the edge image in the mutually vertical direction; determining the matching degree of each region on the image to be detected based on the gradient information of each point in the edge image, determining an edge matching result according to the corresponding region of the matching degree in the image to be detected when the matching degree meets a preset judgment condition, and stopping the correlation matching of the image to be detected.
The gradient information in the mutually vertical direction can be gradient information in the x direction and the y direction, the matching degree of each region on the image to be detected is determined based on the gradient information of each point in the edge image, and the following formula is satisfied:
Figure BDA0002805097470000111
wherein S isu,vThe matching degree of the corresponding area in the image to be detected is represented, n represents the number of pixel points in the edge image,
Figure BDA0002805097470000112
respectively the abscissa and ordinate of the ith pixel point in the edge image, u and v respectively represent the number of pixels in the horizontal and vertical directions of each region in the image to be detected, i represents the ith pixel point in the edge image,
Figure BDA0002805097470000113
respectively representing the gradient information of the ith pixel point in the edge image in the x direction and the y direction,
Figure BDA0002805097470000114
respectively are gradient information in the x direction and the y direction at the coordinate point of (u + Xi, v + Yi) in the image to be detected,
Figure BDA0002805097470000115
respectively (u + Xi, v + Yi) in the edge imageGradient information in x, y direction at the landmark point.
In the above formula, the matching degree between each region and the edge image can be obtained according to the difference of the number of pixels in the horizontal direction and the vertical direction of each region in the image to be measured. The matching range of the image to be detected is gradually enlarged to obtain each area of the image to be detected, in order to accelerate the matching speed with the template image, the matching degree of the gradient information of each point in the edge image is preferentially determined from the area with a small matching range in the image to be detected, when the matching degree of the current matching area of the image to be detected meets a preset judgment condition, the matching degree calculation of the pixel points outside the current matching area is not carried out, the number of the pixel points participating in the calculation of the matching degree is reduced, and therefore the matching speed of relevance matching is improved.
For example, the preset determination condition may be Sm>SminWherein S ismMatching degree S of a certain area containing m pixel points on the image to be detected and the edge imageminThe minimum matching score may be:
Figure BDA0002805097470000116
and g is greedy degree, the value is 0 to 1, the setting can be carried out according to the selection of a user, and n is the number of pixel points of the edge image. Specifically, when g is 0,
Figure BDA0002805097470000117
namely, the preset judgment condition can be regarded as a safety standard, namely all pixel points in the image to be detected are substituted into the matching degree calculation; if the value of g is 1, the ratio,
Figure BDA0002805097470000121
namely, the preset judgment condition can be regarded as a strict standard, and as long as the matching degree of m points on the image to be detected meets the preset judgment condition, the correlation matching of the image to be detected is stopped. When the matching degree of the current matching area of the image to be detected meets the preset judgment condition, images outside the current matching area are not matched any moreThe pixel points are subjected to matching degree calculation, so that the number of the pixel points participating in the calculation of the matching degree is reduced, and the matching speed of relevance matching is improved.
And S130, determining the deflection of the target building based on the position change of the target matching area corresponding to each image to be detected and the template image.
The position of the target matching region can be represented by coordinates of a certain pixel point of the target matching region in the image to be detected, the position of the template image can be represented by coordinates of the pixel point in the reference image, and the pixel point can be a central point of the target matching region or a vertex of the target matching region, which is not limited in the application. And determining the position change of the pixel point by comparing the coordinate of the pixel point in the image to be detected with the coordinate of the pixel point in the reference image, and determining the position change of the image to be detected relative to the template image based on the position change of the pixel point. Specifically, after obtaining each position change of each image to be measured relative to the template image, the position change is mapped to the actual displacement of the target building. For example, if the reference image and the image to be measured are both parallel to the bridge floor, the actual displacement amount is equal to the position change/shooting focal length × shooting distance. And obtaining the deflection of the target building according to the actual displacement mapped by the position change of each target matching area and the template image, wherein the deflection of the target building can be represented by a series of actual displacements, can also be represented by an average value of the series of actual displacements, and can also be represented by the maximum displacement in the series of actual displacements.
According to the technical scheme of the embodiment, a reference image and at least one image to be detected of a target building are obtained; the method comprises the steps of carrying out correlation matching on images to be detected according to template images in a reference image, determining a target matching area corresponding to each image to be detected, determining the deflection of a target building based on the position change of the target matching area corresponding to each image to be detected and the template images, so as to determine the deflection of the target building based on image matching, improve the determination speed of the deflection of the building, improve the accuracy of the determined deflection of the building, and be suitable for measuring scenes with harsh environments.
Example two
Fig. 4 is a schematic flow chart of a building deflection determining method according to a second embodiment of the present invention, and in this embodiment, based on the above embodiments, further optimization is performed on "determining deflection of a target building based on a change in position of the template image and a target matching region corresponding to each image to be measured". Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted. Referring to fig. 4, the method for determining the building deflection provided by the embodiment includes:
s410, acquiring a reference image and at least one image to be measured of the target building.
And S420, performing correlation matching on the images to be detected according to the template images in the reference images, and determining target matching areas corresponding to the images to be detected.
And S430, determining the position coordinates of each target matching area.
The position coordinates refer to the position coordinates of each target matching area in each image to be detected, and the position coordinates can be the coordinates of the central point of each target matching area in the image to be detected, the coordinates of the vertex in the image to be detected, and the coordinates of the pixel point with the highest matching degree in the image to be detected.
Optionally, determining the position coordinates of each target matching area includes: determining a first coordinate of a target matching area, wherein the first coordinate is a coordinate of a center point of the target matching area or a coordinate of a position point with the highest matching degree in the target matching area; and determining sub-pixel level coordinates determined based on the first coordinates and pixel values of adjacent points of the first coordinates as position coordinates of the target matching region.
Determining the sub-pixel level coordinate to satisfy the following formula based on the first coordinate and the pixel value of the adjacent point of the first coordinate:
Figure BDA0002805097470000141
wherein the first coordinate is i (x, y), x is the abscissa of the first coordinate i (x, y), f (x) is the abscissa of the subpixel level coordinate, a, b, c are all intermediate coefficients, which can be obtained by the following formula,
Figure BDA0002805097470000142
Figure BDA0002805097470000143
Figure BDA0002805097470000144
wherein, the left and right pixel points of the first coordinate are respectively represented by i-1 and i +1, and xi-1、yi-1Is the abscissa and ordinate, x, of the pixel point i-1i+1Is the abscissa, g, of the i +1 pixeli-1Is the gray value of the i-1 pixel point, gi+1Is the gray value of the i +1 pixel point, and g is the gray value of the first coordinate pixel point.
It is understood that, through the above formulas, the ordinate f (y) of the sub-pixel level coordinate can be determined, so as to obtain the complete sub-pixel level coordinate. In this embodiment, the sub-pixel-level coordinates of the target matching region are obtained, and the sub-pixel-level coordinates are determined as the position coordinates of the target matching region, so that the precision of the position coordinates of the target matching region is improved, the precision of the position change between the target matching region and the template image is improved, and the accuracy of the deflection of the target building is improved.
And S440, determining real coordinates corresponding to the images to be detected and real coordinates of the template images based on the shooting angle, the shooting distance, the position coordinates of each target matching area and the position coordinates of the template images.
The shooting distance refers to the distance from the camera to the target building when the reference image and each image to be measured are collected, the shooting angle refers to the included angle between the camera and the horizontal plane where the target building is located, and as shown in fig. 5, θ in the drawing is the included angle between the shooting direction of the camera and the horizontal plane of the bridge. The real coordinates corresponding to each image to be detected refer to the actual coordinates corresponding to the position coordinates of the target matching area in the target building, and the real coordinates of the template image refer to the actual coordinates corresponding to the position coordinates of the template image in the target building. According to the position coordinates based on the shooting angle, the shooting distance, the position coordinates of each target matching area and the position coordinates of the template image, the real coordinates corresponding to each image to be detected and the template image can be uniquely mapped. For example, assuming that the target building in this embodiment is a bridge, the deflection is mainly reflected as a displacement in the vertical direction, and therefore only the ordinate of the real coordinate needs to be calculated, which specifically satisfies the following formula:
Figure BDA0002805097470000151
wherein x isAIs the vertical coordinate of the real coordinate corresponding to each image to be detected,
Figure BDA0002805097470000152
and (3) matching the position coordinates of the areas for each target, f is the focal length of a camera for collecting the reference image and each image to be detected, theta is the shooting angle, D is the shooting distance, and the vertical coordinate of the real coordinate of the template image can be calculated by the formula. As shown in fig. 6, fig. 6 is a simplified image of the bridge in the vertical direction, and AB represents the displacement of the bridge in the vertical direction, i.e. the bridge deflection.
S450, determining the deflection of the target building based on the real coordinates corresponding to the images to be detected and the real coordinates of the template images.
And comparing the real coordinates corresponding to the images to be detected with the real coordinates of the template images to obtain a plurality of displacement amounts of the real coordinates corresponding to the images to be detected relative to the real coordinates of the template images, so as to determine the target building according to the plurality of displacement amounts.
According to the technical scheme of the embodiment, the real coordinates corresponding to the images to be detected and the real coordinates of the template images are determined by determining the position coordinates of the target matching areas, based on the shooting angle, the shooting distance, the position coordinates of the target matching areas and the position coordinates of the template images, and the deflection of the target building is determined based on the real coordinates corresponding to the images to be detected and the real coordinates of the template images, so that the deflection of the building is determined according to the matching coordinates of image matching, the deflection measuring method is suitable for measuring harsh environments, and the flexibility measuring convenience is improved.
EXAMPLE III
Fig. 7 is a schematic structural diagram of a building deflection determining apparatus according to a third embodiment of the present invention, which is applicable to a situation where deflection of a target building needs to be determined according to a captured image of each time point of the target building, and the apparatus specifically includes: an image acquisition module 710, a matching module 720, and a deflection determination module 730.
An image obtaining module 710, configured to obtain a reference image and at least one image to be detected of a target building;
the matching module 720 is used for performing correlation matching on the images to be detected according to the template images in the reference images and determining target matching areas corresponding to the images to be detected;
and a deflection determining module 730, configured to determine the deflection of the target building based on the position changes of the target matching region corresponding to each image to be detected and the template image.
In this embodiment, a reference image and at least one image to be measured of a target building are acquired through an image acquisition module; the method comprises the steps of carrying out correlation matching on images to be measured through a matching module according to template images in a reference image, determining a target matching area corresponding to each image to be measured, determining the deflection of a target building according to the deflection determining module based on the position change of the target matching area corresponding to each image to be measured and the template images, so as to determine the deflection of the target building based on image matching, improve the determination speed of the deflection of the building, improve the accuracy of the determined deflection of the building, and be suitable for measuring scenes with harsh environments.
On the basis of the foregoing apparatus, optionally, the matching module 720 includes:
the matching sub-module is used for performing edge correlation matching and/or gray scale correlation matching on the image to be detected based on the template image in the reference image;
and the matching determination submodule is used for determining a target matching area corresponding to each image to be detected based on the edge matching result and/or the gray matching result.
Optionally, the matching sub-module includes:
the edge matching unit is used for carrying out edge correlation matching on the image to be detected based on the template image in the reference image and comprises the following steps: carrying out downsampling processing on the template image to obtain a template image set with a pyramid structure; performing edge extraction on each image in the template image set to obtain edge images with pyramid structures; and performing correlation matching on the edge image and the image to be detected layer by layer based on the pyramid structure.
The gray matching unit is used for carrying out gray correlation matching on the image to be detected based on the template image in the reference image and comprises the following steps: and respectively carrying out relevance matching on the information of each pixel point in the template image and the information of the pixel point in each area in the image to be detected, determining the matching degree, and determining the area with the highest matching degree as a gray matching result.
Optionally, the edge matching unit includes:
the layer-by-layer matching subunit is used for performing correlation matching on the edge image and each image to be detected layer by layer from the top layer to the bottom layer based on the pyramid structure;
and the judging subunit is used for judging the correlation matching result between the edge image of the current layer and each image to be detected, and stopping the correlation matching of the next layer if the result is not null.
Optionally, the layer-by-layer matching subunit is specifically configured to determine gradient information of each point in the edge image in the mutually perpendicular direction; determining the matching degree of each region on the image to be detected based on the gradient information of each point in the edge image, determining an edge matching result according to the corresponding region of the matching degree in the image to be detected when the matching degree meets a preset judgment condition, and stopping the correlation matching of the image to be detected.
Optionally, the deflection determination module 730 includes:
the position coordinate determination submodule is used for determining the position coordinates of each target matching area;
the real coordinate determination submodule is used for determining the real coordinate corresponding to each image to be detected and the real coordinate of the template image based on the shooting angle, the shooting distance, the position coordinate of each target matching area and the position coordinate of the template image;
and the deflection determining submodule is used for determining the deflection of the target building based on the real coordinates corresponding to the images to be detected and the real coordinates of the template image.
Optionally, the position coordinate determination submodule is specifically configured to determine a first coordinate of the target matching region, where the first coordinate is a coordinate of a center point of the target matching region or a coordinate of a position point with a highest matching degree in the target matching region; and determining sub-pixel level coordinates determined based on the first coordinates and pixel values of adjacent points of the first coordinates as position coordinates of the target matching region.
The building deflection determining device provided by the embodiment of the invention can execute the building deflection determining method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the executing method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
Example four
Fig. 8 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 8 illustrates a block diagram of an exemplary electronic device 80 suitable for use in implementing embodiments of the present invention. The electronic device 80 shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 8, the electronic device 80 is in the form of a general purpose computing device. The components of the electronic device 80 may include, but are not limited to: one or more processors or processing units 801, a system memory 802, and a bus 803 that couples various system components including the system memory 802 and the processing unit 801.
Bus 803 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The electronic device 80 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 80 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 802 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)804 and/or cache memory 805. The electronic device 80 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 806 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard drive"). Although not shown in FIG. 8, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 803 by one or more data media interfaces. Memory 802 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 808 having a set (at least one) of program modules 807 may be stored, for instance, in memory 802, such program modules 807 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may include an implementation of a network environment. Program modules 807 generally perform the functions and/or methodologies of embodiments of the present invention as described herein.
The electronic device 80 may also communicate with one or more external devices 809 (e.g., keyboard, pointing device, display 810, etc.), with one or more devices that enable a user to interact with the electronic device 80, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 80 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 811. Also, the electronic device 80 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 812. As shown, the network adapter 812 communicates with the other modules of the electronic device 80 over the bus 803. It should be appreciated that although not shown in FIG. 8, other hardware and/or software modules may be used in conjunction with electronic device 80, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 801 executes various functional applications and data processing by running a program stored in the system memory 802, for example, implementing a stock quantity reservation determination method provided by the present embodiment, the method including:
acquiring a reference image and at least one image to be detected of a target building;
performing correlation matching on the images to be detected according to the template images in the reference images, and determining target matching areas corresponding to the images to be detected;
and determining the deflection of the target building based on the position changes of the target matching area corresponding to each image to be detected and the template image.
Of course, those skilled in the art will appreciate that the processor may also implement the technical solution of the method for determining the deflection of the building provided in any embodiment of the present invention.
EXAMPLE five
The present embodiments provide a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the building deflection determination method steps as provided by any of the embodiments of the present invention, the method comprising:
acquiring a reference image and at least one image to be detected of a target building;
performing correlation matching on the images to be detected according to the template images in the reference images, and determining target matching areas corresponding to the images to be detected;
and determining the deflection of the target building based on the position changes of the target matching area corresponding to each image to be detected and the template image.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method for determining building deflection, comprising:
acquiring a reference image and at least one image to be detected of a target building;
performing correlation matching on the images to be detected according to template images in the reference images, and determining a target matching area corresponding to each image to be detected;
and determining the deflection of the target building based on the position change of the target matching area corresponding to each image to be detected and the template image.
2. The method according to claim 1, wherein the performing correlation matching on the images to be detected according to the template images in the reference image and determining the target matching area corresponding to each image to be detected comprises:
and performing edge correlation matching and/or gray scale correlation matching on the images to be detected based on the template images in the reference images, and determining a target matching area corresponding to each image to be detected based on an edge matching result and/or a gray scale matching result.
3. The method of claim 2, wherein the performing edge correlation matching on the image to be tested based on the template image in the reference image comprises:
performing downsampling processing on the template image to obtain a template image set with a pyramid structure;
performing edge extraction on each image in the template image set to obtain edge images with pyramid structures;
and performing correlation matching on the edge image and the image to be detected layer by layer based on a pyramid structure.
4. The method of claim 3, wherein the pyramid-based correlation matching of the edge image and the image to be tested layer by layer comprises:
and the pyramid structure performs correlation matching on the edge image and the image to be detected layer by layer from the top layer to the bottom layer, and stops the correlation matching of the next layer if the correlation matching result of the edge image of the current layer and the image to be detected is not empty.
5. The method of claim 3, wherein the step of performing correlation matching on the edge image and the image to be tested layer by layer comprises:
determining gradient information of each point in the edge image in the mutually vertical direction;
determining the matching degree of each region on the image to be detected based on the gradient information of each point in the edge image, determining an edge matching result according to the corresponding region of the matching degree in the image to be detected when the matching degree meets a preset judgment condition, and stopping the correlation matching of the image to be detected.
6. The method of claim 2, wherein performing gray scale correlation matching on the image to be tested based on the template image in the reference image comprises:
and respectively carrying out relevance matching on the pixel point information in the template image and the pixel point information in each region in the image to be detected, determining the matching degree, and determining the region with the highest matching degree as a gray matching result.
7. The method of claim 1, wherein determining the deflection of the target building based on the position change of the target matching region corresponding to each image to be measured and the template image comprises:
determining the position coordinates of each target matching area;
determining real coordinates corresponding to the images to be detected and real coordinates of the template images based on the shooting angle, the shooting distance, the position coordinates of each target matching area and the position coordinates of the template images;
and determining the deflection of the target building based on the real coordinates corresponding to the images to be detected and the real coordinates of the template image.
8. The method of claim 7, wherein determining the location coordinates of each of the target matching regions comprises:
determining a first coordinate of the target matching area, wherein the first coordinate is a coordinate of a center point of the target matching area or a coordinate of a position point with the highest matching degree in the target matching area;
determining sub-pixel level coordinates determined based on the first coordinates and pixel values of neighboring points of the first coordinates as position coordinates of the target matching region.
9. A building deflection determining apparatus, comprising:
the image acquisition module is used for acquiring a reference image and at least one image to be detected of a target building;
the matching module is used for performing correlation matching on the images to be detected according to the template images in the reference images and determining a target matching area corresponding to each image to be detected;
and the deflection determining module is used for determining the deflection of the target building based on the position changes of the target matching area corresponding to each image to be detected and the template image.
10. An electronic device comprising one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the building deflection determination method as claimed in claims 1-8.
CN202011364789.0A 2020-11-27 2020-11-27 Building deflection determination method and device and electronic equipment Pending CN112419287A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011364789.0A CN112419287A (en) 2020-11-27 2020-11-27 Building deflection determination method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011364789.0A CN112419287A (en) 2020-11-27 2020-11-27 Building deflection determination method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112419287A true CN112419287A (en) 2021-02-26

Family

ID=74842874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011364789.0A Pending CN112419287A (en) 2020-11-27 2020-11-27 Building deflection determination method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112419287A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113566730A (en) * 2021-07-29 2021-10-29 广东电网有限责任公司 Battery expansion deformation detection system and method
CN113610782A (en) * 2021-07-21 2021-11-05 武汉理工大学 Building deformation monitoring method and equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103438819A (en) * 2013-08-28 2013-12-11 华北电力大学(保定) Transformer substation tubular busbar deflection monitoring method
US20150287215A1 (en) * 2012-10-22 2015-10-08 Sony Computer Entertainment Inc. Image processor and image processing method
WO2015189629A2 (en) * 2014-06-13 2015-12-17 Bangor University Improvements in and relating to the display of images
CN106157329A (en) * 2015-04-20 2016-11-23 中兴通讯股份有限公司 A kind of adaptive target tracking method and device
WO2018000731A1 (en) * 2016-06-28 2018-01-04 华南理工大学 Method for automatically detecting curved surface defect and device thereof
CN109754429A (en) * 2018-12-14 2019-05-14 东南大学 A kind of deflection of bridge structure measurement method based on image
CN109815307A (en) * 2019-02-13 2019-05-28 北京百度网讯科技有限公司 Location determining method, device, equipment and medium
CN110186383A (en) * 2019-05-31 2019-08-30 上海大学 Monocular camera deflection metrology method based on the variation of the target point elevation angle
CN110634137A (en) * 2019-09-26 2019-12-31 杭州鲁尔物联科技有限公司 Bridge deformation monitoring method, device and equipment based on visual perception
CN110702343A (en) * 2019-09-20 2020-01-17 武汉中岩科技股份有限公司 Deflection measurement system and method based on stereoscopic vision
CN111060136A (en) * 2019-12-11 2020-04-24 深圳大学 Deflection measurement correction method, device and system
CN111076880A (en) * 2020-01-11 2020-04-28 东南大学 Multi-point deflection measuring method of long-span bridge considering camera attitude change

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150287215A1 (en) * 2012-10-22 2015-10-08 Sony Computer Entertainment Inc. Image processor and image processing method
CN103438819A (en) * 2013-08-28 2013-12-11 华北电力大学(保定) Transformer substation tubular busbar deflection monitoring method
WO2015189629A2 (en) * 2014-06-13 2015-12-17 Bangor University Improvements in and relating to the display of images
CN106157329A (en) * 2015-04-20 2016-11-23 中兴通讯股份有限公司 A kind of adaptive target tracking method and device
WO2018000731A1 (en) * 2016-06-28 2018-01-04 华南理工大学 Method for automatically detecting curved surface defect and device thereof
CN109754429A (en) * 2018-12-14 2019-05-14 东南大学 A kind of deflection of bridge structure measurement method based on image
CN109815307A (en) * 2019-02-13 2019-05-28 北京百度网讯科技有限公司 Location determining method, device, equipment and medium
CN110186383A (en) * 2019-05-31 2019-08-30 上海大学 Monocular camera deflection metrology method based on the variation of the target point elevation angle
CN110702343A (en) * 2019-09-20 2020-01-17 武汉中岩科技股份有限公司 Deflection measurement system and method based on stereoscopic vision
CN110634137A (en) * 2019-09-26 2019-12-31 杭州鲁尔物联科技有限公司 Bridge deformation monitoring method, device and equipment based on visual perception
CN111060136A (en) * 2019-12-11 2020-04-24 深圳大学 Deflection measurement correction method, device and system
CN111076880A (en) * 2020-01-11 2020-04-28 东南大学 Multi-point deflection measuring method of long-span bridge considering camera attitude change

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨吉云等: "图像法挠度测量中靶标成像识别的研究", 仪器仪表学报, no. 06 *
王翔等: "桥梁动态挠度图像识别测试技术研究", 世界桥梁, no. 03 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610782A (en) * 2021-07-21 2021-11-05 武汉理工大学 Building deformation monitoring method and equipment and storage medium
CN113610782B (en) * 2021-07-21 2024-01-02 武汉理工大学 Building deformation monitoring method, equipment and storage medium
CN113566730A (en) * 2021-07-29 2021-10-29 广东电网有限责任公司 Battery expansion deformation detection system and method
CN113566730B (en) * 2021-07-29 2023-09-08 广东电网有限责任公司 Battery expansion deformation detection system and method

Similar Documents

Publication Publication Date Title
Feng et al. Computer vision for SHM of civil infrastructure: From dynamic response measurement to damage detection–A review
Kohut et al. Monitoring of a civil structure’s state based on noncontact measurements
CN111272366B (en) Bridge displacement high-precision measurement method based on multi-sensor data fusion
JP6114052B2 (en) Point cloud analysis processing device and point cloud analysis processing program
Yu et al. Fast bridge deflection monitoring through an improved feature tracing algorithm
CN111174961B (en) Cable force optical measurement method based on modal analysis and measurement system thereof
CN102788572B (en) Method, device and system for measuring attitude of lifting hook of engineering machinery
CN105547635A (en) Non-contact type structural dynamic response measurement method for wind tunnel test
CN112419287A (en) Building deflection determination method and device and electronic equipment
CN111583244B (en) Bridge deformation detection method and system
Xu Photogrammetry-based structural damage detection by tracking a visible laser line
CN113029098A (en) Wind power tower inclination detection device and use method
CN113884011A (en) Non-contact concrete surface crack measuring equipment and method
Shan et al. Multi-level deformation behavior monitoring of flexural structures via vision-based continuous boundary tracking: Proof-of-concept study
CN114119614A (en) Method for remotely detecting cracks of building
Brown et al. Evaluation of a novel video-and laser-based displacement sensor prototype for civil infrastructure applications
CN114445404A (en) Automatic structural vibration response identification method and system based on sub-pixel edge detection
Xin et al. Marker‐free vision‐based method for vibration measurements of RC structure under seismic vibration
Xing et al. Improving displacement measurement accuracy by compensating for camera motion and thermal effect on camera sensor
CN110532725B (en) Engineering structure mechanical parameter identification method and system based on digital image
Wang et al. Vision technique for deflection measurements based on laser positioning
Su et al. Feature-constrained real-time simultaneous monitoring of monocular vision odometry for bridge bearing displacement and rotation
Pan et al. A remote deflection detection method for long-span bridges using adaptive mask and high-resolution camera
JP7348575B2 (en) Deterioration detection device, deterioration detection system, deterioration detection method, and program
CN111951328A (en) Object position detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination