CN116818233A - High-precision bridge state monitoring method and system adopting laser and machine vision - Google Patents

High-precision bridge state monitoring method and system adopting laser and machine vision Download PDF

Info

Publication number
CN116818233A
CN116818233A CN202310722342.3A CN202310722342A CN116818233A CN 116818233 A CN116818233 A CN 116818233A CN 202310722342 A CN202310722342 A CN 202310722342A CN 116818233 A CN116818233 A CN 116818233A
Authority
CN
China
Prior art keywords
bridge
image
spot
laser
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310722342.3A
Other languages
Chinese (zh)
Inventor
滕龙寅
谭江云
王梦珂
路东明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202310722342.3A priority Critical patent/CN116818233A/en
Publication of CN116818233A publication Critical patent/CN116818233A/en
Pending legal-status Critical Current

Links

Abstract

The application provides a high-precision bridge health state monitoring method and system adopting laser and machine vision, wherein an industrial camera is used for collecting a scale map, and affine transformation correction, denoising, defogging, binarization and morphological transformation are carried out after an effective area containing scale scales and laser spots in a collected color spot image is extracted to obtain a binarized image of the spot; extracting a color spot image by using a binary image, performing an unsaturated point gray information Gaussian fitting method on the spot image, solving a centroid coordinate of a laser spot in an image coordinate system, obtaining a spot centroid variation by making a difference with an initially obtained spot centroid ordinate, obtaining actual disturbance of a bridge through a pixel-to-reality conversion matrix, uploading the actual disturbance to a cloud server through an ftp protocol, combining temperature and humidity data of a bridge monitoring site by the cloud server, generating an instantaneous and continuous bad state threshold of the bridge, and performing early warning. The application has the advantages of simple operation, high measurement precision, less hardware equipment resources and the like.

Description

High-precision bridge state monitoring method and system adopting laser and machine vision
Technical Field
The application relates to a remote non-contact measurement technology, in particular to a high-precision bridge state monitoring method and system adopting laser and machine vision.
Background
Measurement techniques hold great importance in modern society where mechanized equipment is highly concentrated. However, the current requirements for measuring equipment not only require high precision, high efficiency and strong environment adaptability, but also require the measuring equipment to be convenient and smart, high in automation degree, low in cost and the like. In the bridge deflection detection scene, the requirements of long-time detection, all-weather detection and the like often exist, and the requirements of all aspects are difficult to be met by the traditional measurement method. Therefore, the method has great significance in exploring and researching the automatic, high-precision, real-time and long-distance non-contact novel bridge deflection monitoring and measuring method.
Non-contact measurement methods generally include acoustic wave measurement, photoelectric measurement, radar measurement, etc., where a relatively large number of acoustic wave measurement techniques and photoelectric measurement techniques are used at present, but these two measurement techniques have some limitations, and they both require relatively high requirements for certain physical properties of the measurement object, or otherwise difficult to obtain measurement results. However, although the radar measurement method has high accuracy, the radar measurement method has high requirements on the measurement object and often has high cost.
Disclosure of Invention
The application aims to provide a high-precision bridge state monitoring method and system adopting laser and machine vision.
The technical scheme for realizing the application is as follows: a high-precision bridge state monitoring method adopting laser and machine vision comprises the following steps:
step 1, installing a laser module box on the side surface of a bridge pier, installing an observation camera bellows provided with a light supplementing light source, a graduated scale, an industrial camera and a bracket on the side surface of the center of the bridge deck, wherein the graduated scale is opposite to an industrial camera, so that laser light can vertically enter the graduated scale, the initial position of a target light spot is positioned at a 0 scale mark of the graduated scale, acquiring a graduated scale image by the industrial camera, and when the deflection of the bridge is changed, the position of the laser light spot is changed;
step 2, extracting an effective area containing scale marks and laser spots in the acquired color spot images by the on-site working condition machine, and then carrying out affine transformation correction; denoising, defogging, binarizing and morphological transformation are carried out on the color facula image to obtain a binarized image of the facula; extracting a colored spot image by using a binary image, performing an unsaturated point gray information Gaussian fitting method on the colored spot image, solving a centroid coordinate of a laser spot in an image coordinate system, performing difference between the obtained centroid of the spot and an initially obtained centroid ordinate of the spot to obtain a centroid variation of the spot, and obtaining actual disturbance of a bridge through a pixel-to-reality conversion matrix;
and 3, superposing the actual disturbance of the bridge on the real-time image, uploading the actual disturbance of the bridge to a cloud server through an ftp protocol, combining the temperature and humidity data of the bridge monitoring site by the cloud server, generating an instantaneous and continuous bad state threshold value of the bridge, and carrying out early warning when the bridge disturbance exceeds the instantaneous threshold value or the bridge exceeds the continuous threshold value within a period of time.
And (3) affine transformation correction, namely shooting a group of black and white chessboard image data by using a camera, calculating an internal and external parameter matrix of the camera according to the contrast between the black and white square distance on the shot image and the actual distance, and correcting the distortion of the camera.
The denoising adopts bilateral filtering, and a template formula of the bilateral filtering is as follows:
wherein (i, j) is the coordinates of the center pixel of the template window; (k, l) is the coordinates of the other pixels of the template window; sigma (sigma) d Sum sigma r Standard deviation of Gaussian functions of the airspace template and the value domain template respectively; f (i, j) is the gray value of the target point before processing.
The defogging adopts a dark channel defogging formula:
wherein k represents a point coordinate in the image, c is one of three color channels of r, g and b, J c To process the pre-image, J dark For processing the post-image, w k Is a rectangular window centered on k.
The binarization is carried out, the color image is converted into a gray image through a matrix, the pixels with the brightness value larger than the judging threshold value are selected to be marked as white, and other pixels are marked as black, wherein the conversion formula for converting the color image into the gray image is as follows:
wherein Y represents the brightness value of the image, and R, G and B are three channel values of the input image;
the threshold value is an empirical value, and the method for determining the threshold value is specifically to observe that the illumination condition in the camera bellows is stable, and the brightness of the background and the graduated scale in the facula image shot by the industrial camera is almost unchanged, so 180 is selected as the threshold value.
The morphological processing utilizes structural elements to perform morphological filtering on the binary image, and the specific formula is as follows:
wherein B' is the binary image after the treatment, B is the binary image before the treatment, S is the structural element of a 3X 3 white square, and the sum of the twoRepresenting corrosion and expansion operations, respectively.
The unsaturated point gray information Gaussian fitting method takes the upper left corner of an image as an original point, takes the right corner as the positive direction of the x axis, takes the downward direction as the positive direction of the y axis, and obtains the center coordinate of a laser spot in an image coordinate system through the unsaturated point gray information Gaussian fitting method for the preprocessed binary spot image, wherein the formula is as follows:
wherein x and y are respectively the horizontal and vertical coordinates of the center, and x 1 、x 2 、x 3 Is the abscissa, y of any three pixel points with the same ordinate and different abscissas on the light spot 1 、y 2 、y 3 G represents the gray value of the pixel point for the ordinate of any three pixel points with the same abscissa and different ordinate on the light spot;
and weighting according to the distance between each spot pixel and the center to obtain the spot centroid.
The bridge deflection comprises the following calculation formula:
s=scale·Δy=scale·(y 1 -y 0 )
wherein s is bridge deflection, scale is pixel scale, i.e. the actual length corresponding to the size of a single pixel, y 1 And y 0 The current light spot centroid ordinate and the reference light spot centroid ordinate are respectively, and deltay is the light spot centroid displacement.
The bad state threshold value of the bridge disturbance is calculated according to the following formula:
wherein b is an empirical constant value, 1.5762 is taken, sigma is the average structural coefficient of the bridge, t i Is the real-time temperature of the bridge, t o Is a bridgeAverage temperature, RH of the region i Real-time humidity, RH, of bridges o The average humidity of the area where the bridge is located,is the average value of bridge disturbance at the same load in the past.
The high-precision bridge state monitoring system implements the high-precision bridge state monitoring method adopting laser and combining machine vision, and realizes high-precision bridge state monitoring.
Compared with the prior art, the application has the remarkable advantages that: (1) The red laser spot is identified to fall on the graduated scale, and the ordinate of the centroid of the spot and the ordinate of the reference centroid have higher precision and are less influenced by the outside; (2) Different thresholds are automatically generated through cloud big data, so that bridge disturbance can better reflect the health condition of the bridge, and management staff can overhaul and maintain the bridge in time; (3) The system can automatically complete the whole process, has high automation, strong real-time performance and simple and convenient operation; (4) The required hardware equipment resources are few, the bridge deflection is convenient to monitor for a long time, and the cost is low.
Drawings
FIG. 1 is a schematic diagram of a high-precision single-point bridge deflection monitoring method by combining laser reference with image processing.
FIG. 2 is a general frame diagram of a high-precision single-point bridge deflection monitoring method of the laser reference combined with image processing.
FIG. 3 is a flow chart of a method for monitoring deflection of a high-precision single-point bridge by combining laser reference with image processing.
Fig. 4 is a laser spot diagram of the high-precision single-point bridge deflection monitoring method combining laser reference with image processing.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The application discloses a high-precision single-point bridge deflection monitoring method by combining laser reference with image processing, which comprises the following steps of:
step 1, a cuboid laser module box provided with a red laser is arranged on a bridge pier, a cuboid observation camera bellows provided with a light supplementing light source, a graduated scale, an industrial camera and a bracket is arranged on the side surface of the bridge, wherein the graduated scale is opposite to the industrial camera, so that the light of the laser can vertically enter the graduated scale, and the initial position of a target light spot is positioned at a 0 graduation line of the graduated scale; acquiring a graduated scale image by using an industrial camera, and changing the position of a laser spot when the deflection of the bridge changes; the cuboid laser module box and the cuboid observation camera bellows are respectively provided with rectangular openings on the side walls of one side of each of the cuboid laser module box and the cuboid observation camera bellows, and the other five sides are opaque.
Step 2, extracting an effective area containing scale marks and laser spots in the acquired color image by the on-site working condition machine, and then carrying out affine transformation correction; performing bilateral filtering and dark channel prior defogging on the color facula image to obtain a noise-reduced color image, performing color recognition and threshold judgment on the noise-reduced image to obtain a binary image of the facula, and performing morphological processing on the binary image of the facula to obtain a preprocessed binary facula image;
affine transformation correction is to shoot a group of black and white checkerboard image data by using a camera, calculate an internal and external parameter matrix of the camera according to the contrast between the black and white checkerboard distance on the shot image and the actual distance, and correct the distortion of the camera.
The template formula of the bilateral filtering is as follows:
wherein (i, j) is the coordinates of the center pixel of the template window; (k, l) is the coordinates of the other pixels of the template window; sigma (sigma) d Sum sigma r Standard deviation of Gaussian functions of the airspace template and the value domain template respectively; f (i, j) is the processing frontGray value of punctuation;
the dark channel defogging formula is as follows:
wherein k represents a point coordinate in the image, c is one of three color channels of r, g and b, J c To process the pre-image, J dark For processing the post-image, w k A rectangular window centered on k;
the color recognition and threshold judgment are specifically to convert a color image into a gray image through a matrix, wherein pixels with brightness values larger than a judgment threshold are selected to be marked as white, and other pixels are marked as black. The conversion formula in which a color image is converted into a grayscale image is as follows:
wherein Y represents the brightness value of the image, and R, G and B are three channel values of the input image;
the threshold value is an empirical value, and the method for determining the threshold value is specifically to observe that the illumination condition in the camera bellows is stable, and the brightness of the background and the graduated scale in the facula image shot by the industrial camera is almost unchanged, so 180 is selected as the threshold value.
The morphological processing method is that structural elements are utilized to perform morphological filtering on the binary image, and the specific formula is as follows:
wherein B' is the binary image after treatment, B is the binary image before treatment, S is the structural element of a 3X 3 white square, and the sum of the twoRepresenting corrosion and expansion operations, respectively.
Step 3, extracting a colored spot image by using a binary image, taking the upper left corner of the collected colored image as an original point, taking the right corner as an x-axis positive direction, taking the downward corner as a y-axis positive direction, carrying out an unsaturated point gray level information Gaussian fitting method on the colored spot image to obtain a theoretical light spot center, and carrying out weighting according to the distance between each light spot pixel and the center to obtain a light spot centroid;
the formula of the unsaturated point gray information Gaussian fitting method is as follows:
wherein, x and y are respectively the horizontal and vertical coordinates of the center, x 1 ,x 2 ,x 3 The y is the abscissa of any three pixel points with the same ordinate and different abscissas on the light spot 1 ,y 2 ,y 3 And g represents the gray value of the pixel point to be taken, wherein the gray value is the ordinate of the pixel point with the same arbitrary three abscissas and different abscissas on the light spot.
Step 4, calibrating a reference spot coordinate, and differencing the ordinate of the centroid of the spot from the ordinate of the reference centroid, when the deflection of the bridge changes, the position of the laser spot changes, and the change of the spot position coordinate is recorded, namely the actual deflection of the bridge;
the calculation formula of the deflection of the bridge is as follows:
s=scale·Δy=scale·(y 1 -y 0 )
wherein s is bridge deflection, scale is pixel scale, i.e. the actual length corresponding to the size of a single pixel, y 1 And y 0 The current light spot centroid ordinate and the reference light spot centroid ordinate are respectively, and deltay is the light spot centroid displacement.
And 5, acquiring a real-time image of the bridge safety key parameter including bridge disturbance by means of a machine vision technology, superposing the measured data on the real-time image and uploading the real-time image to a cloud server, and enabling bridge management personnel to access the server through a PC-end web page or a mobile phone-end APP so as to acquire the acquired field image and the measured data. After the received bridge disturbance, the cloud server can combine the temperature, the humidity and the disturbance data in the past of the bridge to generate an instantaneous and continuous bad state threshold value of the bridge, and when the bridge exceeds the instantaneous threshold value or the bridge exceeds the continuous threshold value for a period of time, a warning is sent to remind a manager to overhaul and maintain.
The solving formula of the bad state threshold value of the bridge disturbance is as follows:
the above formula is an empirical formula which is an empirical summary based on measured data, where t i Is the real-time temperature of the bridge, t o Is the average temperature, RH, of the area where the bridge is located i Real-time humidity, RH, of bridges o Average humidity of the area where the bridge is located.
The measured data is uploaded to the cloud for storage, so that the memory requirement of the field computing equipment is reduced on one hand; on the other hand, bridge manager accesses the server through the PC end web page or the mobile end APP, so that acquisition of the acquired field image and measurement data is realized, and manual operation and field observation are not required. The data is acquired from the database without manual update, and after the data is transmitted into the database in the monitoring site, the data of the web page end and the APP are automatically refreshed. The method can realize backtracking inquiry and analysis of the disturbance data in a certain period of the bridge, and after a time interval to be inquired is selected, the result is presented in a graph mode, and a manager can clearly see the change condition of the disturbance data of the bridge, and the maximum value, the minimum value and the middle value of deflection data in the period of time. Meanwhile, the system also collects temperature and humidity data of the bridge monitoring site, and generates an instantaneous and continuous bad state threshold value of the bridge by combining design parameters of a specific bridge and disturbance data of the bridge.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. The high-precision bridge state monitoring method adopting laser and machine vision is characterized by comprising the following steps of:
step 1, installing a laser module box on the side surface of a bridge pier, installing an observation camera bellows provided with a light supplementing light source, a graduated scale, an industrial camera and a bracket on the side surface of the center of the bridge deck, wherein the graduated scale is opposite to an industrial camera, so that laser light can vertically enter the graduated scale, the initial position of a target light spot is positioned at a 0 scale mark of the graduated scale, acquiring a graduated scale image by the industrial camera, and when the deflection of the bridge is changed, the position of the laser light spot is changed;
step 2, extracting an effective area containing scale marks and laser spots in the acquired color spot images by the on-site working condition machine, and then carrying out affine transformation correction; denoising, defogging, binarizing and morphological transformation are carried out on the color facula image to obtain a binarized image of the facula; extracting a colored spot image by using a binary image, performing an unsaturated point gray information Gaussian fitting method on the colored spot image, solving a centroid coordinate of a laser spot in an image coordinate system, performing difference between the obtained centroid of the spot and an initially obtained centroid ordinate of the spot to obtain a centroid variation of the spot, and obtaining actual disturbance of a bridge through a pixel-to-reality conversion matrix;
and 3, superposing the actual disturbance of the bridge on the real-time image, uploading the actual disturbance of the bridge to a cloud server through an ftp protocol, combining the temperature and humidity data of the bridge monitoring site by the cloud server, generating an instantaneous and continuous bad state threshold value of the bridge, and carrying out early warning when the bridge disturbance exceeds the instantaneous threshold value or the bridge exceeds the continuous threshold value within a period of time.
2. The method for monitoring the bridge state with high precision by combining laser with machine vision according to claim 1, wherein affine transformation is corrected, a group of black and white chessboard image data is shot by using a camera, an internal and external parameter matrix of the camera is calculated according to the contrast between the black and white square distance on the shot image and the actual distance, and the distortion of the camera is corrected.
3. The method for monitoring the deflection of the high-precision single-point bridge by combining the laser reference with the image processing according to claim 1, wherein the denoising adopts bilateral filtering, and a template formula of the bilateral filtering is as follows:
wherein (i, j) is the coordinates of the center pixel of the template window; (k, l) is the coordinates of the other pixels of the template window; sigma (sigma) d Sum sigma r Standard deviation of Gaussian functions of the airspace template and the value domain template respectively; f (i, j) is the gray value of the target point before processing.
4. The method for monitoring the state of the bridge with high precision by combining laser with machine vision according to claim 1, wherein the defogging adopts a dark channel defogging formula:
wherein k represents a point coordinate in the image, c is one of three color channels of r, g and b, J c In order to process the pre-image,J dark for processing the post-image, w k Is a rectangular window centered on k.
5. The method for monitoring the bridge state with high precision by combining laser and machine vision according to claim 1, wherein the binarization is used for converting a color image into a gray image through a matrix, the pixels with the brightness value larger than a judgment threshold value are selected to be marked as white, and the other pixels are marked as black, wherein the conversion formula for converting the color image into the gray image is as follows:
wherein Y represents the brightness value of the image, and R, G and B are three channel values of the input image;
the threshold value is an empirical value, and the method for determining the threshold value is specifically to observe that the illumination condition in the camera bellows is stable, and the brightness of the background and the graduated scale in the facula image shot by the industrial camera is almost unchanged, so 180 is selected as the threshold value.
6. The method for monitoring the state of the bridge with high precision by combining laser with machine vision according to claim 1, wherein the morphological processing utilizes structural elements to perform morphological filtering on the binary image, and the specific formula is as follows:
wherein B' is the binary image after the treatment, B is the binary image before the treatment, S is the structural element of a 3X 3 white square, and the sum of the twoRepresenting corrosion and expansion operations, respectively.
7. The method for monitoring the state of the bridge with the high precision by combining the laser with the machine vision according to claim 1, wherein the unsaturated point gray information Gaussian fitting method is characterized in that an upper left corner of an image is taken as an original point, a right corner is taken as an x-axis positive direction, a downward direction is taken as a y-axis positive direction, and a central coordinate of a laser spot in an image coordinate system is obtained by the preprocessed binary spot image through the unsaturated point gray information Gaussian fitting method, wherein the formula is as follows:
wherein x and y are respectively the horizontal and vertical coordinates of the center, and x 1 、x 2 、x 3 Is the abscissa, y of any three pixel points with the same ordinate and different abscissas on the light spot 1 、y 2 、y 3 G represents the gray value of the pixel point for the ordinate of any three pixel points with the same abscissa and different ordinate on the light spot;
and weighting according to the distance between each spot pixel and the center to obtain the spot centroid.
8. The method for monitoring the state of the bridge with high precision by combining laser with machine vision according to claim 1, wherein the bridge deflection is calculated according to the following formula:
s=scale·Δy=scale·(y 1 -y 0 )
wherein s is bridge deflection, scale is pixel scale, i.e. the actual length corresponding to the size of a single pixel, y 1 And y 0 The current light spot centroid ordinate and the reference light spot centroid ordinate are respectively, and deltay is the light spot centroid displacement.
9. The method for monitoring the state of the bridge with high precision by combining laser with machine vision according to claim 1, wherein the bad state threshold of the bridge disturbance is calculated according to the following formula:
wherein b is an empirical constant value, 1.5762 is taken, sigma is the average structural coefficient of the bridge, t i Is the real-time temperature of the bridge, t o Is the average temperature, RH, of the area where the bridge is located i Real-time humidity, RH, of bridges o The average humidity of the area where the bridge is located,is the average value of bridge disturbance at the same load in the past.
10. A high-precision bridge state monitoring system, characterized in that the high-precision bridge state monitoring method adopting laser and machine vision according to any one of claims 1-9 is implemented to realize high-precision bridge state monitoring.
CN202310722342.3A 2023-06-16 2023-06-16 High-precision bridge state monitoring method and system adopting laser and machine vision Pending CN116818233A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310722342.3A CN116818233A (en) 2023-06-16 2023-06-16 High-precision bridge state monitoring method and system adopting laser and machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310722342.3A CN116818233A (en) 2023-06-16 2023-06-16 High-precision bridge state monitoring method and system adopting laser and machine vision

Publications (1)

Publication Number Publication Date
CN116818233A true CN116818233A (en) 2023-09-29

Family

ID=88113929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310722342.3A Pending CN116818233A (en) 2023-06-16 2023-06-16 High-precision bridge state monitoring method and system adopting laser and machine vision

Country Status (1)

Country Link
CN (1) CN116818233A (en)

Similar Documents

Publication Publication Date Title
WO2021093283A1 (en) Sea surface small-area oil spill region detection system and detection method based on multi-sensing fusion
CN110108348B (en) Thin-wall part micro-amplitude vibration measurement method and system based on motion amplification optical flow tracking
CN105574897A (en) Crop growth situation monitoring Internet of Things system based on visual inspection
CN103454285A (en) Transmission chain quality detection system based on machine vision
CN104700395A (en) Method and system for detecting appearance crack of structure
CN113358231B (en) Infrared temperature measurement method, device and equipment
CN103617611A (en) Automatic threshold segmentation detection method for center and size of light spot
CN113554667B (en) Three-dimensional displacement detection method and device based on image recognition
CN113688817A (en) Instrument identification method and system for automatic inspection
CN112927233A (en) Marine laser radar and video combined target capturing method
CN113469178A (en) Electric power meter identification method based on deep learning
CN112634179B (en) Camera shake prevention power transformation equipment image change detection method and system
Liu et al. Image-based recognition and processing system for monitoring water levels in an irrigation and drainage channel
CN116818233A (en) High-precision bridge state monitoring method and system adopting laser and machine vision
CN107194923B (en) Ultraviolet image diagnosis method for defect inspection of contact network power equipment
GB2470741A (en) Liquid level detection method
CN112906095B (en) Bridge modal identification method and system based on laser stripe center tracking
CN107796323A (en) A kind of micro- change detecting system of bridge based on hot spot vision signal intellectual analysis
CN112966594A (en) Environment sensing method
CN113536895A (en) Disc pointer meter identification method
CN113250914A (en) 2D displacement measurement method, system, equipment and storage medium for cabin tower top
KR20120000036A (en) Surveying system using a measuring rule
CN117333675B (en) Monitoring and early warning method and system for GIS expansion joint
CN113971679B (en) Ocean tide measuring method based on computer vision and image processing
CN114913316B (en) Image classification method and device for meter recognition of industrial equipment, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination