CN116907349A - Universal switch state identification method based on image processing - Google Patents

Universal switch state identification method based on image processing Download PDF

Info

Publication number
CN116907349A
CN116907349A CN202311167482.5A CN202311167482A CN116907349A CN 116907349 A CN116907349 A CN 116907349A CN 202311167482 A CN202311167482 A CN 202311167482A CN 116907349 A CN116907349 A CN 116907349A
Authority
CN
China
Prior art keywords
state
switch
predicted
screenshot
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311167482.5A
Other languages
Chinese (zh)
Other versions
CN116907349B (en
Inventor
贺亮
岑亮
易炜
吴雷
刘云川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Hongbao Technology Co ltd
Beijing Baolong Hongrui Technology Co ltd
Original Assignee
Chongqing Hongbao Technology Co ltd
Beijing Baolong Hongrui Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Hongbao Technology Co ltd, Beijing Baolong Hongrui Technology Co ltd filed Critical Chongqing Hongbao Technology Co ltd
Priority to CN202311167482.5A priority Critical patent/CN116907349B/en
Publication of CN116907349A publication Critical patent/CN116907349A/en
Application granted granted Critical
Publication of CN116907349B publication Critical patent/CN116907349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Nonlinear Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a general switch state identification method based on image processing, which mainly comprises the following steps: s100: performing initial configuration on a switch to be identified; s200: identification of the state of the switch to be predicted is accomplished using a modified SSIM algorithm. The invention does not need to collect data, label and train, is suitable for identifying the switch states of various different modes, consumes little extra calculation force, and can be deployed on various embedded devices.

Description

Universal switch state identification method based on image processing
Technical Field
The invention belongs to the technical field of industrial or engineering computing, and particularly relates to a general switch state identification method based on image processing.
Background
There are a large number of various switches for equipment, instruments, etc. in both daily life and industrial parks, sites or oil and gas wells. In order to realize an automatic process, a user is facilitated or manpower is saved, personnel safety risks are reduced, a technology based on visual intelligent recognition is introduced in a plurality of scenes at present, the on-off state is automatically recognized through an algorithm, and the device based on visual algorithm recognition has the advantages of being capable of being deployed in a non-invasive mode and not changing the original device structure.
The current mainstream practice is to collect a large number of pictures of target switches, then to perform manual labeling, train a target detection model, and finally to perform deployment. The disadvantage is obvious, namely, a large amount of training picture data needs to be collected, but in practical application, the collection cost and the labor cost of labeling are high, and a large amount of time is consumed; the algorithm based on the model has higher calculation force requirement on deployment hardware, so that the deployment cost is further improved; meanwhile, the trained model can only identify the appointed trained switch style, and once one switch is replaced, the model can not be identified almost, and the model has no universality.
Disclosure of Invention
In order to solve the technical problems, the invention discloses a general switch state identification method based on image processing, which comprises the following steps:
s100: performing initial configuration on a switch to be identified;
s200: the identification of the state of the switch to be predicted is accomplished using a modified SSIM algorithm, comprising:
s201: reading a camera picture to obtain an image to be predicted;
s202: reading the coordinate position of the switch area, a state screenshot and a state Guan Jietu;
s203: cutting out an image to be predicted, a switch area of a state screenshot and a switch area of a state screenshot through the coordinate positions to obtain a small image to be predicted, a state small image and a state small image;
s204: respectively preprocessing the small image to be predicted, the state opening small image and the state closing small image;
s205: calculating the similarity of the preprocessed small diagram to be predicted and the state open small diagram by using an improved SSIM algorithm to obtain S1;
s206: calculating the similarity of the preprocessed small diagram to be predicted and the state-closed small diagram by using an improved SSIM algorithm to obtain S2;
s207: setting a matching threshold T1;
s208: if S1 is larger than S2 and S1 is larger than T1, the state of the switch to be predicted is on, if S2 is larger than S1 and S1 is larger than T1, the state of the switch to be predicted is off, if S1 and S2 are smaller than T1, the identification fails, and alarm prompt information is output.
Preferably, the step S100 further includes:
s101: reading a camera picture;
s102: adjusting the switch state to be identified to be on;
s103: the screenshot is saved as a state screenshot;
s104: selecting a switch area to be identified by a frame, and storing the coordinate position of the switch area;
s105: adjusting the switch state to be identified to be off;
s106: the screenshot is saved as a state-close screenshot.
Preferably, the coordinate positions in the step S104 refer to coordinates of an upper left corner and a lower right corner of the frame.
Preferably, the preprocessing in step S204 includes scaling to a fixed size, median filtering denoising, and gray-scale image conversion.
Preferably, the alarm prompt information in step S208 includes that the switch with prediction is blocked and the camera frame moves.
Preferably, the improved SSIM algorithm comprises the steps of:
s301: for the graph x to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the graph x, +.>A gray value representing a single pixel point;
for the graphs y to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the map y, +.>Represents the gray value of a single pixel, and +.>
S302: for the graphs x and y to be compared, calculating the standard deviation of the gray value of the pixel,/>
S303: respectively calculating brightness indexes of two images x and y to be comparedContrast index->And structural index->
S304: according to the formulaCalculating the similarity of the two graphs x, y to be compared, wherein +.>Is the duty ratio of brightness index +.>For the duty cycle of contrast index +.>Is the duty ratio of the structural index.
Preferably, the brightness indexContrast index->And structural index->Is calculated as follows:
wherein ,,/>is constant and is->Is the covariance of images x and y.
Preferably, the said,/>,/>Is calculated as follows:
wherein ,,/>is constant and is->Is the pixel value.
Preferably, an arrangement is provided0.1 @, @>0.5%>1.
Through the technical scheme, data acquisition, labeling and training are not required, the method is suitable for identifying the switch states of various different types, and meanwhile, extra calculation force is hardly consumed, so that the method can be deployed on various embedded devices.
Drawings
FIG. 1 is a flow chart of a method for identifying a general switch state based on image processing according to one embodiment of the present invention;
FIG. 2 is a configuration flow diagram provided in one embodiment of the invention;
FIG. 3 is a predictive flow diagram provided in one embodiment of the invention;
fig. 4 is a schematic diagram of an actual scenario application provided in one embodiment of the present invention.
Detailed Description
In order for those skilled in the art to understand the technical solutions disclosed in the present invention, the technical solutions of the various embodiments will be described below with reference to the embodiments and the related fig. 1 to 4, where the described embodiments are some embodiments, but not all embodiments of the present invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, in one embodiment, the present invention discloses a method for identifying a general switch state based on image processing, the method comprising the steps of:
s100: performing initial configuration on a switch to be identified;
s200: identification of the state of the switch to be predicted is accomplished using a modified SSIM algorithm.
For this embodiment, the method is applied to a water pipe type valve, and is mainly divided into two major parts of a configuration flow and a prediction flow. The method carries out similarity calculation on any configured switch by using an improved SSIM image structural index algorithm, then compares the similarity calculation result with a set matching threshold value, and finally outputs a judging result. The whole scheme is simple to operate, data acquisition and data labeling are not required, a calculation training model is not required, the switch type can be specified by a user, the universality is high, the method is suitable for identifying the switch states of various different types, meanwhile, extra calculation force is hardly consumed, the method can be deployed on various embedded equipment, and the performance requirements and the energy consumption requirements of edge equipment deployment are greatly reduced.
In another embodiment, the step S100 further includes:
s101: reading a camera picture;
s102: adjusting the switch state to be identified to be on;
s103: the screenshot is saved as a state screenshot;
s104: selecting a switch area to be identified by a frame, and storing the coordinate position of the switch area;
s105: adjusting the switch state to be identified to be off;
s106: the screenshot is saved as a state-close screenshot.
For this embodiment, as shown in fig. 2, the configuration flow in the method first reads the camera screen, ensuring that the switch to be identified can be clearly seen. Then, the switch is turned to be in an open state, and the screenshot is recorded as a state screenshot; then selecting a switch area to be identified by a frame, and storing coordinates of the upper left corner and the lower right corner of the frame; and then turning the switch to the off state, capturing a screenshot again, and recording the screenshot as a state off screenshot, thus completing the configuration.
In this embodiment, the switch to be identified is set to be in an on state, the camera picture is read, then the screenshot is saved and recorded as a state screenshot, and meanwhile, the switch to be identified is selected from the screenshot upper frame. Two considerations are selected for the manual frame, namely, when a plurality of switches are arranged in a picture, the frame is used for indicating which switch is to be identified; and secondly, the calculation amount of the algorithm is reduced, and a small area to be calculated is specified. After the frame selection, the upper left corner coordinates (x 0, y 0) and the lower right corner coordinates (x 1, y 1) are saved. Then the switch is turned to the off state, and the screenshot is stored and recorded as a state off screenshot. Because the switch position is unchanged, repeated framing is not needed.
In another embodiment, the coordinates in step S104 are the coordinates of the upper left corner and the lower right corner of the frame.
In another embodiment, the step S200 further includes:
s201: reading a camera picture to obtain an image to be predicted;
s202: reading the coordinate position of the switch area, a state screenshot and a state Guan Jietu;
s203: cutting out an image to be predicted, a switch area of a state screenshot and a switch area of a state screenshot through the coordinate positions to obtain a small image to be predicted, a state small image and a state small image;
s204: respectively preprocessing the small image to be predicted, the state opening small image and the state closing small image;
s205: calculating the similarity of the preprocessed small diagram to be predicted and the state open small diagram by using an improved SSIM algorithm to obtain S1;
s206: calculating the similarity of the preprocessed small diagram to be predicted and the state-closed small diagram by using an improved SSIM algorithm to obtain S2;
s207: setting a matching threshold T1;
s208: if S1 is larger than S2 and S1 is larger than T1, the state of the switch to be predicted is on, if S2 is larger than S1 and S1 is larger than T1, the state of the switch to be predicted is off, if S1 and S2 are smaller than T1, the identification fails, and alarm prompt information is output.
With this embodiment, once configured, it can be deployed for use. As shown in fig. 3, the prediction process in the method is to obtain a camera picture first, and then take a screenshot. And then cutting the screenshot to be identified, the state opening screenshot and the state Guan Jietu according to the coordinates selected by the frame respectively to obtain a small drawing to be identified, a state opening small drawing and a state closing small drawing. Followed by data preprocessing. After the data and processing is completed, the improved SSIM similarity is calculated. Calculating the preprocessed small diagram to be identified and the preprocessed state small diagram to obtain a similarity value S1; and then calculating the preprocessed small diagram to be identified and the preprocessed state closing diagram to obtain a similarity value S2. And then compared to a matching threshold T1, typically empirically set for different scenarios, here typically set to 0.7. If S1 is greater than S2 and S1 is greater than T1, the identification is successful, and the output identification result is on; if S2 is greater than S1 and S2 is greater than T1, the identification is successful, and an identification result is output as off; if S1 and S2 are smaller than T1, the identification fails, and alarm information is output.
In another embodiment, the preprocessing in step S204 includes scaling to a fixed size, median filtering denoising, and gray scale map conversion.
For this embodiment, first uniformly scaling to a size with an overall width equal to 100 pixels, then denoising using median filtering with a convolution kernel size of 3, and finally converting the three-channel color map to a single-channel gray map.
In another embodiment, the alarm prompting message in the step S208 includes that the camera screen is moved through with the prediction switch being blocked.
For this embodiment, this is typically due to occlusion of the picture or camera position movement, requiring reconfiguration.
In another embodiment, the improved SSIM algorithm includes the steps of:
s301: for the graph x to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the graph x, +.>A gray value representing a single pixel point;
for the graphs y to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the map y, +.>Represents the gray value of a single pixel, and +.>
S302: for the graphs x and y to be compared, calculating the standard deviation of the gray value of the pixel,/>
S303: respectively calculating brightness indexes of two images x and y to be comparedContrast index->And structural index->
S304: according to the formulaCalculating the similarity of the two graphs x, y to be compared, wherein +.>Is the duty ratio of brightness index +.>For the duty cycle of contrast index +.>Is the duty ratio of the structural index.
For this embodiment, the calculation process is as follows:
for each plot, an average pixel gray value is first calculated, where N takes a value of 10000, because after scaling to 100 aspect, the total pixel point is 100×100=10000. For a single-channel gray scale map, there is n=image width x image height. Re-calculating standard deviation of pixel gray value. And then respectively calculating brightness indexes, contrast indexes and structural indexes of the two images to be compared. After the three indexes are obtained, the improved S can be calculated according to the following formulaSIM similarity:
。/>SSIM similarity representing two graphs.
In another embodiment, the brightness indexContrast index->And structural index->Is calculated as follows:
wherein ,,/>is constant and is->Is the covariance of images x and y.
For this embodiment, where x, y represent the two graphs to be compared of the inputs.For empirically set constants for avoidingThe value of l (x, y) is unstable when approaching 0, < >>Is also an empirically set constant for avoiding instabilities when the value of c (x, y) is close to 0,/is>Is also an empirically set constant to avoid instability when the value of s (x, y) approaches 0.
In another embodiment, the,/>,/>Is calculated as follows:
wherein ,is constant and is->Is constant and is->Is the pixel value.
In the case of this embodiment, the first and second embodiments,,/>are constants set empirically, and satisfy less than 1, wherein +.>0.01%>0.03%>The pixel value is taken as a value range of 255.
In another embodiment, a set up0.1 @, @>0.5%>1.
For this embodiment, in the original SSIM algorithm, these three default values are all 1, however, this is not applicable to the present invention, because in the application scenario of the present invention, the structural index is more concerned, and the application scenario may face environmental changes such as lights, night, etc. Experimentally, the inventors have found that the effect of the index of brightness and contrast needs to be reduced, while taking into account that some switches may be represented by light contrast. Finally, the invention verifies that the best effect is to set up0.1 @, @>0.5%>1.
In another embodiment, as shown in figure 4,
the first step is to perform a configuration operation. Reading a camera picture, turning a switch into an on state, storing a screenshot, recording the screenshot as a state screenshot, manually selecting a switch area to be identified by a frame, and recording corresponding coordinates: the upper left corner (226,202), lower right corner (302,262), then the switch is turned to the off state, and the screenshot is saved and recorded as a state off screenshot.
The second step is to deploy the predictive procedure.
Firstly, a camera picture is read, and a screenshot is saved and recorded as a screenshot to be identified. And then reading the coordinate positions of the frame selection, and cutting the state opening screenshot, the state closing screenshot and the to-be-dead screenshot according to the recorded frame selection range coordinates to obtain a state opening small picture, a state closing small picture and a to-be-predicted small picture. The specific operation is to read the picture as a two-dimensional array and then intercept the region with x-coordinate between 226 and 302 and y-coordinate between 202 and 262.
Then preprocessing is performed, and for the similarity comparison threshold calculated by the same different sizes, the image is scaled to a fixed size, wherein the size is 100x100, i.e. the width and the height are all 100, and any interpolation algorithm can be used for scaling, such as nearest neighbor interpolation. After scaling, denoising is performed by using median filtering to improve the robustness of the algorithm, wherein the neighborhood size of the median filtering is set to 3. Then, the image is grayed, because the original image is three-channel of RGB color map, the color map is converted into single-channel GRAY map by the conversion formula of GRAY=B 0.114+G 0.587+R 0.299. Wherein GRAY represents GRAY values, B represents pixel values of a blue channel of an original color map, G represents pixel values of a green channel of the original color map, and R represents pixel values of a red channel of the original color map. The state opening small diagram x after pretreatment, the state closing small diagram y after pretreatment and the small diagram z to be detected after pretreatment are obtained in the step.
Then the SSIM similarity is calculated according to the improved SSIM algorithm described above. Assuming that the numbers of pixels of the three graphs to be compared are N, firstly calculating the average pixel gray value of each graph
Where N represents the total number of input image pixels, for a single-channel gray scale map, there is n=image width x image height,representing the gray value of each pixel point; the average pixel gray value of the state-open small graph x after pretreatment can be obtained when the image width and the image height are 100>= 128.3684, average pixel gray value of the pre-processed state-cut-down pattern y +.>= 122.0649, average pixel gray value of the pre-processed panel z to be detected +.>=50.4655。
Then calculate the standard deviation of the pixel gray value of each image
Obtaining the standard deviation of the pixel gray value of the preprocessed state open small graph x= 44.0961, the standard deviation of the pixel gray values of the pre-processed state-cut-down graph y ∈>= 51.9707. According to the calculation method, the standard deviation of the pixel gray value of the preprocessed small diagram z to be detected>=20.0469。
Then calculating brightness indexes of the preprocessed state open small graph x and the preprocessed small graph z to be detectedThe formula is as follows:
wherein ,for the constants set empirically, the function is to prevent the abnormal problem of the result caused when the denominator approaches 0, the calculation method is as follows:
wherein ,for empirically set constants, it is sufficient to satisfy less than 1, here set +.>0.01%>The pixel value is taken as a value range of 255. Obtain->.
The same method calculates the brightness indexes of the preprocessed state-closing graph y and the preprocessed graph z to be detected.
Then calculating the contrast index of the preprocessed state open small image x and the preprocessed small image z to be detectedThe formula is as follows:
wherein ,for empirically set constants, the function is to prevent the problem of abnormal results caused when the denominator approaches 0, and the calculation method is as follows:
wherein ,for empirically set constants, less than 1 is satisfied, < ->0.03%>The pixel value is taken as a value range of 255. Obtain->.
The same method calculates the contrast index of the preprocessed state check graph y and the preprocessed graph z to be detected to obtain.
Then calculating the structural indexes of the preprocessed state open small graph x and the preprocessed small graph z to be detected:
wherein ,covariance for images X and Z:
it can be appreciated that where x i 、z i For a pixel point in the image x, z,average pixel gray value calculated for image x according to the algorithm described above, ">The average pixel gray value calculated by the algorithm described above for image z.
For empirically set constants, the effect is to prevent the problem of abnormal results caused when the denominator approaches 0, calculated as follows:
obtainingThen the structural indexes of the preprocessed state check diagram Y and the preprocessed diagram Z to be detected are calculated by the same method to obtain +.>.
Finally according to the formulaCalculating SSIM similarity of the two graphs x, y to be compared, wherein +.>Is the duty ratio of brightness index +.>For the duty cycle of contrast index +.>Is the duty ratio of the structural index. The default SSIM algorithm is 1 for three values, and in the application scene of the invention, structural indexes are more concerned, the application scene possibly faces environmental changes such as lamplight, night and the like, so that the index influence of brightness and contrast is reduced, and in consideration of the fact that certain switches possibly are represented by lamplight contrast, the best effect is set up by experimental verification of the invention>0.1 @, @>0.5%>1.
Finally obtaining the SSIM similarity of the preprocessed state open small diagram x and the preprocessed small diagram z to be detectedSSIM similarity of the preprocessed state-close diagram y and the preprocessed diagram z to be detected
Finally, comparing with a matching threshold T1=0.7 which is set experimentally and empirically, and judging that the switch state of the object to be detected is on because S (x, z) > S (y, z) and T1> S (y, z) are satisfied.
Finally, it is pointed out that a person skilled in the art, given the benefit of this disclosure, can make numerous variants, all of which fall within the scope of protection of the invention, without thereby departing from the scope of protection of the claims.

Claims (9)

1. A method for identifying a general switch state based on image processing, the method comprising the steps of:
s100: performing initial configuration on a switch to be identified;
s200: the identification of the state of the switch to be predicted is accomplished using a modified SSIM algorithm, comprising:
s201: reading a camera picture to obtain an image to be predicted;
s202: reading the coordinate position of the switch area, a state screenshot and a state Guan Jietu;
s203: cutting out an image to be predicted, a switch area of a state screenshot and a switch area of a state screenshot through the coordinate positions to obtain a small image to be predicted, a state small image and a state small image;
s204: respectively preprocessing the small image to be predicted, the state opening small image and the state closing small image;
s205: calculating the similarity of the preprocessed small diagram to be predicted and the state open small diagram by using an improved SSIM algorithm to obtain S1;
s206: calculating the similarity of the preprocessed small diagram to be predicted and the state-closed small diagram by using an improved SSIM algorithm to obtain S2;
s207: setting a matching threshold T1;
s208: if S1 is larger than S2 and S1 is larger than T1, the state of the switch to be predicted is on, if S2 is larger than S1 and S1 is larger than T1, the state of the switch to be predicted is off, if S1 and S2 are smaller than T1, the identification fails, and alarm prompt information is output.
2. The method as set forth in claim 1, wherein the step S100 further includes:
s101: reading a camera picture;
s102: adjusting the switch state to be identified to be on;
s103: the screenshot is saved as a state screenshot;
s104: selecting a switch area to be identified by a frame, and storing the coordinate position of the switch area;
s105: adjusting the switch state to be identified to be off;
s106: the screenshot is saved as a state-close screenshot.
3. The method of claim 2, wherein the coordinate positions in step S104 refer to coordinates of an upper left corner and a lower right corner of the frame.
4. The method of claim 1, wherein the preprocessing in step S204 includes scaling to a fixed size, median filtering denoising, turning a gray scale.
5. The method of claim 1, wherein the alarm prompting message in step S208 includes that the camera screen is moved with the prediction switch blocked.
6. The method of claim 1, wherein the modified SSIM algorithm comprises the steps of:
s301: for the graph x to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the graph x, +.>A gray value representing a single pixel point;
for the graphs y to be compared, an average pixel gray value is calculated
wherein ,representing the number of pixels of the map y, +.>Represents the gray value of a single pixel, and +.>
S302: for the graphs x and y to be compared, calculating the standard deviation of the gray value of the pixel,/>
S303: respectively calculating brightness indexes of two images x and y to be comparedContrast index->And structural index
S304: according to the formulaCalculating the similarity of the two graphs x, y to be compared, wherein +.>Is the duty ratio of brightness index +.>For the duty cycle of contrast index +.>Is the duty ratio of the structural index.
7. The method of claim 6, wherein the brightness index isContrast index->And structural index->Is calculated as follows:
wherein ,,/>,/>is constant and is->Is the covariance of images x and y.
8. The method of claim 7, wherein the,/>,/> The calculation is as follows:
wherein ,,/>is constant and is->Is the pixel value.
9. The method of claim 1, wherein the setting0.1 @, @>0.5%>1.
CN202311167482.5A 2023-09-12 2023-09-12 Universal switch state identification method based on image processing Active CN116907349B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311167482.5A CN116907349B (en) 2023-09-12 2023-09-12 Universal switch state identification method based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311167482.5A CN116907349B (en) 2023-09-12 2023-09-12 Universal switch state identification method based on image processing

Publications (2)

Publication Number Publication Date
CN116907349A true CN116907349A (en) 2023-10-20
CN116907349B CN116907349B (en) 2023-12-08

Family

ID=88360602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311167482.5A Active CN116907349B (en) 2023-09-12 2023-09-12 Universal switch state identification method based on image processing

Country Status (1)

Country Link
CN (1) CN116907349B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331710A (en) * 2014-11-19 2015-02-04 集美大学 On-off state recognition system
CN106250902A (en) * 2016-07-29 2016-12-21 武汉大学 Power system on off state detection method based on characteristics of image template matching
CN108805862A (en) * 2018-05-02 2018-11-13 南京大学 A kind of tag discrimination methods based on improved structure similarity
CN112819094A (en) * 2021-02-25 2021-05-18 北京时代民芯科技有限公司 Target detection and identification method based on structural similarity measurement
CN113822180A (en) * 2021-09-07 2021-12-21 深圳市长龙铁路电子工程有限公司 Air switch on-off state identification method and device, electronic equipment and storage medium
WO2022121129A1 (en) * 2020-12-12 2022-06-16 南方电网调峰调频发电有限公司 Fire recognition method and apparatus, and computer device and storage medium
CN114639022A (en) * 2022-03-28 2022-06-17 广东电网有限责任公司 Switch cabinet on-off state identification method and system based on SUFR template matching
CN114821309A (en) * 2022-04-12 2022-07-29 福建省海峡智汇科技有限公司 Indoor transformer substation switch and indicator lamp state identification method and system
CN114972817A (en) * 2022-04-25 2022-08-30 深圳创维-Rgb电子有限公司 Image similarity matching method, device and storage medium
CN116109849A (en) * 2022-12-30 2023-05-12 浙江工业大学 SURF feature matching-based high-voltage isolating switch positioning and state identification method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331710A (en) * 2014-11-19 2015-02-04 集美大学 On-off state recognition system
CN106250902A (en) * 2016-07-29 2016-12-21 武汉大学 Power system on off state detection method based on characteristics of image template matching
CN108805862A (en) * 2018-05-02 2018-11-13 南京大学 A kind of tag discrimination methods based on improved structure similarity
WO2022121129A1 (en) * 2020-12-12 2022-06-16 南方电网调峰调频发电有限公司 Fire recognition method and apparatus, and computer device and storage medium
CN112819094A (en) * 2021-02-25 2021-05-18 北京时代民芯科技有限公司 Target detection and identification method based on structural similarity measurement
CN113822180A (en) * 2021-09-07 2021-12-21 深圳市长龙铁路电子工程有限公司 Air switch on-off state identification method and device, electronic equipment and storage medium
CN114639022A (en) * 2022-03-28 2022-06-17 广东电网有限责任公司 Switch cabinet on-off state identification method and system based on SUFR template matching
CN114821309A (en) * 2022-04-12 2022-07-29 福建省海峡智汇科技有限公司 Indoor transformer substation switch and indicator lamp state identification method and system
CN114972817A (en) * 2022-04-25 2022-08-30 深圳创维-Rgb电子有限公司 Image similarity matching method, device and storage medium
CN116109849A (en) * 2022-12-30 2023-05-12 浙江工业大学 SURF feature matching-based high-voltage isolating switch positioning and state identification method

Also Published As

Publication number Publication date
CN116907349B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
US9710716B2 (en) Computer vision pipeline and methods for detection of specified moving objects
JP4309926B2 (en) Facial feature point detection apparatus, facial feature point detection method, and program
KR102153607B1 (en) Apparatus and method for detecting foreground in image
US8306262B2 (en) Face tracking method for electronic camera device
EP2605180A2 (en) User detecting apparatus, user detecting method and a user detecting program
WO2006008944A1 (en) Image processor, image processing method, image processing program, and recording medium on which the program is recorded
KR20080038356A (en) Image processing method and apparatus, digital camera, and recording medium recording image processing program
WO2013135033A1 (en) Tunnel deformation online monitoring system based on image analysis and application thereof
JP5441670B2 (en) Image processing apparatus and control method thereof
JP2017005389A (en) Image recognition device, image recognition method, and program
CN113012383B (en) Fire detection alarm method, related system, related equipment and storage medium
CN114842397B (en) Real-time old man falling detection method based on anomaly detection
US20130028470A1 (en) Image processing apparatus, image processing method, and comupter readable recording device
JP4798042B2 (en) Face detection device, face detection method, and face detection program
JP2007067560A (en) Imaging apparatus and its control method, computer program and recording medium
JP4662258B2 (en) Image processing method and apparatus, digital camera apparatus, and recording medium recording image processing program
US8818096B2 (en) Apparatus and method for detecting subject from image
CN114158163B (en) Intelligent ship lighting control method, device, equipment and storage medium
WO2021098359A1 (en) Lane line recognizing method, device, equipment, and storage medium
US20230394829A1 (en) Methods, systems, and computer-readable storage mediums for detecting a state of a signal light
WO2023025010A1 (en) Stroboscopic banding information recognition method and apparatus, and electronic device
JP2017229061A (en) Image processing apparatus, control method for the same, and imaging apparatus
JP2010160743A (en) Apparatus and method for detecting object
KR101044903B1 (en) Fire detecting method using hidden markov models in video surveillance and monitoring system
JP5159390B2 (en) Object detection method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant