CN114973131A - Full-automatic fisheye opening and closing indicator state identification method and system - Google Patents

Full-automatic fisheye opening and closing indicator state identification method and system Download PDF

Info

Publication number
CN114973131A
CN114973131A CN202210549947.2A CN202210549947A CN114973131A CN 114973131 A CN114973131 A CN 114973131A CN 202210549947 A CN202210549947 A CN 202210549947A CN 114973131 A CN114973131 A CN 114973131A
Authority
CN
China
Prior art keywords
indicator
area
size
state
standard deviation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210549947.2A
Other languages
Chinese (zh)
Inventor
丁健配
蔡富东
吕昌峰
刘焕云
帅民伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan Xinxinda Electric Technology Co ltd
Original Assignee
Jinan Xinxinda Electric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan Xinxinda Electric Technology Co ltd filed Critical Jinan Xinxinda Electric Technology Co ltd
Priority to CN202210549947.2A priority Critical patent/CN114973131A/en
Publication of CN114973131A publication Critical patent/CN114973131A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the field of image recognition, and provides a full-automatic fisheye opening and closing indicator state recognition method and system, which comprises the steps of obtaining an indicator monitoring image; preprocessing the indicator monitoring image, and extracting contour information of each connected domain according to the preprocessed indicator image to determine an elliptical panel area to be detected; according to the principle of the minimum standard deviation in the class, selecting a segmentation mode with the minimum standard deviation in the class, segmenting the area of the elliptical panel to be detected into four fan-shaped areas, and comparing the color vividness of each area to select the area with the most vividness as an observation window; judging the state of the indicator according to the color information of the pixel points in the observation area window; according to the invention, on the basis of no need of prior labeling information, the contour information of each connected domain of the indicator monitoring image is used for determining the area of the elliptic panel to be detected, and the area of the observation window is automatically segmented, so that the state of the indicator is judged based on the color information of the pixel points in the observation window.

Description

Full-automatic fisheye opening and closing indicator state identification method and system
Technical Field
The invention belongs to the technical field of image recognition, and particularly relates to a full-automatic fisheye opening and closing indicator state recognition method and system.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
The existing transformer substation disconnecting switch detection and identification method is based on a fast RCNN model, utilizes a deep neural network to extract and analyze characteristics of a disconnecting switch of a detection image, can effectively improve identification precision and detection speed of disconnecting switch detection and identification, and has strong robustness, but the transformer substation disconnecting switch detection and identification method based on the fast RCNN needs to be marked and trained in advance in a large amount, has huge calculated amount in an inference process, and is difficult to meet the real-time detection requirement on edge equipment.
According to the conventional fisheye type opening and closing state identification method for the transformer substation, indicator images are vertically acquired and analyzed, so that an angle that the line of the circle center and the end point deviates from the boundary of an observation window by taking the intersection point of the connecting lines of two boundary lines of two observation windows as the circle center and the position of a character in the observation window as the end point is acquired, the opening and closing state of the opening and closing is identified by the angle, and quantitative analysis of the opening and closing in-place degree of the halving switch through the angle is realized; in the method, a yolov3 network is used for detecting an observation window of the fisheye opening and closing brake, and a large amount of labels and training are required; and a template matching method is used during character detection, and template images are required to be prepared in advance for 'split-joint identification characters' of different types, different fonts and different angles.
In addition, the other method divides the corresponding non-observation window area of the image collected by the image collecting device and related to the fish eye opening and closing, obtains a mask image corresponding to the character area by the exclusive or operation of the divided image, extracts an original image of the mask image corresponding to the character area, and divides the original image corresponding to the character area on the S channel to obtain a character image on the S channel; communicating the character images on the S channel to obtain the central point position of the communicated character images; determining the offset angle of the character image in the character area according to the position of the central point; segmenting an original image of a mask image corresponding to a character region in an H channel to obtain a character image on the H channel, carrying out pixel histogram statistics on the character image on the H channel, and determining that a fisheye opening and closing valve is in a switching-off state under the condition that a pixel value of the character image is in a first preset range, or else, determining that the fisheye opening and closing valve is in a switching-on state; when in detection, a plurality of manually marked feature points and seed points need to be input in advance, and the new marks are needed when a new device is deployed or the position of the shooting device changes, so that the method has certain complexity in large-scale application.
Disclosure of Invention
In order to solve the problems, the invention provides a full-automatic fisheye opening and closing indicator state identification method and system, on the basis of no need of prior marking information, an elliptical panel area to be detected is determined by contour information of all connected domains of an indicator monitoring image, and then the area of an observation window is automatically divided, so that the state of an indicator is judged based on color information of pixel points in the observation window; the method can adapt to fish-eye type opening and closing of various colors and patterns, and has good robustness under the condition of light reflection in a medium and small range.
According to some embodiments, a first aspect of the present invention provides a method for identifying a state of a full-automatic fisheye opening and closing indicator, which adopts the following technical solutions:
a full-automatic fisheye opening and closing indicator state identification method comprises the following steps:
acquiring a monitor image of the indicator;
preprocessing the indicator monitoring image, and extracting contour information of each connected domain according to the preprocessed indicator image to determine an elliptical panel area to be detected;
according to the principle of the minimum standard deviation in the class, selecting a segmentation mode with the minimum standard deviation in the class, segmenting the area of the elliptical panel to be detected into four fan-shaped areas, and comparing the color vividness of each area to select the area with the most vividness as an observation window;
and judging the state of the indicator according to the color information of the pixel points in the observation area window.
Further, the pre-processing based on the indicator monitoring image comprises:
carrying out image graying on the monitoring image of the indicator;
carrying out binarization segmentation on the grayed indicator monitoring image by using a sauvola algorithm;
negating the binarization result;
and (5) performing mathematical morphology open operation by using the kernel with the size of 5 x 5 to remove the impurity points, and obtaining a preprocessed indicator monitoring image.
Further, the extracting contour information of each connected domain according to the preprocessed indicator image to determine the elliptical panel area to be detected includes:
detecting the outline of the preprocessed indicator monitoring image, and storing an outline result in a non-compression mode, namely, sequentially storing each point of the outline to obtain all the outlines of the preprocessed indicator monitoring image;
and traversing each contour to be screened, selecting a contour with the highest similarity with the ellipse, and taking the fitted ellipse as the area of the elliptical panel to be detected.
Further, traversing each contour to be screened, selecting a contour with the highest similarity to the ellipse, and using the fitted ellipse as the elliptical panel area to be detected includes:
record Contours for each contour i I represents a contour number;
determining the convex hull of each contour is denoted as ConvexHull i
Determining the Ellipse fitted to each contour to obtain an elliptical contour as Ellipse i
Calculate all Contours i And ConvexHull i The similarity of the shapes is represented as Sim1 i
Calculate all ConvexHull i And Ellipse i The similarity of the shape of (2) is expressed as Sim2 i
Get Sim i =(Sim1 i +0.0001)*(Sim2 i +0.0001) as an evaluation index, the smaller the value, the higher the similarity between the contour, convex hull and fitted ellipse;
selecting Sim i And taking the fitting Ellipse of the minimum outline as an elliptical panel area to be detected, and recording the fitting Ellipse as Ellipse, the Mask of the area as Mask, and the central point as Center.
Further, the selecting a segmentation mode with the minimum standard deviation in the class according to the minimum standard deviation principle in the class, segmenting the elliptical panel region to be detected into four fan-shaped regions, and selecting the region with the most vivid color as the observation window by comparing the color vividness of each region includes:
traversing each point of the indicator monitoring image in the area of the elliptical panel to be detected, calculating the angle of the indicator monitoring image relative to the central point of the area, rounding off and taking the integer as d, if d is equal to 360, enabling d to be equal to 0, and enabling the integer d to be equal to [0, 359 ];
establishing 360 sets, and recording as Points d According to the angle d, each point is classified into the set Points corresponding to the angle d d Performing the following steps;
3 arrays of Avg _ R, Avg _ G and Avg _ B were built, at [0, 359 [ ]]Internally traversing the integer angle d, and calculating each Points d The mean value of the middle point in the red, green and blue channels is stored in sequence into Avg _ R [ d ]]、Avg_G[d]And Avg _ B [ d]Obtaining the average value of the point set of each angle on each red, green and blue component;
based on the mean value of the point set of each angle on each red, green and blue component, the minimum intra-class standard deviation principle is used as the dividing basis, and the complete 360-degree elliptical panel area is divided into 4 continuous and disjoint elliptical sector areas, which are marked as A n ,n∈{1,2,3,4},A 1 And A 3 Subtend at a central angle of (A) 2 And A 4 The central angle of the circle is opposite;
calculating each pixel point (x, y) in the elliptical area in the original image
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y))
MINRGB(x,y)=min(R(x,y),G(x,y),B(x,y))
COLORFUL(x,y)=(MAXRGB-MINRGB)*MAXRGB
Wherein, R (x, y), G (x, y), B (x, y) represent the red, green and blue component values of the pixel (x, y) respectively, COLORFUL (x, y) represents the color vividness of the pixel (x, y);
and comparing the averages of COLORFUL (x, y) of the pixels in the 4 sector areas, and taking the sector area with the largest average as the finally determined observation window.
Further, based on the mean value of the point set of each angle on each red, green and blue component, the minimum intra-class standard deviation principle is used as a segmentation basis, and a complete 360-degree elliptical panel area is segmented into 4 continuous and disjoint elliptical sector areas, and the specific steps are as follows:
for any A 1 The central angle of (A) is represented as SIZE _ A 1 Set SIZE _ A by a priori knowledge 1 ∈[30,90]And SIZE _ A 1 Is an integer, then A is known 2 SIZE of central angle SIZE _ A of 2 =180-SIZE_A 1 ,A 3 SIZE of central angle SIZE _ A 3 =A 1 _SIZE,A 4 SIZE of central angle SIZE _ A of 4 =180-SIZE_A 1
For any A 1 Is noted as START _ A 1 ,START_A 1 ∈[0,359]And START _ A 1 Is an integer, then A is known 1 The START-stop interval corresponding to the angle is [ START _ A ] 1 ,START_A 1 +SIZE_A 1 ) By analogy, A can be obtained 2 、A 3 And A 4 Starting and stopping intervals of the corresponding angles of the areas;
taking the red component as an example, calculate A n The standard deviation between the mean values of the angles in the region, i.e. the calculation of the array Avg _ R at the index [ START _ A ] n ,START_A n +SIZE_A n ) Standard deviation of values between, note
Figure BDA0003654503850000051
Calculate green in the same wayStandard deviation under color component of
Figure BDA0003654503850000052
Standard deviation at blue component of
Figure BDA0003654503850000061
Calculate each A n The intra-class standard deviation after the region color fusion is as follows:
Figure BDA0003654503850000062
for arbitrarily determined START _ A 1 And SIZE _ A 1 And calculating the weighted sum of the standard deviations in the classes of all the areas in the current segmentation mode as a determined segmentation mode:
Figure BDA0003654503850000063
go through all partitions, i.e. all START _ A 1 And SIZE _ A 1 In combination with (1) to obtain
Figure BDA0003654503850000064
Minimum START _ A 1 And SIZE _ A 1 As an optimal result, the final 4 consecutive and disjoint elliptical sector areas are determined based on the optimal result.
Further, the determining the state of the indicator according to the color information of the pixel points in the observation region window includes:
converting the indicator monitoring image into an HSV color space, calculating the average saturation of all pixels in the observation window, and marking as S _ AVG;
and judging whether any pixel point (x, y) in the observation window is red or green according to the color information of the pixel point (x, y), wherein the formula is as follows:
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y))
Figure BDA0003654503850000065
Figure BDA0003654503850000066
wherein R (x, y), G (x, y), B (x, y) respectively represent Red, Green and blue component values of the pixel (x, y), S (x, y) represents saturation of the pixel (x, y), if Red (x, y) is 1, it represents that the pixel (x, y) is Red, Green (x, y) is 1, it represents that the pixel (x, y) is Green, both may be 0, and may also be equal to 1 (such as pure yellow), and the detection result is not affected;
counting the number of all pixels judged as red and green in the observation window, and recording as RedCount and GreenCount, wherein the formula is as follows:
Figure BDA0003654503850000071
Figure BDA0003654503850000072
and finally, judging the current state of the fisheye opening and closing brake, judging the current state to be in an "closing state" if the RedCount is greater than GreenCount +100, judging the current state to be in an "opening state" if the GreenCount is greater than RedCount +100, and judging the current state to be in an "abnormal state" if the GreenCount is greater than RedCount + 100.
According to some embodiments, the second aspect of the present invention provides a full-automatic fisheye opening and closing indicator state identification system, which adopts the following technical solutions:
the utility model provides a full-automatic fisheye divide-shut brake indicator state identification system, includes:
an image acquisition module configured to acquire a pointer monitoring image;
the oval panel area determining module is configured to preprocess the indicator monitoring image and extract contour information of each connected domain according to the preprocessed indicator image to determine an oval panel area to be detected;
the observation window area determining module is configured to select a division mode with the minimum in-class standard deviation according to a minimum in-class standard deviation principle, divide the elliptical panel area to be detected into four fan-shaped areas, and select an area with the most vivid color as an observation window by comparing the color vividness of each area;
and the indicator state judging module is configured to judge the state of the indicator according to the color information of the pixel points in the observation region window.
According to some embodiments, a third aspect of the invention provides a computer-readable storage medium.
A computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in a fully automatic fisheye opening and closing indicator state identification method according to the first aspect.
According to some embodiments, a fourth aspect of the invention provides a computer apparatus.
A computer device comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor executes the program to implement the steps of the method for identifying the state of the fully automatic fisheye opening and closing indicator according to the first aspect.
Compared with the prior art, the invention has the beneficial effects that:
the invention realizes the identification, positioning and segmentation of the elliptical panel area on the basis of no need of prior marking information; meanwhile, the area of the observation window can be automatically segmented without marking or templates; the method gradually increases pixel level characteristics from two levels of angles and areas, can adapt to fish-eye opening and closing of various colors and patterns, has good robustness under the condition of light reflection in a medium and small range, and can effectively identify the fish-eye opening and closing state under the condition of no mark and no template.
The invention provides a fish eye type opening and closing observation window segmentation method based on the minimum intra-class standard deviation principle, which can automatically segment the area of an observation window without marking or a template; the method gradually increases the pixel level characteristics from two levels of angles and areas, can adapt to fish-eye type opening and closing of various colors and patterns, and has good robustness under the condition of light reflection in a medium and small range. The minimum intra-class standard deviation principle is equal to the maximum inter-class standard deviation even if the intra-class standard deviation of each segmentation region is minimized, and has good adaptivity (no threshold selection is needed) to the class of the two-peak segmentation problem with obvious foreground and background differences.
The invention provides a fisheye type opening and closing oval panel detection method based on contour information, which realizes identification, positioning and segmentation of oval panel areas on the basis of no need of prior marking information. The method mainly identifies the outline of the connected domain, and selects the target closest to the ellipse as the detected ellipse panel area by calculating the similarity among the convex hull of the outline, the fitting ellipse and the outline.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
Fig. 1 is a flowchart of a method for identifying a state of a full-automatic fisheye opening and closing indicator according to an embodiment of the invention;
fig. 2 shows a detection result and a segmentation result of a bisection and closing elliptical panel region in the full-automatic fisheye opening and closing indicator state identification method according to the embodiment of the invention.
Detailed Description
The invention is further described with reference to the following figures and examples.
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
Example one
As shown in fig. 1, the present embodiment provides a full-automatic fisheye opening and closing indicator state identification method, and the present embodiment is exemplified by applying the method to a server, it is to be understood that the method may also be applied to a terminal, and may also be applied to a system including a terminal and a server, and is implemented by interaction between the terminal and the server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network server, cloud communication, middleware service, a domain name service, a security service CDN, a big data and artificial intelligence platform, and the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein. In this embodiment, the method includes the steps of:
acquiring a monitor image of the indicator;
preprocessing the indicator monitoring image, and extracting contour information of each connected domain according to the preprocessed indicator image to determine an elliptical panel area to be detected;
according to the principle of the minimum standard deviation in the class, selecting a division mode with the minimum standard deviation in the class, dividing the area of the elliptical panel to be detected into four fan-shaped areas, and comparing the brightness of the color of each area to select the area with the most bright color as an observation window;
and judging the state of the indicator according to the color information of the pixel points in the observation area window.
Specifically, the method of this embodiment includes the following steps:
step 1, detecting elliptical panel area, namely fish-eye type opening and closing elliptical panel detection method based on contour information
Briefly described: after the indicator monitoring image is preprocessed, extracting outline information of each connected domain, selecting an outline closest to the ellipse by calculating the similarity among a convex hull of the outline, a fitting ellipse and the outline, and taking the fitted ellipse as a detected ellipse panel area.
The detailed steps are as follows:
1.1 graying of the image.
1.2 using sauvola algorithm to do binary segmentation to the image. In this implementation step, the neighborhood size of the mean variance is calculated as 75 × 75, the parameter R when the local threshold T is calculated is 128, and k is 0.2.
1.3 negating the binarization result of the step 1.2.
1.4 mathematical morphology opening with 5 x 5 sized kernels to eliminate outliers.
1.5 detecting the contour, and storing the contour result in a non-compression mode, namely, sequentially storing each point of the contour.
1.6 after excluding the inner contour of the hole structure (hole) and the contour with too small area, each contour to be screened is traversed, a contour closest to the ellipse is selected, and the ellipse fitted to it is used as the detected panel area.
The method comprises the following specific steps:
1.6.1 marks each contour as Contours i I represents a contour number, the same as below;
1.6.2 computing the convex hull of each contour is denoted ConvexHull i
1.6.3 calculate the Ellipse fitted to each contour, get the elliptical contour as Ellipse i
1.6.4 calculate all Contours i And ConvexHull i The similarity of the shape of (2) is expressed as Sim1 i
1.6.5 calculate all ConvexHull i And Ellipse i The similarity of the shape of (2) is expressed as Sim2 i
1.6.6 taking Sim i =(Sim1 i +0.0001)*(Sim2 i +0.0001) as an evaluation index, the smaller the value, the higher the similarity (degree of coincidence) between the contour, convex hull and fitted ellipse, and the 0.0001 added in the calculation is to prevent a certain multiplier from being too small (e.g., 1 e) -6 ) Resulting in a weight imbalance;
1.6.7 selection of Sim i Taking the fitting Ellipse of the minimum outline as a panel area, marking as Ellipse, marking the Mask of the area as Mask, and marking the central point as Center; the result is shown in fig. 2, in which the white ellipse is the detected ellipse panel area result.
Step 2, determining an observation window area, and based on a minimum intra-class standard deviation principle, determining a fisheye type opening and closing observation window segmentation method;
briefly described: and (4) dividing the elliptical area obtained in the step (1) into 4 fan-shaped areas, (traversing all possible dividing methods) selecting a dividing mode with the minimum standard deviation in the class according to the minimum standard deviation principle in the class as a final dividing result, and selecting an area with the most vivid color as an observation window by comparing the vivid color degree of each area.
The detailed steps are as follows:
2.1 go through each point of the original image in the Mask area, calculate its angle with respect to Center and round it to d, if d equals 360, then d equals 0, obviously the integer d ∈ [0, 359 ].
2.2 build 360 sets, denoted as Points d According to the angle calculated in step 2.1, each point is classified into the set Points corresponding to the angle d d In (1).
2.3 set up 3 arrays Avg _ R, Avg _ G and Avg _ B at [0, 359]Internally traversing the integer angle d, and calculating each Points d The average value of the middle point in the red, green and blue channels is stored in sequence into the Avg _ R [ d ]]、Avg_G[d]And Avg _ B [ d]So far we have obtained a set of points at each angle at each redMean value over green-blue component. At this time, the size of each array is 360, in order to meet the requirement of subsequent calculation, the size of each array is expanded to 720, and any integer d belongs to [0, 359 ∈],Avg_R[d+360]=Avg_R[d],Avg_G[d+360]=Avg_G[d],Avg_B[d+360]=Avg_B[d]。
2.4 dividing the complete 360-degree panel region into 4 continuous and disjoint elliptical sector regions marked as A by taking the mean value obtained in the step 2.3 as basic data and taking the minimum intra-class standard deviation principle as a division basis n N ∈ {1, 2, 3, 4} (all subscript n here and hereafter refer to region number), requiring A 1 And A 3 Subtend at a central angle of (A) 2 And A 4 Is opposite to the central angle of the circle. The method comprises the following specific steps:
2.4.1 pairs of any of A 1 The central angle of (A) is represented as SIZE _ A 1 Set SIZE _ A by a priori knowledge 1 ∈[30,90]And SIZE _ A 1 Is an integer, then A is known 2 SIZE of central angle SIZE _ A of 2 =180-SIZE_A 1 ,A 3 SIZE of central angle SIZE _ A of 3 =A 1 _SIZE,A 4 SIZE of central angle SIZE _ A of 4 =180-SIZE_A 1
2.4.2 pairs of any of A 1 Is noted as START _ A 1 ,START_A 1 ∈[0,359]And START _ A 1 Is an integer, then A is known 1 The START-stop interval corresponding to the angle is [ START _ A ] 1 ,START_A 1 +SIZE_A 1 ) By analogy, A can be obtained 2 、A 3 And A 4 The area corresponds to the start-stop interval of the angle.
2.4.3 on the basis of step 2.4.1 and step 2.4.2, taking the red component as an example, calculate A n The standard deviation between the means of the angles in the region, i.e. the calculation array Avg _ R, is indexed under the index [ START _ a ] n ,START_A n +SIZE_A n ) Standard deviation of values between, note
Figure BDA0003654503850000131
The same method is used to obtain the standard deviation of the green component as
Figure BDA0003654503850000132
Standard deviation at blue component of
Figure BDA0003654503850000133
2.4.4 calculation of each A n Intra-class standard deviation after region color fusion
Figure BDA0003654503850000134
2.4.5 pairs of arbitrarily determined START _ A 1 And SIZE _ A 1 And calculating the weighted sum of the standard deviations in the classes of all the areas in the current segmentation mode as a determined segmentation mode:
Figure BDA0003654503850000141
2.4.6 go through all partitions, i.e. through all START _ A 1 And SIZE _ A 1 In combination with (b) to obtain
Figure BDA0003654503850000142
Minimum START _ A 1 And SIZE _ A 1 As an optimal result, the range of 4 regions is now determined, as shown in fig. 2, the optimal segmentation result being 4 sectors separated by 4 white "lines" within the elliptical range.
2.5 for each pixel point (x, y) in the original image within the elliptical area, calculate
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y)) (3)
MINRGB(x,y)=min(R(x,y),G(x,y),B(x,y)) (4)
COLORFUL(x,y)=(MAXRGB-MINRGB)*MAXRGB (5)
Wherein, R (x, y), G (x, y), B (x, y) respectively represent the red, green and blue component values of the pixel (x, y), and COLORFUL (x, y) represents the color vividness of the pixel (x, y) (similar to the saturation of HSV color space).
2.6 compare the averages of the colorfull (x, y) of the pixels in the 4 sectors, and take the sector with the largest average as the final determined observation window.
Step 3, judging the current opening and closing state of the fisheye type opening and closing brake
And 3.1, converting the image into an HSV color space, and calculating the average saturation of all pixels in the observation window obtained in the step 2 and marking as S _ AVG.
3.2 to any pixel point (x, y) in the observation window, judge whether it is red or green according to its color information, the formula is as follows:
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y)) (3)
Figure BDA0003654503850000151
Figure BDA0003654503850000152
r (x, y), G (x, y), B (x, y) respectively represent Red, Green and blue component values of the pixel (x, y), S (x, y) represents saturation of the pixel (x, y), if Red (x, y) is 1, it represents that the pixel (x, y) is Red, Green (x, y) is 1, it represents that the pixel (x, y) is Green, both may be 0, and may also be equal to 1 (such as pure yellow), and the detection result is not affected.
3.3 count the number of all pixels in the viewing window that were judged red and green in step 3.2, denoted as red count and GreenCount, as follows:
RedCount=∑ (x,y) Red(x,y) (8)
GreenCount=∑ (x,y) Green(x,y) (9)
and 3.4, finally judging the current state of the fisheye opening and closing brake, judging the current state to be in an 'on state' if the RedCount is greater than GreenCount +100, judging the current state to be in an 'off state' if the GreenCount is greater than RedCount +100, and judging the current state to be in an 'abnormal state' if the GreenCount is greater than RedCount + 100.
Example two
This embodiment provides a full-automatic fisheye divide-shut brake indicator state identification system, includes:
an image acquisition module configured to acquire a pointer monitoring image;
the oval panel area determining module is configured to preprocess the indicator monitoring image and extract contour information of each connected domain according to the preprocessed indicator image to determine an oval panel area to be detected;
the observation window area determining module is configured to select a division mode with the minimum in-class standard deviation according to a minimum in-class standard deviation principle, divide the elliptical panel area to be detected into four fan-shaped areas, and select an area with the most vivid color as an observation window by comparing the color vividness of each area;
and the indicator state judging module is configured to judge the state of the indicator according to the color information of the pixel points in the observation region window.
The modules are the same as the corresponding steps in the implementation example and application scenarios, but are not limited to the disclosure of the first embodiment. It should be noted that the modules described above as part of a system may be implemented in a computer system such as a set of computer-executable instructions.
In the foregoing embodiments, the descriptions of the embodiments have different emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The proposed system can be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not executed.
EXAMPLE III
The present embodiment provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in a fully automatic fisheye opening and closing indicator state identification method as described in the first embodiment above.
Example four
The embodiment provides a computer device, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the processor implements the steps in the method for identifying the state of the full-automatic fisheye opening and closing indicator according to the first embodiment.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (10)

1. The full-automatic fisheye opening and closing indicator state identification method is characterized by comprising the following steps of:
acquiring a monitor image of the indicator;
preprocessing the indicator monitoring image, and extracting contour information of each connected domain according to the preprocessed indicator image to determine an elliptical panel area to be detected;
according to the principle of the minimum standard deviation in the class, selecting a segmentation mode with the minimum standard deviation in the class, segmenting the area of the elliptical panel to be detected into four fan-shaped areas, and comparing the color vividness of each area to select the area with the most vividness as an observation window;
and judging the state of the indicator according to the color information of the pixel points in the observation area window.
2. The method for identifying the state of the full-automatic fisheye opening and closing indicator as claimed in claim 1, wherein the preprocessing based on the indicator monitoring image comprises:
carrying out image graying on the monitoring image of the indicator;
carrying out binarization segmentation on the grayed indicator monitoring image by using a sauvola algorithm;
negating the binarization result;
and (5) performing mathematical morphology open operation by using the kernel with the size of 5 x 5 to remove the impurity points, and obtaining a preprocessed indicator monitoring image.
3. The method for identifying the state of the full-automatic fisheye opening and closing indicator as claimed in claim 1, wherein the step of extracting the contour information of each connected domain according to the preprocessed indicator image to determine the elliptical panel region to be detected comprises the steps of:
detecting the outline of the preprocessed indicator monitoring image, and storing an outline result in a non-compression mode, namely, sequentially storing each point of the outline to obtain all the outlines of the preprocessed indicator monitoring image;
and traversing each contour to be screened, selecting a contour with the highest similarity with the ellipse, and taking the fitted ellipse as the area of the elliptical panel to be detected.
4. The full-automatic fisheye opening and closing indicator state identification method of claim 3, wherein the step of traversing each contour to be screened, selecting a contour with the highest similarity to an ellipse, and using the ellipse fitted with the contour as an ellipse panel area to be detected comprises the steps of:
record Contours for each contour i I represents a contour number;
determining the convex hull of each contour is denoted as ConvexHull i
Determining the Ellipse fitted to each contour to obtain an elliptical contour as Ellipse i
Calculate all Contours i And ConvexHull i Shape phase ofSimilarity is recorded as Sim1 i
Calculate all ConvexHull i And Ellipse i The similarity of the shape of (2) is expressed as Sim2 i
Get Sim i =(Sim1 i +0.0001)*(Sim2 i +0.0001) as an evaluation index, the smaller the value, the higher the similarity between the contour, convex hull and fitted ellipse;
selecting Sim i And taking the fitting Ellipse of the minimum outline as an elliptical panel area to be detected, and recording the fitting Ellipse as Ellipse, the Mask of the area as Mask, and the central point as Center.
5. The method for identifying the state of the full-automatic fisheye opening and closing indicator as claimed in claim 1, wherein the selecting the division mode with the smallest standard deviation in the class according to the principle of the smallest standard deviation in the class as the final division result, and then selecting the region with the most vivid color as the observation window by comparing the vividness of the color of each region comprises:
traversing each point of the indicator monitoring image in the area of the elliptical panel to be detected, calculating the angle of the indicator monitoring image relative to the central point of the area, rounding off and taking the integer as d, if d is equal to 360, enabling d to be equal to 0, and enabling the integer d to be equal to [0, 359 ];
establishing 360 sets, and recording as Points d According to the angle d, each point is classified into the set Points corresponding to the angle d d Performing the following steps;
3 arrays of Avg _ R, Avg _ G and Avg _ B are built, at [0, 359]Internally traversing the integer angle d, and calculating each Points d The mean value of the middle point in the red, green and blue channels is stored in sequence into Avg _ R [ d ]]、Avg_G[d]And Avg _ B [ d]Obtaining the mean value of the point set of each angle on each red, green and blue component;
based on the mean value of the point set of each angle on each red, green and blue component, the minimum intra-class standard deviation principle is used as the dividing basis, and the complete 360-degree elliptical panel area is divided into 4 continuous and disjoint elliptical sector areas, which are marked as A n ,n∈{1,2,3,4},A 1 And A 3 Subtend at a central angle of (A) 2 And A 4 The central angle of the circle is opposite;
calculating each pixel point (x, y) in the elliptic region in the original image
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y))
MINRGB(x,y)=min(R(x,y),G(x,y),B(x,y))
COLORFUL(x,y)=(MAXRGB-MINRGB)*MAXRGB
Wherein, R (x, y), G (x, y), B (x, y) represent the red, green and blue component values of the pixel (x, y) respectively, COLORFUL (x, y) represents the color vividness of the pixel (x, y);
and comparing the averages of COLORFUL (x, y) of the pixels in the 4 sector areas, and taking the sector area with the largest average as the finally determined observation window.
6. The method for identifying the state of the full-automatic fisheye opening and closing indicator as claimed in claim 5, wherein a complete 360-degree elliptical panel region is divided into 4 continuous and non-intersecting elliptical sector regions based on the mean value of a point set of each angle on each red, green and blue component by taking a minimum intra-class standard deviation principle as a division basis, and the specific steps are as follows:
for any A 1 The central angle of (A) is represented as SIZE _ A 1 Set SIZE _ A by a priori knowledge 1 ∈[30,90]And SIZE _ A 1 Is an integer, then A is known 2 SIZE of central angle SIZE _ A of 2 =180-SIZE_A 1 ,A 3 SIZE of central angle SIZE _ A of 3 =A 1 _SIZE,A 4 SIZE of central angle SIZE _ A of 4 =180-SIZE_A 1
For any A 1 Is noted as START _ A 1 ,START_A 1 ∈[0,359]And START _ A 1 Is an integer, then A is known 1 The START-stop interval corresponding to the angle is [ START _ A ] 1 ,START_A 1 +SIZE_A 1 ) By analogy, A can be obtained 2 、A 3 And A 4 Starting and stopping intervals of the corresponding angles of the areas;
taking the red component as an example, calculate A n The standard deviation between the mean values of the angles in the area, i.e. the calculation of the array Avg _ R at the index START_A n ,START_A n +SIZE_A n ) Standard deviation of values between, note
Figure FDA0003654503840000041
The same method is used to obtain the standard deviation of the green component as
Figure FDA0003654503840000042
Standard deviation at blue component of
Figure FDA0003654503840000043
Calculate each A n The intra-class standard deviation after the region color fusion is as follows:
Figure FDA0003654503840000044
for arbitrarily determined START _ A 1 And SIZE _ A 1 And calculating the weighted sum of the standard deviations in the classes of all the areas in the current segmentation mode as a determined segmentation mode:
Figure FDA0003654503840000045
go through all partitions, i.e. all START _ A 1 And SIZE _ A 1 In combination with (b) to obtain
Figure FDA0003654503840000046
Minimum START _ A 1 And SIZE _ A 1 As an optimal result, the final 4 consecutive and disjoint elliptical sector areas are determined based on the optimal result.
7. The method for identifying the state of the full-automatic fisheye opening and closing indicator as claimed in claim 1, wherein the determining the state of the indicator according to the color information of the pixel points in the observation area window comprises:
converting the indicator monitoring image into an HSV color space, calculating the average saturation of all pixels in the observation window, and marking as S _ AVG;
and judging whether any pixel point (x, y) in the observation window is red or green according to the color information of the pixel point (x, y), wherein the formula is as follows:
MAXRGB(x,y)=max(R(x,y),G(x,y),B(x,y))
Figure FDA0003654503840000051
Figure FDA0003654503840000052
wherein R (x, y), G (x, y), B (x, y) respectively represent Red, Green and blue component values of the pixel (x, y), S (x, y) represents saturation of the pixel (x, y), if Red (x, y) is 1, it represents that the pixel (x, y) is Red, Green (x, y) is 1, it represents that the pixel (x, y) is Green, both may be 0, and may also be equal to 1 (such as pure yellow), and the detection result is not affected;
counting the number of all pixels judged as red and green in the observation window, and recording as RedCount and GreenCount, wherein the formula is as follows:
Figure FDA0003654503840000053
Figure FDA0003654503840000054
and finally, judging the current state of the fisheye opening and closing brake, judging the current state to be in an "closing state" if the RedCount is greater than GreenCount +100, judging the current state to be in an "opening state" if the GreenCount is greater than RedCount +100, and judging the current state to be in an "abnormal state" if the GreenCount is greater than RedCount + 100.
8. The utility model provides a full-automatic fisheye divide-shut brake indicator state identification system which characterized in that includes:
an image acquisition module configured to acquire a pointer monitoring image;
the oval panel area determining module is configured to preprocess the indicator monitoring image and extract contour information of each connected domain according to the preprocessed indicator image to determine an oval panel area to be detected;
the observation window area determining module is configured to select a division mode with the minimum in-class standard deviation according to a minimum in-class standard deviation principle, divide the elliptical panel area to be detected into four fan-shaped areas, and select an area with the most vivid color as an observation window by comparing the color vividness of each area;
and the indicator state judging module is configured to judge the state of the indicator according to the color information of the pixel points in the observation region window.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of a method for fully automatic status recognition of a fisheye opening and closing indicator according to any one of claims 1-7.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the steps of a fully automatic fisheye opening and closing indicator status recognition method as claimed in any one of claims 1-7.
CN202210549947.2A 2022-05-20 2022-05-20 Full-automatic fisheye opening and closing indicator state identification method and system Pending CN114973131A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210549947.2A CN114973131A (en) 2022-05-20 2022-05-20 Full-automatic fisheye opening and closing indicator state identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210549947.2A CN114973131A (en) 2022-05-20 2022-05-20 Full-automatic fisheye opening and closing indicator state identification method and system

Publications (1)

Publication Number Publication Date
CN114973131A true CN114973131A (en) 2022-08-30

Family

ID=82985120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210549947.2A Pending CN114973131A (en) 2022-05-20 2022-05-20 Full-automatic fisheye opening and closing indicator state identification method and system

Country Status (1)

Country Link
CN (1) CN114973131A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351499A (en) * 2023-12-04 2024-01-05 深圳市铁越电气有限公司 Split-combined indication state identification method, system, computer equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351499A (en) * 2023-12-04 2024-01-05 深圳市铁越电气有限公司 Split-combined indication state identification method, system, computer equipment and medium
CN117351499B (en) * 2023-12-04 2024-02-02 深圳市铁越电气有限公司 Split-combined indication state identification method, system, computer equipment and medium

Similar Documents

Publication Publication Date Title
CN111860533B (en) Image recognition method and device, storage medium and electronic device
CN108108761B (en) Rapid traffic signal lamp detection method based on deep feature learning
EP3455782B1 (en) System and method for detecting plant diseases
Chen et al. Automatic license-plate location and recognition based on feature salience
KR101640998B1 (en) Image processing apparatus and image processing method
CN109558806B (en) Method for detecting high-resolution remote sensing image change
WO2018019194A1 (en) Image recognition method, terminal, and nonvolatile storage medium
Agrawal et al. Grape leaf disease detection and classification using multi-class support vector machine
CN110458835B (en) Image processing method, device, equipment, system and medium
CN106803257B (en) Method for segmenting disease spots in crop disease leaf image
CN103577838A (en) Face recognition method and device
CN103295013A (en) Pared area based single-image shadow detection method
Le et al. Real time traffic sign detection using color and shape-based features
CN102184404B (en) Method and device for acquiring palm region in palm image
CN103034838A (en) Special vehicle instrument type identification and calibration method based on image characteristics
CN112132153B (en) Tomato fruit identification method and system based on clustering and morphological processing
CN112464942A (en) Computer vision-based overlapped tobacco leaf intelligent grading method
CN111695373B (en) Zebra stripes positioning method, system, medium and equipment
CN111709305B (en) Face age identification method based on local image block
CN114973131A (en) Full-automatic fisheye opening and closing indicator state identification method and system
CN113449639A (en) Non-contact data acquisition method for instrument by gateway of Internet of things
CN110544262A (en) cervical cell image segmentation method based on machine vision
Das et al. Human face detection in color images using HSV color histogram and WLD
CN111738310B (en) Material classification method, device, electronic equipment and storage medium
CN113344047A (en) Platen state identification method based on improved K-means algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination