CN114820616A - Equipment state detection method and device for flashing mode indicator light - Google Patents

Equipment state detection method and device for flashing mode indicator light Download PDF

Info

Publication number
CN114820616A
CN114820616A CN202210745792.XA CN202210745792A CN114820616A CN 114820616 A CN114820616 A CN 114820616A CN 202210745792 A CN202210745792 A CN 202210745792A CN 114820616 A CN114820616 A CN 114820616A
Authority
CN
China
Prior art keywords
indicator light
image
equipment
images
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210745792.XA
Other languages
Chinese (zh)
Other versions
CN114820616B (en
Inventor
李斌山
韩丹
常金琦
胡坤
雒厂辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Mengpa Xinchuang Technology Co ltd
Original Assignee
Beijing Mengpa Xinchuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Mengpa Xinchuang Technology Co ltd filed Critical Beijing Mengpa Xinchuang Technology Co ltd
Priority to CN202210745792.XA priority Critical patent/CN114820616B/en
Publication of CN114820616A publication Critical patent/CN114820616A/en
Application granted granted Critical
Publication of CN114820616B publication Critical patent/CN114820616B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a device state detection method and a device for a flash mode indicator light, wherein the method comprises the following steps: acquiring a plurality of images of equipment to be detected in different preset time frames, wherein all the images of the equipment to be detected contain indicator light information; fusing all the acquired images of the equipment to be detected to form fused images; carrying out target detection on the fusion image, and giving position and color information of the indicator light; extracting indicator light image blocks of all the images of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, and giving an indicator light flashing mode; and giving the state of the equipment to be detected based on the corresponding relation between the flashing mode of the indicator light and the state of the equipment. The intelligent detection of the equipment state is achieved by accurately and efficiently identifying the flashing state of the indicator lamp.

Description

Equipment state detection method and device for flashing mode indicator light
Technical Field
The invention relates to the field of equipment state detection, in particular to an equipment state detection method and device for a flash mode indicator lamp.
Background
In the inspection scene of the information machine room, the indicator light is one of the most remarkable features for indicating the running state of equipment. The modes of the indicator lamp are divided into a color mode and a flashing mode, wherein the color mode usually uses red to indicate fault, yellow to indicate alarm, green to indicate normal and blue to indicate operation. In the light of the above, researchers have proposed that the color of the indicator light is recognized by an image acquisition device and an image processing algorithm, and the color is used as a basis for judging the operation or fault state of the equipment.
For example, patent publication No. CN112100039A discloses a method and system for alarming device failure, which determine the operating status of the device by identifying and detecting the information of the indicator light, and determining the operating status of the device. The information is judged and processed by replacing manpower, and the automatic monitoring capability of the machine room is improved.
The existing methods all belong to static detection, and judge equipment fault and output alarm only by detecting and identifying color information of an indicator lamp. However, there is less research on indicator light detection for blinking mode.
A device status detecting method, system, apparatus and readable storage medium disclosed in the patent publication CN111815912A include determining device operating status data by a flashing pattern feature of a warning lamp, and determining color features and position features of an indicator lamp by a general feature recognition algorithm. Can meet the requirements of industrial fields, but the intelligent degree is still insufficient.
In a real machine room patrol inspection scene, color characteristics and position information of an indicator light given by the existing characteristic identification means cannot meet the requirement for judging the flicker state of the indicator light, and the conventional image acquisition mode can cause errors of flicker frequency and influence the judgment of the equipment state.
Therefore, how to design a method for detecting the device status of the blinking mode indicator light to achieve intelligent detection of the device status by accurately and efficiently identifying the blinking status of the indicator light is a problem to be solved by those skilled in the art.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a method and a device for detecting the equipment state of a flash mode indicator light. With this, realized through accurate, the efficient discernment to pilot lamp scintillation state, reached the effect to equipment state intelligent detection.
In a first aspect, the present invention provides a device status detection method for a blinking mode indicator light, comprising the steps of:
acquiring a plurality of images of equipment to be detected in different preset time frames, wherein all the images of the equipment to be detected contain indicator light information;
fusing all the acquired images of the equipment to be detected to form a fused image;
carrying out target detection on the fusion image, and giving position and color information of the indicator light;
extracting indicator light image blocks of all the images of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, and giving an indicator light flashing mode;
and giving the state of the equipment to be detected based on the corresponding relation between the indicating lamp flashing mode and the equipment state.
Through the combination of time frame information, image fusion and color change information, the identification precision of the indicator lamp can be improved, and the flashing parameters can be efficiently and quickly given, so that the flashing mode of the indicator lamp is obtained, and the running state of the equipment to be detected is given.
Further, all the images of the equipment to be detected are obtained and fused to form a fused image, and the method specifically comprises the following steps:
taking partial images of the equipment to be detected in continuous time frames, converting the partial images into gray-scale images one by one, and respectively giving covariance matrixes of pixels of the gray-scale images;
calculating respective eigenvalue and eigenvector according to all given covariance matrixes;
sorting the eigenvectors corresponding to the eigenvalues according to the magnitude of the eigenvalues, and calculating respective principal components to obtain a first principal component;
and based on the first principal component, performing inverse transformation on all the images of the equipment to be detected to obtain a fused image.
When the scintillation state of discernment pilot lamp, different predetermine time frame wait to examine equipment image in, the numerical value difference of pilot lamp pixel is great, directly carries out position and colour analysis to all waiting to examine equipment image, and the error is great, through the accurate definite of realization pilot lamp position of image fusion ability.
Further, the covariance matrix of the grayscale map is specifically expressed as:
Figure 393025DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 59630DEST_PATH_IMAGE002
is a covariance matrix of the gray-scale map,c xy are respectively arranged in the transverse and longitudinal directions
Figure 748231DEST_PATH_IMAGE003
Figure 449471DEST_PATH_IMAGE004
The value of the pixel point of (a),
Figure 424380DEST_PATH_IMAGE005
the number of pixels in the grayscale map in the horizontal direction,
Figure 716297DEST_PATH_IMAGE006
the number of the pixel points in the gray-scale image is longitudinal;
the formula for the calculation of the principal components is as follows:
Figure 67644DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 498756DEST_PATH_IMAGE008
is a main component, and is characterized in that,
Figure 644567DEST_PATH_IMAGE009
is as follows
Figure 20184DEST_PATH_IMAGE010
The numerical value of the pixel point of the tone map,
Figure 909643DEST_PATH_IMAGE011
is as follows
Figure 54316DEST_PATH_IMAGE010
The feature vector of the intensity map is expanded,
Figure 368099DEST_PATH_IMAGE012
the number of images of the apparatus to be inspected being part of successive time frames;
based on the first principal component, all the images of the equipment to be detected are inversely transformed, and the calculation formula is as follows:
Figure 231012DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 393003DEST_PATH_IMAGE014
is the value of the pixel point after the inverse transformation,
Figure 657763DEST_PATH_IMAGE015
as a result of the first principal component,
Figure 755163DEST_PATH_IMAGE016
is as follows
Figure 105373DEST_PATH_IMAGE017
The characteristic value of the image of the equipment to be detected,
Figure 336634DEST_PATH_IMAGE018
the number of images of all devices to be inspected.
The covariance matrix is adopted and the mode of sequencing according to the characteristic values is adopted to express the correlation among the pixel points in the image, so that the information of the correlation and the larger difference among the pixel points under different dimensions is reserved, and high-frequency parts (such as indicator light positions) in all the images to be detected are reserved.
Further, the target detection is a pre-trained indicator light detection model, and the pre-training process of the indicator light detection model specifically includes:
collecting N training images containing indicator lamps;
marking the type of the indicator light in each training image to form a corresponding label, and generating a training data set in which the training images correspond to the labels one by one;
and pre-training the target detection model by utilizing the training data set to reach a set convergence range, completing the pre-training process and generating an indicator light detection model.
Further, draw all pilot lamp image blocks of waiting to examine equipment image according to the pilot lamp position, based on the color information of pilot lamp image block, analysis pixel point eigenvalue gives the pilot lamp mode of twinkling, specifically includes:
representing the position of the indicator light obtained by target detection as the coordinate position of the target frame;
based on the coordinate position of the target frame, extracting image blocks corresponding to all the images of the equipment to be detected according to a time frame sequence, acquiring the image blocks of the indicator light, and forming an image block data set of the indicator light;
carrying out gray level transformation on the indicator light image block data set to obtain an indicator light image gray level set;
calculating the average gray of each indicating lamp gray image in the indicating lamp image gray set to form an average gray array of all the image gray images of the equipment to be detected;
calculating the characteristic value of the average gray array, and giving out the flicker parameter of the indicator light in the image to be detected;
extracting time frame information of all equipment images to be detected, and calculating the time interval of continuous equipment images to be detected according to the time frame information;
and giving a flashing mode of the indicator light according to the flashing parameters and the time interval of the indicator light.
Through calculation of a gray array in a gray map, the change parameters of the image block of the indicator light in a preset time frame are given, so that the flicker mode of the indicator light is determined. By adopting a calculation mode of characteristic values in the gray level array and combining time information of the collected images and position positioning of image fusion, the flash mode identification of the indicator lamp can be ensured to be efficient and rapid, and the identification accuracy can be ensured.
Further, calculating the average gray scale of each indicating lamp gray scale image in the indicating lamp image gray scale set to form an average gray scale array of all the image gray scale images of the equipment to be detected, specifically comprising:
the calculation formula of the average gray scale of each indicating lamp gray scale image is as follows:
Figure 718549DEST_PATH_IMAGE019
wherein W is the width of the target frame, H is the height of the target frame,
Figure 377064DEST_PATH_IMAGE020
is an average gray matrix of a single gray map,
Figure 948990DEST_PATH_IMAGE021
the pixel point value matrix is a single gray level image;
the average gray array of the gray images of all the equipment to be detected is as follows:
Figure 593729DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 833081DEST_PATH_IMAGE018
the number of all the images of the equipment to be checked.
Further, the characteristic value of the average gradation group is calculated,
calculating the characteristic value of the average gray level array, and giving out the flicker parameter of the indicator light in the image to be detected, which specifically comprises the following steps:
and giving the variance value of the average gray level array, wherein the calculation formula is as follows:
Figure 662497DEST_PATH_IMAGE023
wherein the content of the first and second substances,
Figure 721720DEST_PATH_IMAGE024
is the variance value of the average gray level array,
Figure 26274DEST_PATH_IMAGE020
an average gray matrix of a single gray image;
and analyzing the fluctuation of the average gray level array based on the relation between the variance value and the set variance threshold value, and judging the flicker parameter of the indicator light.
Further, according to the flashing parameters and the time interval of the indicator light, a flashing mode of the indicator light is given, and the method specifically comprises the following steps:
based on the numerical value of the average gray array, giving out a cycle period of an image gray set of the indicator light;
and obtaining the flashing frequency of the indicator light according to the cycle period and the time interval, wherein the calculation formula is as follows:
Figure 120132DEST_PATH_IMAGE025
wherein F is the flicker frequency, T is the time interval,
Figure 120449DEST_PATH_IMAGE026
is a cycle period;
and comparing the alarm rules according to the flashing frequency and the color information, and giving a flashing mode of the indicator lamp.
Further, based on the value of the average gray array, a cycle period of the gray set of the image of the indicator light is given, which specifically includes:
carrying out numerical transformation on the average gray array, wherein the transformation formula is as follows:
Figure 666968DEST_PATH_IMAGE027
Figure 512565DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure 726508DEST_PATH_IMAGE029
representing the frequency domain response of the average gray scale array,
Figure 366568DEST_PATH_IMAGE030
is the n-th number in the average gray level array
Figure 400383DEST_PATH_IMAGE028
Is the digital frequency of the average gray level array,
Figure 312320DEST_PATH_IMAGE031
representing the imaginary part in the frequency domain,
Figure 115191DEST_PATH_IMAGE032
is the number of a single gray scale image,
Figure 191731DEST_PATH_IMAGE032
is 0 to
Figure 853788DEST_PATH_IMAGE033
And performing positive number conversion on the frequency domain response, and calculating a frequency value when the frequency domain response is the highest through phase frequency characteristics, wherein the calculation formula is as follows:
Figure 306766DEST_PATH_IMAGE034
wherein the content of the first and second substances,
Figure 964143DEST_PATH_IMAGE035
is the frequency value at which the frequency domain response is highest,
Figure 742744DEST_PATH_IMAGE036
is the absolute value of the frequency domain response;
to be provided with
Figure 37238DEST_PATH_IMAGE035
As the frequency of the average gray array, the cycle period of the average gray array is given, and the calculation formula of the cycle period is as follows:
Figure 372535DEST_PATH_IMAGE037
after judging whether the indicator lamp has a flashing state or not, the frequency of the array is given based on the gray array, the cycle period is obtained through the continuous image identification mode, and the flashing mode of the indicator lamp can be accurately given.
In a second aspect, the present invention also provides an apparatus for implementing the device status detection method for blinking mode indicator light as described above, comprising:
the acquisition unit is used for acquiring images of all equipment to be detected containing indicator light information in a plurality of different preset time frames;
the processing unit is used for fusing the acquired image of the equipment to be detected to form a fused image, detecting a target on the fused image and giving position and color information of the indicator light;
and the identification unit is used for extracting all indicator light image blocks of the image of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, giving out the flashing mode of the indicator lights, and giving out the state of the equipment to be detected based on the corresponding relation between the flashing mode of the indicator lights and the state of the equipment.
The invention provides at least the following beneficial effects:
(1) the images of all the equipment to be detected are fused, and the characteristic values of the covariance matrix of each gray level image are calculated, so that the high-frequency part (such as the position of an indicator light) in the images is reserved, the identification obstacle caused by the difference of pixels at the positions of the indicator light in the images of different time frames is overcome, and the accuracy of target detection is improved.
(2) Through calculation of the gray level array, time information in the collected image and position positioning of image fusion are combined, and a complete, accurate and efficient detection method for the indicating lamp flashing mode according to image analysis is provided.
(3) The integration of different preset time frame images, the position determination of the indicator lamp and the characteristic analysis of the pixel points of the image blocks of the indicator lamp realize the accuracy and the high efficiency of the flash mode identification of the indicator lamp, and further achieve the intelligent detection of the equipment state.
Drawings
FIG. 1 is a flow chart of device status detection for a blinking mode indicator light according to the present invention;
FIG. 2 is a schematic diagram of an image of an apparatus to be inspected according to the present invention;
FIG. 3 is a flow chart of image fusion of all images to be examined according to the present invention;
FIG. 4 is a flow chart of an indicator light blinking pattern analysis provided by the present invention;
fig. 5 is a diagram of a device status detection apparatus for a blinking mode indicator light according to the present invention.
Detailed Description
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
As shown in fig. 1, the present invention provides a device status detection method for a blinking mode indicator light, comprising the steps of:
acquiring a plurality of images of equipment to be detected in different preset time frames, wherein all the images of the equipment to be detected contain indicator light information;
fusing all the acquired images of the equipment to be detected to form a fused image;
carrying out target detection on the fusion image, and giving position and color information of the indicator light;
extracting indicator light image blocks of all the images of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, and giving an indicator light flashing mode;
and giving the state of the equipment to be detected based on the corresponding relation between the indicating lamp flashing mode and the equipment state.
As shown in fig. 2, a reference numeral (i) is an indicator light position for displaying a device state, and an image of a device to be detected needs to contain indicator light information, that is, the obtained image to be detected contains indicator lights.
The determination of the preset time frame needs to consider the signal frequency of the flashing of the indicator light of the device to be detected. Generally, the frequency of the acquired image is fixed and is more than twice the frequency of the signal of the indicator light. At the same time, the total length of the time frame is guaranteed to be greater than twice the period of the indicator light signal. Therefore, the whole process of covering the flickering of the indicator light can be ensured, and the high efficiency and convenience of the flickering frequency identification calculation can be ensured.
Through the combination of time frame information, image fusion and color change information, the identification precision of the indicator lamp can be improved, and the flashing parameters can be efficiently and quickly given, so that the flashing mode of the indicator lamp is obtained, and the running state of the equipment to be detected is given.
As shown in fig. 3, fusing all the acquired images of the device to be detected to form a fused image, specifically including:
taking images of equipment to be detected in partial continuous time frames, converting the images into a gray scale image one by one, and respectively providing covariance matrixes of pixels of the gray scale image, wherein the images of the equipment to be detected are RGB three-channel images, and the formula for converting the gray scale is as follows:
Figure 568342DEST_PATH_IMAGE039
calculating respective eigenvalue and eigenvector according to all given covariance matrixes;
sorting the eigenvectors corresponding to the eigenvalues according to the magnitude of the eigenvalues, and calculating respective principal components to obtain a first principal component;
and based on the first principal component, performing inverse transformation on all the images of the equipment to be detected to obtain a fused image.
When the scintillation state of discernment pilot lamp, different predetermine time frame wait to examine equipment image in, the numerical value difference of pilot lamp pixel is great, directly carries out position and colour analysis to all waiting to examine equipment image, and the error is great, through the accurate definite of realization pilot lamp position of image fusion ability.
The images of the equipment to be inspected in partial continuous time frames are selected from all the images of the equipment to be inspected, and the continuous time frames can cover at least one state capable of showing 'bright' when the indicator lamps flicker. Therefore, the high-frequency part in the fused image can be reserved according to the difference and the correlation of the pixel points in the indicator lamp area.
The covariance matrix of the grayscale map is specifically expressed as:
Figure 64045DEST_PATH_IMAGE040
wherein the content of the first and second substances,
Figure 858826DEST_PATH_IMAGE041
is a covariance matrix of the gray-scale map,c xy are respectively arranged in the transverse and longitudinal directions
Figure 753445DEST_PATH_IMAGE003
Figure 749214DEST_PATH_IMAGE004
The value of the pixel point of (a) is,
Figure 466634DEST_PATH_IMAGE005
the number of pixels in the grayscale map in the horizontal direction,
Figure 65106DEST_PATH_IMAGE006
the number of the pixel points in the gray-scale image is longitudinal;
the formula for the calculation of the principal components is as follows:
Figure 817161DEST_PATH_IMAGE042
wherein the content of the first and second substances,
Figure 577307DEST_PATH_IMAGE043
is a main component, and is characterized in that,
Figure 47602DEST_PATH_IMAGE044
is as follows
Figure 181256DEST_PATH_IMAGE010
The numerical value of the pixel point of the tone map,
Figure 787818DEST_PATH_IMAGE011
is as follows
Figure 718865DEST_PATH_IMAGE010
The feature vector of the intensity map is expanded,
Figure 82981DEST_PATH_IMAGE012
the number of images of the apparatus to be inspected being part of successive time frames;
based on the first principal component, all the images of the equipment to be detected are inversely transformed, and the calculation formula is as follows:
Figure 429780DEST_PATH_IMAGE045
wherein, the first and the second end of the pipe are connected with each other,
Figure 28864DEST_PATH_IMAGE046
is the value of the pixel point after the inverse transformation,
Figure 599654DEST_PATH_IMAGE047
as a result of the first principal component,
Figure 716645DEST_PATH_IMAGE016
is as follows
Figure 601556DEST_PATH_IMAGE017
The characteristic value of the image of the equipment to be detected,
Figure 117463DEST_PATH_IMAGE018
the number of images of all devices to be inspected.
By adopting the covariance matrix and the mode of sequencing according to the characteristic values, the relevance among the pixel points in the image to be detected can be accurately represented, the information of the relevance and the larger difference among the pixel points under different dimensions is reserved, and the high-frequency parts (such as the positions of the indicator lamps) in all the images to be detected are reflected.
All the obtained images of the equipment to be detected are fused to form a fused image, and the method can also comprise the following steps:
converting all the images of the equipment to be detected into gray level images one by one to form a gray level image set;
based on the maximum gray value rule, fusing every two images of the gray level image set, wherein the calculation formula is as follows:
Figure 265679DEST_PATH_IMAGE048
wherein the content of the first and second substances,
Figure 135546DEST_PATH_IMAGE049
in order to fuse the pixel points of the image,
Figure 24480DEST_PATH_IMAGE050
,
Figure 69928DEST_PATH_IMAGE051
representing the pixels that are fused two by two,
Figure 982520DEST_PATH_IMAGE052
,
Figure 805595DEST_PATH_IMAGE053
and representing the gray values corresponding to the gray level images subjected to pairwise fusion.
The method comprises the following steps that an indicator light detection module for target detection needs to be pre-trained, and the specific pre-training process comprises the following steps:
collecting N training images containing indicator lamps;
marking the type of the indicator light in each training image to form a corresponding label, and generating a training data set in which the training images correspond to the labels one by one;
and pre-training the target detection model by utilizing the training data set to reach a set convergence range, completing the pre-training process and generating an indicator light detection model.
And (3) carrying out a pre-trained indicator light detection model, wherein the selected training image is not limited to the indicator light image in a flashing mode. In the pre-training process, the aim of training the characteristics such as color information, position information and the like can be fulfilled.
As shown in fig. 4, extract all pilot lamp image blocks of waiting to examine equipment image according to the pilot lamp position, based on the color information of pilot lamp image block, analysis pixel characteristic value gives the pilot lamp mode of twinkling, specifically includes:
the position of the indicator light obtained by target detection is represented as the coordinate position of a target frame, and the coordinate position of the target frame can be represented by the coordinate positions of an upper left point and a lower right point;
based on the coordinate position of the target frame, extracting image blocks corresponding to all the images of the equipment to be detected according to a time frame sequence, acquiring the image blocks of the indicator light, and forming an image block data set of the indicator light;
carrying out gray level transformation on the indicator light image block data set to obtain an indicator light image gray level set;
calculating the average gray of each indicating lamp gray image in the indicating lamp image gray set to form an average gray array of all the image gray images of the equipment to be detected;
calculating the characteristic value of the average gray array, and giving out the flicker parameter of the indicator light in the image to be detected;
extracting time frame information of all equipment images to be detected, and calculating the time interval of continuous equipment images to be detected according to the time frame information;
and giving a flashing mode of the indicator light according to the flashing parameters and the time interval of the indicator light.
Through calculation of a gray array in a gray map, the change parameters of the image block of the indicator light in a preset time frame are given, so that the flicker mode of the indicator light is determined. By adopting a calculation mode of characteristic values in the gray level array and combining time information of the collected images and position positioning of image fusion, the flash mode identification of the indicator lamp can be ensured to be efficient and rapid, and the identification accuracy can be ensured.
Calculating the average gray scale of each indicating lamp gray scale image in the indicating lamp image gray scale set to form an average gray scale array of all the image gray scale images of the equipment to be detected, and specifically comprising the following steps of:
the calculation formula of the average gray scale of each indicating lamp gray scale image is as follows:
Figure 297887DEST_PATH_IMAGE054
wherein W is the width of the target frame, H is the height of the target frame,
Figure 791317DEST_PATH_IMAGE055
is an average gray matrix of a single gray map,
Figure 874810DEST_PATH_IMAGE056
the pixel point value matrix is a single gray level image;
the average gray array of the gray images of all the equipment to be detected is as follows:
Figure 326127DEST_PATH_IMAGE057
wherein the content of the first and second substances,
Figure 481165DEST_PATH_IMAGE018
the number of images of all devices to be inspected.
The characteristic values of the average gray group are calculated,
calculating the characteristic value of the average gray level array, and giving out the flicker parameter of the indicator light in the image to be detected, which specifically comprises the following steps:
and giving the variance value of the average gray level array, wherein the calculation formula is as follows:
Figure 32363DEST_PATH_IMAGE058
wherein the content of the first and second substances,
Figure 21179DEST_PATH_IMAGE059
is the variance value of the average gray level array,
Figure 680830DEST_PATH_IMAGE060
an average gray matrix which is a single gray image;
and analyzing the fluctuation of the average gray level array based on the relation between the variance value and the set variance threshold value, and judging the flicker parameter of the indicator light.
Giving an indicator light flashing mode according to the flashing parameters and the time interval of the indicator light, and specifically comprising the following steps:
based on the numerical value of the average gray array, giving out a cycle period of an image gray set of the indicator light;
and obtaining the flashing frequency of the indicator light according to the cycle period and the time interval, wherein the calculation formula is as follows:
Figure 4274DEST_PATH_IMAGE061
wherein F is the flicker frequency, the unit of F is Hz, T is the time interval,
Figure 941137DEST_PATH_IMAGE026
is a cycle period;
and comparing the alarm rules according to the flashing frequency and the color information, and giving a flashing mode of the indicator lamp.
Based on the value of the average gray array, a cycle period of an image gray set of the indicator light is given, and the method specifically comprises the following steps:
carrying out numerical transformation on the average gray array, wherein the transformation formula is as follows:
Figure 163171DEST_PATH_IMAGE062
Figure 513381DEST_PATH_IMAGE063
wherein the content of the first and second substances,
Figure 744642DEST_PATH_IMAGE064
representing the frequency domain response of the average gray scale array,
Figure 863908DEST_PATH_IMAGE065
is the n-th number in the average gray level array
Figure 253914DEST_PATH_IMAGE066
Is the digital frequency of the average gray level array,
Figure 356999DEST_PATH_IMAGE031
representing the imaginary part in the frequency domain,
Figure 860793DEST_PATH_IMAGE032
is the number of a single gray scale image,
Figure 241090DEST_PATH_IMAGE032
is 0 to
Figure 539347DEST_PATH_IMAGE067
And performing positive number conversion on the frequency domain response, and calculating a frequency value when the frequency domain response is the highest through phase frequency characteristics, wherein the calculation formula is as follows:
Figure 129728DEST_PATH_IMAGE068
wherein the content of the first and second substances,
Figure 437213DEST_PATH_IMAGE035
is the frequency value at which the frequency domain response is highest,
Figure 262562DEST_PATH_IMAGE069
is the absolute value of the frequency domain response;
to be provided with
Figure 997300DEST_PATH_IMAGE035
As an averageThe frequency of the gray array gives the cycle period of the average gray array, and the calculation formula of the cycle period is as follows:
Figure 543819DEST_PATH_IMAGE070
after judging whether the indicator lamp has a flashing state or not, the frequency of the array is given based on the gray array, the cycle period is obtained through the continuous image identification mode, and the flashing mode of the indicator lamp can be accurately given.
The given flashing mode comprises information of two aspects of color and flashing, and the running state of the equipment at the moment can be indicated through accurate identification of the flashing mode.
For example, the alarm rule for the operation of the device may be set to indicate that the light is "yellow flashing (1 Hz): the system gives a serious error alarm; red scintillation (1 Hz): emergency error warning "of the system. And comparing a flicker mode with an alarm rule by detecting the given color and frequency, and outputting an alarm signal.
As shown in fig. 5, the present invention also provides an apparatus for device status detection of a blinking mode indicator light, comprising:
the acquisition unit is used for acquiring images of all equipment to be detected containing indicator light information in a plurality of different preset time frames;
the processing unit is used for fusing the acquired image of the equipment to be detected to form a fused image, detecting a target on the fused image and giving position and color information of the indicator light;
and the identification unit is used for extracting all indicator light image blocks of the image of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, giving out the flashing mode of the indicator lights, and giving out the state of the equipment to be detected based on the corresponding relation between the flashing mode of the indicator lights and the state of the equipment.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A device status detection method for a blinking mode indicator light, comprising the steps of:
acquiring a plurality of images of equipment to be detected in different preset time frames, wherein all the images of the equipment to be detected contain indicator light information;
fusing all the acquired images of the equipment to be detected to form a fused image;
carrying out target detection on the fusion image, and giving position and color information of the indicator light;
extracting indicator light image blocks of all the images of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, and giving an indicator light flashing mode;
and giving the state of the equipment to be detected based on the corresponding relation between the indicating lamp flashing mode and the equipment state.
2. The apparatus status detecting method according to claim 1, wherein fusing all the acquired images of the apparatus to be detected to form a fused image, specifically comprises:
taking partial images of the equipment to be detected in continuous time frames, converting the partial images into gray-scale images one by one, and respectively giving covariance matrixes of pixels of the gray-scale images;
calculating respective eigenvalue and eigenvector according to all given covariance matrixes;
sorting the eigenvectors corresponding to the eigenvalues according to the magnitude of the eigenvalues, and calculating respective principal components to obtain a first principal component;
and based on the first principal component, performing inverse transformation on all the images of the equipment to be detected to obtain a fused image.
3. The apparatus state detection method according to claim 2, wherein the covariance matrix of the gray-scale map is specifically expressed as:
Figure 464312DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 934255DEST_PATH_IMAGE002
is a covariance matrix of the gray-scale map,c xy are respectively arranged in the transverse and longitudinal directions
Figure 179423DEST_PATH_IMAGE003
Figure 693188DEST_PATH_IMAGE004
The value of the pixel point of (a),
Figure 365609DEST_PATH_IMAGE005
the number of pixels in the grayscale map in the horizontal direction,
Figure 378039DEST_PATH_IMAGE006
the number of the pixel points in the gray-scale image is longitudinal;
the formula for the calculation of the principal components is as follows:
Figure 207324DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 700229DEST_PATH_IMAGE008
is a main component, and is characterized in that,
Figure 137026DEST_PATH_IMAGE009
is as follows
Figure 842945DEST_PATH_IMAGE010
The numerical value of the pixel point of the tone map,
Figure 23391DEST_PATH_IMAGE011
is as follows
Figure 761015DEST_PATH_IMAGE010
The feature vector of the intensity map is expanded,
Figure 244080DEST_PATH_IMAGE012
the number of images of the apparatus to be inspected being part of successive time frames;
based on the first principal component, all the images of the equipment to be detected are inversely transformed, and the calculation formula is as follows:
Figure 30771DEST_PATH_IMAGE013
wherein, the first and the second end of the pipe are connected with each other,
Figure 890274DEST_PATH_IMAGE014
is the value of the pixel point after the inverse transformation,
Figure 609968DEST_PATH_IMAGE015
as a result of the first principal component,
Figure 261005DEST_PATH_IMAGE016
is as follows
Figure 675937DEST_PATH_IMAGE017
And (4) opening the characteristic value of the image of the equipment to be detected, wherein L is the number of all the images of the equipment to be detected.
4. The device status detection method according to claim 1, wherein the target detection is a pre-trained indicator light detection model, and the pre-training process of the indicator light detection model specifically includes:
collecting N training images containing indicator lamps;
marking the type of the indicator light in each training image to form a corresponding label, and generating a training data set in which the training images correspond to the labels one by one;
and pre-training the target detection model by utilizing the training data set to reach a set convergence range, completing the pre-training process and generating an indicator light detection model.
5. The device status detection method according to claim 1, wherein the method comprises the steps of extracting indicator light image blocks of all to-be-detected device images according to the positions of the indicator lights, analyzing pixel point characteristic values based on color information of the indicator light image blocks, and giving an indicator light flicker mode, and specifically comprises the steps of:
representing the position of the indicator light obtained by target detection as the coordinate position of the target frame;
based on the coordinate position of the target frame, extracting image blocks corresponding to all the images of the equipment to be detected according to a time frame sequence, acquiring the image blocks of the indicator light, and forming an image block data set of the indicator light;
carrying out gray level transformation on the indicator light image block data set to obtain an indicator light image gray level set;
calculating the average gray of each indicating lamp gray image in the indicating lamp image gray set to form an average gray array of all the image gray images of the equipment to be detected;
calculating the characteristic value of the average gray array, and giving out the flicker parameter of the indicator light in the image to be detected;
extracting time frame information of all equipment images to be detected, and calculating the time interval of continuous equipment images to be detected according to the time frame information;
and giving a flashing mode of the indicator light according to the flashing parameters and the time interval of the indicator light.
6. The apparatus status detecting method according to claim 5, wherein the calculating of the average gray scale of each indicating lamp gray scale image in the indicating lamp image gray scale set to form the average gray scale array of all the detected apparatus image gray scale images specifically comprises:
the calculation formula of the average gray scale of each indicating lamp gray scale image is as follows:
Figure 749851DEST_PATH_IMAGE018
wherein W is the width of the target frame, H is the height of the target frame,
Figure 402680DEST_PATH_IMAGE019
is an average gray matrix of a single gray map,
Figure 289865DEST_PATH_IMAGE020
the pixel point value matrix is a single gray level image;
the average gray array of the gray images of all the equipment to be detected is as follows:
Figure 861267DEST_PATH_IMAGE021
wherein L is the number of all the images of the equipment to be detected.
7. The apparatus status detecting method according to claim 5, wherein calculating the feature value of the average gray array and providing the flicker parameter of the indicator light in the image to be detected specifically comprises:
and giving the variance value of the average gray level array, wherein the calculation formula is as follows:
Figure 62572DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 301399DEST_PATH_IMAGE023
is the variance value of the average gray level array,
Figure 687381DEST_PATH_IMAGE019
an average gray matrix which is a single gray image;
and analyzing the fluctuation of the average gray level array based on the relation between the variance value and the set variance threshold value, and judging the flicker parameter of the indicator light.
8. The device status detection method according to claim 7, wherein the giving of the blinking pattern of the indicator light according to the blinking parameter and the time interval of the indicator light specifically comprises:
based on the numerical value of the average gray array, giving out a cycle period of an image gray set of the indicator light;
and obtaining the flashing frequency of the indicator light according to the cycle period and the time interval, wherein the calculation formula is as follows:
Figure 811326DEST_PATH_IMAGE024
wherein F is the flicker frequency, T is the time interval,
Figure 173912DEST_PATH_IMAGE025
is a cycle period;
and comparing the alarm rules according to the flashing frequency and the color information, and giving a flashing mode of the indicator lamp.
9. The apparatus state detection method according to claim 8, wherein the step of giving a cycle period of the gray set of the image of the indicator light based on the value of the average gray array specifically comprises:
carrying out numerical transformation on the average gray array, wherein the transformation formula is as follows:
Figure 332492DEST_PATH_IMAGE026
Figure 761812DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 373053DEST_PATH_IMAGE028
representing the frequency domain response of the average gray scale array,
Figure 447320DEST_PATH_IMAGE029
is the n-th number in the average gray level array
Figure 460406DEST_PATH_IMAGE027
Is the digital frequency of the average gray level array,
Figure 919682DEST_PATH_IMAGE030
is the number of a single gray scale image,
Figure 877273DEST_PATH_IMAGE030
is 0 to
Figure 958493DEST_PATH_IMAGE031
And performing positive number conversion on the frequency domain response, and calculating a frequency value when the frequency domain response is the highest through phase frequency characteristics, wherein the calculation formula is as follows:
Figure 419561DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure 52668DEST_PATH_IMAGE033
is the frequency value at which the frequency domain response is highest,
Figure 638501DEST_PATH_IMAGE034
is the absolute value of the frequency domain response;
to be provided with
Figure 379537DEST_PATH_IMAGE033
As an average gray scale arrayThe cycle period of the average gray level array is given, and the calculation formula of the cycle period is as follows:
Figure 695111DEST_PATH_IMAGE035
10. an apparatus for implementing the method for detecting the status of the device according to any one of claims 1 to 9, comprising:
the acquisition unit is used for acquiring images of all equipment to be detected containing indicator light information in a plurality of different preset time frames;
the processing unit is used for fusing the acquired image of the equipment to be detected to form a fused image, detecting a target on the fused image and giving position and color information of the indicator light;
and the identification unit is used for extracting all indicator light image blocks of the image of the equipment to be detected according to the positions of the indicator lights, analyzing the characteristic values of the pixel points based on the color information of the indicator light image blocks, giving out the flashing mode of the indicator lights, and giving out the state of the equipment to be detected based on the corresponding relation between the flashing mode of the indicator lights and the state of the equipment.
CN202210745792.XA 2022-06-29 2022-06-29 Equipment state detection method and device for flashing mode indicator light Active CN114820616B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210745792.XA CN114820616B (en) 2022-06-29 2022-06-29 Equipment state detection method and device for flashing mode indicator light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210745792.XA CN114820616B (en) 2022-06-29 2022-06-29 Equipment state detection method and device for flashing mode indicator light

Publications (2)

Publication Number Publication Date
CN114820616A true CN114820616A (en) 2022-07-29
CN114820616B CN114820616B (en) 2023-01-10

Family

ID=82522622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210745792.XA Active CN114820616B (en) 2022-06-29 2022-06-29 Equipment state detection method and device for flashing mode indicator light

Country Status (1)

Country Link
CN (1) CN114820616B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
CN111626139A (en) * 2020-04-30 2020-09-04 上海允登信息科技有限公司 Accurate detection method for fault information of IT equipment in machine room
WO2021018144A1 (en) * 2019-07-31 2021-02-04 浙江商汤科技开发有限公司 Indication lamp detection method, apparatus and device, and computer-readable storage medium
CN112395928A (en) * 2019-08-19 2021-02-23 珠海格力电器股份有限公司 Method for automatically detecting equipment state operation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
WO2021018144A1 (en) * 2019-07-31 2021-02-04 浙江商汤科技开发有限公司 Indication lamp detection method, apparatus and device, and computer-readable storage medium
CN112395928A (en) * 2019-08-19 2021-02-23 珠海格力电器股份有限公司 Method for automatically detecting equipment state operation
CN111626139A (en) * 2020-04-30 2020-09-04 上海允登信息科技有限公司 Accurate detection method for fault information of IT equipment in machine room

Also Published As

Publication number Publication date
CN114820616B (en) 2023-01-10

Similar Documents

Publication Publication Date Title
CN101430195B (en) Method for computing electric power line ice-covering thickness by using video image processing technology
CN103761529A (en) Open fire detection method and system based on multicolor models and rectangular features
CN111047655A (en) High-definition camera cloth defect detection method based on convolutional neural network
CN113592828B (en) Nondestructive testing method and system based on industrial endoscope
CN108921099A (en) Moving ship object detection method in a kind of navigation channel based on deep learning
WO2018010386A1 (en) Method and system for component inversion testing
CN109409289A (en) A kind of electric operating safety supervision robot security job identifying method and system
CN114494185B (en) Electrical equipment fault detection method based on RGB-T multi-scale feature fusion
CN114463296B (en) Light-weight part defect detection method based on single sample learning
CN106872488A (en) A kind of double surface defect visible detection methods of rapid large-area transparent substrate and device
CN111199194A (en) Automobile intelligent cabin instrument testing method based on machine vision and deep learning
CN114581760B (en) Equipment fault detection method and system for machine room inspection
CN114441452B (en) Optical fiber pigtail detection method
CN114820616B (en) Equipment state detection method and device for flashing mode indicator light
CN114820597B (en) Smelting product defect detection method, device and system based on artificial intelligence
CN117677969A (en) Defect detection method and device
CN112241691B (en) Channel ice condition intelligent identification method based on unmanned aerial vehicle inspection and image characteristics
CN116052041A (en) Indicating lamp state identification method based on depth network
CN109635684A (en) A kind of food traceability system
CN112070164A (en) Dry and wet sludge classification method and device
CN106568782A (en) Machine vision-based method for conducting color quantization of colored bottle cap image
CN112508946A (en) Cable tunnel abnormity detection method based on antagonistic neural network
CN116563770B (en) Method, device, equipment and medium for detecting vehicle color
CN113688748B (en) Fire detection model and method
CN112364795B (en) Automatic identification method for number lamps and situation awareness method for two ships meeting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant