CN111639647B - Indicator light state identification method and device, computer equipment and storage medium - Google Patents

Indicator light state identification method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111639647B
CN111639647B CN202010441220.3A CN202010441220A CN111639647B CN 111639647 B CN111639647 B CN 111639647B CN 202010441220 A CN202010441220 A CN 202010441220A CN 111639647 B CN111639647 B CN 111639647B
Authority
CN
China
Prior art keywords
image
identified
indicator lamp
information
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010441220.3A
Other languages
Chinese (zh)
Other versions
CN111639647A (en
Inventor
肖娟
王秋阳
吴亦歌
郑博超
李德民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sunwin Intelligent Co Ltd
Original Assignee
Shenzhen Sunwin Intelligent Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sunwin Intelligent Co Ltd filed Critical Shenzhen Sunwin Intelligent Co Ltd
Priority to CN202010441220.3A priority Critical patent/CN111639647B/en
Publication of CN111639647A publication Critical patent/CN111639647A/en
Application granted granted Critical
Publication of CN111639647B publication Critical patent/CN111639647B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention relates to a method, a device, a computer device and a storage medium for identifying the state of an indicator light, wherein the method comprises the steps of marking an identifier beside the indicator light, and manufacturing a template image to obtain relevant information of an interested region corresponding to the indicator light; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp; cutting an image to be identified according to the target position information and extracting color characteristics to obtain color characteristic information; generating an indication state of the indication lamp according to the color characteristic information; and feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal. The invention enhances the stability of correcting the current image to be identified, marks the rectangular frames of different modes for different LED indicator lamps, reduces the workload of manual marking, has low environmental requirements, and can be applied to various environments and has universality.

Description

Indicator light state identification method and device, computer equipment and storage medium
Technical Field
The present invention relates to a method for identifying an indicator light, and more particularly, to a method, an apparatus, a computer device, and a storage medium for identifying the status of an indicator light.
Background
With the continuous development of national economy, the modern society is more and more dependent on electric power, and the fault of an electric power system often brings huge loss to the national economy. The transformer substation is used as a core facility in the power system, the standard of each instrument is of great importance, and the main judgment basis of the running state of the transformer equipment is provided. The intelligent patrol robot can continuously patrol the data center for 24 hours, collect environmental data, simultaneously read the abnormal condition of main equipment in real time and automatically alarm, and greatly improve the reliability and standardization of patrol. The intelligent robot is used for assisting manual inspection, so that the labor intensity can be reduced, the operation efficiency can be improved, and the operation and maintenance cost can be reduced.
With the popularization of automatic inspection robots of a transformer substation in recent years, the automatic reading identification research of pointer type meters of the transformer substation achieves some results, but still has some problems to slow down the popularization and application of the robots. At present, the state identification method of the LED indicator lamp applied to the substation inspection robot has the difficulty that the LED indicator lamp which is not lightened and has a smaller target is unstable to detect and is easy to lose, and the state of the indicator lamp cannot be accurately estimated.
In the prior art, an image segmentation method based on an ambiguity threshold value, an image recognition method based on color intensity and an image threshold segmentation method based on gray level processing are mentioned for carrying out state recognition of an indicator lamp, and the three methods are effective under the condition that an LED indicator lamp is visible in high definition for a simple background and normal illumination, but the non-lighted lamp is difficult to detect under the condition that the illumination change is relatively large for a complex environment and relatively fuzzy; there is also a method of indicator light detection recognition based on color attributes, i.e., efficient extraction of color features is performed by mapping an input image from an original RGB color space to an 11-dimensional color attribute space. The method solves the problem of missed detection caused by image distortion due to factors such as illumination, visual angle and the like, but solves the problem of size of a target; of course, a method of manually marking the LED indicator lamps can be adopted, the method can predict the position information of the LED indicator lamps in advance, the step of detecting the LED indicator lamps is not needed, the complexity of an algorithm is simplified, but in an actual scene, the LED indicator lamps are more, and if the LED indicator lamps are marked one by one, the work becomes complicated; the LED indicator lamp can be detected based on a deep learning method, the effect is better than the robustness of the traditional method, but the speed is a great problem, and the LED indicator lamp is difficult to reach real time under the condition of low hardware configuration, so that the judgment of the flickering state of the LED indicator lamp is influenced.
Therefore, a new method is necessary to be designed, the stability of correcting the current image to be identified is enhanced, the workload of manual marking is reduced, the environmental requirement is low, and the method has universality.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide an indicator lamp state identification method, an indicator lamp state identification device, computer equipment and a storage medium.
In order to achieve the above purpose, the present invention adopts the following technical scheme: the indicator light state identification method comprises the following steps:
marking a mark beside the indicator lamp, and making a template image to obtain relevant information of the region of interest corresponding to the indicator lamp;
acquiring an image to be identified;
performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
cutting the image to be identified and extracting color features according to the target position information to obtain color feature information;
generating an indication state of an indication lamp according to the color characteristic information;
and feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal.
The further technical scheme is as follows: marking a mark beside the indicator lamp, and making a template image to obtain relevant information of an interested region corresponding to the indicator lamp, wherein the method comprises the following steps:
Marking a mark beside the indicator light;
selecting a clear field image to obtain a template image, and marking a region of interest corresponding to the indicator lamp;
and storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
The further technical scheme is as follows: the feature matching of the image to be identified and the template image, and the acquisition of the target position information according to the relevant information of the region of interest corresponding to the indicator lamp, comprises the following steps:
extracting characteristic points of the image to be identified and the template image to obtain the characteristic points of the image to be identified and the characteristic points of the template image;
matching the feature points of the image to be identified and the feature points of the template image to obtain feature matching pairs;
obtaining an affine matrix of the feature matching pair;
and obtaining the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp, so as to obtain the target position information.
The further technical scheme is as follows: the matching of the feature points of the image to be identified and the feature points of the template image to obtain a feature matching pair comprises the following steps:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set;
And screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
The further technical scheme is as follows: the step of clipping the image to be identified and extracting color features according to the target position information to obtain color feature information comprises the following steps:
cutting the image to be identified according to the target position information to obtain an intermediate image;
converting the intermediate image into YIQ image space according to RGB image space conversion to obtain a target image;
and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
The further technical scheme is as follows: the generating the indication state of the indicator lamp according to the color characteristic information comprises the following steps:
judging whether the color characteristic information is larger than a color threshold value or not;
if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state;
and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
The invention also provides an indicator lamp state recognition device, which is characterized by comprising:
the preprocessing unit is used for marking a mark beside the indicator lamp and making a template image so as to obtain relevant information of the region of interest corresponding to the indicator lamp;
the image acquisition unit is used for acquiring an image to be identified;
the matching unit is used for carrying out feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
the color information acquisition unit is used for cutting the image to be identified and extracting color characteristics according to the target position information so as to obtain color characteristic information;
a state generating unit for generating an indication state of the indicator lamp according to the color characteristic information;
and the state feedback unit is used for feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal.
The further technical scheme is as follows: the preprocessing unit includes:
a marking subunit for marking a mark beside the indicator light;
the template generation subunit is used for selecting a clear field image to obtain a template image and marking out a region of interest corresponding to the indicator lamp;
And the storage subunit is used for storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
The invention also provides a computer device which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the method when executing the computer program.
The present invention also provides a storage medium storing a computer program which, when executed by a processor, performs the above-described method.
Compared with the prior art, the invention has the beneficial effects that: according to the invention, the stability of correcting the current image to be identified is enhanced by arranging the mark beside the indicator lamp, the template image is arranged and the region of interest corresponding to the indicator lamp is arranged, the rectangular frame marking of different modes is carried out on different LED indicator lamps, the workload of manual marking is reduced, after the image to be identified is obtained, the characteristic point matching is carried out on the template image, the state of the indicator lamp is judged after the color characteristic extraction is carried out, the environment requirement is not high, the LED indicator lamp can be applied to various environments, and the LED indicator lamp with various shapes and sizes has universality.
The invention is further described below with reference to the drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an application scenario of an indicator light state recognition method according to an embodiment of the present invention;
fig. 2 is a flow chart of an indicator light status recognition method according to an embodiment of the present invention;
fig. 3 is a schematic sub-flowchart of an indicator light status recognition method according to an embodiment of the present invention;
fig. 4 is a schematic sub-flowchart of an indicator light status recognition method according to an embodiment of the present invention;
fig. 5 is a schematic sub-flowchart of an indicator light status recognition method according to an embodiment of the present invention;
fig. 6 is a schematic sub-flowchart of an indicator light status recognition method according to an embodiment of the present invention;
fig. 7 is a schematic sub-flowchart of an indicator light status recognition method according to an embodiment of the present invention;
FIG. 8 is a schematic block diagram of an indicator light status recognition device according to an embodiment of the present invention;
fig. 9 is a schematic block diagram of a preprocessing unit of the indicator light state recognition device provided by the embodiment of the invention;
fig. 10 is a schematic block diagram of a matching unit of the indicator light state recognition device according to the embodiment of the present invention;
FIG. 11 is a schematic block diagram of a matching pair generation subunit of the indicator light status recognition device provided by an embodiment of the present invention;
fig. 12 is a schematic block diagram of a color information acquisition unit of the indicator light state recognition device according to the embodiment of the present invention;
fig. 13 is a schematic block diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic diagram of an application scenario of an indicator light status recognition method according to an embodiment of the present invention. Fig. 2 is a schematic flowchart of an indicator light status recognition method according to an embodiment of the present invention. The indicator light state identification method is applied to the server. The server and the terminal are in data interaction, wherein the server can be placed on the inspection robot, a camera is arranged on the inspection robot, the shot image is transmitted to the server for extraction and analysis, the analyzed state is displayed on the terminal, and the terminal can be a display screen and is integrated on the inspection robot.
Fig. 2 is a flowchart of an indicator light status recognition method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S160.
S110, marking a mark beside the indicator lamp, and making a template image to obtain relevant information of the region of interest corresponding to the indicator lamp.
In this embodiment, the template image refers to a field image to which reference can be applied, and the information on the region of interest corresponding to the indicator light refers to information such as coordinates of the position where the indicator light is located. The LED indicator lamp is provided with the identifier with obvious characteristics, so that the stability of correcting the current image to be identified is enhanced, and the workload of manual marking is reduced by marking different rectangular frames of different LED indicator lamps.
In one embodiment, referring to fig. 3, the step S110 may include steps S111 to S113.
S111, marking a mark beside the indicator lamp.
In this embodiment, a mark with obvious key point features is attached to the side of the on-site LED indicator lamp.
The mark with obvious key point characteristics refers to the characteristic of the mark with obvious angular points, such as a bar code and a two-dimensional code, or a manufactured image, when the current image to be identified has horizontal offset or angle rotation compared with the template image, and the like, the current image to be identified and the template image need to be aligned, so that the characteristic points of the marks in the image to be identified and the template image can be respectively extracted.
S112, selecting a clear field image to obtain a template image, and marking a region of interest corresponding to the indicator lamp.
Specifically, a clear live image is selected as a template image, and the template image is marked, that is, an ROI (region of interest ) area is drawn for each LED indicator.
And S113, storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection points.
The region of interest corresponding to the indicator lamps is that a rectangular frame is drawn for each LED indicator lamp, if the LED indicator lamps are fewer, the rectangular frames can be respectively marked for each LED indicator lamp, if the LED indicator lamps are more and distributed in a matrix, the rectangular frames can be marked in a form, namely, the number of rows and the number of columns of the form are given according to the arrangement matrix of the LED indicator lamps in the image, then the rectangular frames are punched and pulled down from the upper left corner point of the LED indicator lamps on the image, a form with the same number as the arrangement matrix of the LED indicator lamps is generated, and the small boxes of the form are the coordinate positions of the LED indicator lamps.
The template image and the relevant information of the region of interest corresponding to the indicator light are stored in a mode of naming a patrol point. For example, the template image of the inspection point1 is named as point1.jpg, each inspection point corresponds to an information file, for example, the labeling information of the inspection point1 is stored in the point1.txt, and the total number of the stored information is recorded as n. The coordinate information of each rectangular frame is (i, j), and the coordinate information of each rectangular frame represents the ith row and the jth LED indicator lamp.
The LED indicator lamp has low requirements on environment, can be applied to various environments, is also used for LED indicator lamps with various shapes and sizes, and has universality.
S120, acquiring an image to be identified.
In this embodiment, the image to be identified refers to a current live photo taken by the inspection robot.
And S130, performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp.
In this embodiment, the target position information refers to coordinate information of the current indicator light in the image to be identified.
In one embodiment, referring to fig. 4, the step S130 may include steps S131 to S134.
S131, extracting feature points of the image to be identified and the template image to obtain feature points of the image to be identified and feature points of the template image.
In this embodiment, the feature point refers to an indicator light within the image.
Specifically, feature points of the template image and the current image to be identified are acquired by using an ORB (algorithm for rapid feature point extraction and description, oriented FAST and Rotated BRIEF) feature extraction method.
And S132, matching the feature points of the image to be identified and the feature points of the template image to obtain a feature matching pair.
In this embodiment, the feature matching pair refers to a feature point combination in which the similarity between two feature points in the image to be identified and the template image satisfies the requirement.
The matching of features is performed for feature descriptors, which are usually vectors, and the distance between two feature descriptors can reflect the similarity degree, i.e. the two feature points are not identical. Different distance measures may be selected depending on the feature descriptors. If it is a floating point type of feature descriptor, its Euclidean distance may be used; for binary feature descriptors, their hamming distance may be used, i.e. the hamming distance between two different binary strings refers to the number of different bits of the two binary strings.
With the method of calculating descriptor similarity, how to find the feature points most similar to the feature points in the feature point set is the matching of the feature points. The matching can be carried out by the following method:
the violence matching method is that the distances between a certain feature point descriptor and all other feature point descriptors are calculated, then the obtained distances are sequenced, and the closest one is taken as a matching point;
filtering an error matching method;
The Hamming distance is less than twice the minimum distance, namely, the Hamming distance of the matched point pair is selected to be less than twice the minimum distance as a judgment basis, if the Hamming distance is less than the minimum distance, the point pair is considered as a wrong match, and the point pair is filtered; above this value, a correct match is considered;
cross-matching, i.e. cross-filtering, is to be simple, and to perform the matching again, and in turn to use the matched points, if the matched point is still the first matched point, this is considered to be a correct match. For example, if the first feature point a uses a method of violent matching, the matched feature point is a feature point B; in turn, feature point B is used for matching, which is considered a correct match if it is still feature point a, or an incorrect match.
KNN matching, i.e. K nearest neighbor matching, selects K points most similar to the feature point at the time of matching, and if the difference between these K points is sufficiently large, selects that point most similar as the matching point, and typically selects k=2, i.e. nearest neighbor matching. For each match two nearest neighbors are returned, if the first and second match distance ratio is large enough, i.e. the vector distance is far enough, then this is considered a correct match, and the threshold of the ratio is typically around 2.
A random sampling consistency method calculates a homography matrix between two images by using matching points, and then judges whether a certain match is a correct match or not by using a reprojection error.
In one embodiment, referring to fig. 5, the step S132 may include steps S1321 to S1322.
S1321, calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set.
In this embodiment, the similarity set refers to a set of the similarity of all feature points of the image to be identified and all feature points of the template image.
S1322, screening feature points of the image to be identified and feature points of the template image, which correspond to the similarity which is not lower than the set threshold value in the similarity set, so as to form feature matching pairs.
And carrying out characteristic point matching by a knnMatch (K nearest neighbor matching, K-Nearest Neighbor Match) algorithm to obtain a characteristic point matching pair. In practical applications, there are often wrong matching pairs, and the wrong matching pairs are brought into a final motion model to cause great errors, so that a ransac (random sampling coincidence, RANdom SAmple Consensus) algorithm is required to be used for eliminating the wrong matching pairs.
S133, obtaining an affine matrix of the feature matching pair.
In this embodiment, the affine matrix refers to a matrix obtained by affine transformation, which is also called affine mapping, and refers to a geometry in which one vector space is subjected to linear transformation and then translated into another vector space.
Specifically, affine matrix is solved by the preserved matching pairs and perspective change principle.
S134, obtaining the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp, so as to obtain the target position information.
The position information of the indicator lamp in the image to be identified can be obtained according to the affine matrix and the relevant information of the region of interest corresponding to the indicator lamp, the region of interest of the indicator lamp is determined by utilizing the affine matrix, and then the target position information can be obtained according to the relevant information of the region of interest corresponding to the indicator lamp.
And S140, cutting the image to be identified and extracting color features according to the target position information to obtain color feature information.
In this embodiment, the color characteristic information refers to the percentage of the color of the indicator light to the image to be recognized after clipping.
In one embodiment, referring to fig. 6, the step S140 may include steps S141 to S143.
S141, clipping the image to be identified according to the target position information to obtain an intermediate image.
In the present embodiment, the intermediate image refers to an image including only the indicator lamp.
S142, converting the intermediate image into a YIQ image space mode according to an RGB image space mode to obtain a target image.
In the present embodiment, the target image refers to an intermediate image converted into YIQ image space.
S143, acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
The color feature extraction is to convert an RGB image space into a YIQ image space, and obtain a binary image by a threshold value method of an I component of the YIQ image space, wherein the value of the I component is larger than a threshold value and is 1 and smaller than the threshold value and is 0; the threshold value in this embodiment is preset to 230.
S150, generating an indication state of the indicator lamp according to the color characteristic information.
In one embodiment, referring to fig. 7, the step S150 may include steps S151 to S153.
S151, judging whether the color characteristic information is larger than a color threshold value or not;
S152, if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state;
and S153, if the color characteristic information is not greater than a color threshold value, the indication state of the indicator lamp is an off state.
Judging the state of the LED indicator lamp through the color characteristic information, namely calculating the percentage of the binary image element 1 in the whole target image area, if the percentage value is larger than the set color threshold value, the LED indicator lamp is on, otherwise, the LED indicator lamp is not on, and in the embodiment, the color threshold value is 0.01.
And S160, feeding back the indication state of the indicator lamp to the terminal so as to display the indication state at the terminal.
According to the indicating lamp state identification method, the stability of correcting the current image to be identified is enhanced by arranging the identification beside the indicating lamp, the template image is arranged, the region of interest corresponding to the indicating lamp is arranged, the rectangular frame marking of different modes is carried out on different LED indicating lamps, the workload of manual marking is reduced, after the image to be identified is obtained, the matching of characteristic points is carried out with the template image, the state of the indicating lamp is judged after the color characteristic extraction is carried out, the environment requirement is not high, the method can be applied to various environments, and the LED indicating lamp with various shapes and various sizes is also used, so that the method has universality.
Fig. 8 is a schematic block diagram of an indicator light status recognition device 300 according to an embodiment of the present invention. As shown in fig. 8, the present invention further provides an indicator light status recognition device 300 corresponding to the above indicator light status recognition method. The indicator light state recognition apparatus 300 includes a unit for performing the above-described indicator light state recognition method, and may be configured in a server. Specifically, referring to fig. 8, the indicator light state recognition apparatus 300 includes a preprocessing unit 301, an image acquisition unit 302, a matching unit 303, a color information acquisition unit 304, a state generation unit 305, and a state feedback unit 306.
The preprocessing unit 301 is configured to mark a mark beside the indicator light, and make a template image to obtain relevant information of a region of interest corresponding to the indicator light; an image acquisition unit 302, configured to acquire an image to be identified; the matching unit 303 is configured to perform feature matching on the image to be identified and the template image, and obtain target position information according to the relevant information of the region of interest corresponding to the indicator light; a color information obtaining unit 304, configured to clip the image to be identified and extract color features according to the target position information, so as to obtain color feature information; a state generating unit 305 for generating an indication state of the indicator lamp according to the color feature information; and the state feedback unit 306 is configured to feedback the indication state of the indicator lamp to the terminal, so as to display the indication state at the terminal.
In one embodiment, as shown in fig. 9, the preprocessing unit 301 includes a marking subunit 3011, a template generation subunit 3012, and a save subunit 3013.
A marking subunit 3011, configured to mark a marker beside the indicator light; the template generation subunit 3012 is configured to select a clear field image to obtain a template image, and mark a region of interest corresponding to the indicator light; and the storage subunit 3013 is used for storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
In one embodiment, as shown in fig. 10, the matching unit 303 includes a feature point extracting subunit 3031, a matching pair generating subunit 3032, a matrix acquiring subunit 3033, and a position acquiring subunit 3034.
The feature point extraction subunit 3031 is configured to perform feature point extraction on the image to be identified and the template image to obtain feature points of the image to be identified and feature points of the template image; a matching pair generating subunit 3032, configured to match the feature points of the image to be identified and the feature points of the template image to obtain a feature matching pair; a matrix acquisition subunit 3033, configured to acquire an affine matrix for the feature matching pair; the position obtaining subunit 3034 is configured to obtain the position information of the indicator lamp in the image to be identified according to the affine matrix matched by the feature matching and the related information of the region of interest corresponding to the indicator lamp, so as to obtain the target position information.
In one embodiment, as shown in fig. 11, the matching pair generating subunit 3032 includes a similarity calculating subunit 30321 and a filtering subunit 30322.
A similarity calculating subunit 30321, configured to calculate a similarity between the feature points of the image to be identified and the feature points of the template image, so as to obtain a similarity set; and the screening subunit 30322 is configured to screen feature points of the image to be identified and feature points of the template image corresponding to the similarity not lower than the set threshold in the similarity set to form a feature matching pair.
In one embodiment, as shown in fig. 12, the color information acquisition unit 304 includes a clipping subunit 3041, a conversion subunit 3042, and an information extraction subunit 3043.
A clipping subunit 3041, configured to clip the image to be identified according to the target position information, so as to obtain an intermediate image; a conversion subunit 3042, configured to convert the intermediate image into YIQ image space according to RGB image space conversion, so as to obtain a target image; the information extraction subunit 3043 is configured to obtain a binary image corresponding to the target image, and calculate a percentage of the binary image in the whole target image, so as to obtain color feature information.
In an embodiment, the state generating unit 305 is configured to determine whether the color feature information is greater than a color threshold; if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
It should be noted that, as will be clearly understood by those skilled in the art, the specific implementation process of the indicator light state recognition device 300 and each unit may refer to the corresponding description in the foregoing method embodiments, and for convenience and brevity of description, the description is omitted here.
The above-described indicator light state recognition means 300 may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 13.
Referring to fig. 13, fig. 13 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, where the server may be a stand-alone server or may be a server cluster formed by a plurality of servers.
With reference to FIG. 13, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 includes program instructions that, when executed, cause the processor 502 to perform a method of identifying status of an indicator light.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the execution of a computer program 5032 in the non-volatile storage medium 503, which computer program 5032, when executed by the processor 502, causes the processor 502 to perform a method of identifying status of an indicator light.
The network interface 505 is used for network communication with other devices. It will be appreciated by those skilled in the art that the structure shown in fig. 13 is merely a block diagram of some of the structures associated with the present application and does not constitute a limitation of the computer device 500 to which the present application is applied, and that a particular computer device 500 may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to execute a computer program 5032 stored in a memory to implement the steps of:
Marking a mark beside the indicator lamp, and making a template image to obtain relevant information of the region of interest corresponding to the indicator lamp; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp; cutting the image to be identified and extracting color features according to the target position information to obtain color feature information; generating an indication state of an indication lamp according to the color characteristic information; and feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal.
In an embodiment, when the step of marking an identifier beside the indicator light and making a template image to obtain relevant information of the region of interest corresponding to the indicator light is implemented by the processor 502, the following steps are specifically implemented:
marking a mark beside the indicator light; selecting a clear field image to obtain a template image, and marking a region of interest corresponding to the indicator lamp; and storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
In an embodiment, when the processor 502 performs the step of performing feature matching on the image to be identified and the template image, and obtaining the target position information according to the relevant information of the region of interest corresponding to the indicator, the following steps are specifically implemented:
Extracting characteristic points of the image to be identified and the template image to obtain the characteristic points of the image to be identified and the characteristic points of the template image; matching the feature points of the image to be identified and the feature points of the template image to obtain feature matching pairs; obtaining an affine matrix of the feature matching pair; and obtaining the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp, so as to obtain the target position information.
In an embodiment, when the processor 502 performs the matching between the feature points of the image to be identified and the feature points of the template image to obtain the feature matching pair, the following steps are specifically implemented:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set; and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
In an embodiment, when the step of clipping the image to be identified according to the target position information and extracting the color features to obtain the color feature information is implemented by the processor 502, the following steps are specifically implemented:
Cutting the image to be identified according to the target position information to obtain an intermediate image; converting the intermediate image into YIQ image space according to RGB image space conversion to obtain a target image; and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
In an embodiment, when the step of generating the indication state of the indicator light according to the color feature information is implemented by the processor 502, the following steps are specifically implemented:
judging whether the color characteristic information is larger than a color threshold value or not; if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
It should be appreciated that in embodiments of the present application, the processor 502 may be a central processing unit (Central Processing Unit, CPU), the processor 502 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those skilled in the art will appreciate that all or part of the flow in a method embodying the above described embodiments may be accomplished by computer programs instructing the relevant hardware. The computer program comprises program instructions, and the computer program can be stored in a storage medium, which is a computer readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer readable storage medium. The storage medium stores a computer program which, when executed by a processor, causes the processor to perform the steps of:
marking a mark beside the indicator lamp, and making a template image to obtain relevant information of the region of interest corresponding to the indicator lamp; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp; cutting the image to be identified and extracting color features according to the target position information to obtain color feature information; generating an indication state of an indication lamp according to the color characteristic information; and feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal.
In an embodiment, when the processor executes the computer program to implement the step of marking an identifier beside the indicator light and making a template image to obtain relevant information of the region of interest corresponding to the indicator light, the specific implementation steps include:
marking a mark beside the indicator light; selecting a clear field image to obtain a template image, and marking a region of interest corresponding to the indicator lamp; and storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
In an embodiment, when the processor executes the computer program to perform the step of performing feature matching on the image to be identified and the template image, and acquiring the target position information according to the relevant information of the region of interest corresponding to the indicator, the processor specifically performs the following steps:
extracting characteristic points of the image to be identified and the template image to obtain the characteristic points of the image to be identified and the characteristic points of the template image; matching the feature points of the image to be identified and the feature points of the template image to obtain feature matching pairs; obtaining an affine matrix of the feature matching pair; and obtaining the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp, so as to obtain the target position information.
In an embodiment, when the processor executes the computer program to match the feature points of the image to be identified and the feature points of the template image to obtain the feature matching pair, the processor specifically realizes the following steps:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set; and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
In one embodiment, when the processor executes the computer program to implement the step of cropping the image to be identified according to the target position information and extracting the color features to obtain the color feature information, the processor specifically implements the following steps:
cutting the image to be identified according to the target position information to obtain an intermediate image; converting the intermediate image into YIQ image space according to RGB image space conversion to obtain a target image; and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
In one embodiment, when the processor executes the computer program to implement the step of generating the indication state of the indicator lamp according to the color feature information, the processor specifically implements the following steps:
judging whether the color characteristic information is larger than a color threshold value or not; if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
The storage medium may be a U-disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, or other various computer-readable storage media that can store program codes.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be combined, divided and deleted according to actual needs. In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The integrated unit may be stored in a storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a terminal, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (5)

1. The method for identifying the state of the indicator lamp is characterized by comprising the following steps:
marking a mark beside the indicator lamp, and making a template image to obtain relevant information of the region of interest corresponding to the indicator lamp;
acquiring an image to be identified;
performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
cutting the image to be identified and extracting color features according to the target position information to obtain color feature information;
generating an indication state of an indication lamp according to the color characteristic information;
feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal;
the feature matching of the image to be identified and the template image, and the acquisition of the target position information according to the relevant information of the region of interest corresponding to the indicator lamp, comprises the following steps:
Extracting characteristic points of the image to be identified and the template image to obtain the characteristic points of the image to be identified and the characteristic points of the template image;
matching the feature points of the image to be identified and the feature points of the template image to obtain feature matching pairs;
obtaining an affine matrix of the feature matching pair;
obtaining the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp so as to obtain the target position information;
the matching of the feature points of the image to be identified and the feature points of the template image to obtain a feature matching pair comprises the following steps:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set;
screening feature points of the image to be identified and feature points of the template image corresponding to the similarity which is not lower than a set threshold in the similarity set to form feature matching pairs;
the step of clipping the image to be identified and extracting color features according to the target position information to obtain color feature information comprises the following steps:
cutting the image to be identified according to the target position information to obtain an intermediate image;
Converting the intermediate image into YIQ image space according to RGB image space conversion to obtain a target image;
acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information;
the generating the indication state of the indicator lamp according to the color characteristic information comprises the following steps:
judging whether the color characteristic information is larger than a color threshold value or not;
if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state;
and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
2. The method for identifying the status of an indicator lamp according to claim 1, wherein marking a mark beside the indicator lamp and making a template image to obtain information about a region of interest corresponding to the indicator lamp, comprises:
marking a mark beside the indicator light;
selecting a clear field image to obtain a template image, and marking a region of interest corresponding to the indicator lamp;
and storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point.
3. The utility model provides an pilot lamp state recognition device which characterized in that includes:
the preprocessing unit is used for marking a mark beside the indicator lamp and making a template image so as to obtain relevant information of the region of interest corresponding to the indicator lamp;
the image acquisition unit is used for acquiring an image to be identified;
the matching unit is used for carrying out feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
the color information acquisition unit is used for cutting the image to be identified and extracting color characteristics according to the target position information so as to obtain color characteristic information;
a state generating unit for generating an indication state of the indicator lamp according to the color characteristic information;
the state feedback unit is used for feeding back the indication state of the indication lamp to the terminal so as to display the indication state at the terminal;
the preprocessing unit includes:
a marking subunit for marking a mark beside the indicator light;
the template generation subunit is used for selecting a clear field image to obtain a template image and marking out a region of interest corresponding to the indicator lamp;
the storage subunit is used for storing the corresponding template image and the relevant information of the region of interest corresponding to the indicator lamp according to the inspection point;
The matching unit comprises a feature point extraction subunit, a matching pair generation subunit, a matrix acquisition subunit and a position acquisition subunit;
the characteristic point extraction subunit is used for extracting characteristic points of the image to be identified and the template image to obtain characteristic points of the image to be identified and characteristic points of the template image; the matching pair generating subunit is used for matching the characteristic points of the image to be identified and the characteristic points of the template image to obtain a characteristic matching pair; a matrix acquisition subunit, configured to acquire an affine matrix for the feature matching pair; the position acquisition subunit is used for acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix matched with the characteristics and the related information of the region of interest corresponding to the indicator lamp so as to obtain target position information;
the matching pair generation subunit comprises a similarity calculation subunit and a screening subunit;
the similarity calculation subunit is used for calculating the similarity of the feature points of the image to be identified and the feature points of the template image so as to obtain a similarity set; the screening subunit is used for screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs;
The color information acquisition unit comprises a clipping subunit, a conversion subunit and an information extraction subunit;
the clipping subunit is used for clipping the image to be identified according to the target position information so as to obtain an intermediate image; the conversion subunit is used for converting the intermediate image into a YIQ image space mode according to an RGB image space so as to obtain a target image; the information extraction subunit is used for acquiring a binary image corresponding to the target image and calculating the percentage of the binary image in the whole target image to obtain color characteristic information;
a state generating unit, configured to determine whether the color feature information is greater than a color threshold; if the color characteristic information is larger than a color threshold value, the indication state of the indicator lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indication state of the indicator lamp is an extinction state.
4. A computer device, characterized in that it comprises a memory on which a computer program is stored and a processor which, when executing the computer program, implements the method according to any of claims 1-2.
5. A storage medium storing a computer program which, when executed by a processor, performs the method of any one of claims 1 to 2.
CN202010441220.3A 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium Active CN111639647B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010441220.3A CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010441220.3A CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111639647A CN111639647A (en) 2020-09-08
CN111639647B true CN111639647B (en) 2023-07-25

Family

ID=72329036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010441220.3A Active CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111639647B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950535B (en) * 2020-09-23 2022-07-12 苏州科达科技股份有限公司 Traffic signal lamp color, color recognition method, electronic device and storage medium
CN112115897B (en) * 2020-09-24 2023-12-22 深圳市赛为智能股份有限公司 Multi-pointer instrument alarm detection method, device, computer equipment and storage medium
CN112364740B (en) * 2020-10-30 2024-04-19 交控科技股份有限公司 Unmanned aerial vehicle room monitoring method and system based on computer vision
CN112364780A (en) * 2020-11-11 2021-02-12 许继集团有限公司 Method for identifying state of indicator lamp
CN117094966B (en) * 2023-08-21 2024-04-05 青岛美迪康数字工程有限公司 Tongue image identification method and device based on image amplification and computer equipment
CN117670884A (en) * 2024-01-31 2024-03-08 深圳中科精工科技有限公司 Image labeling method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529556A (en) * 2016-11-16 2017-03-22 国家电网公司 Visual inspection system for instrument indicator lamp
CN107103330A (en) * 2017-03-31 2017-08-29 深圳市浩远智能科技有限公司 A kind of LED status recognition methods and device
CN107392116A (en) * 2017-06-30 2017-11-24 广州广电物业管理有限公司 A kind of indicator lamp recognition methods and system
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN111178200A (en) * 2019-12-20 2020-05-19 海南车智易通信息技术有限公司 Identification method of instrument panel indicator lamp and computing equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529556A (en) * 2016-11-16 2017-03-22 国家电网公司 Visual inspection system for instrument indicator lamp
CN107103330A (en) * 2017-03-31 2017-08-29 深圳市浩远智能科技有限公司 A kind of LED status recognition methods and device
CN107392116A (en) * 2017-06-30 2017-11-24 广州广电物业管理有限公司 A kind of indicator lamp recognition methods and system
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN111178200A (en) * 2019-12-20 2020-05-19 海南车智易通信息技术有限公司 Identification method of instrument panel indicator lamp and computing equipment

Also Published As

Publication number Publication date
CN111639647A (en) 2020-09-08

Similar Documents

Publication Publication Date Title
CN111639647B (en) Indicator light state identification method and device, computer equipment and storage medium
CN112906694B (en) Reading correction system and method for transformer substation inclined pointer instrument image
CN110659636B (en) Pointer instrument reading identification method based on deep learning
CN112115893A (en) Instrument panel pointer reading identification method and device, computer equipment and storage medium
Stent et al. Detecting change for multi-view, long-term surface inspection.
CN112418216B (en) Text detection method in complex natural scene image
CN112508098B (en) Dial plate positioning and automatic reading pointer type meter value identification method and system
AU2020103716A4 (en) Training method and device of automatic identification device of pointer instrument with numbers in natural scene
CN110533654A (en) The method for detecting abnormality and device of components
CN110738164A (en) Part abnormity detection method, model training method and device
CN111768498A (en) Visual positioning method and system based on dense semantic three-dimensional map and mixed features
CN111563896A (en) Image processing method for catenary anomaly detection
CN111353502B (en) Digital table identification method and device and electronic equipment
CN114758249A (en) Target object monitoring method, device, equipment and medium based on field night environment
CN116168351A (en) Inspection method and device for power equipment
CN114120094A (en) Water pollution identification method and system based on artificial intelligence
CN109614512B (en) Deep learning-based power equipment retrieval method
CN115019294A (en) Pointer instrument reading identification method and system
CN114998432A (en) YOLOv 5-based circuit board detection point positioning method
CN114627461A (en) Method and system for high-precision identification of water gauge data based on artificial intelligence
CN111738259A (en) Tower state detection method and device
CN116977250A (en) Defect detection method and device for industrial parts and computer equipment
Zhang et al. A YOLOv3-Based Industrial Instrument Classification and Reading Recognition Method
Li et al. HybridPoint: Point Cloud Registration Based on Hybrid Point Sampling and Matching
CN114549513A (en) Part identification method, part identification device, quality inspection method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant