CN111639647A - Indicating lamp state identification method and device, computer equipment and storage medium - Google Patents

Indicating lamp state identification method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111639647A
CN111639647A CN202010441220.3A CN202010441220A CN111639647A CN 111639647 A CN111639647 A CN 111639647A CN 202010441220 A CN202010441220 A CN 202010441220A CN 111639647 A CN111639647 A CN 111639647A
Authority
CN
China
Prior art keywords
image
indicator light
information
color
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010441220.3A
Other languages
Chinese (zh)
Other versions
CN111639647B (en
Inventor
肖娟
王秋阳
吴亦歌
郑博超
李德民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sunwin Intelligent Co Ltd
Original Assignee
Shenzhen Sunwin Intelligent Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sunwin Intelligent Co Ltd filed Critical Shenzhen Sunwin Intelligent Co Ltd
Priority to CN202010441220.3A priority Critical patent/CN111639647B/en
Publication of CN111639647A publication Critical patent/CN111639647A/en
Application granted granted Critical
Publication of CN111639647B publication Critical patent/CN111639647B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention relates to a method, a device, computer equipment and a storage medium for identifying the state of an indicator light, wherein the method comprises the steps of marking an identifier beside the indicator light and manufacturing a template image to obtain the related information of an interested area corresponding to the indicator light; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light; cutting and extracting color features of the image to be recognized according to the target position information to obtain color feature information; generating an indication state of an indicator lamp according to the color characteristic information; and feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal. The invention enhances the stability of correcting the current image to be recognized, carries out rectangular frame labeling in different modes on different LED indicating lamps, reduces the workload of manual labeling, has low requirement on environment, can be applied to various environments and has universality.

Description

Indicating lamp state identification method and device, computer equipment and storage medium
Technical Field
The invention relates to an indicator light identification method, in particular to an indicator light state identification method, an indicator light state identification device, computer equipment and a storage medium.
Background
With the continuous development of national economy, the modern society has greater and greater dependence on electric power, and the fault of an electric power system can bring huge loss to the national economy. The transformer substation is used as a core facility in an electric power system, and the standard of each instrument in the transformer substation is of great importance and is a main judgment basis for the operation state of transformer equipment. The intelligent patrol robot can continuously patrol in the data center for 24 hours, collect environmental data, simultaneously read the abnormal conditions of main equipment in real time and automatically alarm, and greatly improve the reliability and the normalization of patrol. The intelligent robot assists in manual inspection, so that the labor intensity can be reduced, the operation efficiency can be improved, and the operation and maintenance cost can be reduced.
In recent years, with the popularization of automatic substation inspection robots, the automatic reading and identifying research of the pointer type instrument of the substation has achieved some achievements, but still has some problems to slow down the popularization and application of the automatic substation inspection robots. At present, the LED indicator lamp state identification method applied to the transformer substation inspection robot has the difficulty that the LED indicator lamp which is not on and has a small target is unstable in detection and easy to lose, and the state of the indicator lamp cannot be accurately estimated.
In the prior art, an image segmentation method based on a fuzziness threshold value, an image identification method based on color chroma and an image threshold value segmentation method based on gray level processing are used for indicating lamp state identification, the three methods are effective under the conditions of simple background and normal illumination and high-definition visibility of an LED indicating lamp, but the method is difficult to detect the unlighted lamp under the conditions of complex environment and relatively fuzzy illumination and relatively large illumination change; there is also an indicator light detection recognition method based on color attributes, i.e., by mapping an input image from an original RGB color space to an 11-dimensional color attribute space, efficient extraction of color features is performed. Although the method solves the problem of missed detection caused by image distortion due to factors such as illumination, visual angle and the like, the method cannot solve the problem of the size of the target; certainly, a method of manually marking the LED indicator lamps can be adopted, the method can predict the position information of the LED indicator lamps in advance, the step of detecting the LED indicator lamps is not needed, the complexity of the algorithm is simplified, in an actual scene, the number of the LED indicator lamps is large, and if one LED indicator lamp is marked, the work becomes complicated; the LED indicating lamp can be detected based on a deep learning method, the effect is better than the robustness of the traditional method, but the speed is a great problem, and real-time detection is difficult to achieve under the condition of low-configuration hardware, so that the judgment of the state of the LED indicating lamp flickering is influenced.
Therefore, it is necessary to design a new method, which enhances the stability of correcting the current image to be recognized, reduces the workload of manual labeling, has low requirements on environment, and has universality.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an indicator light state identification method, an indicator light state identification device, computer equipment and a storage medium.
In order to achieve the purpose, the invention adopts the following technical scheme: the status identification method of the indicator lamp comprises the following steps:
marking a mark beside the indicator light, and making a template image to obtain the related information of the region of interest corresponding to the indicator light;
acquiring an image to be identified;
performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light;
cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information;
generating an indication state of an indicator lamp according to the color characteristic information;
and feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal.
The further technical scheme is as follows: the method for marking a mark beside the indicator light and making a template image to obtain the relevant information of the region of interest corresponding to the indicator light comprises the following steps:
marking a mark beside the indicator light;
selecting a clear field image to obtain a template image, and marking an interested area corresponding to the indicator light;
and storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
The further technical scheme is as follows: the characteristic matching of the image to be identified and the template image is performed, and the target position information is obtained according to the relevant information of the region of interest corresponding to the indicator lamp, which comprises the following steps:
extracting characteristic points of the image to be recognized and the template image to obtain the characteristic points of the image to be recognized and the characteristic points of the template image;
matching the characteristic points of the image to be recognized and the characteristic points of the template image to obtain characteristic matching pairs;
obtaining an affine matrix of the feature matching pairs;
and acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the related information of the region of interest corresponding to the indicator lamp to obtain the target position information.
The further technical scheme is as follows: the matching of the feature points of the image to be recognized and the feature points of the template image to obtain feature matching pairs comprises the following steps:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set;
and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
The further technical scheme is as follows: the cutting and color feature extraction of the image to be recognized according to the target position information to obtain color feature information includes:
cutting the image to be recognized according to the target position information to obtain an intermediate image;
converting the intermediate image according to a mode of converting an RGB image space into a YIQ image space to obtain a target image;
and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
The further technical scheme is as follows: the generating of the indication state of the indicator light according to the color feature information includes:
judging whether the color characteristic information is larger than a color threshold value;
if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state;
and if the color characteristic information is not greater than the color threshold value, the indicating state of the indicator lamp is a turning-off state.
The invention also provides an indicating lamp state recognition device, which is characterized by comprising:
the preprocessing unit is used for marking a mark beside the indicator light and making a template image so as to obtain the related information of the region of interest corresponding to the indicator light;
the image acquisition unit is used for acquiring an image to be identified;
the matching unit is used for carrying out feature matching on the image to be identified and the template image and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
the color information acquisition unit is used for cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information;
the state generating unit is used for generating the indicating state of the indicating lamp according to the color characteristic information;
and the state feedback unit is used for feeding back the indication state of the indicator lamp to the terminal so as to display the indication state at the terminal.
The further technical scheme is as follows: the preprocessing unit includes:
the marking subunit is used for marking a mark beside the indicator light;
the template generation subunit is used for selecting a clear field image to obtain a template image and marking an interested area corresponding to the indicator light;
and the storage subunit is used for storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory is stored with a computer program, and the processor realizes the method when executing the computer program.
The invention also provides a storage medium storing a computer program which, when executed by a processor, is operable to carry out the method as described above.
Compared with the prior art, the invention has the beneficial effects that: the invention enhances the stability of correcting the current image to be recognized by arranging the mark beside the indicator lamp, sets the template image and the region of interest corresponding to the indicator lamp, marks different rectangular frames of different modes on different LED indicator lamps, reduces the workload of manual marking, matches the characteristic points with the template image after acquiring the image to be recognized, judges the state of the indicator lamp after extracting color characteristics, has low requirement on environment, can be applied to various environments, is also used for LED indicator lamps with various shapes and sizes, and has universality.
The invention is further described below with reference to the accompanying drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of an indicator light state identification method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an indicator light status identification method according to an embodiment of the present invention;
fig. 3 is a schematic sub-flow chart of an indicator light status identification method according to an embodiment of the present invention;
fig. 4 is a schematic sub-flow chart of an indicator light status identification method according to an embodiment of the present invention;
fig. 5 is a schematic sub-flow chart of an indicator light status identification method according to an embodiment of the present invention;
fig. 6 is a schematic sub-flow chart of an indicator light status identification method according to an embodiment of the present invention;
fig. 7 is a schematic sub-flow chart of an indicator light status identification method according to an embodiment of the present invention;
fig. 8 is a schematic block diagram of an indicator light status identification apparatus provided in an embodiment of the present invention;
fig. 9 is a schematic block diagram of a preprocessing unit of the indicator light state identification apparatus provided in the embodiment of the present invention;
fig. 10 is a schematic block diagram of a matching unit of the indicator light state identification apparatus provided in the embodiment of the present invention;
fig. 11 is a schematic block diagram of a matching pair generation subunit of the indicator light state identification apparatus provided in the embodiment of the present invention;
fig. 12 is a schematic block diagram of a color information obtaining unit of the indicator light state identification apparatus according to the embodiment of the present invention;
FIG. 13 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view of an application scenario of an indicator light state identification method according to an embodiment of the present invention. Fig. 2 is a schematic flow chart of an indicator light status identification method according to an embodiment of the present invention. The method for identifying the state of the indicator lamp is applied to a server. This server carries out data interaction with the terminal, and wherein, the server can be placed on patrolling and examining the robot, is equipped with the camera on patrolling and examining the robot, and the image that will shoot reaches the server and draws and the analysis, and the state that will analyze out shows on the terminal again, and this terminal can be a display screen, and is integrated on patrolling and examining the robot.
Fig. 2 is a schematic flowchart of an indicator light state identification method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S160.
S110, marking a mark beside the indicator light, and making a template image to obtain the related information of the region of interest corresponding to the indicator light.
In this embodiment, the template image is a live image that can be used as a reference, and the information regarding the region of interest corresponding to the indicator light is information such as coordinates of the position where the indicator light is located. The identifier with obvious characteristics is attached beside the LED indicating lamp, so that the stability of correcting the current image to be recognized is enhanced, and the workload of manual marking is reduced by marking different rectangular frames of different modes on different LED indicating lamps.
In an embodiment, referring to fig. 3, the step S110 may include steps S111 to S113.
And S111, marking a mark beside the indicator light.
In the embodiment, a mark with obvious key point characteristics is pasted beside the LED indicating lamp on the spot.
The mark with obvious key point features refers to a feature with obvious corner points, such as a bar code, a two-dimensional code, or a manufactured image, when the current image to be recognized has horizontal deviation or angle rotation and the like compared with the template image, the current image to be recognized and the template image need to be aligned, and feature points marked in the image to be recognized and the template image can be respectively extracted.
And S112, selecting a clear field image to obtain a template image, and marking the region of interest corresponding to the indicator light.
Specifically, a clear field image is selected as a template image, and the template image is labeled, that is, a region of interest (ROI) is drawn on each LED indicator.
S113, storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
The interesting area corresponding to the indicating lamp is that a rectangular frame is drawn for each LED indicating lamp, if the number of the LED indicating lamps is small, the rectangular frame can be respectively marked for each LED indicating lamp, if the number of the LED indicating lamps is large and the LED indicating lamps are distributed in a matrix, the LED indicating lamps can be marked in a table mode, namely, the row number and the column number of the table are given according to the arrangement matrix of the LED indicating lamps in the image, then the LED indicating lamps are clicked from the upper left corner of the image and pulled down, the table with the same number as the arrangement matrix of the LED indicating lamps is generated, and the small square frame of the table is the coordinate position of the LED indicating lamps.
The step of storing the template image and the information related to the region of interest corresponding to the indicator light means that the template image and the annotation information are stored in a mode of naming the inspection point. For example, the naming mode of the template image of the inspection point1 is point1.jpg, each inspection point corresponds to an information file, for example, the label information of the inspection point1 is stored in point1.txt, and the stored information has a total number of targets which are recorded as n. And (3) coordinate information of each rectangular frame, wherein the coordinate information of each rectangular frame is (i, j), and the ith rectangular frame represents the jth LED indicator lamp on the ith row.
The LED indicator lamp has low requirement on environment, can be applied to various environments, can also be used for LED indicator lamps in various shapes and sizes, and has universality.
And S120, acquiring an image to be identified.
In this embodiment, the image to be recognized refers to a current live picture taken by the inspection robot.
And S130, performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp.
In this embodiment, the target position information refers to coordinate information of the current indicator light in the image to be recognized.
In an embodiment, referring to fig. 4, the step S130 may include steps S131 to S134.
S131, extracting the characteristic points of the image to be recognized and the template image to obtain the characteristic points of the image to be recognized and the characteristic points of the template image.
In the present embodiment, the feature point refers to an indicator lamp within the image.
Specifically, feature points of the template image and the current image to be recognized are obtained by using an ORB (algorithm for rapid feature point extraction and description, aided FAST and RotatedBRIEF) feature extraction method.
And S132, matching the characteristic points of the image to be recognized and the characteristic points of the template image to obtain a characteristic matching pair.
In this embodiment, the feature matching pair refers to a feature point combination in which the similarity of two feature points in the image to be recognized and the template image meets the requirement.
The matching of features is performed on feature descriptors, which are usually a vector, and the distance between two feature descriptors can reflect the similarity degree, i.e. the two feature points are not the same. Different distance measures may be selected depending on the feature descriptors. If the feature descriptor is a floating point type feature descriptor, the Euclidean distance of the feature descriptor can be used; for binary feature descriptors, their hamming distance can be used, i.e. the hamming distance between two different binary strings refers to the number of different bits of the two binary strings.
With the method of calculating the similarity of descriptors, how to find the most similar feature point in the feature point set is the matching of the feature points. The matching can be carried out by the following method:
the violence matching method comprises the steps of calculating the distances between a certain feature point descriptor and all other feature point descriptors, then sequencing the obtained distances, and taking the closest one as a matching point;
filtering the error matching method;
the Hamming distance is less than twice of the minimum distance, namely the Hamming distance of the matched point pair is less than twice of the minimum distance to be used as a judgment basis, if the Hamming distance is less than the minimum distance, an error match is considered, and filtering is carried out; a correct match is considered to be if greater than this value;
cross matching, i.e. cross filtering, is simple to perform, and then match again, and conversely use the matched point to perform matching, if the matched point is still the first matched point, then it is considered as a correct match. For example, if the first feature point a uses a brute force matching method, the matched feature point is the feature point B; and conversely, matching is carried out by using the characteristic point B, if the matched characteristic point A is still the characteristic point A, the matching is considered to be a correct matching, and otherwise, the matching is an incorrect matching.
The KNN matching, i.e. K-nearest neighbor matching, selects K points that are most similar to the feature points at the time of matching, and if the difference between the K points is large enough, selects the most similar point as the matching point, and usually selects K ═ 2, i.e. nearest neighbor matching. Two nearest neighbor matches are returned for each match, and if the first match and the second match are sufficiently distant in distance ratio, i.e. the vectors are sufficiently distant, then this is considered a correct match, the threshold for the ratio is typically around 2.
The random sampling consistency method calculates a homography matrix between two images by using matching points, and then judges whether a certain match is a correct match or not by using a reprojection error.
In one embodiment, referring to fig. 5, the step S132 may include steps S1321 to S1322.
S1321, calculating the similarity of the feature points of the image to be recognized and the feature points of the template image to obtain a similarity set.
In this embodiment, the similarity set refers to a set formed by the similarities of all the feature points of the image to be recognized and all the feature points of the template image.
S1322, feature points of the image to be recognized and feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set are screened to form feature matching pairs.
And (4) carrying out feature point matching through a knnnMatch (K Nearest Neighbor matching, K-Nearest Neighbor Match) algorithm to obtain a feature point matching pair. In practical applications, there are often wrong matching pairs, and these wrong matching pairs are introduced into the final motion model, which causes a large error, so that a ransac (RANdom sampling consensus) algorithm needs to be used to eliminate the wrong matching pairs.
And S133, acquiring an affine matrix of the feature matching pairs.
In this embodiment, the affine matrix is a matrix obtained by affine transformation, and the affine transformation is also called affine mapping, which means that in geometry, one vector space is subjected to linear transformation once and then translated into another vector space.
Specifically, an affine matrix is obtained through the retained matching pairs and the perspective change principle.
And S134, acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the related information of the region of interest corresponding to the indicator lamp to obtain the target position information.
According to the affine matrix and the relevant information of the region of interest corresponding to the indicator lamp, the position information of the indicator lamp in the image to be identified can be obtained, the region of interest of the indicator lamp is determined by the affine matrix, and then the target position information can be obtained according to the relevant information of the region of interest corresponding to the indicator lamp.
And S140, cutting and extracting color features of the image to be recognized according to the target position information to obtain color feature information.
In this embodiment, the color feature information is the percentage of the clipped image to be recognized in the color ratio of the indicator light.
In an embodiment, referring to fig. 6, the step S140 may include steps S141 to S143.
And S141, cutting the image to be recognized according to the target position information to obtain an intermediate image.
In the present embodiment, the intermediate image refers to an image including only the indicator lamp.
And S142, converting the intermediate image according to a mode of converting an RGB image space into a YIQ image space to obtain a target image.
In the present embodiment, the target image refers to an intermediate image converted into the YIQ image space.
S143, obtaining a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color feature information.
The color feature extraction refers to color space conversion, namely converting an RGB image space into a YIQ image space, and obtaining a binary image by an I component of the YIQ image space through a threshold method, namely the value of the I component is greater than a threshold value and is 1 and less than the threshold value and is 0; the threshold is preset to 230 in this embodiment.
And S150, generating the indicating state of the indicating lamp according to the color characteristic information.
In an embodiment, referring to fig. 7, the step S150 may include steps S151 to S153.
S151, judging whether the color characteristic information is larger than a color threshold value;
s152, if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state;
and S153, if the color characteristic information is not greater than the color threshold, the indicating state of the indicator lamp is a turning-off state.
And judging the state of the LED indicator lamp through the color characteristic information, namely calculating the percentage of the binary image element 1 in the whole target image area, if the percentage value is greater than the set color threshold value, the LED indicator lamp is on, otherwise, the LED indicator lamp is not on, and in the embodiment, the color threshold value is 0.01.
And S160, feeding back the indication state of the indicator lamp to the terminal to display the indication state at the terminal.
According to the method for identifying the state of the indicator lamp, the mark is arranged beside the indicator lamp, the stability of correcting the current image to be identified is enhanced, the template image and the region of interest corresponding to the indicator lamp are set, different LED indicator lamps are marked with rectangular frames in different modes, the workload of manual marking is reduced, after the image to be identified is obtained, the template image is matched with the feature points, the state of the indicator lamp is judged after color feature extraction is carried out, the requirement on environment is not high, the method can be applied to various environments, and can also be used for LED indicator lamps in various shapes and sizes, and the method has universality.
Fig. 8 is a schematic block diagram of an indicator light status identification apparatus 300 according to an embodiment of the present invention. As shown in fig. 8, the present invention also provides an indicator light status recognition apparatus 300 corresponding to the above indicator light status recognition method. The indicator lamp status recognition apparatus 300 includes a unit for performing the above-described indicator lamp status recognition method, and the apparatus may be configured in a server. Specifically, referring to fig. 8, the indicator light state recognition apparatus 300 includes a preprocessing unit 301, an image acquisition unit 302, a matching unit 303, a color information acquisition unit 304, a state generation unit 305, and a state feedback unit 306.
The preprocessing unit 301 is configured to mark a mark beside the indicator light and make a template image to obtain information related to an area of interest corresponding to the indicator light; an image acquisition unit 302 for acquiring an image to be recognized; the matching unit 303 is configured to perform feature matching on the image to be recognized and the template image, and acquire target position information according to the relevant information of the region of interest corresponding to the indicator light; a color information obtaining unit 304, configured to perform clipping and color feature extraction on the image to be recognized according to the target position information, so as to obtain color feature information; a state generating unit 305 for generating an indication state of an indicator light according to the color feature information; and a state feedback unit 306 for feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal.
In one embodiment, as shown in fig. 9, the preprocessing unit 301 includes a marking subunit 3011, a template generating subunit 3012, and a saving subunit 3013.
A marking subunit 3011, configured to mark a mark beside the indicator light; a template generation subunit 3012, configured to select a clear field image to obtain a template image, and mark an area of interest corresponding to the indicator light; and the saving subunit 3013 is configured to save the corresponding template image and the information related to the region of interest corresponding to the indicator light according to the inspection point.
In one embodiment, as shown in fig. 10, the matching unit 303 includes a feature point extracting sub-unit 3031, a matching pair generating sub-unit 3032, a matrix acquiring sub-unit 3033, and a position acquiring sub-unit 3034.
A feature point extracting subunit 3031, configured to perform feature point extraction on the image to be identified and the template image to obtain feature points of the image to be identified and feature points of the template image; a matching pair generating subunit 3032, configured to match the feature points of the image to be recognized and the feature points of the template image to obtain feature matching pairs; a matrix obtaining subunit 3033, configured to obtain an affine matrix of the feature matching pairs; and the position obtaining subunit 3034 is configured to obtain, according to the affine matrix of the feature matching pair and the relevant information of the region of interest corresponding to the indicator lamp, the position information of the indicator lamp in the image to be identified, so as to obtain the target position information.
In one embodiment, as shown in fig. 11, the matching pair generation subunit 3032 includes a similarity calculation subunit 30321 and a screening subunit 30322.
The similarity operator unit 30321 is used for calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set; a screening subunit 30322, configured to screen feature points of the to-be-identified image and feature points of the template image corresponding to the similarity not lower than the set threshold in the similarity set, so as to form a feature matching pair.
In one embodiment, as shown in fig. 12, the color information obtaining unit 304 includes a cropping subunit 3041, a converting subunit 3042, and an information extracting subunit 3043.
A cutting subunit 3041, configured to cut the image to be identified according to the target position information to obtain an intermediate image; a conversion subunit 3042, configured to convert the intermediate image according to a manner of converting an RGB image space into a YIQ image space, so as to obtain a target image; the information extracting subunit 3043 is configured to obtain a binary image corresponding to the target image, and calculate a percentage of the binary image in the entire target image to obtain color feature information.
In an embodiment, the state generating unit 305 is configured to determine whether the color feature information is greater than a color threshold; if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indicating state of the indicator lamp is a turning-off state.
It should be noted that, as can be clearly understood by those skilled in the art, the specific implementation processes of the indicator light state identification apparatus 300 and each unit may refer to the corresponding descriptions in the foregoing method embodiments, and for convenience and brevity of description, no further description is provided herein.
The indicator light state recognition apparatus 300 may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 13.
Referring to fig. 13, fig. 13 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 13, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer programs 5032 include program instructions that, when executed, cause the processor 502 to perform a method of indicator light status identification.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the operation of the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 may be caused to execute an indicator light status identification method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the architecture shown in fig. 13 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing device 500 to which the disclosed aspects apply, as a particular computing device 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
marking a mark beside the indicator light, and making a template image to obtain the related information of the region of interest corresponding to the indicator light; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light; cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information; generating an indication state of an indicator lamp according to the color characteristic information; and feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal.
In an embodiment, when the processor 502 implements the steps of marking a mark beside the indicator light and making a template image to obtain the information related to the region of interest corresponding to the indicator light, the following steps are specifically implemented:
marking a mark beside the indicator light; selecting a clear field image to obtain a template image, and marking an interested area corresponding to the indicator light; and storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
In an embodiment, when implementing the steps of performing feature matching on the image to be recognized and the template image, and acquiring the target position information according to the relevant information of the region of interest corresponding to the indicator light, the processor 502 specifically implements the following steps:
extracting characteristic points of the image to be recognized and the template image to obtain the characteristic points of the image to be recognized and the characteristic points of the template image; matching the characteristic points of the image to be recognized and the characteristic points of the template image to obtain characteristic matching pairs; obtaining an affine matrix of the feature matching pairs; and acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the related information of the region of interest corresponding to the indicator lamp to obtain the target position information.
In an embodiment, when the processor 502 implements the matching of the feature points of the image to be recognized and the feature points of the template image to obtain a feature matching pair step, the following steps are implemented:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set; and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
In an embodiment, when the step of performing the cropping and the color feature extraction on the image to be recognized according to the target position information to obtain the color feature information is implemented by the processor 502, the following steps are specifically implemented:
cutting the image to be recognized according to the target position information to obtain an intermediate image; converting the intermediate image according to a mode of converting an RGB image space into a YIQ image space to obtain a target image; and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
In an embodiment, when the processor 502 implements the step of generating the indication state of the indicator light according to the color feature information, the following steps are specifically implemented:
judging whether the color characteristic information is larger than a color threshold value; if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indicating state of the indicator lamp is a turning-off state.
It should be understood that, in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
marking a mark beside the indicator light, and making a template image to obtain the related information of the region of interest corresponding to the indicator light; acquiring an image to be identified; performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light; cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information; generating an indication state of an indicator lamp according to the color characteristic information; and feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal.
In an embodiment, when the processor executes the computer program to implement the steps of marking a mark beside the indicator light and making a template image to obtain information related to the region of interest corresponding to the indicator light, the following steps are specifically implemented:
marking a mark beside the indicator light; selecting a clear field image to obtain a template image, and marking an interested area corresponding to the indicator light; and storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
In an embodiment, when the processor executes the computer program to implement the steps of performing feature matching on the image to be recognized and the template image, and acquiring the target position information according to the information related to the region of interest corresponding to the indicator light, the following steps are specifically implemented:
extracting characteristic points of the image to be recognized and the template image to obtain the characteristic points of the image to be recognized and the characteristic points of the template image; matching the characteristic points of the image to be recognized and the characteristic points of the template image to obtain characteristic matching pairs; obtaining an affine matrix of the feature matching pairs; and acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the related information of the region of interest corresponding to the indicator lamp to obtain the target position information.
In an embodiment, when the processor executes the computer program to implement the step of matching the feature points of the image to be recognized and the feature points of the template image to obtain a feature matching pair, the following steps are specifically implemented:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set; and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
In an embodiment, when the processor executes the computer program to implement the step of performing cropping and color feature extraction on the image to be recognized according to the target position information to obtain the color feature information, the following steps are specifically implemented:
cutting the image to be recognized according to the target position information to obtain an intermediate image; converting the intermediate image according to a mode of converting an RGB image space into a YIQ image space to obtain a target image; and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
In an embodiment, when the processor executes the computer program to implement the step of generating the indication state of the indicator light according to the color feature information, the processor specifically implements the following steps:
judging whether the color characteristic information is larger than a color threshold value; if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state; and if the color characteristic information is not greater than the color threshold value, the indicating state of the indicator lamp is a turning-off state.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. The method for identifying the state of the indicator lamp is characterized by comprising the following steps:
marking a mark beside the indicator light, and making a template image to obtain the related information of the region of interest corresponding to the indicator light;
acquiring an image to be identified;
performing feature matching on the image to be identified and the template image, and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light;
cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information;
generating an indication state of an indicator lamp according to the color characteristic information;
and feeding back the indication state of the indicator lamp to the terminal for displaying at the terminal.
2. The method for identifying the status of the indicator light according to claim 1, wherein the marking a mark beside the indicator light and making a template image to obtain the information related to the region of interest corresponding to the indicator light comprises:
marking a mark beside the indicator light;
selecting a clear field image to obtain a template image, and marking an interested area corresponding to the indicator light;
and storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
3. The method for identifying the status of the indicator light according to claim 2, wherein the step of performing feature matching on the image to be identified and the template image and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator light comprises the steps of:
extracting characteristic points of the image to be recognized and the template image to obtain the characteristic points of the image to be recognized and the characteristic points of the template image;
matching the characteristic points of the image to be recognized and the characteristic points of the template image to obtain characteristic matching pairs;
obtaining an affine matrix of the feature matching pairs;
and acquiring the position information of the indicator lamp in the image to be identified according to the affine matrix of the feature matching pair and the related information of the region of interest corresponding to the indicator lamp to obtain the target position information.
4. The method for identifying the status of the indicator light of claim 3, wherein the matching the feature points of the image to be identified and the feature points of the template image to obtain a feature matching pair comprises:
calculating the similarity of the feature points of the image to be identified and the feature points of the template image to obtain a similarity set;
and screening the feature points of the image to be identified and the feature points of the template image corresponding to the similarity which is not lower than the set threshold in the similarity set to form feature matching pairs.
5. The method for identifying the status of the indicator light according to claim 1, wherein the cropping and color feature extraction of the image to be identified according to the target position information to obtain color feature information comprises:
cutting the image to be recognized according to the target position information to obtain an intermediate image;
converting the intermediate image according to a mode of converting an RGB image space into a YIQ image space to obtain a target image;
and acquiring a binary image corresponding to the target image, and calculating the percentage of the binary image in the whole target image to obtain color characteristic information.
6. The method for identifying the status of the indicator light according to claim 5, wherein the generating the indication status of the indicator light according to the color feature information comprises:
judging whether the color characteristic information is larger than a color threshold value;
if the color characteristic information is larger than a color threshold value, the indicating state of the indicating lamp is a lighting state;
and if the color characteristic information is not greater than the color threshold value, the indicating state of the indicator lamp is a turning-off state.
7. Indicator lamp state recognition device, its characterized in that includes:
the preprocessing unit is used for marking a mark beside the indicator light and making a template image so as to obtain the related information of the region of interest corresponding to the indicator light;
the image acquisition unit is used for acquiring an image to be identified;
the matching unit is used for carrying out feature matching on the image to be identified and the template image and acquiring target position information according to the relevant information of the region of interest corresponding to the indicator lamp;
the color information acquisition unit is used for cutting the image to be recognized and extracting color features according to the target position information to obtain color feature information;
the state generating unit is used for generating the indicating state of the indicating lamp according to the color characteristic information;
and the state feedback unit is used for feeding back the indication state of the indicator lamp to the terminal so as to display the indication state at the terminal.
8. The indicator light status recognition apparatus according to claim 7, wherein the preprocessing unit includes:
the marking subunit is used for marking a mark beside the indicator light;
the template generation subunit is used for selecting a clear field image to obtain a template image and marking an interested area corresponding to the indicator light;
and the storage subunit is used for storing the corresponding template image and the related information of the region of interest corresponding to the indicator lamp according to the inspection point.
9. A computer device, characterized in that the computer device comprises a memory, on which a computer program is stored, and a processor, which when executing the computer program implements the method according to any of claims 1 to 6.
10. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 6.
CN202010441220.3A 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium Active CN111639647B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010441220.3A CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010441220.3A CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111639647A true CN111639647A (en) 2020-09-08
CN111639647B CN111639647B (en) 2023-07-25

Family

ID=72329036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010441220.3A Active CN111639647B (en) 2020-05-22 2020-05-22 Indicator light state identification method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111639647B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950535A (en) * 2020-09-23 2020-11-17 苏州科达科技股份有限公司 Traffic signal lamp color, color recognition method, electronic device and storage medium
CN112115897A (en) * 2020-09-24 2020-12-22 深圳市赛为智能股份有限公司 Multi-pointer instrument alarm detection method and device, computer equipment and storage medium
CN112364780A (en) * 2020-11-11 2021-02-12 许继集团有限公司 Method for identifying state of indicator lamp
CN112364740A (en) * 2020-10-30 2021-02-12 交控科技股份有限公司 Unmanned machine room monitoring method and system based on computer vision
CN117094966A (en) * 2023-08-21 2023-11-21 青岛美迪康数字工程有限公司 Tongue image identification method and device based on image amplification and computer equipment
CN117670884A (en) * 2024-01-31 2024-03-08 深圳中科精工科技有限公司 Image labeling method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529556A (en) * 2016-11-16 2017-03-22 国家电网公司 Visual inspection system for instrument indicator lamp
CN107103330A (en) * 2017-03-31 2017-08-29 深圳市浩远智能科技有限公司 A kind of LED status recognition methods and device
CN107392116A (en) * 2017-06-30 2017-11-24 广州广电物业管理有限公司 A kind of indicator lamp recognition methods and system
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN111178200A (en) * 2019-12-20 2020-05-19 海南车智易通信息技术有限公司 Identification method of instrument panel indicator lamp and computing equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529556A (en) * 2016-11-16 2017-03-22 国家电网公司 Visual inspection system for instrument indicator lamp
CN107103330A (en) * 2017-03-31 2017-08-29 深圳市浩远智能科技有限公司 A kind of LED status recognition methods and device
CN107392116A (en) * 2017-06-30 2017-11-24 广州广电物业管理有限公司 A kind of indicator lamp recognition methods and system
CN107832770A (en) * 2017-11-08 2018-03-23 浙江国自机器人技术有限公司 A kind of equipment routing inspection method, apparatus, system, storage medium and crusing robot
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN111178200A (en) * 2019-12-20 2020-05-19 海南车智易通信息技术有限公司 Identification method of instrument panel indicator lamp and computing equipment

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950535A (en) * 2020-09-23 2020-11-17 苏州科达科技股份有限公司 Traffic signal lamp color, color recognition method, electronic device and storage medium
CN111950535B (en) * 2020-09-23 2022-07-12 苏州科达科技股份有限公司 Traffic signal lamp color, color recognition method, electronic device and storage medium
CN112115897A (en) * 2020-09-24 2020-12-22 深圳市赛为智能股份有限公司 Multi-pointer instrument alarm detection method and device, computer equipment and storage medium
CN112115897B (en) * 2020-09-24 2023-12-22 深圳市赛为智能股份有限公司 Multi-pointer instrument alarm detection method, device, computer equipment and storage medium
CN112364740A (en) * 2020-10-30 2021-02-12 交控科技股份有限公司 Unmanned machine room monitoring method and system based on computer vision
CN112364740B (en) * 2020-10-30 2024-04-19 交控科技股份有限公司 Unmanned aerial vehicle room monitoring method and system based on computer vision
CN112364780A (en) * 2020-11-11 2021-02-12 许继集团有限公司 Method for identifying state of indicator lamp
CN117094966A (en) * 2023-08-21 2023-11-21 青岛美迪康数字工程有限公司 Tongue image identification method and device based on image amplification and computer equipment
CN117094966B (en) * 2023-08-21 2024-04-05 青岛美迪康数字工程有限公司 Tongue image identification method and device based on image amplification and computer equipment
CN117670884A (en) * 2024-01-31 2024-03-08 深圳中科精工科技有限公司 Image labeling method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111639647B (en) 2023-07-25

Similar Documents

Publication Publication Date Title
CN111639647B (en) Indicator light state identification method and device, computer equipment and storage medium
CN112115893A (en) Instrument panel pointer reading identification method and device, computer equipment and storage medium
CN109522900B (en) Natural scene character recognition method and device
US11189022B2 (en) Automatic detection, counting, and measurement of logs using a handheld device
CN112418216B (en) Text detection method in complex natural scene image
CN110533654A (en) The method for detecting abnormality and device of components
CN110119680A (en) A kind of electrical cabinet wiring automatic errordetecting system based on image recognition
AU2020103716A4 (en) Training method and device of automatic identification device of pointer instrument with numbers in natural scene
CN116168351B (en) Inspection method and device for power equipment
CN111044149A (en) Method and device for detecting temperature abnormal point of voltage transformer and readable storage medium
CN116721107B (en) Intelligent monitoring system for cable production quality
CN111553176B (en) Wireless transmission checking method and system suitable for wiring of substation screen cabinet
CN113225461A (en) System and method for detecting video monitoring scene switching
CN113780484B (en) Industrial product defect detection method and device
CN111738319A (en) Clustering result evaluation method and device based on large-scale samples
CN109614512B (en) Deep learning-based power equipment retrieval method
CN111079826A (en) SLAM and image processing fused construction progress real-time identification method
CN113706455B (en) Rapid detection method for damage of 330kV cable porcelain insulator sleeve
CN114612393A (en) Monocular vision-based reflective part pose estimation method
CN114694130A (en) Method and device for detecting telegraph poles and pole numbers along railway based on deep learning
CN110472085B (en) Three-dimensional image searching method, system, computer device and storage medium
CN113793370A (en) Three-dimensional point cloud registration method and device, electronic equipment and readable medium
CN110110795B (en) Image classification method and device
CN110211200B (en) Dental arch wire generating method and system based on neural network technology
CN111639643A (en) Character recognition method, character recognition device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant