CN114155380A - Instrument panel image identification method and device, computer equipment and storage medium - Google Patents

Instrument panel image identification method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114155380A
CN114155380A CN202111443293.7A CN202111443293A CN114155380A CN 114155380 A CN114155380 A CN 114155380A CN 202111443293 A CN202111443293 A CN 202111443293A CN 114155380 A CN114155380 A CN 114155380A
Authority
CN
China
Prior art keywords
image data
robust feature
acceleration
instrument panel
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111443293.7A
Other languages
Chinese (zh)
Inventor
林佳润
李露琼
刘梓权
林捷
林梓衡
张华欣
陈文旭
林峰
赖楷文
杨康宜
崔畅
林沐
刘宇嘉
张艺妮
魏济
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Power Grid Co Ltd
Shantou Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangdong Power Grid Co Ltd
Shantou Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Power Grid Co Ltd, Shantou Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangdong Power Grid Co Ltd
Priority to CN202111443293.7A priority Critical patent/CN114155380A/en
Publication of CN114155380A publication Critical patent/CN114155380A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method, a device, computer equipment and a medium for identifying an image of a dashboard, wherein the method comprises the following steps: the method comprises the steps of obtaining first image data collected by a transformer substation-oriented instrument panel of the automatic aircraft, extracting first acceleration robust features from the first image data, matching the first acceleration robust features with second acceleration robust features, aligning the first image data with the second image data according to the first acceleration robust features and the second acceleration robust features if matching is successful, and identifying the reading of the instrument panel under the type in the aligned first image data. According to the embodiment of the invention, the first image data acquired by the automatic aircraft is more suitable for scale identification by aligning the first image data, so that the reading of the instrument panel can be identified more accurately when the position of the pointer is calculated according to the aligned first image data, and the reliability of the reading of the instrument panel by acquiring the image data of the instrument panel by the automatic aircraft is improved.

Description

Instrument panel image identification method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image recognition, in particular to a method and a device for recognizing an image of a dashboard, computer equipment and a storage medium.
Background
With the rapid development of the instrument and meter technology, the instrument and meter is widely applied to various industries. In the operation process of the power system, in order to monitor the operation state, health condition and the like of a power line and power equipment of a transformer substation, instruments and meters are also commonly applied to the power system, and the state information of the power equipment and the like is indicated through a pointer and scales.
Due to the promotion of the digital process of the power grid, the intelligent inspection mode of the power transmission and distribution integrated automatic aircraft is actively practiced in the inspection work of the transformer substation, and the inspection project of the power transformation professional automatic aircraft enters the actual test and even the popularization stage. The automatic aircraft can generate a large amount of image data aiming at various instruments and meters in the process of patrol operation, and the quantification of the image data still depends on manual reading and entry at present.
Firstly, with the proposition of ideas such as data twinning and strategic deployment, the obtained power equipment data is not only used for judging whether the state of the power equipment meets the safety requirement during patrol, but also used for systematically analyzing the long-time running state of the equipment by combining historical data, and the requirement on the precision of the relative equipment data is higher and higher. When the data is read manually, individual errors and accidental errors exist in reading of power data displayed by image data of different instruments and meters by everyone, the data is difficult to be quantized under the same standard, and the accuracy of the data is difficult to be guaranteed. Secondly, because the image data volume that unmanned aerial vehicle gathered at the tour in-process is big, make the work load of artifical input increase. And the situation that images collected by different instruments are similar exists, so that error data are easily input into a specified power equipment data collection table due to confusion of the images during manual reading and inputting.
Disclosure of Invention
The embodiment of the invention provides a method and a device for identifying an image of a dashboard, computer equipment and a storage medium, and aims to solve the problems that when pointer data of each dashboard of a transformer substation is read manually, individual errors are caused when different personnel read the data, and similar dashboards are confused, so that the data is misread or the data accuracy is not high.
In a first aspect, an embodiment of the present invention provides a method for identifying an image of a dashboard, including:
acquiring first image data acquired by an instrument panel facing a transformer substation of an automatic aircraft;
extracting a first accelerated robust feature from the first image data;
matching the first acceleration robust feature with a second acceleration robust feature, wherein the second acceleration robust feature is derived from second image data, and the second image data is image data acquired from a designated type of instrument panel in the transformer substation at a preset angle;
if the matching is successful, aligning the first image data with the second image data according to the first acceleration robust feature and the second acceleration robust feature;
identifying a reading of the dashboard under the type in the aligned first image data.
In a second aspect, an embodiment of the present invention further provides an instrument panel image recognition apparatus, including:
the first image data acquisition module is used for acquiring first image data acquired by an instrument panel of the automatic aircraft facing the transformer substation;
a first accelerated robust feature extraction module for extracting a first accelerated robust feature from the first image data;
the acceleration robust feature matching module is used for matching the first acceleration robust feature with a second acceleration robust feature, wherein the second acceleration robust feature is derived from second image data, the second image data is image data acquired from a designated instrument panel in the transformer substation at a preset angle, and if the matching is successful, a first image data alignment module is called;
a first image data alignment module to align the first image data with the second image data according to the first acceleration robust feature and the second acceleration robust feature;
and the instrument panel reading identification module is used for identifying the reading of the instrument panel under the type in the aligned first image data.
In a third aspect, an embodiment of the present invention further provides a computer device, where the computer device includes:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the dashboard image recognition method according to the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements the dashboard image recognition method according to the first aspect.
In the embodiment of the invention, the first image data acquired by the automatic aircraft facing the instrument panel of the transformer substation is acquired, the first acceleration robust feature is extracted from the first image data, the first acceleration robust feature is matched with the second acceleration robust feature of the second image data acquired at a preset angle in advance, and the first image data is aligned after matching, so that the first image data is more suitable for scale identification, and scale identification failure caused by distortion of an instrument panel area in the acquired first image data due to different angles when the automatic aircraft acquires the first image data facing the instrument panel of the power equipment in the transformer substation can be avoided. And then, reading of the specified type of instrument panel is identified in the aligned first image data, and the identification of the instrument panel reading is completed. And reading is carried out aiming at the instrument panel of the specified type, so that the problem that the identification is disordered due to the similarity of the appearance of different types of instrument panels when the first image data are collected facing different types of instrument panels is avoided. In the embodiment of the invention, the image is acquired for the instrument panel, the operation data of the power equipment indicated by the instrument panel in the transformer substation is read according to the pointer position of the instrument panel, and compared with a manual reading mode, when a large number of dial plate images to be read are faced, the reading error caused by the individual difference of a reading person is reduced, the accuracy of data acquisition is improved, and the manpower is saved. Further, according to the embodiment of the invention, the second image data is aligned with the first image data, so that the influence of distortion of the first image data caused by non-fixed-angle acquisition when the first image data of the instrument panel is acquired by the automatic aircraft is reduced, the reading of the instrument panel is identified more accurately when the pointer position is calculated according to the aligned first image data, and the reliability of instrument panel reading performed by acquiring the image data of the instrument panel by the automatic aircraft is improved.
Drawings
Fig. 1 is a flowchart of an instrument panel image recognition method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an instrument panel image recognition apparatus according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a computer device according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a meter image recognition method according to an embodiment of the present invention, where the embodiment is applicable to a case where accuracy of scale values in meter panel image data is low and workload is large, the method may be executed by a meter image recognition device, the meter image recognition device may be implemented by software and/or hardware, and may be configured in a computer device, and specifically includes the following steps:
step 101, first image data collected by an instrument panel facing a transformer substation of an automatic aircraft are obtained.
In this embodiment, a place where voltage is changed is provided in the substation, and in this place, various kinds of power equipment are provided for servicing the work of changing voltage. In order to ensure the safe operation of the transformer substation, various power equipment in the side power station is generally monitored in real time so as to prevent accidents or timely maintain the failed power equipment and reduce potential safety hazards. Along with the proposal of the intelligent concept of the power grid and the consideration of personnel safety, the automatic aircraft is further popularized and adopted to patrol various electric devices in the transformer substation, the image of the instrument panel can be collected as first image data by the camera carried by the automatic aircraft aiming at the instrument panel for monitoring the operation data of the electric devices in the transformer substation in the patrol process, and the first image data is used for identifying the reading of the instrument panel.
Step 102, extracting a first accelerated robust feature from the first image data.
In this embodiment, when the automatic aircraft collects images, the automatic aircraft is often limited by angles facing different substations and positions of different power devices in the substations, so that the position of the instrument panel in the collected first image data may be distorted to affect subsequent reading identification. Therefore, the embodiment provides that the first acceleration robust feature is extracted from the first image data and is used for correcting the position of the distorted first image data, so that the reading identification is more accurate, and the inspection quality of the automatic aircraft is improved.
In one embodiment of the invention, the process of extracting the first accelerated robust feature from the first image data is embodied as:
and obtaining coordinate values of pixel points in the first image data as accelerated robust feature calculation data.
In this embodiment, a Speeded Up Robust Features (Speeded Up Robust Features) algorithm may be used to extract the first accelerated Robust Features of the first image data. The SURF algorithm is an accelerated version of the SIFT algorithm (Scale Invariant Feature Transform), generally, the SURF algorithm is several times faster than the SIFT algorithm, and has better stability under a plurality of images. When the SURF algorithm is applied to extract the first acceleration robust feature, the coordinate values of each pixel point in the first image data are extracted first, and then the coordinate values of each pixel point are input into the hessian matrix respectively, which is a square matrix composed of second-order partial derivatives of real-valued functions with arguments as vectors. The input of the hessian matrix can be the coordinate value of each pixel point belonging to the image data so as to obtain the accelerated robust feature of the image data to which the pixel point belongs, and the accelerated robust feature can also be expressed as a feature point. Therefore, in this embodiment, the coordinate value of each pixel point in the first image data may be obtained as the calculation data for obtaining the accelerated robust feature. Substituting the accelerated robust feature calculation data into the feature matrix to obtain a first target feature matrix, for example, when the coordinate value of the pixel point is (x, y), inputting the (x, y) into the hessian matrix is expressed as:
Figure BDA0003384066320000061
h represents a Hessian matrix and H represents a Hessian matrix,
Figure BDA0003384066320000062
represents the horizontal and vertical second-order partial derivatives of the acceleration robust features,
Figure BDA0003384066320000063
the second partial derivative of the pixel points of the current acceleration robust feature representation in the horizontal direction,
Figure BDA0003384066320000071
and f represents a coordinate value function.
The acceleration robust feature has scale independence, so that the acceleration robust feature can be subjected to gaussian filtering through a hessian matrix to obtain a first acceleration robust feature of the first image data, the principle of the gaussian filtering is to firstly judge whether a pixel point represented by the acceleration robust feature is an extreme point in each pixel point in the first image data, the extreme point comprises a maximum value point and a minimum value point, and then the pixel point determined as the extreme point is the first acceleration robust feature. Furthermore, in this embodiment, a three-dimensional linear interpolation method may be further used to obtain the first acceleration robust feature at the sub-pixel level, and at the same time, the acceleration robust feature smaller than a certain threshold is removed, or an extreme value may be increased, so that the maximum value is larger and the minimum value is smaller, thereby obtaining fewer acceleration robust features as the first acceleration robust feature, so that the obtained first acceleration robust feature expresses that the first image data has stronger characteristic.
The eigenvalue of the first target characteristic matrix is obtained and used as the first eigenvalue, the process of calculating the extreme point in this embodiment is represented by obtaining the eigenvalue of the first target characteristic matrix, the eigenvalue is calculated by a determinant of the first target matrix, and the determinant in this embodiment is represented by:
Figure BDA0003384066320000072
det (H) represents a determinant function.
In this embodiment, after the matrix value of the first target matrix obtained by each pixel is calculated, when an extremum point is screened for each pixel, a range of screening is performed, and an extremum point within a preset range is selected, for example, the eigenvalue obtained by each pixel is compared with the eigenvalue obtained by 26 points (each plane is 9 points and then is removed from itself) of the pixel adjacent to (i.e., in the three-dimensional field) in the two-dimensional image space and the scale space, and if the eigenvalue of the pixel is the maximum value or the minimum value of the 26 points, the pixel can be used as a first acceleration robust feature. In the calculation overstroke of this embodiment, the feature value obtained by the pixel point may be used as the first feature value, and the feature values of 26 pixel points adjacent to the pixel point are used as the second feature value, so that the first accelerated robust feature selection process is represented as follows: and selecting the pixel points to which the first characteristic value and/or the second characteristic value with the largest numerical value and the smallest numerical value belong as the first acceleration robust characteristic.
And 103, matching the first acceleration robust feature with a second acceleration robust feature, wherein the second acceleration robust feature is derived from second image data, the second image data is image data acquired from a dashboard of a specified type in the transformer substation at a preset angle, and if the matching is successful, executing step 104.
In this embodiment, after the first acceleration robust feature is obtained, the first acceleration robust feature may be matched with the second acceleration robust feature, where the matching is performed to align the first image data that may be distorted according to the second image data having a template function, and the second acceleration robust feature is a set of pixel points representing features of the second image data. In the embodiment, before the first image data is collected, second image data can be collected for a certain specified type of instrument panel in the transformer substation, and the collection angle is determined to be accurate when the second image data is collected.
In one embodiment of the present invention, the specific process of obtaining the second accelerated robust feature is represented as:
the method includes the steps that original image data collected by an instrument panel facing to a designated type of a transformer substation at a preset angle are obtained, and the instrument panel is not distorted in the original image data collected at the preset angle.
In this embodiment, after the original image data is obtained, the instrument panel area in the original image data may be cut out to serve as the second image data in order to simplify the matching process and the calibration process of the second accelerated robust feature.
And calibrating the scale point and the circle center of the instrument panel for the second image data to serve as a second acceleration steady characteristic. In this embodiment, the second acceleration robust feature is obtained by calibrating a scale point and a center point of the instrument panel in the second image data. Because the final target of this embodiment is to identify the reading of the instrument panel, the scale point and the circle center of the instrument panel in the second image data are used as the second acceleration robust feature, so that the scale position and the pointer position in the first image data can be better corrected, and the accuracy of identifying the reading of the instrument panel for the first image data is improved.
In one embodiment of the present invention, the process of matching the first accelerated robust feature with the second accelerated robust feature may be specifically expressed as:
in this embodiment, when the first acceleration robust feature and the second acceleration robust feature are matched, the similarity between the first acceleration robust feature and the second acceleration robust feature may be measured by using the euclidean distance, and then the most similar first acceleration robust feature and the second acceleration robust feature are matched together. In this example. The smaller the euclidean distance, the higher the similarity of the first acceleration robust feature and the second acceleration robust feature. After the first acceleration robust feature of the first image data is selected and obtained, main direction distribution can be performed on the first acceleration robust feature, for example, haar wavelet features in a circular neighborhood of the first acceleration robust feature are counted. That is, in the circular neighborhood of the feature point, the sum of the horizontal and vertical haar wavelet features of all the points in the 60-degree sector is counted, then the sector is rotated at intervals of 0.2 radian and the haar wavelet feature value in the region is counted again, and finally the direction of the sector with the largest value is taken as the main direction of the feature point. Further, in this embodiment, a 4 × 4 rectangular region block may be selected around the first acceleration robust feature, the obtained rectangular region block direction is the main direction obtained by the foregoing calculation, and then haar wavelet features in the horizontal direction and the vertical direction of 25 pixel points in a single sub-region of the rectangular region block are counted, where the horizontal direction and the vertical direction are both relative to the main direction. The haar wavelet features include a sum of horizontal direction values, a sum of vertical direction values, a sum of horizontal direction absolute values, and a sum of vertical direction absolute values. Then, the four values are used as the feature vector of each sub-region, so that in this embodiment, for the first acceleration robust feature point, the feature vector composed of the four values may be multiplied by 4 × 4 of the rectangular region block, and a 64-dimensional vector is obtained as a descriptor of the first acceleration robust feature, and for the second acceleration robust feature, the descriptor obtaining method is the same. Calculating the euclidean distance between the first acceleration robust feature and the second acceleration robust feature is performed by calculating the euclidean distance between the first acceleration robust feature and the second acceleration robust feature in 64 dimensions.
And if the Euclidean distance is smaller than a preset distance threshold, determining that the first acceleration robust feature and the second acceleration robust feature are successfully matched. In this embodiment, a distance threshold of the euclidean distance may be set, and when the euclidean distance is smaller than the distance threshold, it may be determined that the first acceleration robust feature and the second acceleration robust feature are successfully matched.
In an embodiment of the present invention, when the first acceleration robust feature is matched with the second acceleration robust feature, it may further be determined whether the first acceleration robust feature and the second acceleration robust feature are not matched according to matrix traces of the feature matrix, for example:
the coordinate value of the first acceleration robust feature is substituted into the feature matrix to obtain a second target feature matrix, and the coordinate value of the second acceleration robust feature is substituted into the feature matrix to obtain a third target feature matrix.
In this embodiment, if the signs of the matrix traces between the first acceleration robust feature and the second acceleration robust feature are the same, it is determined that the first acceleration robust feature and the second acceleration robust feature have the same direction contrast change, and if the signs of the matrix traces are different, it is determined that the first acceleration robust feature and the second acceleration robust feature have the opposite direction contrast change, so that even if the euclidean distance is 0, the case where the first acceleration robust feature is similar to the second acceleration robust feature is directly excluded. Therefore, in this embodiment, it may be determined whether signs of the first matrix trace and the second matrix trace are the same, and if the signs of the first matrix trace and the second matrix trace are the same, the first acceleration robust feature and the second acceleration robust feature may be continuously matched according to the euclidean distance, and if the signs of the first matrix trace and the second matrix trace are different, the first acceleration robust feature and the second acceleration robust feature may be directly determined to fail to be matched.
Step 104 aligns the first image data with the second image data based on the first acceleration robust feature and the second acceleration robust feature.
In this embodiment, the first accelerated robust feature and the second accelerated robust feature that are successfully matched can align the first image data with the second image data, and the effect of template correction of the second image data is exerted
In one embodiment of the present invention, the specific process of aligning the first image data and the second image data is represented as:
according to the coordinate values of the successfully matched first acceleration robust feature and the coordinate values of the successfully matched second acceleration robust feature, a perspective transformation matrix is generated, and in the embodiment, the first acceleration robust feature and the second acceleration robust feature can represent different performances of the same position point in image data collected for the same object under different viewing angles. Therefore, a perspective transformation matrix, namely a homography matrix, can be generated according to the corresponding relation between the successfully matched first acceleration robust feature and the second acceleration robust feature, and each pixel point in the distorted first image data is corrected according to the perspective transformation matrix, so that the identification difficulty is reduced. For example, when the coordinate values of the first acceleration robust feature are (x1, y1) and the coordinate values of the second acceleration robust feature are (x1, x2), the perspective change matrix is expressed as:
Figure BDA0003384066320000111
wherein h is11-h33The characteristic coefficient of the perspective transformation matrix can be calculated by substituting coordinate values of a plurality of successfully matched first acceleration robust features and second acceleration robust features.
The first image data is aligned using a perspective transformation matrix. In this embodiment, after the perspective change matrix is obtained through calculation, all the pixel points in the first image data may be substituted into the perspective change matrix to obtain new pixel points with changed coordinate values, and all the new pixel points are combined to obtain the aligned first image data, thereby completing the alignment of the first image data.
Step 105, reading of the instrument panel under the type is identified in the aligned first image data.
In this embodiment, the reading of the instrument panel of the specified type collected by the automatic aircraft may be identified by reading the pointer position and the scale position of the instrument panel in the aligned first image data. Due to the adjustment of the second acceleration robust feature in the aligned first image data, the pointer position and the scale position of the instrument panel can return to the undistorted position from the distorted state, and the alignment of the first image data is equivalent to the fact that the first image data shot by the automatic aircraft from the non-preset angle is corrected to be shot at the preset angle.
In one embodiment of the present invention, the specific process of identifying the dashboard reading is represented by:
the aligned first image data is grayed, and in this embodiment, after the aligned first image data is obtained, graying is first performed on the first image data. In the first image data collected for the dashboard, the pointer is usually black or other dark color due to the nature of the dashboard, e.g., the surface of the dashboard is usually white over a large area. Therefore, after the gray processing is performed on the first image data, the scale of the instrument panel pointed by the pointer and the center of the instrument panel have sudden change of the gray value.
And acquiring a gray value of the second acceleration robust feature in the first image data after the alignment as a first gray value. In this embodiment, after the first image data is aligned, the second acceleration robust feature in the second image data may be mapped to the pixel point in the aligned first image data, so as to obtain the corresponding center position and scale position of the dashboard in the first image data, and obtain the gray value of the pixel point at the center position and scale position as the first gray value.
Traversing the first gray value, after the first gray value of the pixel point which can represent the scale and the center of the circle on the instrument panel is obtained, the pixel point with the suddenly changed gray value can be found by traversing the first gray value, and the pixel point is used as the scale pointed by the pointer in the instrument panel and the center of the circle of the instrument panel.
In this embodiment, after the scale position pointed by the pointer in the instrument panel and the center position of the instrument panel are obtained according to the mapped second acceleration robust feature, the line between the scale position and the center position may be drawn to represent the pointer in the instrument panel.
And sequentially acquiring the gray value of each pixel point on the connecting line as a second gray value. In this embodiment, after the connection line representing the instrument panel pointer is determined, the gray values of the pixels on the connection line may be sequentially obtained from the circle center position, and the obtained gray values are used as the second gray values.
And if the second gray value is continuously larger than and/or smaller than the preset gray threshold, determining the connecting line as the pointer position of the instrument panel. In this embodiment, because the length, thickness, and color depth of the pointer in the different types of instrument panels are different, different gray level thresholds can be set according to the different types of instrument panels. When the second gray value is continuously larger than and/or continuously smaller than the preset gray threshold, for example, the pointer of the instrument panel is black, and the surface is white, the gray threshold may be set to 5 in consideration of the influence of the light on the gray value, and when the second gray values are all smaller than 5, it is stated that the above-mentioned line may represent the pointer of the instrument panel. In this embodiment, it may be further determined that the connection line represents the pointer of the dashboard only when how many consecutive second grayscale values are greater than and/or less than the preset grayscale threshold value may be set according to the length of the pointer.
And identifying the reading of the instrument panel indicated by the connecting line according to the type of the instrument panel. In this embodiment, after it is determined that the connection line may represent the pointer of the instrument panel, the characteristics of the instrument panel, such as the corresponding numerical value of each scale, may be determined according to the type of the instrument panel, and then the reading of the scale indicated in the instrument panel by the connection line is calculated by combining with the mathematical geometry.
In the embodiment of the invention, the first image data acquired by the automatic aircraft facing the instrument panel of the transformer substation is acquired, the first acceleration robust feature is extracted from the first image data, the first acceleration robust feature is matched with the second acceleration robust feature of the second image data acquired at a preset angle in advance, and the first image data is aligned after matching, so that the first image data is more suitable for scale identification, and scale identification failure caused by distortion of an instrument panel area in the acquired first image data due to different angles when the automatic aircraft acquires the first image data facing the instrument panel of the power equipment in the transformer substation can be avoided. And then, reading of the specified type of instrument panel is identified in the aligned first image data, and the identification of the instrument panel reading is completed. And reading is carried out aiming at the instrument panel of the specified type, so that the problem that the identification is disordered due to the similarity of the appearance of different types of instrument panels when the first image data are collected facing different types of instrument panels is avoided. In the embodiment of the invention, the image is acquired for the instrument panel, the operation data of the power equipment indicated by the instrument panel in the transformer substation is read according to the pointer position of the instrument panel, and compared with a manual reading mode, when a large number of dial plate images to be read are faced, the reading error caused by the individual difference of a reading person is reduced, the accuracy of data acquisition is improved, and the manpower is saved. Further, according to the embodiment of the invention, the second image data is aligned with the first image data, so that the influence of distortion of the first image data caused by non-fixed-angle acquisition when the first image data of the instrument panel is acquired by the automatic aircraft is reduced, the reading of the instrument panel is identified more accurately when the pointer position is calculated according to the aligned first image data, and the reliability of instrument panel reading performed by acquiring the image data of the instrument panel by the automatic aircraft is improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Example two
Fig. 2 is a schematic structural diagram of an instrument panel image recognition apparatus according to a second embodiment of the present invention, which may specifically include the following modules:
the first image data acquisition module 210 is used for acquiring first image data acquired by an instrument panel facing a transformer substation of the automatic aircraft;
a first accelerated robust feature extraction module 220 for extracting a first accelerated robust feature from the first image data;
the acceleration robust feature matching module 230 is configured to match the first acceleration robust feature with a second acceleration robust feature, where the second acceleration robust feature is derived from second image data, the second image data is image data acquired from a dashboard of a designated type in the substation at a preset angle, and if the matching is successful, the first image data alignment module 240 is invoked;
a first image data alignment module 240 for aligning the first image data with the second image data according to the first and second accelerated robust features;
a dashboard reading identification module 250, configured to identify, in the aligned first image data, a reading of the dashboard in the type.
In one embodiment of the present invention, the first accelerated robust feature extraction module 220 comprises:
the acceleration robust feature calculation data acquisition module is used for acquiring coordinate values of pixel points in the first image data as acceleration robust feature calculation data;
the first target characteristic matrix acquisition module is used for substituting the accelerated robust characteristic calculation data into a characteristic matrix to obtain a first target characteristic matrix;
a first eigenvalue acquisition module, configured to acquire an eigenvalue of the first target eigenvalue matrix as a first eigenvalue;
the characteristic value comparison module is used for comparing the first characteristic value with a second characteristic value, and the second characteristic value is obtained by calculating coordinate values of 26 adjacent pixel points of the pixel points in a two-dimensional image space and a scale space;
the first acceleration robust feature selection module is used for selecting the pixel points to which the first characteristic value and/or the second characteristic value with the largest numerical value and the smallest numerical value belong as first acceleration robust features;
in one embodiment of the present invention, the accelerated robust feature matching module 230 comprises:
the Euclidean distance calculation module is used for calculating the Euclidean distance between the first acceleration robust feature and the second acceleration robust feature, and if the Euclidean distance is smaller than a preset distance threshold value, the matching success determination module is called;
and the matching success determining module is used for determining that the first accelerated robust feature and the second accelerated robust feature are successfully matched.
In one embodiment of the present invention, the accelerated robust feature matching module 230 further comprises:
the second target characteristic matrix obtaining module is used for substituting the coordinate value of the first acceleration steady characteristic into the characteristic matrix to obtain a second target characteristic matrix;
the third target characteristic matrix obtaining module substitutes the coordinate value of the second acceleration steady characteristic into the characteristic matrix to obtain a third target characteristic matrix;
the matrix trace acquisition module is used for calculating matrix traces of the second target characteristic matrix and the third target characteristic matrix as a first matrix trace and a second matrix trace respectively;
a matrix track judging module, configured to judge whether signs of the first matrix track and the second matrix track are the same, if so, invoke the euclidean distance calculating module to match the first acceleration robust feature and the second acceleration robust feature, and if not, invoke the matching failure determining module;
a matching failure determination module to determine that the first accelerated robust feature and the second accelerated robust feature fail to match.
In one embodiment of the present invention, the first image data alignment module 240 includes:
the perspective transformation matrix acquisition module is used for generating a perspective transformation matrix according to the coordinate values of the first acceleration robust feature and the second acceleration robust feature which are successfully matched;
a first image data perspective alignment module to align the first image data using the perspective transformation matrix.
In one embodiment of the present invention, the instrument panel reading identification module 250 includes:
the graying processing module is used for graying the aligned first image data;
a first gray value obtaining module, configured to obtain a gray value of the second acceleration robust feature in the first image data mapped to the aligned image, as a first gray value;
the first gray value traversing module is used for traversing the first gray value;
a connection line obtaining module, configured to obtain a connection line formed by the second acceleration robust feature in which the first gray value is mutated;
the second gray value acquisition module is used for sequentially acquiring the gray values of all the pixel points on the connecting line as second gray values, and if the second gray values are continuously larger than and/or smaller than a preset gray threshold value, the pointer position determination module is called;
the pointer position determining module is used for determining the connecting line as the pointer position of the instrument panel;
and the reading identification module is used for identifying the reading of the instrument panel indicated by the connecting line by combining the type of the instrument panel.
In an embodiment of the present invention, the instrument panel image recognition apparatus further includes:
the system comprises an original image data acquisition module, a data acquisition module and a data acquisition module, wherein the original image data is acquired by an automatic aircraft facing to a meter panel of a specified type of a transformer substation at a preset angle, and the meter panel is not distorted in the original image data acquired by the automatic aircraft at the preset angle;
the second image data acquisition module is used for cutting an instrument panel area in the original image data to be used as second image data;
and the second acceleration robust feature acquisition module is used for calibrating the scale points and the circle center of the instrument panel on the second image data to serve as a second acceleration robust feature.
The instrument image recognition device provided by the embodiment of the invention can execute the instrument panel image recognition method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a computer device according to a third embodiment of the present invention. FIG. 3 illustrates a block diagram of an exemplary computer device 12 suitable for use in implementing embodiments of the present invention. The computer device 12 shown in FIG. 3 is only an example and should not impose any limitation on the scope of use or functionality of embodiments of the present invention.
As shown in FIG. 3, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 3, and commonly referred to as a "hard drive"). Although not shown in FIG. 3, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 20. As shown, network adapter 20 communicates with the other modules of computer device 12 via bus 18. It should be understood that although not shown in FIG. 3, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing the dashboard image recognition method provided by the embodiment of the present invention.
Example four
The fourth embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the dashboard image recognition method, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here.
A computer readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An instrument panel image recognition method is characterized by comprising the following steps:
acquiring first image data acquired by an instrument panel facing a transformer substation of an automatic aircraft;
extracting a first accelerated robust feature from the first image data;
matching the first acceleration robust feature with a second acceleration robust feature, wherein the second acceleration robust feature is derived from second image data, and the second image data is image data acquired from a designated type of instrument panel in the transformer substation at a preset angle;
if the matching is successful, aligning the first image data with the second image data according to the first acceleration robust feature and the second acceleration robust feature;
identifying a reading of the dashboard under the type in the aligned first image data.
2. The method of claim 1, wherein extracting the first accelerated robust feature from the first image data comprises:
obtaining coordinate values of pixel points in the first image data as accelerated robust feature calculation data;
substituting the accelerated robust feature calculation data into a feature matrix to obtain a first target feature matrix;
acquiring a characteristic value of the first target characteristic matrix as a first characteristic value;
comparing the first characteristic value with a second characteristic value, wherein the second characteristic value is calculated by coordinate values of 26 adjacent pixel points of the pixel points in a two-dimensional image space and a scale space;
and selecting the pixel point to which the first characteristic value and/or the second characteristic value with the largest numerical value and the smallest numerical value belong as a first acceleration robust characteristic.
3. The method of claim 1, wherein matching the first accelerated robust feature with the second accelerated robust feature comprises:
calculating Euclidean distances between the first acceleration robust feature and the second acceleration robust feature;
and if the Euclidean distance is smaller than a preset distance threshold, determining that the first acceleration robust feature and the second acceleration robust feature are successfully matched.
4. The method of claim 2, wherein said matching the first accelerated robust feature with the second accelerated robust feature further comprises:
substituting the coordinate value of the first accelerated robust feature into the feature matrix to obtain a second target feature matrix;
substituting the coordinate value of the second acceleration steady characteristic into the characteristic matrix to obtain a third target characteristic matrix;
calculating matrix traces of the second target characteristic matrix and the third target characteristic matrix to be respectively used as a first matrix trace and a second matrix trace;
judging whether the signs of the first matrix track and the second matrix track are the same or not;
if the first acceleration robust feature and the second acceleration robust feature are the same, matching the first acceleration robust feature and the second acceleration robust feature according to the Euclidean distance;
and if not, determining that the first acceleration robust feature and the second acceleration robust feature fail to be matched.
5. The method according to any of claims 1-4, wherein said aligning the first image data with the second image data according to the first accelerated robust feature and the second accelerated robust feature comprises:
generating a perspective transformation matrix according to the coordinate values of the first acceleration robust feature and the second acceleration robust feature which are successfully matched;
aligning the first image data using the perspective transformation matrix.
6. The method of claim 5, wherein identifying the reading of the dashboard under the type in the aligned first image data comprises:
performing graying processing on the aligned first image data;
acquiring a gray value of the second acceleration robust feature mapped to the aligned first image data as a first gray value;
traversing the first gray value;
obtaining a connecting line formed by the second acceleration robust features with the first gray value subjected to mutation;
sequentially acquiring the gray value of each pixel point on the connecting line as a second gray value;
if the second gray value is continuously larger than and/or smaller than a preset gray threshold, determining the connection line as the pointer position of the instrument panel;
identifying a reading of the dashboard indicated by the wiring in conjunction with the type of the dashboard.
7. The method according to any one of claims 1-4, further comprising:
acquiring original image data collected by an automatic aircraft facing to a meter panel of a specified type of a transformer substation at a preset angle, wherein the meter panel is not distorted in the original image data collected by the automatic aircraft at the preset angle;
cutting an instrument panel area in the original image data to be used as second image data;
and calibrating the scale point and the circle center of the instrument panel for the second image data to serve as a second acceleration steady feature.
8. A meter image recognition apparatus, comprising:
the first image data acquisition module is used for acquiring first image data acquired by an instrument panel of the automatic aircraft facing the transformer substation;
a first accelerated robust feature extraction module for extracting a first accelerated robust feature from the first image data;
the acceleration robust feature matching module is used for matching the first acceleration robust feature with a second acceleration robust feature, wherein the second acceleration robust feature is derived from second image data, the second image data is image data acquired from a designated instrument panel in the transformer substation at a preset angle, and if the matching is successful, a first image data alignment module is called;
a first image data alignment module to align the first image data with the second image data according to the first acceleration robust feature and the second acceleration robust feature;
and the instrument panel reading identification module is used for identifying the reading of the instrument panel under the type in the aligned first image data.
9. A computer device, characterized in that the computer device comprises:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the dashboard image recognition method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a dashboard image recognition method according to any one of claims 1-7.
CN202111443293.7A 2021-11-30 2021-11-30 Instrument panel image identification method and device, computer equipment and storage medium Pending CN114155380A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111443293.7A CN114155380A (en) 2021-11-30 2021-11-30 Instrument panel image identification method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111443293.7A CN114155380A (en) 2021-11-30 2021-11-30 Instrument panel image identification method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114155380A true CN114155380A (en) 2022-03-08

Family

ID=80454841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111443293.7A Pending CN114155380A (en) 2021-11-30 2021-11-30 Instrument panel image identification method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114155380A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115877993A (en) * 2023-02-21 2023-03-31 北京和利时系统工程有限公司 Three-dimensional view display method and device based on digital twins

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115877993A (en) * 2023-02-21 2023-03-31 北京和利时系统工程有限公司 Three-dimensional view display method and device based on digital twins

Similar Documents

Publication Publication Date Title
CN111414934A (en) Pointer type meter reading automatic identification method based on fast R-CNN and U-Net
CN110119680B (en) Automatic error checking system of regulator cubicle wiring based on image recognition
CN111062282A (en) Transformer substation pointer type instrument identification method based on improved YOLOV3 model
CN109409385B (en) Automatic identification method for pointer instrument
CN109801267B (en) Inspection target defect detection method based on feature point detection and SVM classifier
CN111368906B (en) Pointer type oil level meter reading identification method based on deep learning
CN111814740B (en) Pointer instrument reading identification method, device, computer equipment and storage medium
CN110634137A (en) Bridge deformation monitoring method, device and equipment based on visual perception
US11657644B2 (en) Automatic ruler detection
Huang et al. Automatic identification and location technology of glass insulator self-shattering
CN112683169A (en) Object size measuring method, device, equipment and storage medium
CN111563896A (en) Image processing method for catenary anomaly detection
CN108992033B (en) Grading device, equipment and storage medium for vision test
Zhuo et al. Machine vision detection of pointer features in images of analog meter displays
CN114155380A (en) Instrument panel image identification method and device, computer equipment and storage medium
CN111553176A (en) Wireless transmission checking method and system suitable for wiring of transformer substation cubicle
CN114998432A (en) YOLOv 5-based circuit board detection point positioning method
CN114863129A (en) Instrument numerical analysis method, device, equipment and storage medium
CN117953048A (en) Swivel bridge attitude monitoring system and method based on computer vision
CN113705350A (en) Pointer instrument reading identification method and device for transformer substation, medium and electronic equipment
CN114092542B (en) Bolt measurement method and system based on two-dimensional vision
CN115205155A (en) Distorted image correction method and device and terminal equipment
CN114782822A (en) Method and device for detecting state of power equipment, electronic equipment and storage medium
CN114255458A (en) Method and system for identifying reading of pointer instrument in inspection scene
CN113111849A (en) Human body key point detection method, device, system and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination