CN115311471B - Shuttle kiln sintering condition image identification method - Google Patents

Shuttle kiln sintering condition image identification method Download PDF

Info

Publication number
CN115311471B
CN115311471B CN202211194569.7A CN202211194569A CN115311471B CN 115311471 B CN115311471 B CN 115311471B CN 202211194569 A CN202211194569 A CN 202211194569A CN 115311471 B CN115311471 B CN 115311471B
Authority
CN
China
Prior art keywords
edge
flame
pixel point
furnace wall
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211194569.7A
Other languages
Chinese (zh)
Other versions
CN115311471A (en
Inventor
白丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rudong Yanfeng Steel Structure Co ltd
Original Assignee
Rudong Yanfeng Steel Structure Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rudong Yanfeng Steel Structure Co ltd filed Critical Rudong Yanfeng Steel Structure Co ltd
Priority to CN202211194569.7A priority Critical patent/CN115311471B/en
Publication of CN115311471A publication Critical patent/CN115311471A/en
Application granted granted Critical
Publication of CN115311471B publication Critical patent/CN115311471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The invention relates to the technical field of data identification, in particular to a shuttle kiln sintering condition image identification method, which comprises the following steps: obtaining a connected domain corresponding to each edge in the edge region to be processed in real time, determining a furnace wall edge straight line significant coefficient, a furnace wall edge chromaticity significant coefficient and a furnace wall edge roughness significant coefficient corresponding to each edge, and further determining a furnace wall edge significant coefficient; obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes according to a plurality of corner point target clusters corresponding to each edge, and further obtaining a flame outer flame edge significant coefficient; and determining the flame segmentation edge significant coefficient according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining a target flame area image so as to determine the current sintering condition of the shuttle kiln to be detected. The invention utilizes the data identification technology to detect the current shuttle kiln sintering working condition, thereby effectively improving the detection accuracy of the shuttle kiln sintering working condition.

Description

Shuttle kiln sintering condition image identification method
Technical Field
The invention relates to the technical field of data identification, in particular to a shuttle kiln sintering condition image identification method.
Background
The shuttle kiln is an intermittent thermal kiln, is widely applied to various small-scale intermittent ceramic production due to flexible production scheduling and simple and convenient operation, and the control of the sintering working condition of the shuttle kiln has important influence on the quality of products fired by the ceramic shuttle kiln, wherein sintering refers to the process of converting powdery materials into compact bodies and is a traditional process. The existing method for identifying the sintering condition of the ceramic shuttle kiln mainly adopts thermocouple detection and is assisted by a manual fire observation mode, but the thermocouple can only measure the local temperature in the kiln, the detection precision is low, and the measured data cannot faithfully reflect the specific situation of a sintering area. In addition, the labor intensity of manual fire observation is high, the experience requirement is high, and the automation degree and the production efficiency of the shuttle kiln ceramic production are low easily. The industrial thermometer can accurately identify the sintering working condition, but the detection method has high cost and is not suitable for small-scale ceramic production.
With the development and progress of machine vision and image processing technology, a flame image recognition algorithm based on deep learning appears, the algorithm extracts the characteristics or attributes of the acquired flame images through a characteristic layer of a deep neural network, and classifies the flame images according to the characteristics of the flame images, so that the inherent defect of sensitivity to noise data in the traditional image processing algorithm is overcome, and the accuracy of image classification and recognition is high. However, due to the characteristics of the structure and the material of the shuttle kiln, light formed by reflection of bright light of the flame in the kiln wall is closer to flame outer flame, the flame in the image is not easily divided, an accurate flame image cannot be obtained, and the detection accuracy of the sintering condition of the shuttle kiln is poor.
Disclosure of Invention
In order to solve the technical problem of poor detection accuracy of the sintering condition of the shuttle kiln in the prior art, the invention aims to provide an image identification method of the sintering condition of the shuttle kiln.
The invention provides a shuttle kiln sintering condition image identification method, which comprises the following steps:
acquiring a flame image of a shuttle kiln to be detected during sintering in real time, and performing pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image;
performing edge detection processing on a to-be-processed area of the flame gray level image to obtain to-be-processed edge areas, and further obtaining communication areas corresponding to all edges in the to-be-processed edge areas;
determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in a connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge;
carrying out corner point detection processing on each edge in the edge area to be processed to obtain a plurality of corner point target clusters corresponding to each edge, and further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame outer flame edge significant coefficient corresponding to each edge according to a plurality of corner point target clusters, a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining a target flame region image;
and determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network.
Further, the step of determining the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge comprises the following steps:
determining a fitting straight line corresponding to each edge, fitting goodness of the fitting straight line, a window area corresponding to each pixel point in the connected domain corresponding to each edge and a connected domain expansion area corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge area to be processed;
determining a fitting expansion area corresponding to each pixel point of the fitting straight line according to the fitting straight line corresponding to each edge, and further determining furnace wall edge straight line significant coefficients corresponding to each edge according to the fitting goodness of the fitting straight line corresponding to each edge, the fitting expansion area corresponding to each pixel point of the fitting straight line and each pixel point in a communication domain corresponding to each edge;
according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining furnace wall edge chroma significant coefficients corresponding to each edge, and further determining the furnace wall edge roughness significant coefficients corresponding to each edge according to the gray values of all pixel points in the connected domain expansion region corresponding to each edge.
Further, the step of determining the linear saliency of the furnace wall edge for each edge includes:
counting the number of outlier pixels corresponding to each edge according to each pixel point in the fitting expansion region corresponding to each pixel point of the fitting straight line corresponding to each edge and each pixel point in the connected domain corresponding to each edge;
determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the number of pixel points in the connected domain corresponding to each edge, the number of outlier pixel points and the fitting goodness of the fitting straight line, wherein the calculation formula is as follows:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 564784DEST_PATH_IMAGE002
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
Figure 204844DEST_PATH_IMAGE003
the goodness of fit for the corresponding fitted line for each edge,afor the number of outlier pixels corresponding to each edge,band the number of the pixel points in the connected domain corresponding to each edge.
Further, the step of determining the furnace wall edge chroma significant coefficient corresponding to each edge comprises the following steps:
according to each pixel point in the window region corresponding to each pixel point in the connected domain corresponding to each edgeRGBDetermining a first target pixel point and a second target pixel point corresponding to each pixel point in a connected domain;
according to the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBDetermining the primary color difference index of each pixel point in the connected domain corresponding to each edge;
and determining the median value of the primary color difference index of the connected domain corresponding to each edge according to the primary color difference index of each pixel point in the connected domain corresponding to each edge, and taking the median value of the primary color difference index as the furnace wall edge chroma significant coefficient corresponding to the corresponding edge.
Further, a calculation formula for determining the primary color difference index of each pixel point in the connected domain corresponding to each edge is as follows:
Figure 769818DEST_PATH_IMAGE004
wherein, the first and the second end of the pipe are connected with each other,ppfor the primary color difference index of each pixel point in the connected domain corresponding to each edge,
Figure DEST_PATH_IMAGE005
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRValue of a step of,
Figure 684684DEST_PATH_IMAGE006
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRValue of,
Figure 18713DEST_PATH_IMAGE007
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGValue of,
Figure 829675DEST_PATH_IMAGE008
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGThe value of the sum of the values,
Figure 147523DEST_PATH_IMAGE009
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueBValue of,
Figure 866081DEST_PATH_IMAGE010
for each edge correspondOf first target pixel points corresponding to respective pixel points in connected domainRGBOf valueBValue of,
Figure 320196DEST_PATH_IMAGE011
in order to be a non-zero hyper-parameter,maxis a function of the maximum value.
Further, the step of determining the furnace wall edge roughness significant coefficient for each edge comprises:
determining the energy value of each pixel point in the connected domain expansion area according to the gray value of each pixel point in the connected domain expansion area corresponding to each edge;
and calculating the energy value mean value of the connected domain expansion region corresponding to each edge according to the energy value of each pixel point in the connected domain expansion region corresponding to each edge, and taking the energy value mean value as the furnace wall edge roughness significant coefficient corresponding to the corresponding edge.
Further, the step of further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge comprises:
determining a flame tip angle vertex angle area of each corner point target cluster corresponding to each edge according to the position of each corner point in a plurality of corner point target clusters corresponding to each edge, and determining a plurality of corner point distance indexes corresponding to each edge;
determining flame sharp corner valley bottom areas corresponding to any two adjacent flame sharp corner areas according to the position of each pixel point between any two adjacent flame sharp corner areas corresponding to each edge;
and determining the gliding gradient of the flame sharp angles corresponding to each edge according to the positions of the mass centers in any two adjacent flame sharp angle vertex angles corresponding to each edge and the positions of the mass centers in the corresponding flame sharp angle valley bottom areas.
Further, a calculation formula for determining the downward sliding gradient of the sharp corner of each flame corresponding to each edge is as follows:
Figure 98796DEST_PATH_IMAGE012
wherein the content of the first and second substances,Sa plurality of flame sharp angle downslide gradients for each edge,
Figure 638362DEST_PATH_IMAGE013
is the ordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure DEST_PATH_IMAGE014
the abscissa of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure 832714DEST_PATH_IMAGE015
is the ordinate of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
Figure 141335DEST_PATH_IMAGE016
is the abscissa of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure 90837DEST_PATH_IMAGE017
is the longitudinal coordinate of the mass center in the flame sharp angle valley bottom area corresponding to any two adjacent flame sharp angle apex angle areas,
Figure 852120DEST_PATH_IMAGE018
is the abscissa of the mass center in the flame sharp corner valley bottom area corresponding to the vertex angle areas of any two adjacent flame sharp corners,
Figure 178059DEST_PATH_IMAGE019
rounding up is performed for pairs.
Further, the calculation formula for determining the out-flame edge significant coefficient corresponding to each edge is as follows:
Figure 606766DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,OFEfor each edge corresponding to the out-flame edge saliency coefficient,Nmultiple corner point targets for each edgeThe total number of corner points within a cluster,Sa plurality of flame sharp angles corresponding to each edge slide down in a gradient manner to form a series,
Figure 727169DEST_PATH_IMAGE021
for each edge corresponds toiThe distance index of each corner point is used,
Figure 444589DEST_PATH_IMAGE022
the number of corner target clusters corresponding to each edge,maxis a function of the maximum value.
Further, the step of determining the image of the target flame region further comprises:
calculating a flame segmentation edge significant coefficient mean value according to the flame segmentation edge significant coefficient corresponding to each edge, and taking the flame segmentation edge significant coefficient mean value as a flame segmentation edge threshold value;
if the flame segmentation edge significant coefficient corresponding to a certain edge is larger than the flame segmentation edge threshold value, judging that the edge is the segmentation edge of the flame outer flame and the furnace wall background, otherwise, judging that the edge is not the segmentation edge of the flame outer flame and the furnace wall background, and further obtaining a plurality of segmentation edges of the flame outer flame and the furnace wall background;
and determining a target flame area image according to a plurality of segmentation edges of the flame outer flame and the background of the furnace wall and the to-be-processed area of the flame gray level image.
The invention has the following beneficial effects:
the invention provides a shuttle kiln sintering condition image identification method, which utilizes a data identification technology to accurately identify a target flame area image in a flame image, takes the target flame area image as a reference image for detecting the sintering condition of a shuttle kiln, effectively improves the detection accuracy of the sintering condition of the shuttle kiln, and has lower detection cost; acquiring a flame image during sintering of the shuttle kiln to be detected in real time, preprocessing the flame image to obtain a to-be-processed area of the flame gray image, and further obtaining a communication area corresponding to each edge in the to-be-processed area. In order to eliminate the influence caused by noise and external interference, preprocessing operation is carried out on the acquired flame image, in addition, in order to facilitate the analysis of subsequent steps and reduce the range of image recognition, graying processing is carried out on the flame image, the flame area in the flame gray image is extracted to be used as the area to be processed, the target flame area image can be accurately extracted in the follow-up process, edge detection and connected domain analysis are carried out on the area to be processed of the flame gray image, and the connected domain corresponding to each edge in the edge area to be processed is obtained; determining furnace wall edge straight line significant coefficients, furnace wall edge chroma significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge. The furnace wall edge significant coefficient corresponding to each edge is constructed by mathematical modeling technology through the edge characteristics that the edge in the shuttle kiln wall background area is close to a straight line, the texture distribution of the edge is rough, the colors and the brightness of two sides of the edge are equal. The larger the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to the edge are, the larger the furnace wall edge significant coefficient corresponding to the edge is; and determining the outer flame edge significant coefficient corresponding to each edge according to the plurality of corner point target clusters, the plurality of flame corner angle gliding gradients and the plurality of corner point distance indexes corresponding to each edge. And performing corner detection on each edge based on the shape characteristics of the flame, wherein the larger the number of corner points of a corner point target cluster corresponding to the edge is, the higher the possibility that the edge is the flame outer flame edge is. The flame sharp angle gliding gradient refers to the slope from the vertex angle peak of two adjacent flames to the bottom of the flame vertex angle valley, the larger the slope is, the more violent the flame combustion is represented, so that the gliding gradient of the flame sharp angles corresponding to each edge is determined, and whether the edge is the flame outer flame edge can be effectively judged; and determining the flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining the target flame area image. Each edge has a furnace wall edge significant coefficient and a flame outer flame edge significant coefficient corresponding to the edge, and the ratio of the furnace wall edge significant coefficient to the flame outer flame edge significant coefficient is calculated to obtain a flame segmentation edge significant coefficient. The segmentation edges between the flame outer flames and the furnace wall background can be screened from each edge in the edge area to be processed through the flame segmentation edge significant coefficient corresponding to each edge, and then the segmentation edges between the flame outer flames and the furnace wall background are utilized to obtain a target flame area image, the target flame area image can accurately represent the flame characteristic information during sintering of the shuttle kiln, the influence of other interference factors in the collected flame image is eliminated, and the more accurate target flame area image is obtained; and determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network. The target flame area image is used as an input image of the working condition detection neural network, so that the detection precision of the working condition detection neural network can be effectively improved, and the detection accuracy of the sintering working condition of the shuttle kiln is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of an image recognition method for sintering conditions of a shuttle kiln according to the present invention;
fig. 2 is a schematic diagram of a region to be processed of a flame gray scale image in an embodiment of the present invention.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the technical solutions according to the present invention will be given with reference to the accompanying drawings and preferred embodiments. In the following description, different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment provides a shuttle kiln sintering condition image identification method, as shown in fig. 1, the method comprises the following steps:
(1) The method comprises the steps of acquiring a flame image of a shuttle kiln to be detected during sintering in real time, and carrying out pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image, wherein the steps comprise:
(1-1) acquiring a flame image of the shuttle kiln to be detected during sintering in real time.
In the embodiment, the industrial camera is arranged right opposite to the shuttle kiln to be detected for producing ceramics to collect the flame image when the shuttle kiln to be detected is sintered, the flame area in the shuttle kiln to be detected is enabled to be in a more central position in the flame image by adjusting the shooting angle and the focusing multiple of the industrial camera, the industrial camera is utilized to obtain the flame image when the shuttle kiln to be detected is sintered in real time, and the flame image is visible lightRGBAnd (4) an image.
And (1-2) preprocessing the flame image during sintering of the shuttle kiln to be detected to obtain a to-be-processed area of the flame gray image.
In order to enhance the accuracy of the acquired flame image, the flame image during the sintering of the shuttle kiln to be detected is subjected to preprocessing operation, and the preprocessing operation can eliminate the influence caused by image noise and part of external interference factors. Specifically, gaussian filtering is adopted to respectively reduce noise of each channel of the flame image, namely, a gaussian function and the flame image are utilized to carry out convolution processing to eliminate random noise, and then graying operation is carried out on the flame image after gaussian filtering processing to obtain a flame gray image. Then, according to the priori knowledge, the position of the flame source of the shuttle kiln to be detected is fixed, the size of the flame area can be in a fixed range, the area where the flame is located in the flame gray level image is marked by a rectangular frame artificially, and the area is called as the area to be processed of the flame gray level image.
It should be noted that when the area to be treated is to be protected from the flame, the flame may be completely contained within the area to be treated, and there may be some background of the furnace wall in the area to be treated. Since the position of the industrial camera does not change generally, the coordinate position range of the to-be-processed region of the flame gray image is fixed, the schematic diagram of the to-be-processed region of the flame gray image is shown in fig. 2, the rectangular frame flame regions marked by 201 and 202 in the schematic diagram are both the to-be-processed regions, and the embodiment can explain the two to-be-processed regions in the subsequent steps.
(2) Carrying out edge detection processing on a to-be-processed area of the flame gray level image to obtain a to-be-processed edge area, and further obtaining a connected domain corresponding to each edge in the to-be-processed edge area, wherein the method comprises the following steps of:
and (2-1) carrying out edge detection processing on the to-be-processed area of the flame gray level image to obtain the to-be-processed edge area.
It should be noted that, a main purpose of this embodiment is to accurately partition a flame region in a region to be processed of a flame grayscale image, where the flame region is a region that only includes an outer flame and an inner flame, that is, the outer flame with low flame brightness is partitioned from a furnace wall background, and an edge corresponding to a joint of the outer flame and the furnace wall background is found, so that edge detection processing needs to be performed on the region to be processed of the flame grayscale image.
Because the inner flame part of the flame has the highest brightness and is bright yellow, the outer flame part of the flame has lower brightness and is orange red, the difference between the outer flame and the inner flame of the flame is increased, and a clear boundary is formed, the inner flame area of the flame in the area to be processed can be accurately divided by the OTSU large body threshold segmentation method, and the inner flame area of the flame in the area to be processed is obtained. And performing edge detection on the to-be-processed area by using a canny edge detection algorithm according to the to-be-processed area of the flame gray image to obtain the to-be-processed edge area of the flame gray image, wherein the to-be-processed edge area is a binary image. Both the OTSU tsu threshold segmentation method and the canny edge detection algorithm are prior art, and are not in the protection scope of the present invention, and are not elaborated herein.
And (2-2) obtaining a connected domain corresponding to each edge in the edge region to be processed according to the edge region to be processed.
Due to the irregularity of the shape and color presented by the flame, the rough inner wall of the shuttle kiln and the existence of a plurality of stripes, a plurality of edges can be detected in the region to be processed, the edges can be called as edges to be distinguished, and the edge to be distinguished can be divided into three categories, namely: the edges of the flame inner flame and the flame inner flame, the edges of the flame outer flame and the furnace wall background and the edges generated by the furnace wall background.
In order to prevent the edge at the joint of the inner flame and the outer flame from interfering with the extraction of the accurate flame image, the edge formed by connecting the inner flame and the outer flame is removed according to the position information of each edge pixel point in the inner flame area of the area to be processed, so that the number of the subsequent edges to be analyzed is reduced, and the efficiency of flame image identification is improved. In order to make each edge in the edge region to be processed clearer and more accurate, the opening operation is firstly carried out on each edge in the edge region to be processed, the independence of each edge is improved, then the closing operation is continued, the same type of edges which are not clear and cause breakage are connected, and each independent edge in the edge region to be processed is obtained. Then, performing connected domain analysis on each edge in the edge region to be processed by using a connected domain algorithm to obtain a connected domain corresponding to each edge in the edge region to be processed, wherein each edge has a corresponding connected domain, and the connected domain corresponding to each edge can be analyzed in the subsequent steps. The implementation processes of the open operation, the close operation and the connected domain algorithm are all the prior art, are out of the protection scope of the invention, and are not elaborated herein.
(3) Determining furnace wall edge straight line significant coefficients, furnace wall edge chroma significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge.
Firstly, determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in a connected domain corresponding to each edge in an edge region to be processed, wherein the furnace wall edge straight line significant coefficients, the furnace wall edge chromaticity significant coefficients and the furnace wall edge roughness significant coefficients comprise the following steps:
and (3-1) determining a fitting straight line corresponding to each edge, the fitting goodness of the fitting straight line, a window area corresponding to each pixel point in the connected domain corresponding to each edge and a connected domain expansion area corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge area to be processed.
In this embodiment, based on the position of each pixel point in the connected domain corresponding to each edge in the edge region to be processed, each pixel point in the connected domain corresponding to each edge is fitted to obtain a fitted straight line corresponding to each edge, and then according to the fitted straight line corresponding to each edge, the goodness of fit of the fitted straight line corresponding to each edge is obtained and recorded as
Figure 308640DEST_PATH_IMAGE003
. And constructing a sliding window with the size of 5 x 5 by taking each pixel point in the connected domain corresponding to each edge in the edge region to be processed as a central point, and sliding the sliding window on the connected domain corresponding to each edge to obtain the sliding window corresponding to each pixel point in the connected domain. Taking the sliding window corresponding to each pixel point as a window area, wherein each pixel point in the connected domain has the corresponding window area, and the area formed by all the pixel points in each sliding window is used as a connected domain expansion areaEach edge has its corresponding connected domain extension region. The processes of constructing the sliding window, fitting the straight line and determining the goodness of fit are all prior art and are not within the scope of the invention, and are not described in detail herein.
And (3-2) determining a fitting expansion area corresponding to each pixel point of the fitting straight line according to the fitting straight line corresponding to each edge, and further determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the goodness of fit of the fitting straight line corresponding to each edge, the fitting expansion area corresponding to each pixel point of the fitting straight line and each pixel point in the communication domain corresponding to each edge.
In this embodiment, a window with a size of 3 × 3 is created by taking each pixel point on the fitted straight line corresponding to each edge as a center through the fitted straight line corresponding to each edge obtained in step (3-1), and a window area corresponding to each pixel point on the fitted straight line is referred to as a fitted extended area.
Determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the goodness of fit of the fitted straight line corresponding to each edge and the fitted extended area corresponding to each pixel point of the fitted straight line, wherein the furnace wall edge straight line significant coefficient corresponding to each edge comprises the following steps:
(3-2-1) counting the number of the outlier pixels corresponding to each edge according to each pixel point in the fitting expansion region corresponding to each pixel point of the fitting straight line corresponding to each edge and each pixel point in the connected domain corresponding to each edge.
In this embodiment, based on each pixel point in the fitting extended region and each pixel point in the connected domain corresponding to each pixel point of the fitting straight line, the number of the pixel points in the connected domain is recorded asbCounting the number of pixel points corresponding to each edge in the connected domain but not in the fitting expansion region, calling the pixel points as outlier pixel points, and recording the number of the outlier pixel points asa
And (3-2-2) determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the number of pixel points in the connected domain corresponding to each edge, the number of outlier pixel points and the fitting goodness of the fitted straight line.
Firstly, it should be noted that the shuttle kiln wall has a compact structure, and is mostly constructed by regular vertical or horizontal objects, so that the edge of the shuttle kiln wall background is relatively close to a straight line, and in order to facilitate subsequent determination of the edge of the furnace wall background in a plurality of edges to be resolved, in this embodiment, based on the above analysis, according to the number of pixels in a connected domain corresponding to each edge, the number of outlier pixels, and the goodness of fit of a fitted straight line, the furnace wall edge straight line significant coefficient corresponding to each edge is determined through mathematical modeling, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE023
wherein, the first and the second end of the pipe are connected with each other,
Figure 795116DEST_PATH_IMAGE002
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
Figure 351999DEST_PATH_IMAGE003
the goodness of fit for the corresponding fitted line for each edge,afor the number of outlier pixels corresponding to each edge,band the number of the pixel points in the connected domain corresponding to each edge.
When the goodness of fit of the fitting straight line corresponding to any edge is larger and the number of the outlier pixel points is smaller, the furnace wall edge straight line significant coefficient corresponding to the edge is larger, and when the goodness of fit of the fitting straight line corresponding to any edge is smaller and the number of the outlier pixel points is larger, the furnace wall edge straight line significant coefficient corresponding to the edge is smaller.
(3-3) according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining furnace wall edge chroma significant coefficients corresponding to each edge, and further determining the furnace wall edge roughness significant coefficients corresponding to each edge according to the gray value of each pixel point in the connected domain expansion area corresponding to each edge, wherein the steps comprise:
(3-3-1) according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd (4) determining the furnace wall edge chroma significant coefficient corresponding to each edge.
It should be noted that the inner wall and the outer wall of the shuttle kiln are artificially and specially constructed to show obvious color difference, the inner wall of the shuttle kiln is greatly influenced by flame illumination, and the image shows bright orange yellow, while the outer wall of the shuttle kiln is slightly influenced by flame illumination, and the image shows obviously dull orange. Based on the analysis, the furnace wall background edges can be distinguished by calculating the primary color difference of the pixel points, and then the furnace wall edge chromaticity significant coefficient corresponding to each edge is determined, and the method comprises the following steps:
(3-3-1-1) according to each pixel point in the window region corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining a first target pixel point and a second target pixel point corresponding to each pixel point in the connected domain.
In this embodiment, based on the window region corresponding to each pixel point in the connected domain corresponding to each edge obtained in step (3-1), the position of each pixel point in the window region corresponding to each pixel point in the connected domain may be determined. Because the flame image of the shuttle kiln to be detected during sintering is visible lightRGBThe image is obtained by detecting the position of each pixel point in the window area in the flame image during sintering of the shuttle kilnRGBThe value is obtained. Selecting the pixels in the window area corresponding to each pixel from the pixels in the window area corresponding to each pixel in the connected domainRValue (c),GValue andBpixel points with larger values andRvalue (c),GValue andBpixel points with smaller values are arranged in the window areaRValue (c),GValue andBthe pixel points with larger values are used as second target pixel points, and the window area is internally provided withRValue (c),GValue andBand the pixel points with smaller values are used as first target pixel points, and each pixel point in the connected domain has a corresponding first target pixel point and a corresponding second target pixel point.
It should be noted that, in this embodiment, the first target pixel point and the second target pixel point corresponding to each pixel point are pixel points having a characteristic property in the window region, and the accuracy of subsequently determining the primary color difference index can be improved by determining the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain.
(3-3-1-2) according to the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining the primary color difference index of each pixel point in the connected domain corresponding to each edge.
In this embodiment, the first target pixel point corresponding to each pixel point in the connected domain is calculatedRGBOf value and second target pixelRGBThe difference of value, each pixel in the connected domain all has 3 primary color difference values, selects the maximum value in 3 primary color difference values as the primary color difference index of corresponding pixel, and the computational formula of primary color difference index is:
Figure 87874DEST_PATH_IMAGE024
wherein the content of the first and second substances,ppfor the primary color difference index of each pixel point in the connected domain corresponding to each edge,
Figure 755616DEST_PATH_IMAGE005
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRValue of a step of,
Figure 627757DEST_PATH_IMAGE006
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRThe value of the sum of the values,
Figure 355541DEST_PATH_IMAGE007
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGValue of a step of,
Figure 844292DEST_PATH_IMAGE008
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGThe value of the sum of the values,
Figure 50145DEST_PATH_IMAGE009
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueBThe value of the sum of the values,
Figure 42372DEST_PATH_IMAGE010
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueBThe value of the sum of the values,
Figure 941058DEST_PATH_IMAGE011
in order to be a non-zero hyper-parameter,maxis a function of the maximum value.
It should be noted that, in the following description,
Figure 120366DEST_PATH_IMAGE011
the color difference index is a nonzero hyperparameter, the function of the color difference index is a value range of an adjusting function, the empirical value is 255, each pixel point in a connected domain corresponding to each edge has a corresponding primary color difference index, and the larger the primary color difference index is, the higher the significance of the subsequently determined edge chromaticity is.
(3-3-1-3) determining the median value of the primary color difference index of the connected domain corresponding to each edge according to the primary color difference index of each pixel point in the connected domain corresponding to each edge, and taking the median value of the primary color difference index as the furnace wall edge chroma significant coefficient corresponding to the corresponding edge.
In this embodiment, based on the primary color difference index of each pixel point in the connected domain corresponding to each edge obtained in step (3-3-1-2), a median value of the primary color difference index of each pixel point in the connected domain is calculated, where the median value is also called a median, specifically, the primary color difference indexes of each pixel point are arranged in order of magnitude to form a sequence, the primary color difference index at the middle position of the sequence is the furnace wall edge chroma significant coefficient, and the calculation formula of the furnace wall edge chroma significant coefficient is as follows:
Figure 146222DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 868322DEST_PATH_IMAGE026
for each edge the corresponding furnace wall edge chroma significant coefficient,
Figure DEST_PATH_IMAGE027
the number sequence formed by the primary color difference indexes of all the pixel points in the connected domain corresponding to each edge,
Figure 141171DEST_PATH_IMAGE028
for the number of primary color difference indexes of each pixel point in the connected domain corresponding to each edge,
Figure 604513DEST_PATH_IMAGE029
is a median function.
It should be noted that the difference between the two sides of the edge corresponding to the stripe is obvious at the rough part of the inner wall of the shuttle kiln, and the difference between the edge formed by the flame outer flame and the background of the furnace wall is not obvious. When the primary color difference index in the connected domain corresponding to a certain edge is larger, the furnace wall edge chroma significant coefficient Δ c corresponding to the edge is larger, and the more the furnace wall edge chroma significant coefficient Δ c is larger, the more the furnace wall edge chroma significant coefficient is impossible to be the segmentation edge of the flame outer flame and the furnace wall background.
And (3-3-2) determining the furnace wall edge roughness significant coefficient corresponding to each edge according to the gray value of each pixel point in the connected domain expansion region corresponding to each edge.
First, it should be noted that, a smoke abatement device is provided in the shuttle kiln, although the flame envelope generates dust, the effect of the smoke is weak, each edge in the flame is layered uniformly, and the vicinity of the split edge formed by the flame envelope and the background of the furnace wall is smooth. The texture distribution of the inner wall of the shuttle kiln is rough, the vicinity of each edge of the furnace wall background is rough, and the significance degree of the roughness of the furnace wall edge corresponding to each edge is determined by analyzing the texture characteristics in the communication domain expansion area corresponding to each edge, wherein the method comprises the following steps:
(3-3-2-1) determining the energy value of each pixel point in the connected domain expansion region according to the gray value of each pixel point in the connected domain expansion region corresponding to each edge.
In this embodiment, based on the connected domain expansion region corresponding to each edge obtained in step (3-1), the gray value of each pixel point in the connected domain expansion region corresponding to each edge in the flame gray image is obtained. In the connected domain expansion area corresponding to each edge, using a Laws texture measurement method to obtain the energy value of each pixel point in the connected domain expansion area, and recording the energy value of each pixel point in the connected domain expansion area corresponding to each edge as the energy valueIAre respectively as
Figure DEST_PATH_IMAGE030
Figure 632730DEST_PATH_IMAGE031
,…,
Figure 599549DEST_PATH_IMAGE032
Wherein
Figure 387507DEST_PATH_IMAGE033
And the number of the pixel points in the connected domain expansion region corresponding to each edge is determined. The implementation of the Laws texture measurement method is prior art and is not within the scope of the present invention, and will not be described in detail herein.
(3-3-2-2) calculating the mean value of the energy values of the connected domain expansion regions corresponding to the edges according to the energy values of the pixels in the connected domain expansion regions corresponding to the edges, and taking the mean value of the energy values as the furnace wall edge roughness significant coefficients corresponding to the corresponding edges.
This embodiment is based on the correspondence of each edge obtained in step (3-3-2-1)Energy value of each pixel point in connected domain expansion area
Figure 275829DEST_PATH_IMAGE030
Figure 627176DEST_PATH_IMAGE031
,…,
Figure 448501DEST_PATH_IMAGE032
Calculating the energy mean value of the connected domain expansion region corresponding to each edge, taking the energy mean value as the furnace wall edge roughness significant coefficient, and calculating the furnace wall edge roughness significant coefficient corresponding to each edge according to the following calculation formula:
Figure 859891DEST_PATH_IMAGE034
wherein the content of the first and second substances,
Figure 766667DEST_PATH_IMAGE035
a furnace wall edge roughness significant factor for each edge,
Figure 921705DEST_PATH_IMAGE033
for the number of pixel points in the connected domain expansion region corresponding to each edge,
Figure 863116DEST_PATH_IMAGE033
is a non-zero value and is,
Figure 851932DEST_PATH_IMAGE036
the first in the connected domain expansion area corresponding to each edgenThe energy value of each pixel point.
It should be noted that when the texture distribution of the inner wall of the shuttle kiln to be detected is rougher, the energy average value of the connected domain expansion area corresponding to the edge is larger, that is, the furnace wall edge roughness significant coefficient corresponding to the edge is larger
Figure 855791DEST_PATH_IMAGE035
The larger the size of the hole is,significant coefficient of furnace wall edge roughness
Figure 752203DEST_PATH_IMAGE035
The larger the signature the less likely the edge is to be a separate edge of the flame envelope from the background of the furnace wall.
Thus, the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge are obtained, and the furnace wall edge significant coefficient corresponding to each edge is calculated according to the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge.
In this embodiment, the significance of each edge as a background edge of the furnace wall is analyzed from three angles to determine a probability indicator of each edge as a background edge of the furnace wall, which is the furnace wall edge significance coefficient in this embodiment. Specifically, the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge are multiplied, and the multiplied numerical value is used as the furnace wall edge significant coefficient of the corresponding edge, and the calculation formula is as follows:
Figure 548121DEST_PATH_IMAGE037
wherein, the first and the second end of the pipe are connected with each other,FWEfor each edge a corresponding furnace wall edge saliency coefficient,
Figure 301313DEST_PATH_IMAGE002
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
Figure 448261DEST_PATH_IMAGE026
the furnace wall edge chroma significant coefficient corresponding to each edge,
Figure 679522DEST_PATH_IMAGE035
the furnace wall edge roughness significant factor corresponding to each edge.
It is to be noted thatWhen the linear significance coefficient of the furnace wall edge of a certain edge
Figure 329946DEST_PATH_IMAGE002
Greater, significant coefficient of furnace wall edge chroma
Figure 457302DEST_PATH_IMAGE026
The larger the furnace wall edge chroma is, the significant coefficient
Figure 825966DEST_PATH_IMAGE026
The larger the edge, the more pronounced the furnace wall edge coefficient of the edgeFWEThe larger, i.e. the more likely it is that the edge is an edge of the background of the furnace wall.
(4) And carrying out corner point detection processing on each edge in the edge region to be processed to obtain a plurality of corner point target clusters corresponding to each edge, and further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge.
In this embodiment, corner detection is performed on each edge in the edge region to be processed to obtain a plurality of corners corresponding to each edge, the plurality of corners corresponding to each edge are clustered by using a DBSCAN clustering algorithm with a radius of 3, each edge forms a plurality of corner clusters, and the implementation processes of corner detection and the DBSCAN clustering algorithm are both the prior art and are not within the protection scope of the present invention, and are not described in detail here. Selecting corner clusters with the number of corners larger than 2 from a plurality of corner clusters corresponding to each edge, and calling the corner clusters with the number of corners larger than 2 corresponding to each edge as corner target clusters, thereby obtaining the corner clusters corresponding to each edge
Figure 860919DEST_PATH_IMAGE022
The method comprises the following steps of obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge according to a plurality of corner point target clusters corresponding to each edge, wherein the corner point target clusters comprise:
and (4-1) determining a flame tip angle vertex angle area of each corner point target cluster corresponding to each edge according to the position of each corner point in the plurality of corner point target clusters corresponding to each edge, and determining a plurality of corner point distance indexes corresponding to each edge.
(4-1-1) determining a flame sharp corner vertex angle area of each corner point target cluster corresponding to each edge, wherein the flame in a flame image in the sintering process of the shuttle kiln is not in a regular shape, a plurality of flame sharp corners can appear at the top of the flame, the corner can have a plurality of corner points, and the more the number of the corner points in the corner point target clusters is, the higher the combustion degree of the flame sharp corners corresponding to the corner point target clusters is. Based on the analysis, the approximate position of the corresponding flame tip angle can be obtained from the position of each corner point in the corner point target cluster, the convex hull of each corner point target cluster is determined according to the position of each corner point in each corner point target cluster corresponding to each edge, and the convex hull region of each corner point target cluster corresponding to each edge is used as the flame tip angle vertex angle region. The process of determining the convex hull is prior art and is not within the scope of the present invention and will not be described in detail herein.
(4-1-2) determining a plurality of corner distance indexes corresponding to each edge, calculating the distance between every two adjacent corners in each corner target cluster corresponding to each edge according to the position of each corner in the plurality of corner target clusters corresponding to each edge, obtaining each corner distance of each corner target cluster corresponding to each edge, further calculating the corner distance mean value of each corner target cluster corresponding to each edge based on each corner distance of each corner target cluster and the number of corner distances of each corner target cluster, taking the corner distance mean value as a corner distance index, wherein each corner target cluster has a corresponding corner distance index, and therefore each edge can correspond to a plurality of corner distance indexes.
And (4-2) determining flame sharp corner valley bottom areas corresponding to any two adjacent flame sharp corner vertex angle areas according to the positions of all pixel points between any two adjacent flame sharp corner vertex angle areas corresponding to each edge.
In this embodiment, at least one region similar to the valley bottom is determined to exist between two adjacent flame tip angle vertex angle regions corresponding to the edges, and any two adjacent flame tip angle vertex angle regions corresponding to each edge are selected based on the position of each pixel point between any two adjacent flame tip angle vertex angle regionsThe vertical coordinate of each pixel point between the vertex angle areas of the adjacent flame sharp angles in the flame edge image is the largest, and the vertical coordinate of the pixel point is recorded asvAnd if a plurality of pixels with the maximum vertical coordinates exist, selecting all the pixels. Traversing the longitudinal coordinate interval of vertex angle areas of two adjacent flame vertex angles by taking the selected pixel point with the maximum longitudinal coordinate as a starting pointv- wv]The edge pixel point connected to the start point,wis 10, a convex hull is constructed by the location of these edge pixels, and this convex hull region is called the flame tip valley floor region.
And (4-3) determining the downward sliding gradient of the flame sharp angles corresponding to each edge according to the positions of the mass centers in any two adjacent flame sharp angle vertex areas corresponding to each edge and the positions of the mass centers in the corresponding flame sharp angle valley bottom areas.
In this embodiment, based on any two adjacent flame sharp corner apex angle regions corresponding to each edge and the flame sharp corner valley bottom region corresponding to each edge, distance disturbance and angle disturbance are set, and an OpenCV (Open Source Computer Vision Library) recognition technology is used to obtain the positions of the centroids in any two adjacent flame sharp corner apex angle regions corresponding to each edge and the positions of the centroids in the flame sharp corner valley bottom regions corresponding to each edge. According to the position of the barycenter in two arbitrary adjacent flame closed angle apex angle regions that every edge corresponds and the position of the barycenter in the flame closed angle valley bottom region that corresponds, confirm the slope between two arbitrary adjacent flame closed angle apex angle regions and the flame closed angle valley bottom region that corresponds respectively, also calculate a plurality of flame closed angle glide gradients that every edge corresponds, its computational formula is:
Figure 631428DEST_PATH_IMAGE012
wherein the content of the first and second substances,Sa plurality of flame sharp angle downslide gradients for each edge,
Figure 929686DEST_PATH_IMAGE013
is the ordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure 785646DEST_PATH_IMAGE014
the abscissa of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure 358710DEST_PATH_IMAGE015
is the ordinate of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
Figure 249306DEST_PATH_IMAGE016
is the abscissa of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
Figure 515202DEST_PATH_IMAGE017
is the ordinate of the mass center in the flame sharp corner valley bottom area corresponding to the top corner area of any two adjacent flame sharp corners,
Figure 592879DEST_PATH_IMAGE018
is the abscissa of the mass center in the flame sharp corner valley bottom area corresponding to the vertex angle areas of any two adjacent flame sharp corners,
Figure 438476DEST_PATH_IMAGE019
rounding up is performed for pairs.
It should be noted that, two adjacent flame sharp corner vertex angle regions corresponding to the edges and the corresponding flame sharp corner valley bottom region can obtain a flame sharp corner downward-sliding gradient, if the edges correspond to each otherlA flame sharp corner apex region, the edge corresponding tol-1 flame tip glide gradient, each edge corresponding to a plurality of flame tip glide gradients. When the difference between the longitudinal coordinate of the mass center in the vertex angle area of the flame sharp angle and the longitudinal coordinate of the mass center in the valley bottom area of the flame sharp angle is larger, the downward sliding gradient of the flame sharp angle is larger, and when the difference between the longitudinal coordinate of the mass center in the vertex angle area of the flame sharp angle and the longitudinal coordinate of the mass center in the valley bottom area of the flame sharp angle is smaller, the flame sharp angle slides downwardsThe smaller the angular downslide gradient will be.
(5) And determining the flame outer flame edge significant coefficient corresponding to each edge according to the plurality of corner point target clusters corresponding to each edge, the plurality of flame sharp corner gliding gradients and the plurality of corner point distance indexes.
In this embodiment, the total number of angular points in a plurality of angular point target clusters corresponding to each edge, a number sequence composed of a plurality of flame sharp angle gliding gradients, and a plurality of angular point distance indexes are obtained, a data modeling method is used to construct a flame outer flame edge significant coefficient corresponding to each edge, two situations exist when constructing the flame outer flame edge significant coefficient, one is that the edge has a corresponding angular point target cluster, the other is that the edge does not have a corresponding angular point target cluster, and the calculation formula is as follows:
Figure 917998DEST_PATH_IMAGE038
wherein the content of the first and second substances,OFEfor each edge corresponding to the out-flame edge saliency coefficient,Nthe total number of corners within the target cluster of corners for each edge,Sa plurality of flame sharp angles corresponding to each edge slide down in a gradient manner to form a series,
Figure 354796DEST_PATH_IMAGE021
for each edge corresponds toiThe distance index of each corner point is,
Figure 185349DEST_PATH_IMAGE022
the number of corner target clusters corresponding to each edge,maxis a function of taking the maximum value.
It should be noted that, when a certain edge corresponds to the total number of corner points in the target cluster of corner pointsNWhen the distance index of the corner points is more and the downward sliding gradient of the sharp corner of the flame is more, the obvious coefficient of the outer flame edge corresponding to the edge is largerOFEThe larger the edge of the strip, the more likely it is to be a dividing edge between the flame envelope and the background of the furnace wall.
(6) And determining the flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining the target flame area image.
In this embodiment, the flame segmentation edge significant coefficient corresponding to each edge is determined by calculating the ratio of the furnace wall edge significant coefficient corresponding to each edge to the flame outer flame edge significant coefficient, and the calculation formula is as follows:
Figure 100215DEST_PATH_IMAGE039
wherein, the first and the second end of the pipe are connected with each other,FSEsegmenting the edge saliency coefficient for each edge's corresponding flame,OFEfor each edge corresponding to the out-flame edge saliency coefficient,FEWthe furnace wall edge saliency factor for each edge.
It should be noted that each edge of the edge region to be processed has its corresponding flame segmentation edge significant coefficient, and when the flame outer flame edge significant coefficient of a certain edgeOFEGreater, significant coefficient of furnace wall edgeOFEThe smaller the flame split edge saliency of the edgeFSEThe larger the edge is, the more likely it is to be a dividing edge between the flame envelope and the background of the furnace wall.
After obtaining the flame segmentation edge significant coefficient corresponding to each edge, determining a target flame area image according to the flame segmentation edge significant coefficient corresponding to each edge, wherein the method comprises the following steps of:
and (6-1) calculating a flame segmentation edge significant coefficient mean value according to the flame segmentation edge significant coefficient corresponding to each edge, and taking the flame segmentation edge significant coefficient mean value as a flame segmentation edge threshold value.
In order to facilitate the subsequent screening of the segmentation edges between the flame outer flame and the furnace wall background from the edges to be distinguished, the flame segmentation edge threshold value needs to be determined through the flame segmentation edge significant coefficient corresponding to each edge, is determined through the mean value self-adaption of the flame segmentation edge significant coefficient corresponding to each edge, is higher in referential performance, and is more beneficial to obtaining the accurate segmentation edges of the flame outer flame and the furnace wall background subsequently.
(6-2) if the flame segmentation edge significant coefficient corresponding to a certain edge is greater than the flame segmentation edge threshold value, judging that the edge is the segmentation edge of the flame outer flame and the furnace wall background, otherwise, judging that the edge is not the segmentation edge of the flame outer flame and the furnace wall background, and further obtaining a plurality of segmentation edges of the flame outer flame and the furnace wall background.
When a certain edge corresponds to the flame segmentation edge saliency coefficientFSEGreater than flame cut edge threshold
Figure DEST_PATH_IMAGE040
When the edge is recorded as the segmentation edge of the flame outer flame and the furnace wall background, and when the flame segmentation edge corresponding to a certain edge has significant coefficientFSELess than flame split edge threshold
Figure 371928DEST_PATH_IMAGE040
This edge is then designated as the disturbance edge, i.e. the edge is not the dividing edge between the flame envelope and the background of the furnace wall. In this way, a plurality of divided edges of the flame envelope and the furnace wall background in the edge region to be treated can be obtained.
And (6-3) determining a target flame area image according to a plurality of segmentation edges of the flame outer flame and the background of the furnace wall and the to-be-processed area of the flame gray level image.
In the embodiment, based on the plurality of segmentation edges of the flame outer flame and the furnace wall background in the edge region to be processed obtained in the step (4-2) and the region to be processed of the flame gray scale image in the step (1-2), the flame outer flame region in the region to be processed of the flame gray scale image is extracted by using the plurality of segmentation edges of the flame outer flame and the furnace wall background in the edge region to be processed, the flame outer flame region includes the flame inner flame region, and the flame outer flame region in the region to be processed is taken as the target flame region image.
(7) And determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network.
Inputting the target flame area image into a pre-constructed and trained working condition detection neural network, and outputting the current sintering working condition of the shuttle kiln to be detected, wherein the sintering working condition of the embodiment has four stages: a low-temperature preheating stage at normal temperature of 300-300 ℃, an oxidative decomposition stage at 300-950 ℃, a high-temperature sintering stage at 950-1300 ℃ and a heat preservation stage at about 1300 ℃, wherein the normal temperature can be set by an implementer according to the local specific actual conditions.
The framework of the working condition detection neural Network adopts a convolutional neural Network ResNet (deep Residual error Network), the optimization algorithm adopts an AdaGrad algorithm, the loss function is a cross entropy loss function, and the training image set is a plurality of target flame area images. The construction and training process of the working condition detection neural network is the prior art and is not within the protection scope of the invention, and the detailed description is not provided herein.
The acquisition mode of the training image set may be: shooting a flame image of a sintering area of the ceramic shuttle kiln by using an industrial camera to obtain a flame image sample by adoptingKThe thermocouple and the infrared thermometer are combined to measure the corresponding internal temperature of the kiln. The method comprises the steps of shooting flame images from 50 ℃, shooting flame images in a group of kilns when the temperature rises to 5 ℃, wherein the number of the flame images in the group of kilns is about 20, and shooting is finished until 1300 ℃, and obtaining more than 5000 shuttle kiln flame images corresponding to corresponding temperatures in total, so that 250 groups of image data samples with different temperature values can be obtained. To prevent overfitting of the network and to improve the accuracy of the training effect, the data size of the training image set used for network training should be as large as possible. However, the process of manually collecting the flame image is complicated and is easily interfered by the smoke of the kiln. Therefore, to obtain a sufficient training image set, a data enhancement method is used to enlarge the amount of data. According to the practical situation of image acquisition, 5 data enhancement methods of noise increase, image brightness increase, rotation, turnover and scaling are selected, 250 groups of image data samples with different temperature values are subjected to data processing, the number of the image data samples in each group is expanded to 100 pictures, and the total number of a training image set is 25000 multiple flame images. 25000 multi-flame map based on collectionAnd (3) by using an image recognition technology, referring to the process of determining the target flame region images in the steps (1) to (6), obtaining target flame region images corresponding to 25000 multiple flame images, and taking the target flame region images as training samples of the working condition detection neural network.
Therefore, the method realizes the detection of the current sintering condition of the shuttle kiln to be detected by using a data identification technology, namely determines the stage of the four temperature stages of the current sintering condition of the shuttle kiln to be detected, and effectively improves the accuracy of the detection of the sintering condition of the shuttle kiln by determining the target flame region image.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A shuttle kiln sintering condition image identification method is characterized by comprising the following steps:
acquiring a flame image of a shuttle kiln to be detected during sintering in real time, and performing pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image;
carrying out edge detection processing on the to-be-processed area of the flame gray level image to obtain to-be-processed edge areas, and further obtaining communication areas corresponding to all edges in the to-be-processed edge areas;
determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in a connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge;
carrying out corner point detection processing on each edge in the edge area to be processed to obtain a plurality of corner point target clusters corresponding to each edge, and further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame outer flame edge significant coefficient corresponding to each edge according to a plurality of corner point target clusters, a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining a target flame region image;
and determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network.
2. The shuttle kiln sintering condition image recognition method as claimed in claim 1, wherein the step of determining the furnace wall edge straight line significant coefficient, the furnace wall edge chroma significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge comprises:
determining a fitting straight line corresponding to each edge, fitting goodness of the fitting straight line, a window area corresponding to each pixel point in the connected domain corresponding to each edge and a connected domain expansion area corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge area to be processed;
determining a fitting expansion area corresponding to each pixel point of the fitting straight line according to the fitting straight line corresponding to each edge, and further determining furnace wall edge straight line significant coefficients corresponding to each edge according to the goodness of fit of the fitting straight line corresponding to each edge, the fitting expansion area corresponding to each pixel point of the fitting straight line and each pixel point in a connected domain corresponding to each edge;
according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBValue, determining furnace wall edge color corresponding to each edgeAnd determining the furnace wall edge roughness significant coefficient corresponding to each edge according to the gray value of each pixel point in the connected domain expansion region corresponding to each edge.
3. The shuttle kiln sintering condition image recognition method as claimed in claim 2, wherein the step of determining the furnace wall edge straight line significant coefficient corresponding to each edge comprises the steps of:
counting the number of outlier pixels corresponding to each edge according to each pixel point in the fitting expansion region corresponding to each pixel point of the fitting straight line corresponding to each edge and each pixel point in the connected domain corresponding to each edge;
determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the number of pixel points in the connected domain corresponding to each edge, the number of outlier pixel points and the fitting goodness of the fitting straight line, wherein the calculation formula is as follows:
Figure 213711DEST_PATH_IMAGE002
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE003
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
Figure 802955DEST_PATH_IMAGE004
the goodness of fit for the corresponding fitted line for each edge,afor the number of outlier pixels corresponding to each edge,band the number of the pixel points in the connected domain corresponding to each edge.
4. The shuttle kiln sintering condition image recognition method as claimed in claim 2, wherein the step of determining the furnace wall edge chroma significant coefficient corresponding to each edge comprises the steps of:
according to the window area corresponding to each pixel point in the connected domain corresponding to each edgeOf pixels within a domainRGBDetermining a first target pixel point and a second target pixel point corresponding to each pixel point in a connected domain;
according to the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBDetermining the primary color difference index of each pixel point in the connected domain corresponding to each edge;
and determining the primary color difference index median value of the connected domain corresponding to each edge according to the primary color difference index of each pixel point in the connected domain corresponding to each edge, and taking the primary color difference index median value as the furnace wall edge chroma significant coefficient corresponding to the corresponding edge.
5. The shuttle kiln sintering condition image identification method as claimed in claim 4, wherein the calculation formula for determining the primary color difference index of each pixel point in the connected domain corresponding to each edge is as follows:
Figure 989217DEST_PATH_IMAGE006
wherein the content of the first and second substances,ppfor the primary color difference index of each pixel point in the connected domain corresponding to each edge,
Figure DEST_PATH_IMAGE007
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRThe value of the sum of the values,
Figure 322109DEST_PATH_IMAGE008
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueRValue of a step of,
Figure DEST_PATH_IMAGE009
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGThe value of the sum of the values,
Figure 74165DEST_PATH_IMAGE010
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueGThe value of the sum of the values,
Figure DEST_PATH_IMAGE011
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueBThe value of the sum of the values,
Figure 631048DEST_PATH_IMAGE012
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBOf valueBValue of a step of,
Figure DEST_PATH_IMAGE013
in order to be a non-zero hyper-parameter,maxis a function of taking the maximum value.
6. The shuttle kiln sintering condition image recognition method as claimed in claim 2, wherein the step of determining the furnace wall edge roughness significant coefficient corresponding to each edge comprises:
determining the energy value of each pixel point in the connected domain expansion area according to the gray value of each pixel point in the connected domain expansion area corresponding to each edge;
and calculating the energy value mean value of the connected domain expansion region corresponding to each edge according to the energy value of each pixel point in the connected domain expansion region corresponding to each edge, and taking the energy value mean value as the furnace wall edge roughness significant coefficient corresponding to the corresponding edge.
7. The shuttle kiln sintering condition image identification method as claimed in claim 1, wherein the step of obtaining a plurality of flame sharp corner downslide gradients and a plurality of corner point distance indexes corresponding to each edge comprises:
determining a flame tip angle vertex angle area of each corner point target cluster corresponding to each edge according to the position of each corner point in a plurality of corner point target clusters corresponding to each edge, and determining a plurality of corner point distance indexes corresponding to each edge;
determining flame sharp corner valley bottom areas corresponding to any two adjacent flame sharp corner areas according to the position of each pixel point between any two adjacent flame sharp corner areas corresponding to each edge;
and determining the gliding gradient of the flame sharp angles corresponding to each edge according to the positions of the mass centers in any two adjacent flame sharp angle vertex areas corresponding to each edge and the positions of the mass centers in the flame sharp angle valley bottom areas corresponding to the edge.
8. The shuttle kiln sintering condition image identification method according to claim 7, wherein the calculation formula for determining the downward sliding gradient of the sharp angle of the flame corresponding to each edge is as follows:
Figure DEST_PATH_IMAGE015
wherein the content of the first and second substances,Sa plurality of flame sharp angle downslide gradients for each edge,
Figure 242289DEST_PATH_IMAGE016
is the ordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure DEST_PATH_IMAGE017
the abscissa of the mass center in the vertex angle area of any two adjacent flame sharp angles,
Figure 378872DEST_PATH_IMAGE018
is the ordinate of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
Figure DEST_PATH_IMAGE019
in the vertex angle area of any two adjacent other flame tipsThe abscissa of the center of mass,
Figure 731574DEST_PATH_IMAGE020
is the ordinate of the mass center in the flame sharp corner valley bottom area corresponding to the top corner area of any two adjacent flame sharp corners,
Figure DEST_PATH_IMAGE021
is the abscissa of the mass center in the flame sharp corner valley bottom area corresponding to the vertex angle areas of any two adjacent flame sharp corners,
Figure 662621DEST_PATH_IMAGE022
to round up.
9. The shuttle kiln sintering condition image recognition method as claimed in claim 1, wherein the calculation formula for determining the flame out edge significant coefficient corresponding to each edge is as follows:
Figure 948108DEST_PATH_IMAGE024
wherein the content of the first and second substances,OFEfor each edge corresponding to the out-flame edge saliency coefficient,Nfor each edge the total number of corners in the target cluster of corners,Sa plurality of flame sharp angles corresponding to each edge slide down in a gradient manner to form a series,
Figure DEST_PATH_IMAGE025
for each edge corresponds toiThe distance index of each corner point is used,
Figure 294907DEST_PATH_IMAGE026
the number of corner target clusters corresponding to each edge,maxis a function of taking the maximum value.
10. The shuttle kiln sintering condition image recognition method as claimed in claim 1, wherein the step of further determining the target flame area image comprises:
calculating a flame segmentation edge significant coefficient mean value according to the flame segmentation edge significant coefficient corresponding to each edge, and taking the flame segmentation edge significant coefficient mean value as a flame segmentation edge threshold value;
if the flame segmentation edge significant coefficient corresponding to a certain edge is larger than the flame segmentation edge threshold value, judging that the edge is the segmentation edge of the flame outer flame and the furnace wall background, otherwise, judging that the edge is not the segmentation edge of the flame outer flame and the furnace wall background, and further obtaining a plurality of segmentation edges of the flame outer flame and the furnace wall background;
and determining a target flame area image according to a plurality of segmentation edges of the flame outer flame and the background of the furnace wall and the to-be-processed area of the flame gray level image.
CN202211194569.7A 2022-09-29 2022-09-29 Shuttle kiln sintering condition image identification method Active CN115311471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211194569.7A CN115311471B (en) 2022-09-29 2022-09-29 Shuttle kiln sintering condition image identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211194569.7A CN115311471B (en) 2022-09-29 2022-09-29 Shuttle kiln sintering condition image identification method

Publications (2)

Publication Number Publication Date
CN115311471A CN115311471A (en) 2022-11-08
CN115311471B true CN115311471B (en) 2022-12-27

Family

ID=83867514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211194569.7A Active CN115311471B (en) 2022-09-29 2022-09-29 Shuttle kiln sintering condition image identification method

Country Status (1)

Country Link
CN (1) CN115311471B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103082A (en) * 2014-06-06 2014-10-15 华南理工大学 Image saliency detection method based on region description and priori knowledge
CN108090495A (en) * 2017-12-22 2018-05-29 湖南源信光电科技股份有限公司 A kind of doubtful flame region extracting method based on infrared light and visible images
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN113177467A (en) * 2021-04-27 2021-07-27 上海鹰觉科技有限公司 Flame identification method, system, device and medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103082A (en) * 2014-06-06 2014-10-15 华南理工大学 Image saliency detection method based on region description and priori knowledge
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN108090495A (en) * 2017-12-22 2018-05-29 湖南源信光电科技股份有限公司 A kind of doubtful flame region extracting method based on infrared light and visible images
CN113177467A (en) * 2021-04-27 2021-07-27 上海鹰觉科技有限公司 Flame identification method, system, device and medium

Also Published As

Publication number Publication date
CN115311471A (en) 2022-11-08

Similar Documents

Publication Publication Date Title
US11669971B2 (en) Colony contrast gathering
CN110286124B (en) Machine vision-based refractory brick measuring system
CN105844228B (en) A kind of remote sensing images cloud detection method of optic based on convolutional neural networks
CN105260709B (en) Water meter calibration method, apparatus based on image procossing and system
CN109269951B (en) Image-based flotation tailing ash content, concentration and coarse particle content detection method
CN107705288B (en) Infrared video detection method for dangerous gas leakage under strong interference of pseudo-target motion
CN105260710B (en) Water meter calibration method, apparatus based on image procossing and system
CN109872300B (en) Visual saliency detection method for appearance defects of friction plate
CN104392240A (en) Parasite egg identification method based on multi-feature fusion
CN109900711A (en) Workpiece, defect detection method based on machine vision
CN110148162A (en) A kind of heterologous image matching method based on composition operators
CN115063423B (en) Self-adaptive identification method for cold and hot cracks of mechanical castings based on computer vision
CN108921120B (en) Cigarette identification method suitable for wide retail scene
CN105974120B (en) Automatic detection device and method for C-reactive protein chromaticity
CN103914708A (en) Food variety detection method and system based on machine vision
CN105046701A (en) Image composition line-based multi-scale salient target detection method
CN109359604B (en) Method for identifying instrument under shadow interference facing inspection robot
CN114596551A (en) Vehicle-mounted forward-looking image crack detection method
Musicco et al. Automatic point cloud segmentation for the detection of alterations on historical buildings through an unsupervised and clustering-based Machine Learning approach
CN112489026A (en) Asphalt pavement disease detection method based on multi-branch parallel convolution neural network
CN101477025A (en) Fast evaluation method for collection exhibition materials based on image processing
Lin et al. Surface defect detection of machined parts based on machining texture direction
CN111178405A (en) Similar object identification method fusing multiple neural networks
CN115311471B (en) Shuttle kiln sintering condition image identification method
CN106886609B (en) Block type rural residential area remote sensing quick labeling method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant