CN111753848B - Oil stain degree identification method and system - Google Patents

Oil stain degree identification method and system Download PDF

Info

Publication number
CN111753848B
CN111753848B CN202010568398.4A CN202010568398A CN111753848B CN 111753848 B CN111753848 B CN 111753848B CN 202010568398 A CN202010568398 A CN 202010568398A CN 111753848 B CN111753848 B CN 111753848B
Authority
CN
China
Prior art keywords
oil stain
texture
oil
degree
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010568398.4A
Other languages
Chinese (zh)
Other versions
CN111753848A (en
Inventor
陈佳期
陈旭
李密
颜茂春
陈嘉华
罗伟华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Strait Zhihui Technology Co ltd
Original Assignee
Fujian Strait Zhihui Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Strait Zhihui Technology Co ltd filed Critical Fujian Strait Zhihui Technology Co ltd
Priority to CN202010568398.4A priority Critical patent/CN111753848B/en
Publication of CN111753848A publication Critical patent/CN111753848A/en
Application granted granted Critical
Publication of CN111753848B publication Critical patent/CN111753848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for identifying oil stain degree, which comprises the steps of obtaining an initial picture containing the oil stain degree to be identified, and converting the initial picture into a gray-scale image; the method comprises the steps of detecting a valley shape based on direction to obtain oil stain texture features in a gray level image, and refining a binary image of the oil stain texture features to obtain an oil stain texture image; and obtaining texture graphs similar to the oil stain texture graphs in the oil stain texture library by utilizing deep learning identification, and indexing the oil stain degree of the obtained initial picture according to the oil stain degree of the texture graphs in the texture library. The method can be used for rapidly identifying and classifying the pollution degree of the oil stains in the picture, and is more efficient and more accurate in identification result compared with manual identification.

Description

Oil stain degree identification method and system
Technical Field
The invention relates to the technical field of oil stain image recognition, in particular to a method and a system for recognizing oil stain degree.
Background
In some special application scenes, such as air ducts in kitchens in the catering industry and other oil smoke environments, due to the particularity of the use environment, namely heavy oil smoke and easy accumulation of cockroaches and bugs, the tolerance pressure of various parts such as an air conditioner to the environment is doubled, and although the existing air conditioner and air vents are greatly improved in the aspect of preventing oil smoke, the air conditioner and the air vents still face a great challenge as a variable environment body. There is the distance in one section wind channel between wind gap and the air conditioner main part, has the oil smoke to pass through the wind gap and gets into the wind channel, pollutes the wind channel, influences the problem of air conditioner air-out quality, and the wind channel is difficult to freely dismantle the washing, has further aggravated user experience's reduction. At present, due to the fact that the air channels are difficult to freely disassemble due to the sealing performance and the oil stain condition inside the air channels cannot be known in time, a scheme capable of automatically identifying the oil stain degree is urgently needed.
Disclosure of Invention
In order to solve the technical problem that the oil contamination degree is not automatically identified in the prior art, the invention provides an oil contamination degree identification method and system, and the oil contamination degree is automatically identified and obtained by utilizing an oil contamination identification mode of image processing and deep learning.
In one aspect, the present invention provides a method for identifying the degree of oil contamination, comprising the steps of:
s1: acquiring an initial picture containing the oil stain degree to be identified, and converting the initial picture into a gray-scale image;
s2: the method comprises the steps of detecting a valley shape based on direction to obtain oil stain texture features in a gray level image, and refining a binary image of the oil stain texture features to obtain an oil stain texture image;
s3: and obtaining texture graphs similar to the oil stain texture graphs in the oil stain texture library by utilizing deep learning identification, and indexing the oil stain degree of the obtained initial picture according to the oil stain degree of the texture graphs in the texture library.
Preferably, the step S2 of refining the binary image of the oil stain texture features to obtain the oil stain texture pattern specifically includes: and performing Otsu filtering on the oil stain texture features in sequence, limiting contrast, performing expansion corrosion and performing iterative refinement algorithm to obtain an oil stain texture graph. The clear oil stain texture graph can be obtained through the image processing mode, and the oil stain degree can be conveniently judged.
Further preferably, the contrast limiting mode adopts a contrast limited adaptive histogram equalization algorithm. The method can solve the problem of noise amplification.
Further preferably, the iterative refinement algorithm adopts a zhang suen refinement algorithm. And a refined image of the oil stain texture is obtained through an iterative refinement algorithm, so that deep learning training and recognition are facilitated.
Preferably, step S3 is preceded by: and classifying the similarity of the oil stain texture graphs by utilizing deep learning training, constructing an oil stain texture library, and indexing the oil stain degree in the oil stain texture library. The oil stain texture library is constructed to provide an identification data base for deep learning identification.
Further preferably, the deep learning is dense convolutional network-based deep learning, and the deep learning training model is a densenet121 model. Deep learning using dense convolutional networks has the advantages of narrower networks and fewer parameters.
Preferably, the oil texture features include strike, length and density of the oil texture. The accuracy of oil stain degree judgment is further improved through the recognition of a plurality of textural features.
According to a second aspect of the invention, a computer-readable storage medium is proposed, on which one or more computer programs are stored, which when executed by a computer processor implement the above-mentioned method.
According to a third aspect of the invention, a system for oil contamination level identification is proposed, the system comprising:
a pretreatment unit: the method comprises the steps that an initial picture containing the oil stain degree to be identified is obtained, and the initial picture is converted into a gray-scale image;
oil stain texture figure acquisition unit: the method comprises the steps that valley shape detection based on direction is configured to obtain oil stain texture features in a gray level image, and binary image refinement is carried out on the oil stain texture features to obtain an oil stain texture image; and
greasy dirt degree judgement unit: the texture graph similar to the oil stain texture graph in the oil stain texture library is obtained by deep learning identification, and the oil stain degree of the initial picture is obtained by indexing according to the oil stain degree of the texture graph in the texture library
Preferably, still include greasy dirt texture storehouse unit of constructing: the configuration is used for classifying and constructing an oil stain texture library by utilizing the similarity of oil stain texture graphs in deep learning training, and indexing the oil stain degree in the oil stain texture library.
The invention provides a method and a system for identifying the degree of oil stain, wherein the method utilizes valley shape detection based on direction to obtain oil stain texture characteristics in a gray scale image, and binary image refinement is carried out on the oil stain texture characteristics to obtain an oil stain texture image; and obtaining texture figures similar to the oil stain texture figures in the oil stain texture library by deep learning identification, and indexing according to the oil stain degree of the texture figures in the texture library to obtain the oil stain degree of the initial picture. The deep learning neural network is used for classifying and constructing the oil stain texture library and finally identifying the oil stain degree, compared with a manual identification mode, the deep learning neural network has more accurate identification effect and higher identification speed, and can be widely applied to the field needing oil stain identification.
Drawings
The accompanying drawings are included to provide a further understanding of the embodiments and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments and together with the description serve to explain the principles of the invention. Other embodiments and many of the intended advantages of embodiments will be readily appreciated as they become better understood by reference to the following detailed description. Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of a method for identification of oil contamination level according to an embodiment of the present application;
FIG. 3 is a graph of the effect of oil stain image processing according to an embodiment of the present application;
FIG. 4 is a block diagram of an identification system for oil contamination level according to an embodiment of the present application;
FIG. 5 is a schematic block diagram of a computer system suitable for use in implementing an electronic device according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which the identification method for the degree of oil contamination of the embodiments of the present application may be applied.
As shown in FIG. 1, system architecture 100 may include a data server 101, a network 102, and a host server 103. Network 102 serves as a medium for providing a communication link between data server 101 and host server 103. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The main server 103 may be a server that provides various services, such as a data processing server that processes information uploaded by the data server 101. The data processing server can identify the oil stain degree.
It should be noted that the oil contamination level identification method provided in the embodiment of the present application is generally executed by the main server 103, and accordingly, the device for the oil contamination level identification method is generally disposed in the main server 103.
The data server and the main server may be hardware or software. When the hardware is used, the hardware can be implemented as a distributed server cluster consisting of a plurality of servers, or can be implemented as a single server. When software, it may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module.
It should be understood that the number of data servers, networks, and host servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 shows a flow chart of an identification method for the degree of oil contamination according to an embodiment of the present application. As shown in fig. 2, the method comprises the steps of:
s201: and acquiring an initial picture containing the oil stain degree to be identified, and converting the initial picture into a gray-scale image. The conversion of the gray-scale image can facilitate the image processing such as the binarization of the subsequent oil stain image, and the like, so as to obtain better recognition effect.
S202: and (3) detecting the valley shape based on the direction to obtain the oil stain texture characteristics in the gray level image, and refining the binary image of the oil stain texture characteristics to obtain an oil stain texture figure. The binarization of the image is to change the image into two colors, namely black or white, in many occasions, the binarization is performed on the image, so that the color information and the background information of the image can be ignored, more important form information is reserved, and after the binarization processing of the image, the information amount of the image is greatly reduced, and the processing is more convenient. The oil stain image processing effect is specifically shown in fig. 3.
In a specific embodiment, through gray level analysis of an oil stain image, the gray level of an oil stain area is low, the background gray level is relatively high, a longitudinal section generally has a valley-shaped characteristic, and the oil stain exists in the valley-shaped position. The specific direction valley shape detection method is as follows: 1) and extracting a valley-shaped area. And (3) calculating the convolution sum in 8 directions with the current pixel as the center and the size of 9 × 9 neighborhood by using 8 directional operators for each pixel in the image f (x, y), and replacing the gray value of the current point by using the maximum convolution sum to obtain a new valley-shaped feature matrix g (x, y). 2) The rough segmentation is performed with 0 as a threshold. And (3) carrying out threshold operation on the valley-shaped feature matrix g (x, y) obtained in the step 1) by using a value 0 as a threshold, reserving a value larger than 0, and directly setting a value smaller than 0 to be 0, thereby obtaining a roughly-segmented image h (x, y). 3) Performing secondary coarse segmentation by using the average value of all elements which are not 0 in h (x, y) in the step 2). Counting the number of non-zero elements in h (x, y), dividing the number of the non-zero elements by the sum of the non-zero elements in h (x, y) to obtain a non-zero element mean value, performing threshold operation on the image h (x, y) by using the mean value as a threshold, replacing the gray level of a pixel point which is larger than the mean value by using the mean value, or not changing, and obtaining a new segmentation image k (x, y). 4) Enhancing k (x, y) in the step 3) by using fuzzy enhancement, and acquiring an enhanced image and recording the enhanced image as k' (x, y). 5) Finally, using Niblack algorithm to perform fine segmentation on k' (x, y) in step 4). By the method, the main body area of the oil stain texture can be divided.
In a preferred embodiment, the accuracy of valley shape detection can be improved by a modified algorithm of the first order differential operator of the Candy image edge detection. By processing the gradient with high and low thresholds, the Candy algorithm can find the gradient as much as possible and carry out non-maximum suppression processing on the preliminary gradient image, so that a large number of non-edge gradients are removed, and the possibility of finding the greasy dirt edge characteristic of a single pixel is provided.
In a specific embodiment, the binary image refining of the oil stain texture features to obtain the oil stain texture figure specifically comprises the steps of conducting Otsu filtering on the oil stain texture features in sequence, limiting contrast, performing expansion corrosion and performing an iterative refining algorithm to obtain the oil stain texture figure. The oil stain image is divided into two parts, namely a foreground part and a background part, which are larger than a threshold value and smaller than the threshold value by using an Otsu algorithm, the inter-class variance of the two parts is calculated, the larger the inter-class variance is, the larger the direct gray level difference of the two parts is, the threshold value of 0-255 is traversed until the maximum value of the inter-class variance is obtained, and the gray level image of the oil stain is converted into a binary image by using the value.
Further, general histogram equalization can enhance contrast, the same histogram transformation is performed on pixels of the whole image, the effect of pixel distribution uniformity is good, but the effect is poor when the image has an area which is obviously darker than other areas or dark areas, a local histogram is used, the histogram of pixels of each pixel in a rectangular range around each pixel is equalized, boundary pixels need to be expanded, the boundary pixels are not duplicated but mirrored, if a neighborhood rectangle is larger, the contrast is reduced, the smaller the contrast is enhanced, and when the pixels in a certain area are very close to each other, the histogram is very sharp, a very narrow pixel range is mapped to the whole pixel range, and the noise of a flat area is amplified. In order to solve the problem of noise amplification, the invention adopts a method for limiting the contrast, namely a contrast-limited adaptive histogram equalization CLAHE algorithm. The slope of the CDF can be reduced by distributing the histogram average over a set height to the lower end histogram. The higher the threshold setting, the higher the contrast.
In a specific embodiment, the image processing using dilation and erosion can eliminate noise, segment individual image elements, connect adjacent elements in the oil stain image, find detailed maximum or minimum regions in the oil stain image, and find the gradient of the oil stain image. The image (partial area A in the original image) is convolved with a kernel (anchor point B), wherein the expansion is the operation of solving a local maximum value, the convolution with the B is the operation of solving the maximum value of a pixel point of an area covered by the B, and the maximum value is assigned to a pixel specified by a reference point, so that a highlight area is increased. Its operator is "+". When the corrosion is a certain position traversed, the periphery of the certain position is all white, the white is reserved, otherwise, the certain position is changed into black, and the corrosion of the image is reduced. The process is exactly the reverse of the expansion. The operator is "-". Through the expansion corrosion treatment on the oil stain image, the noise of the oil stain texture image is removed, and the identification accuracy is improved.
In a specific embodiment, in binary image processing such as OCR recognition and matching, it is necessary to obtain an image by thinning characters so as to obtain a skeleton of the image, and obtaining the image by a zhang-suen thinning algorithm, which is one of features of the image and is commonly used as recognition or pattern matching. The method is applied to iterative refinement of the oil stain texture graph in the same way, so that clearer and more accurate oil stain texture characteristics can be obtained, and training and recognition of the oil stain degree can be conveniently carried out. Specifically, the refined iterative algorithm includes two steps: firstly, all foreground pixel points are circulated, pixel points which meet the conditions are marked to be deleted, the conditions are not met, and the pixels do not need to be marked to be deleted. Wherein the conditions are as follows:
1.2<=N(p1)<=6
2.S(P1)=1
3.P2*P4*P6=0
4.P4*P6*P8=0
n (P1) represents the number of foreground pixels in 8 pixels adjacent to P1, S (P1) represents the cumulative number of 0-1 times of appearance from P2-P9-P2 pixels, wherein 0 represents the background, and 1 represents the foreground.
Next, similar to the above steps, the conditions 1 and 2 are the same as the above steps, the conditions 3 and 4 are slightly different, and the pixel P1 satisfying the following conditions is marked as deleted:
1.2<=N(p1)<=6
2.S(P1)=1
3.P2*P4*P8=0
4.P2*P6*P8=0
and circulating the two steps until no pixel in the two steps is marked to be deleted, and outputting a result, namely the skeleton after binary image refinement.
S203: and obtaining texture graphs similar to the oil stain texture graphs in the oil stain texture library by utilizing deep learning identification, and indexing the oil stain degree of the obtained initial picture according to the oil stain degree of the texture graphs in the texture library.
In a specific embodiment, deep learning training is utilized to classify the similarity of oil stain texture patterns, an oil stain texture library is constructed, and the oil stain degree in the oil stain texture library is indexed. Training and identifying the oil stain degree by utilizing deep learning of dense convolutional network densenert, and lightening the vanizing-gradient by utilizing the deep learning of the dense convolutional network densenert; enhancing feature delivery; feature is utilized more effectively; the number of parameters is reduced to a certain extent. In dense convolutional networks densenert, the input to each layer comes from the output of all previous layers. One advantage of DenseNet is that the network is narrower, with fewer parameters, for a large part because, thanks to the design of this dense block, the number of output feature maps per convolutional layer in the dense block is small (less than 100), rather than hundreds or thousands of widths as with other networks. Meanwhile, the connection mode enables the transfer of the characteristics and the gradient to be more effective, and the network is easier to train.
In a particular embodiment, the oil texture features include run, length, and density of the oil texture. Can utilize the training to greasy dirt textural feature to classify, set up reasonable threshold value scope and divide into mild, medium and severe three degree with the greasy dirt degree, through the training discernment to the processing of greasy dirt picture and degree of depth study, acquire the corresponding eigenvalue of waiting to discern the greasy dirt picture, can audio-visually carry out the judgement of greasy dirt degree and classify, also can constantly maintain the renewal greasy dirt texture storehouse simultaneously for subsequent discernment is more accurate.
With continued reference to fig. 4, fig. 4 shows a frame diagram of an identification system for oil contamination level according to an embodiment of the invention. The system specifically comprises a pretreatment unit 401, an oil texture graph acquisition unit 402 and an oil degree judgment unit 403.
In a specific embodiment, the preprocessing unit 401 is configured to obtain an initial picture including the oil contamination degree to be identified, and convert the initial picture into a grayscale image; the oil stain texture pattern obtaining unit 402 is configured to obtain oil stain texture features in a gray level image based on valley shape detection of a direction, and refine a binary image of the oil stain texture features to obtain an oil stain texture pattern; the oil contamination degree judging unit 403 is configured to obtain a texture pattern similar to the oil contamination texture pattern in the oil contamination texture library by deep learning and recognition, and obtain the oil contamination degree of the initial picture by indexing according to the oil contamination degree of the texture pattern in the texture library.
In a specific embodiment, the system further includes an oil stain texture library constructing unit 404, configured to utilize deep learning training to classify and construct an oil stain texture library according to the similarity of oil stain texture patterns, and index the oil stain degree in the oil stain texture library.
Referring now to FIG. 5, shown is a block diagram of a computer system 500 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Liquid Crystal Display (LCD) and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 501. It should be noted that the computer readable storage medium of the present application can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable storage medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Sma l lta l k, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present application may be implemented by software or hardware.
As another aspect, the present application also provides a computer-readable storage medium, which may be included in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable storage medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an initial picture containing the oil stain degree to be identified, and converting the initial picture into a gray-scale image; the method comprises the steps of detecting a valley shape based on direction to obtain oil stain texture features in a gray level image, and refining a binary image of the oil stain texture features to obtain an oil stain texture image; and obtaining texture graphs similar to the oil stain texture graphs in the oil stain texture library by utilizing deep learning identification, and indexing the oil stain degree of the obtained initial picture according to the oil stain degree of the texture graphs in the texture library.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A method for identifying the degree of oil contamination, comprising:
s1: acquiring an initial picture containing the oil stain degree to be identified, and converting the initial picture into a gray-scale image;
s2: detecting and obtaining oil stain texture features in the gray level image based on valley shapes of directions, and refining the oil stain texture features by a binary image to obtain an oil stain texture image;
s3: and obtaining texture graphs similar to the oil stain texture graphs in an oil stain texture library by utilizing deep learning identification, and indexing according to the oil stain degree of the texture graphs in the texture library to obtain the oil stain degree of the initial picture.
2. The oil contamination degree identification method according to claim 1, wherein the step S2 of refining the binary image of the oil contamination texture features to obtain the oil contamination texture pattern specifically comprises: and performing Otsu filtering, contrast limiting, expansion corrosion and iterative refinement algorithm on the oil stain texture features in sequence to obtain the oil stain texture graph.
3. The oil contamination level identification method according to claim 2, wherein the contrast limiting means employs a contrast limited adaptive histogram equalization algorithm.
4. The oil contamination level identification method according to claim 3, wherein the iterative refinement algorithm employs a zhang suen refinement algorithm.
5. The method for recognizing an oil contamination level according to claim 1, further comprising, before the step S3: and classifying the similarity of the oil stain texture patterns by utilizing deep learning training, constructing the oil stain texture library, and indexing the oil stain degree in the oil stain texture library.
6. The oil contamination level identification method according to claim 5, wherein the deep learning is dense convolutional network-based deep learning, and the deep learning training model is a densenet121 model.
7. The method for oil contamination level identification according to claim 1, wherein the oil contamination texture features comprise run, length and density of oil contamination texture.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a computer processor, carries out the method of any one of claims 1 to 7.
9. An identification system for oil contamination level, comprising:
a pretreatment unit: the method comprises the steps that an initial picture containing the oil stain degree to be identified is obtained, and the initial picture is converted into a gray-scale image;
oil stain texture figure acquisition unit: the method comprises the steps that valley shape detection based on direction is configured to obtain oil stain texture features in a gray level image, and binary image refinement is carried out on the oil stain texture features to obtain an oil stain texture image; and
greasy dirt degree judgement unit: the method is configured to obtain a texture graph similar to the oil stain texture graph in an oil stain texture library by utilizing deep learning identification, and obtain the oil stain degree of the initial picture according to oil stain degree indexing of the texture graph in the texture library.
10. An oil contamination level identification system according to claim 9, further comprising an oil contamination texture library construction unit: and classifying and constructing an oil stain texture library for the similarity of the oil stain texture graphs by utilizing deep learning training, and indexing the oil stain degree in the oil stain texture library.
CN202010568398.4A 2020-06-19 2020-06-19 Oil stain degree identification method and system Active CN111753848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010568398.4A CN111753848B (en) 2020-06-19 2020-06-19 Oil stain degree identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010568398.4A CN111753848B (en) 2020-06-19 2020-06-19 Oil stain degree identification method and system

Publications (2)

Publication Number Publication Date
CN111753848A CN111753848A (en) 2020-10-09
CN111753848B true CN111753848B (en) 2021-03-19

Family

ID=72675847

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010568398.4A Active CN111753848B (en) 2020-06-19 2020-06-19 Oil stain degree identification method and system

Country Status (1)

Country Link
CN (1) CN111753848B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6195448B1 (en) * 1997-02-28 2001-02-27 Michael Schiller Finger imaging apparatus
AU2003252765B2 (en) * 1999-02-05 2006-06-29 Samsung Electronics Co., Ltd. Image texture retrieving method and apparatus thereof
US8452046B2 (en) * 2008-10-07 2013-05-28 Honeywell International Inc. Method and apparatus for automatic sediment or sludge detection, monitoring, and inspection in oil storage and other facilities
CN104952077B (en) * 2015-06-18 2018-02-16 深圳辰通智能股份有限公司 A kind of bill images greasy dirt detection method and system
CN108010035A (en) * 2017-11-07 2018-05-08 深圳市金城保密技术有限公司 Finger vena image segmentation method and its system, terminal based on the detection of direction paddy shape
CN108852236A (en) * 2018-06-20 2018-11-23 佛山市顺德区美的洗涤电器制造有限公司 Wash electric appliance and its detergent throwing device, method
CN109271966B (en) * 2018-10-15 2021-10-26 广州广电运通金融电子股份有限公司 Identity authentication method, device and equipment based on finger veins
CN110004664B (en) * 2019-04-28 2021-07-16 深圳数联天下智能科技有限公司 Clothes stain recognition method and device, washing machine and storage medium

Also Published As

Publication number Publication date
CN111753848A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN111709420B (en) Text detection method, electronic device and computer readable medium
CN107784301B (en) Method and device for recognizing character area in image
Yang et al. An adaptive logical method for binarization of degraded document images
Mukherjee et al. Enhancement of image resolution by binarization
CN113688838B (en) Red handwriting extraction method and system, readable storage medium and computer equipment
CN112052842B (en) Palm vein-based personnel identification method and device
Azad et al. New method for optimization of license plate recognition system with use of edge detection and connected component
Kilic et al. Turkish vehicle license plate recognition using deep learning
CN111507337A (en) License plate recognition method based on hybrid neural network
CN112967191A (en) Image processing method, image processing device, electronic equipment and storage medium
CN109241865B (en) Vehicle detection segmentation algorithm under weak contrast traffic scene
Maity et al. Background modeling and foreground extraction in video data using spatio-temporal region persistence features
CN111753848B (en) Oil stain degree identification method and system
Satrasupalli et al. End to end system for hazy image classification and reconstruction based on mean channel prior using deep learning network
CN116052129A (en) Traffic sign detection method and device based on multi-feature fusion and storage medium
Tung et al. Efficient uneven-lighting image binarization by support vector machines
CN114022856A (en) Unstructured road travelable area identification method, electronic device and medium
Boudissa et al. semantic segmentation of traffic landmarks using classical computer vision and U-Net model
CN111612836B (en) Identification method and system for hollow circular pointer type instrument
CN114283087A (en) Image denoising method and related equipment
Yang et al. A novel binarization approach for license plate
Salau An effective graph-cut segmentation approach for license plate detection
CN111476243A (en) Image character recognition method and device
Zhang et al. Using Gaussian Kernels to Remove Uneven Shading from a Document Image
Navastara et al. Video-Based License Plate Recognition Using Single Shot Detector and Recurrent Neural Network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant