CN115205596A - Image classification method, device, equipment and storage medium - Google Patents

Image classification method, device, equipment and storage medium Download PDF

Info

Publication number
CN115205596A
CN115205596A CN202210855654.7A CN202210855654A CN115205596A CN 115205596 A CN115205596 A CN 115205596A CN 202210855654 A CN202210855654 A CN 202210855654A CN 115205596 A CN115205596 A CN 115205596A
Authority
CN
China
Prior art keywords
image set
value
image
standard image
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210855654.7A
Other languages
Chinese (zh)
Inventor
韩金城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202210855654.7A priority Critical patent/CN115205596A/en
Publication of CN115205596A publication Critical patent/CN115205596A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of artificial intelligence, and discloses an image classification method, which comprises the following steps: converting the original image set into a preset HSV space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set; calculating the color moments of the standard image set to obtain the color characteristics of the standard image set; performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set; and splicing the color features and the texture features, and transmitting the spliced standard image feature set to a preset image classifier to obtain an image classification result of the standard image set. The invention also relates to a blockchain technique, where the original image set can be stored in blockchain link points. The invention also provides an image classification device, equipment and a medium. The invention can improve the accuracy of image classification.

Description

Image classification method, device, equipment and storage medium
Technical Field
The invention relates to the field of artificial intelligence, in particular to an image classification method, device, equipment and storage medium.
Background
Image classification is an image processing method that distinguishes objects of different classes in an image based on different image features reflected in image information. The traditional image classification method is to convert a color image into a gray image, extract the characteristics of the gray image and classify the image according to the characteristics of the gray image. However, by extracting only the features in the grayscale image, the color information of the image may be lost, so that the extracted image feature information is insufficient, resulting in a low accuracy of image classification.
Disclosure of Invention
The invention provides an image classification method, an image classification device, image classification equipment and a storage medium, and mainly aims to improve the accuracy of image classification.
In order to achieve the above object, the present invention provides an image classification method, including:
obtaining an original image set, converting the original image set into a preset HSV (hue, saturation, value) space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
calculating the color moments of the standard image set to obtain the color characteristics of the standard image set;
performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
Optionally, the calculating color moments of the standard image set to obtain color features of the standard image set includes:
respectively calculating a first order color moment, a second order color moment and a third order color moment of the standard image set;
and summarizing the first order color moment, the second order color moment and the third order color moment to obtain the color characteristics.
Optionally, the performing a local neighborhood difference operation on the standard image set to obtain a local feature of the standard image set includes:
acquiring an image pixel matrix in the standard image set, and determining a central pixel and each neighborhood pixel according to the image pixel matrix;
and identifying adjacent pixels corresponding to the adjacent pixels, comparing the adjacent pixels with the adjacent pixels, and performing binarization operation on an image pixel matrix according to a comparison result to obtain a plurality of local features of the standard image set.
Optionally, the performing a local neighborhood enhancement operation on the plurality of local features to obtain texture features of the standard image set includes:
performing relative difference calculation on pixels in the local features to obtain a first local texture feature;
calculating the average deviation of pixels in the local features to obtain a second local texture feature;
and splicing the first local texture feature and the second local texture feature to obtain the texture feature of the standard image set.
Optionally, the converting the original image set into a preset HSV space to obtain an original image set based on an HSV space includes:
extracting the red value, the green value and the blue value of any pixel point in the original image set;
respectively carrying out normalization processing on the red value, the green value and the blue value to obtain a red normalization value, a green normalization value and a blue normalization value;
substituting the red normalization value, the green normalization value and the blue normalization value into a preset HSV conversion formula to obtain a hue value, a saturation value and a brightness value;
if the hue value is smaller than a preset hue threshold value, adding the hue value and a hue standard value for calculation to obtain a final hue value;
determining an original image set based on HSV space from the final hue value, the saturation value and the brightness value.
Optionally, the transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image feature set includes:
constructing a plurality of hyperplane functions of the standard image feature set;
determining two parallel hyperplane functions in the hyperplane functions by using a preset geometric interval, and performing formula conversion on the two parallel hyperplane functions to obtain a constraint condition;
converting the constraint condition into an unconstrained condition by utilizing the Lagrange number multiplication, and calculating the unconstrained condition to obtain an optimal hyperplane in the two parallel hyperplane functions;
and classifying the standard image feature set by using the optimal hyperplane to obtain an image classification result of the standard image set.
Optionally, the denoising operation is performed on the original image set based on the HSV space to obtain a standard image set, and the denoising operation includes:
superposing a preset filtering window with the pixel position of the upper left corner image in the original image set, sliding the filtering window according to a preset step length until the filtering window is superposed with the pixel position of the lower right corner image in the original image set, and sequentially reading the pixel gray value corresponding to the superposed image pixel position;
sorting the pixel gray values to obtain sorted pixel gray values;
and searching a median set of the gray values of the sorted pixels, and sequentially selecting a median from the median set to replace the middle value of the gray value of the pixel to obtain the standard image set.
In order to solve the above problem, the present invention also provides an image classification apparatus, comprising:
the space conversion module is used for acquiring an original image set, converting the original image set into a preset HSV space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
the color feature extraction module is used for calculating color moments of the standard image set to obtain color features of the standard image set;
the texture feature extraction module is used for performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and the image classification module is used for splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
In order to solve the above problem, the present invention also provides an electronic device, including:
a memory storing at least one computer program; and
and a processor executing the computer program stored in the memory to implement the image classification method described above.
In order to solve the above problem, the present invention also provides a computer-readable storage medium, in which at least one computer program is stored, the at least one computer program being executed by a processor in an electronic device to implement the image classification method described above.
In the embodiment of the invention, firstly, the original image set is converted into the preset HSV space, so that the brightness, the tone and the vividness of the color can be visually expressed, the subsequent color characteristic can be conveniently extracted, and the noise in the original image set can be removed by carrying out the denoising operation on the original image set based on the HSV space, so that the details in the image are more prominent, and the image quality is higher; secondly, color distribution characteristics of the image can be completely expressed by calculating color moments of the standard image set, local neighborhood difference operation is carried out on the standard image set to obtain local characteristics of the standard image set, local neighborhood enhancement operation is carried out on a plurality of local characteristics, more complete texture characteristics can be extracted, and accuracy of subsequent image classification is guaranteed; finally, by splicing the color features and the texture features, more complete image feature information can be obtained, the obtained standard image feature set is transmitted to a preset image classifier, an image classification result of the standard image set is obtained, and the accuracy of image classification can be improved. Therefore, the image classification method, the image classification device, the image classification equipment and the storage medium provided by the embodiment of the invention can improve the accuracy of image classification.
Drawings
Fig. 1 is a schematic flowchart of an image classification method according to an embodiment of the present invention;
FIG. 2 is a detailed flowchart illustrating a step of an image classification method according to an embodiment of the present invention;
FIG. 3 is a detailed flowchart illustrating a step of an image classification method according to an embodiment of the present invention;
fig. 4 is a schematic block diagram of an image classification apparatus according to an embodiment of the present invention;
fig. 5 is a schematic internal structural diagram of an electronic device implementing an image classification method according to an embodiment of the present invention;
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
The embodiment of the invention provides an image classification method. The execution subject of the image classification method includes, but is not limited to, at least one of electronic devices such as a server and a terminal, which can be configured to execute the method provided by the embodiments of the present application. In other words, the image classification method may be performed by software or hardware installed in a terminal device or a server device, and the software may be a blockchain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
Referring to a schematic flow diagram of an image classification method provided in an embodiment of the present invention shown in fig. 1, in the embodiment of the present invention, the image classification method includes the following steps S1 to S4:
s1, obtaining an original image set, converting the original image set into a preset HSV space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set.
In the embodiment of the invention, the original image set determines RGB color images based on an actual scene, such as road driving images, object images, character images and the like. The preset HSV space refers to three color channels of an image, which are H (hue), S (saturation) and V (value), respectively, wherein hue (hue value) represents the kind of color, and is usually represented by an angle; saturation (saturation value) represents the purity of the color, the higher the value, the higher the purity and the more brilliant the color, whereas the darker the color; value (bright value) indicates the degree to which the color is bright. According to the embodiment of the invention, the original image set is converted into the preset HSV space, so that the brightness, the hue and the vividness of the color can be expressed very visually, and the subsequent color characteristic can be extracted conveniently.
In the embodiment of the invention, small white or black pixel points randomly appear in the original image set, which causes unclear image display and influences on image quality, so that the embodiment of the invention can remove noise in the original image set by carrying out denoising operation on the original image set based on the HSV space, thereby enabling details in the image to be more prominent, the image to be clearer and the image quality to be higher.
As an embodiment of the present invention, the converting the original image set into a preset HSV space to obtain an original image set based on an HSV space includes:
extracting the red value, the green value and the blue value of any pixel point in the original image set; respectively carrying out normalization processing on the red value, the green value and the blue value to obtain a red normalization value, a green normalization value and a blue normalization value; substituting the red normalization value, the green normalization value and the blue normalization value into a preset HSV conversion formula to obtain a hue value, a saturation value and a brightness value; if the hue value is smaller than a preset hue threshold value, adding the hue value and a hue standard value for calculation to obtain a final hue value; determining an original image set based on HSV space from the final hue value, the saturation value and the brightness value.
The red value, the green value and the blue value of any pixel point in the original image set are respectively R, G and B of the image, and the difference of R, G and B in the image can influence the color presented by the image. And carrying out normalization processing on the red value, the green value and the blue value, namely converting the R, G and B values to be between 0 and 1.
Further, the preset HSV conversion formula is as follows:
V=max(R,G,B)
Figure BDA0003754438240000051
Figure BDA0003754438240000052
wherein H represents the hue value; s represents the saturation value; v represents the brightness value; r represents a red value of the original image set; g represents a green value of the original image set; b represents the blue value of the original image set.
Preferably, the hue threshold value is 0 and the hue standard value is 360.
For example, if the tone value is 50, 50 is directly output as the final tone value, and if the tone value is-1, the tone value-1 and the tone standard value 360 are subjected to addition processing to obtain 359, and 359 is taken as the final tone value.
Further, the denoising operation is performed on the original image set based on the HSV space to obtain a standard image set, and the denoising operation includes:
superposing a preset filtering window with the pixel position of the upper left corner image in the original image set, sliding the filtering window according to a preset step length until the filtering window is superposed with the pixel position of the lower right corner image in the original image set, and sequentially reading the pixel gray value corresponding to the superposed image pixel position; sorting the pixel gray values to obtain sorted pixel gray values; and searching a median set of the gray values of the sorted pixels, and sequentially selecting a median from the median set to replace the intermediate value of the gray value of the pixel to obtain the standard image set.
The filtering window may be a 3 × 3 matrix template, the step size may be 1, and the pixel grayscale value refers to a value for recording a brightness degree of an image in an original image set, for example, in a 3 × 3 matrix, the pixel grayscale value after the matrix coincides with a pixel position of an upper-left corner image is 18, 16, 25, 44, 2, 7, 6, 5, 80; if the gray values of the sorted pixels are 2, 5, 6, 7, 16, 25, 28, 44 and 80, the value in the middle position of the gray values of the sorted pixels is a median 16, the median 16 is used for replacing the median 2 of the gray values of the pixels, and the median 2 of the gray values of the pixels is a noise signal, so that the noise signal can be eliminated through median replacement; and performing filtering window sliding according to the step length of 1, sequentially obtaining medians corresponding to sliding, and sequentially replacing the obtained medians with the medians corresponding to sliding to obtain the standard image set.
And S2, calculating the color moment of the standard image set to obtain the color characteristics of the standard image set.
In the embodiment of the present invention, the color moments are a color feature representation method; the color features are used for describing surface properties of a scene corresponding to an image or an image area.
According to the embodiment of the invention, the color moments in the standard image set are calculated, and the color features of the standard image set are extracted according to the color moments, so that the color feature distribution of the image can be determined, the color distribution features of the image can be completely expressed, and the accuracy of color feature extraction is improved.
As an embodiment of the present invention, referring to fig. 2, the calculating the color moments of the standard image set to obtain the color features of the standard image set includes the following steps S21 to S22:
s21, respectively calculating a first order color moment, a second order color moment and a third order color moment of the standard image set;
and S22, summarizing the first-order color moment, the second-order color moment and the third-order color moment to obtain the color characteristics.
Wherein the first order moment of color, the second order moment of color, and the third order moment of color can be calculated by the following formulas:
Figure BDA0003754438240000061
Figure BDA0003754438240000062
Figure BDA0003754438240000071
wherein, the mu i Representing a first order color moment; the above tau i Representing a second order color moment; s is i Representing the third order moment of color; the P is i, A value representing a jth pixel in an ith color channel component of the standard set of images; the N represents the number of pixels in the image.
In the embodiment of the invention, because the color information is mainly distributed in the low-order moment, the color tendency of the image can be represented by calculating the first-order color moment, the distribution range of the color of the image can be represented by calculating the second-order color moment, the symmetry of the color distribution of the image can be represented by calculating the third-order color moment, and the color distribution characteristic of the image can be completely expressed by respectively calculating the first-order color moment, the second-order color moment and the third-order color moment of the standard image set.
In one embodiment of the present invention, since the image contains three color components H, S, and V, and each color component has 3 lower order moments of color, the first, second, and third moments of color of the image are summarized to form a 9-dimensional color feature as follows:
F color =[μ hsvhsv ,s h ,s s ,s v ]
wherein, F is color Representing a color feature; the mu h A color hue value representing a first order color moment; the mu s A color saturation value representing a first order color moment; the mu v A lightness of color value representing a first order moment of color; the above tau h A color hue value representing a second order color moment; the above-mentioned tau s A color saturation value representing a second order color moment; the above tau v A lightness of color value representing a second order moment of color; s is h A color hue value representing a third-order moment of color; s is s A color saturation value representing a third order moment of color; s is v A lightness value representing the third order moment of color.
And S3, carrying out local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and carrying out local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set.
In the embodiment of the invention, the local neighborhood difference operation refers to the operation of extracting local features based on the difference between neighborhood pixels and forming each pixel in a binary mode representation image; the local neighbor enhancement operation is based on the concept that the neighborhood of a specific pixel has a large amount of texture information, and the information is mainly used for texture representation; the texture feature is a global feature, reflects a visual feature of a homogeneous phenomenon in an image, and reflects a feature of a surface tissue structure arrangement attribute with slow transformation or periodic change on the surface of an object.
According to the embodiment of the invention, the local characteristics of the standard image set are obtained by carrying out local neighborhood difference operation on the standard image set, and the texture characteristics of the standard image set are obtained by carrying out local neighborhood enhancement operation on a plurality of local characteristics, so that more complete texture characteristics can be extracted, and the accuracy of subsequent image classification is ensured.
As an embodiment of the present invention, referring to fig. 3, the performing a local neighborhood difference operation on the standard image set to obtain a local feature of the standard image set includes the following steps S31 to S32:
s31, acquiring an image pixel matrix in the standard image set, and determining a central pixel and each neighborhood pixel according to the image pixel matrix;
s32, identifying adjacent pixels corresponding to the adjacent pixels, comparing the adjacent pixels with the adjacent pixels, and performing binarization operation on an image pixel matrix according to a comparison result to obtain a plurality of local features of the standard image set.
The image pixel matrix is a part of image pixels, which is overlapped with a 3x3 matrix template and serves as a pixel matrix; the central pixel refers to a pixel corresponding to the center of a 3x3 matrix, and the neighborhood pixels refer to 8 pixels surrounding the central pixel in the 3x3 matrix and serve as neighborhood pixels; the neighboring pixels refer to pixels adjacent to the neighbor pixels.
In particular, the central pixel I of a 3 × 3 image pixel matrix is determined c And 8 neighborhood pixels, wherein the 8 neighborhood pixels are sequentially marked as I 1 ,I 2 ,I 3 ,I 4 ,I 5 ,I 6 ,I 7 And I 8 (ii) a When the neighboring pixels of the neighborhood pixel are odd, the number of the neighboring pixels is 4, for example, I 1 Is [ I ] as the neighboring pixel 2 ,I 3 ,I 7 ,I 8 ](ii) a When the neighboring pixels of the neighborhood pixel are even, the number of the neighboring pixels is 2, for example, I 2 Is [ I ] as the adjacent pixel 1 ,I 3 ]。
An embodiment of the inventionIn (3), the neighborhood pixel is compared with the adjacent pixel, i.e. I can be provided 1 Is [ I ] as the adjacent pixel 2 ,I 3 ,I 7 ,I 8 ]Is shown by 1 And I 2 ,I 3 ,I 7 ,I 8 Are compared separately when I 2 ,I 3 ,I 7 ,I 8 Has a value greater than I 1 When the value of (c) is greater than or equal to (1), replacing the value with "1"; when I is 2 ,I 3 ,I 7 ,I 8 Of not more than I 1 When the pixel value is obtained, the value is replaced by '0', namely binarization operation is realized, each neighborhood is compared with corresponding adjacent pixels, and 8 different image pixel matrixes, namely 8 local features, can be obtained.
Further, the performing a local neighborhood enhancement operation on the plurality of local features to obtain texture features of the standard image set includes:
performing relative difference calculation on pixels in the local features to obtain a first local texture feature; carrying out average deviation calculation on pixels in the local features to obtain a second local texture feature; and splicing the first local texture feature and the second local texture feature to obtain the texture feature of the standard image set.
Wherein, the first local texture feature refers to a texture feature which can resist the influence of illumination; the relative difference calculation means that a first relative difference is obtained by calculating the relative difference between a neighborhood pixel and an adjacent pixel in the local feature; then, calculating the relative difference between the central pixel and the neighborhood pixel in the local feature to obtain a second relative difference; and performing exclusive-or operation on the first relative difference and the second relative difference to obtain a first local texture feature, so that the extracted texture feature can resist the influence of illumination better.
Further, the first relative difference and the second relative difference may be calculated by the following formulas:
B 1,i =sign(S i ,I i )
B 2, =sign(S i ,I c )
wherein, B 1,i Representing a first relative difference; b is 2, Representing a second relative difference; sign () represents a relative difference function; s i Representing adjacent pixels; i is i Representing a neighborhood of pixels; I.C. A c Represents a center pixel; i denotes a pixel index.
In an embodiment of the present invention, the second local texture feature is a local texture feature for improving a chromatic dispersion effect, and since some images in a standard image set may receive a chromatic dispersion effect, so that a local pixel deviation is too large, the deviation is eliminated by performing average deviation calculation on the multiple local features, so that a more accurate texture feature can be extracted.
In an embodiment of the present invention, the average deviation calculation may obtain a first deviation by calculating an average deviation between a neighborhood pixel and an adjacent pixel in the local feature; and then calculating the average deviation between the central pixel and the field pixel in the local features to obtain a second deviation, and comparing the first deviation with the second deviation to obtain a second local texture feature.
Further, the first deviation and the second deviation may be calculated by the following formula:
Figure BDA0003754438240000091
Figure BDA0003754438240000092
wherein M is i Representing a first deviation; s i Representing adjacent pixels; i is i Representing a neighborhood of pixels; t is c Indicating a second deviation; i is c Represents a center pixel; i denotes a pixel index.
And S4, splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
In the embodiment of the invention, the standard image features and the color features and the texture features can be spliced through a preset vector splicing (splice) mechanism, and more complete image feature information can be obtained by splicing the color features and the texture features.
In the embodiment of the invention, the image classifier can be an image distribution model constructed based on the working principle of a support vector machine.
According to the embodiment of the invention, the standard image feature set is transmitted to the preset image classifier to obtain the image classification result of the standard image feature set, so that the accuracy of image classification can be improved.
In particular, the existing color feature vector v 1 ∈R n (feature dimension) and texture feature vector v 2 ∈R m (feature dimension) of v 1 、v 2 The standard image feature set v = [ v ] can be obtained by splicing in the same dimension 1 ,v 2 ]。
As an embodiment of the present invention, the transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image feature set includes:
constructing a plurality of hyperplane functions of the standard image feature set; determining two parallel hyperplane functions in the hyperplane functions by using a preset geometric interval, and performing formula conversion on the two parallel hyperplane functions to obtain a constraint condition; converting the constraint condition into an unconstrained condition by utilizing the Lagrange number multiplication, and calculating the unconstrained condition to obtain an optimal hyperplane in the two parallel hyperplane functions; and classifying the standard image feature set by using the optimal hyperplane to obtain an image classification result of the standard image set.
Wherein, the maximum distance between the two parallel hyperplane functions is the maximum interval, and the constraint condition can be obtained according to the maximum interval; the constraint condition is that an optimal value of the objective function is found in a limited space; the optimal hyperplane is the plane that segments the standard set of image features.
In an embodiment of the present invention, the optimal hyperplane is obtained by the following formula:
f(x)=(w t x+b)
wherein f (x) represents an optimal hyperplane function, w t The normal vector of the hyperplane is shown, x is the standard image characteristic set, and b is the real-digit displacement term.
In the embodiment of the invention, firstly, the original image set is converted into the preset HSV space, so that the brightness, the tone and the vividness of the color can be visually expressed, the subsequent color characteristic can be conveniently extracted, and the noise in the original image set can be removed by carrying out the denoising operation on the original image set based on the HSV space, so that the details in the image are more prominent, and the image quality is higher; secondly, color distribution characteristics of the image can be completely expressed by calculating color moments of the standard image set, local neighborhood difference operation is carried out on the standard image set to obtain local characteristics of the standard image set, local neighborhood enhancement operation is carried out on a plurality of local characteristics, more complete texture characteristics can be extracted, and accuracy of subsequent image classification is guaranteed; finally, by splicing the color features and the texture features, more complete image feature information can be obtained, the obtained standard image feature set is transmitted to a preset image classifier, an image classification result of the standard image set is obtained, and the accuracy of image classification can be improved. Therefore, the image classification method provided by the embodiment of the invention can improve the accuracy of image classification.
The image classification apparatus 100 of the present invention may be installed in an electronic device. According to the implemented functions, the image classification apparatus may include a spatial conversion module 101, a color feature extraction module 102, a texture feature extraction module 103, and an image classification module 104, which may also be referred to as a unit, and refer to a series of computer program segments capable of being executed by a processor of an electronic device and performing a fixed function, and stored in a memory of the electronic device.
In the present embodiment, the functions regarding the respective modules/units are as follows:
the space conversion module 101 is configured to obtain an original image set, convert the original image set into a preset HSV space, obtain an original image set based on the HSV space, and perform a denoising operation on the original image set based on the HSV space, so as to obtain a standard image set.
In the embodiment of the invention, the original image set determines RGB color images based on an actual scene, such as road driving images, object images, character images and the like. The preset HSV space refers to three color channels of an image, which are H (hue), S (saturation) and V (value), respectively, wherein hue (hue value) represents the type of color, and is usually represented by an angle; saturation (saturation value) represents the purity of the color, the higher the value, the higher the purity and the more brilliant the color, whereas the darker the color; value (bright value) indicates the degree to which the color is bright. According to the embodiment of the invention, the original image set is converted into the preset HSV space, so that the brightness, the hue and the vividness of the color can be expressed very visually, and the subsequent color characteristic can be extracted conveniently.
In the embodiment of the invention, small white or black pixel points randomly appear in the original image set, which causes unclear image display and influences on image quality, so that the embodiment of the invention can remove noise in the original image set by carrying out denoising operation on the original image set based on the HSV space, thereby enabling details in the image to be more prominent, the image to be clearer and the image quality to be higher.
As an embodiment of the present invention, the space transformation module 101 transforms the original image set into a preset HSV space by performing the following operations to obtain an original image set based on an HSV space, including:
extracting the red value, the green value and the blue value of any pixel point in the original image set;
respectively carrying out normalization processing on the red value, the green value and the blue value to obtain a red normalization value, a green normalization value and a blue normalization value;
substituting the red normalization value, the green normalization value and the blue normalization value into a preset HSV conversion formula to obtain a hue value, a saturation value and a brightness value;
if the hue value is smaller than a preset hue threshold value, adding the hue value and a hue standard value for calculation to obtain a final hue value;
determining an original image set based on HSV space from the final hue value, the saturation value and the brightness value.
The red value, the green value and the blue value of any pixel point in the original image set are respectively R, G and B of the image, and the difference of R, G and B in the image can influence the color presented by the image. And carrying out normalization processing on the red value, the green value and the blue value, namely converting the R, G and B values to between 0 and 1.
Further, the preset HSV conversion formula is as follows:
V=max(R,G,B)
Figure BDA0003754438240000111
Figure BDA0003754438240000112
wherein H represents the hue value; s represents the saturation value; v represents the brightness value; r represents a red value of the original image set; g represents a green value of the original image set; b represents the blue value of the original image set.
Preferably, the hue threshold value is 0 and the hue standard value is 360.
For example, if the tone value is 50, 50 is directly output as the final tone value, and if the tone value is-1, the tone value-1 and the tone standard value 360 are subjected to addition processing to obtain 359, and 359 is taken as the final tone value.
Further, the denoising operation is performed on the original image set based on the HSV space to obtain a standard image set, and the denoising operation includes: superposing a preset filtering window with the pixel position of the upper left corner image in the original image set, sliding the filtering window according to a preset step length until the filtering window is superposed with the pixel position of the lower right corner image in the original image set, and sequentially reading the pixel gray value corresponding to the superposed image pixel position; sorting the pixel gray values to obtain sorted pixel gray values; and searching a median set of the gray values of the sorted pixels, and sequentially selecting a median from the median set to replace the intermediate value of the gray value of the pixel to obtain the standard image set.
The filtering window may be a 3x3 matrix template, the step size may be 1, and the pixel grayscale value refers to a value for recording the brightness degree of an image in an original image set, for example, in a 3x3 matrix, the pixel grayscale value after the pixel position of the matrix coincides with the pixel position of the upper left corner image is 18, 16, 25, 44, 2, 7, 6, 5, 80; if the gray value of the sorted pixel is 2, 5, 6, 7, 16, 25, 28, 44, 80, the value in the middle position of the gray value of the sorted pixel is the median 16, the median 16 replaces the median 2 of the gray value of the pixel, and the median replacement can eliminate the noise signal because the median 2 of the gray value of the pixel is a noise signal; and performing filtering window sliding according to the step length of 1, sequentially obtaining medians corresponding to sliding, and sequentially replacing the obtained medians with the medians corresponding to sliding to obtain the standard image set.
The color feature extraction module 102 is configured to calculate color moments of the standard image set to obtain color features of the standard image set.
In the embodiment of the invention, the color moment is a color feature representation method; the color features are used for describing surface properties of a scene corresponding to an image or an image area.
According to the embodiment of the invention, the color moment in the standard image set is calculated, and the color feature of the standard image set is extracted according to the color moment, so that the color feature distribution of the image can be determined, the color distribution feature of the image can be completely expressed, and the accuracy of extracting the color feature is improved.
As an embodiment of the present invention, the color feature extraction module 102 calculates the color moments of the standard image set by performing the following operations to obtain the color features of the standard image set, including:
respectively calculating a first-order color moment, a second-order color moment and a third-order color moment of the standard image set;
and summarizing the first-order color moment, the second-order color moment and the third-order color moment to obtain the color characteristics.
Wherein the first order moment of color, the second order moment of color, and the third order moment of color can be calculated by the following formulas:
Figure BDA0003754438240000131
Figure BDA0003754438240000132
Figure BDA0003754438240000133
wherein, the mu i Representing a first order color moment; the above-mentioned tau i Representing a second order color moment; s is i Representing the third order moment of color; the P is i, A value representing a jth pixel in an ith color channel component of the standard set of images; the N represents the number of pixels in the image.
In the embodiment of the invention, because the color information is mainly distributed in the low-order moment, the color tendency of the image can be represented by calculating the first-order color moment, the distribution range of the color of the image can be represented by calculating the second-order color moment, the symmetry of the color distribution of the image can be represented by calculating the third-order color moment, and the color distribution characteristic of the image can be completely expressed by respectively calculating the first-order color moment, the second-order color moment and the third-order color moment of the standard image set.
In an embodiment of the present invention, since the image includes three color components of H, S, and V, and each color component has 3 low-order moments of color, the first-order moment of color, the second-order moment of color, and the third-order moment of color of the image are summarized to form a 9-dimensional color feature represented as follows:
F color =[μ hsvhsv ,s h ,s s ,s v ]
wherein, F is color Representing a color feature; the mu h A color hue value representing a first order color moment; the mu s A color saturation value representing a first order color moment; the mu v A lightness of color value representing a first order moment of color; the above-mentioned tau h A color hue value representing a second order color moment; the above-mentioned tau s A color saturation value representing a second order color moment; the above-mentioned tau v A lightness value representing a second order moment of color; s is h A color hue value representing a third-order moment of color; s is s A color saturation value representing a third order moment of color; s is v The lightness of color values representing the third order moments of color.
The texture feature extraction module 103 is configured to perform local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and perform local neighborhood enhancement operation on a plurality of the local features to obtain texture features of the standard image set.
In the embodiment of the invention, the local neighborhood difference operation refers to the operation of extracting local features based on the difference between neighborhood pixels and forming each pixel in a binary mode representation image; the local neighbor enhancement operation is based on the concept that the neighborhood of a specific pixel has a large amount of texture information, and the information is mainly used for texture representation; the texture feature is a global feature, reflects a visual feature of a homogeneous phenomenon in an image, and reflects a feature of a surface tissue structure arrangement attribute with slow transformation or periodic change on the surface of an object.
According to the embodiment of the invention, the local characteristics of the standard image set are obtained by carrying out local neighborhood difference operation on the standard image set, and the texture characteristics of the standard image set are obtained by carrying out local neighborhood enhancement operation on a plurality of local characteristics, so that more complete texture characteristics can be extracted, and the accuracy of subsequent image classification is ensured.
As an embodiment of the present invention, the texture feature extraction module 103 performs a local neighborhood difference operation on the standard image set by performing the following operations to obtain a local feature of the standard image set, including:
acquiring an image pixel matrix in the standard image set, and determining a central pixel and each neighborhood pixel according to the image pixel matrix;
and identifying adjacent pixels corresponding to each neighborhood pixel, comparing the neighborhood pixels with the adjacent pixels, and performing binarization operation on an image pixel matrix according to a comparison result to obtain a plurality of local features of the standard image set.
The image pixel matrix is a part of image pixels, which is overlapped with a 3x3 matrix template and serves as a pixel matrix; the central pixel refers to a pixel corresponding to the center of a 3x3 matrix, and the neighborhood pixels refer to 8 pixels surrounding the central pixel in the 3x3 matrix and serve as neighborhood pixels; the neighboring pixels refer to pixels adjacent to the neighbor pixels.
In particular, the central pixel I of a 3 × 3 image pixel matrix is determined c And 8 neighborhood pixels, wherein the 8 neighborhood pixels are sequentially marked as I 1 ,I 2 ,I 3 ,I 4 ,I 5 ,I 6 ,I 7 And I 8 (ii) a When the neighboring pixels of the neighborhood pixel are odd, the number of the neighboring pixels is 4, for example, I 1 Is [ I ] as the adjacent pixel 2 ,I 3 ,I 7 ,I 8 ](ii) a When the neighboring pixels of the neighborhood pixel are even, the number of the neighboring pixels is 2, for example, I 2 Is [ I ] as the neighboring pixel 1 ,I 3 ]。
In an embodiment of the present invention, the neighborhood pixels are compared with the neighboring pixels, i.e. there may be I 1 Is [ I ] as the adjacent pixel 2 ,I 3 ,I 7 ,I 8 ]Is shown by 1 And I 2 ,I 3 ,I 7 ,I 8 Respectively carry out comparison when I 2 ,I 3 ,I 7 ,I 8 Has a value greater than I 1 When the value of (c) is greater than or equal to (1), replacing the value with "1"; when I is 2 ,I 3 ,I 7 ,I 8 Has a content of not more than I 1 When the pixel value is obtained, the value is replaced by '0', namely binarization operation is realized, each neighborhood is compared with corresponding adjacent pixels, and 8 different image pixel matrixes, namely 8 local features, can be obtained.
Further, the texture feature extraction module 103 may be further configured to perform a local neighborhood enhancement operation on a plurality of the local features to obtain texture features of the standard image set, including:
performing relative difference calculation on pixels in the local features to obtain a first local texture feature;
carrying out average deviation calculation on pixels in the local features to obtain a second local texture feature;
and splicing the first local texture feature and the second local texture feature to obtain the texture feature of the standard image set.
Wherein, the first local texture feature refers to a texture feature which can resist the influence of illumination; the relative difference calculation means that a first relative difference is obtained by calculating the relative difference between a neighborhood pixel and an adjacent pixel in the local feature; then, calculating the relative difference between the central pixel and the neighborhood pixel in the local feature to obtain a second relative difference; and performing exclusive-or operation on the first relative difference and the second relative difference to obtain the first local texture feature, so that the extracted texture feature can resist the influence of illumination better.
Further, the first relative difference and the second relative difference may be calculated by the following formulas:
B 1,i =sign(S i ,I i )
B 2, =sign(S i ,I c )
wherein, B 1,i Representing a first relative difference; b is 2, Representing a second relative difference; sign () represents a relative difference function; s i Representing adjacent pixels; i is i Representing a neighborhood of pixels; i is c To representA center pixel; i denotes a pixel index.
In an embodiment of the present invention, the second local texture feature is a local texture feature for improving chromatic dispersion, and since some images in the standard image set may receive chromatic dispersion, so that a local pixel deviation is too large, the deviation is eliminated by performing average deviation calculation on the plurality of local features, so that a more accurate texture feature can be extracted.
In an embodiment of the present invention, the average deviation calculation may obtain a first deviation by calculating an average deviation between a neighborhood pixel and an adjacent pixel in the local feature; and then calculating the average deviation between the central pixel and the field pixel in the local features to obtain a second deviation, and comparing the first deviation with the second deviation to obtain a second local texture feature.
Further, the first deviation and the second deviation may be calculated by the following formula:
Figure BDA0003754438240000151
Figure BDA0003754438240000152
wherein M is i Representing a first deviation; s. the i Representing adjacent pixels; i is i Representing a neighborhood of pixels; t is c Indicating a second deviation; I.C. A c Represents a center pixel; i denotes a pixel index.
The image classification module 104 is configured to splice the color features and the texture features to obtain a standard image feature set, and transmit the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
In the embodiment of the invention, the standard image features and the color features and the texture features can be spliced through a preset vector splicing (splice) mechanism, and more complete image feature information can be obtained by splicing the color features and the texture features.
In the embodiment of the invention, the image classifier can be an image distribution model constructed based on the working principle of a support vector machine.
According to the embodiment of the invention, the standard image feature set is transmitted to the preset image classifier to obtain the image classification result of the standard image feature set, so that the accuracy of image classification can be improved.
In particular, the existing color feature vector v 1 ∈R n (feature dimension) and texture feature vector v 2 ∈R m (feature dimension) of v 1 、v 2 The standard image feature set v = [ v ] can be obtained by splicing in the same dimension 1 ,v 2 ]。
As an embodiment of the present invention, the image classification module 104 transmits the standard image feature set to a preset image classifier by performing the following operations to obtain an image classification result of the standard image feature set, including:
constructing a plurality of hyperplane functions of the standard image feature set;
determining two parallel hyperplane functions in the hyperplane functions by using a preset geometric interval, and performing formula conversion on the two parallel hyperplane functions to obtain a constraint condition;
converting the constraint condition into an unconstrained condition by utilizing the Lagrangian number multiplication, and calculating the unconstrained condition to obtain an optimal hyperplane in the two parallel hyperplane functions;
and classifying the standard image feature set by using the optimal hyperplane to obtain an image classification result of the standard image set.
Wherein, the maximum distance between the two parallel hyperplane functions is the maximum interval, and the constraint condition can be obtained according to the maximum interval; the constraint condition is that the optimal value of the objective function is found in a limited space; the optimal hyperplane is the plane that segments the standard set of image features.
In an embodiment of the present invention, the optimal hyperplane is obtained by the following formula:
f(x)=(w t x+b)
wherein f (x) represents an optimal hyperplane function, w t The normal vector of the hyperplane is shown, x is the standard image characteristic set, and b is the real-digit displacement term.
In the embodiment of the invention, firstly, the original image set is converted into the preset HSV space, so that the brightness, the tone and the vividness of the color can be expressed visually, the subsequent color characteristic can be extracted conveniently, the original image set based on the HSV space is subjected to denoising operation, the noise in the original image set can be removed, the details in the image are more prominent, and the image quality is higher; secondly, color distribution characteristics of the image can be completely expressed by calculating color moments of the standard image set, local neighborhood difference operation is carried out on the standard image set to obtain local characteristics of the standard image set, local neighborhood enhancement operation is carried out on a plurality of local characteristics, more complete texture characteristics can be extracted, and accuracy of subsequent image classification is guaranteed; finally, by splicing the color features and the texture features, more complete image feature information can be obtained, the obtained standard image feature set is transmitted to a preset image classifier, an image classification result of the standard image set is obtained, and the accuracy of image classification can be improved. Therefore, the image classification device provided by the embodiment of the invention can improve the accuracy of image classification.
Fig. 5 is a schematic structural diagram of an electronic device for implementing the image classification method according to the present invention.
The electronic device may comprise a processor 10, a memory 11, a communication bus 12 and a communication interface 13, and may further comprise a computer program, such as an image classification program, stored in the memory 11 and executable on the processor 10.
The memory 11 includes at least one type of media, which includes flash memory, removable hard disk, multimedia card, card type memory (e.g., SD or DX memory, etc.), magnetic memory, local disk, optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device, for example a removable hard disk of the electronic device. The memory 11 may also be an external storage device of the electronic device in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device. The memory 11 may be used not only to store application software installed in the electronic device and various types of data, such as codes of an image classification program, etc., but also to temporarily store data that has been output or is to be output.
The processor 10 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same function or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device by running or executing programs or modules (e.g., image classification programs, etc.) stored in the memory 11 and calling data stored in the memory 11.
The communication bus 12 may be a PerIPheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The bus may be divided into an address bus, a data bus, a control bus, etc. The communication bus 12 is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
Fig. 5 shows only an electronic device with components, and those skilled in the art will appreciate that the structure shown in fig. 5 does not constitute a limitation of the electronic device, and may include fewer or more components than shown, or some components may be combined, or a different arrangement of components.
For example, although not shown, the electronic device may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so that functions such as charge management, discharge management, and power consumption management are implemented through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Optionally, the communication interface 13 may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used to establish a communication connection between the electronic device and other electronic devices.
Optionally, the communication interface 13 may further include a user interface, which may be a Display (Display), an input unit (such as a Keyboard), and optionally, a standard wired interface and a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface.
It is to be understood that the embodiments described are illustrative only and are not to be construed as limiting the scope of the claims.
The image classification program stored in the memory 11 of the electronic device is a combination of computer programs that, when executed in the processor 10, implement:
obtaining an original image set, converting the original image set into a preset HSV (hue, saturation, value) space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
calculating color moments of the standard image set to obtain color features of the standard image set;
performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
Specifically, the processor 10 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the computer program, which is not described herein again.
Further, the electronic device integrated module/unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable medium. The computer readable medium may be non-volatile or volatile. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM).
Embodiments of the present invention may also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor of an electronic device, the computer program may implement:
obtaining an original image set, converting the original image set into a preset HSV (hue, saturation, value) space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
calculating color moments of the standard image set to obtain color features of the standard image set;
performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
In the several embodiments provided in the present invention, it should be understood that the disclosed medium, apparatus, device and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A method of image classification, the method comprising:
obtaining an original image set, converting the original image set into a preset HSV (hue, saturation, value) space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
calculating the color moments of the standard image set to obtain the color characteristics of the standard image set;
performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
2. The image classification method according to claim 1, wherein the calculating of the color moments of the standard image set to obtain the color features of the standard image set comprises:
respectively calculating a first-order color moment, a second-order color moment and a third-order color moment of the standard image set;
and summarizing the first-order color moment, the second-order color moment and the third-order color moment to obtain the color characteristics.
3. The image classification method according to claim 1, wherein the performing a local neighborhood difference operation on the standard image set to obtain local features of the standard image set comprises:
acquiring an image pixel matrix in the standard image set, and determining a central pixel and each neighborhood pixel according to the image pixel matrix;
and identifying adjacent pixels corresponding to each neighborhood pixel, comparing the neighborhood pixels with the adjacent pixels, and performing binarization operation on an image pixel matrix according to a comparison result to obtain a plurality of local features of the standard image set.
4. The image classification method according to claim 1, wherein the performing a local neighborhood enhancement operation on the plurality of local features to obtain texture features of the standard image set includes:
performing relative difference calculation on pixels in the local features to obtain a first local texture feature;
calculating the average deviation of pixels in the local features to obtain a second local texture feature;
and splicing the first local texture feature and the second local texture feature to obtain the texture feature of the standard image set.
5. The image classification method according to claim 1, wherein the converting the original image set into a predetermined HSV space to obtain an original image set based on HSV space comprises:
extracting the red value, the green value and the blue value of any pixel point in the original image set;
respectively carrying out normalization processing on the red value, the green value and the blue value to obtain a red normalization value, a green normalization value and a blue normalization value;
substituting the red normalization value, the green normalization value and the blue normalization value into a preset HSV conversion formula to obtain a hue value, a saturation value and a brightness value;
if the hue value is smaller than a preset hue threshold value, adding the hue value and a hue standard value for calculation to obtain a final hue value;
determining an original image set based on HSV space from the final hue value, the saturation value and the brightness value.
6. The image classification method according to claim 1, wherein the transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image feature set comprises:
constructing a plurality of hyperplane functions of the standard image feature set;
determining two parallel hyperplane functions in the hyperplane functions by using a preset geometric interval, and performing formula conversion on the two parallel hyperplane functions to obtain a constraint condition;
converting the constraint condition into an unconstrained condition by utilizing the Lagrange number multiplication, and calculating the unconstrained condition to obtain an optimal hyperplane in the two parallel hyperplane functions;
and classifying the standard image feature set by using the optimal hyperplane to obtain an image classification result of the standard image set.
7. The image classification method according to claim 1, wherein the denoising operation is performed on the HSV space-based original image set to obtain a standard image set, and the method comprises:
superposing a preset filtering window with the pixel position of the upper left corner image in the original image set, sliding the filtering window according to a preset step length until the filtering window is superposed with the pixel position of the lower right corner image in the original image set, and sequentially reading the pixel gray value corresponding to the superposed image pixel position;
sorting the pixel gray values to obtain sorted pixel gray values;
and searching a median set of the gray values of the sorted pixels, and sequentially selecting a median from the median set to replace the middle value of the gray value of the pixel to obtain the standard image set.
8. An image classification apparatus, characterized in that the apparatus comprises:
the system comprises a space conversion module, a standard image acquisition module and a denoising module, wherein the space conversion module is used for acquiring an original image set, converting the original image set into a preset HSV space to obtain an original image set based on the HSV space, and performing denoising operation on the original image set based on the HSV space to obtain a standard image set;
the color feature extraction module is used for calculating color moments of the standard image set to obtain color features of the standard image set;
the texture feature extraction module is used for performing local neighborhood difference operation on the standard image set to obtain local features of the standard image set, and performing local neighborhood enhancement operation on a plurality of local features to obtain texture features of the standard image set;
and the image classification module is used for splicing the color features and the texture features to obtain a standard image feature set, and transmitting the standard image feature set to a preset image classifier to obtain an image classification result of the standard image set.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the image classification method of any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the image classification method according to any one of claims 1 to 7.
CN202210855654.7A 2022-07-20 2022-07-20 Image classification method, device, equipment and storage medium Pending CN115205596A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210855654.7A CN115205596A (en) 2022-07-20 2022-07-20 Image classification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210855654.7A CN115205596A (en) 2022-07-20 2022-07-20 Image classification method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115205596A true CN115205596A (en) 2022-10-18

Family

ID=83582976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210855654.7A Pending CN115205596A (en) 2022-07-20 2022-07-20 Image classification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115205596A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118155203A (en) * 2024-04-02 2024-06-07 济南餐农网络科技有限公司 Multistage filtering fresh food sorting method
CN118155203B (en) * 2024-04-02 2024-10-25 济南餐农网络科技有限公司 Multistage filtering fresh food sorting method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118155203A (en) * 2024-04-02 2024-06-07 济南餐农网络科技有限公司 Multistage filtering fresh food sorting method
CN118155203B (en) * 2024-04-02 2024-10-25 济南餐农网络科技有限公司 Multistage filtering fresh food sorting method

Similar Documents

Publication Publication Date Title
CN111652845B (en) Automatic labeling method and device for abnormal cells, electronic equipment and storage medium
WO2022110712A1 (en) Image enhancement method and apparatus, electronic device and computer readable storage medium
CN111695609B (en) Target damage degree judging method and device, electronic equipment and storage medium
CN112507923B (en) Certificate copying detection method and device, electronic equipment and medium
CN114627435B (en) Intelligent light adjusting method, device, equipment and medium based on image recognition
CN108764352A (en) Duplicate pages content detection algorithm and device
CN113610934B (en) Image brightness adjustment method, device, equipment and storage medium
CN112508145A (en) Electronic seal generation and verification method and device, electronic equipment and storage medium
CN114512085A (en) Visual color calibration method of TFT (thin film transistor) display screen
CN117455762A (en) Method and system for improving resolution of recorded picture based on panoramic automobile data recorder
CN113420684A (en) Report recognition method and device based on feature extraction, electronic equipment and medium
CN117315369A (en) Fundus disease classification method and device based on neural network
CN112507903A (en) False face detection method and device, electronic equipment and computer readable storage medium
CN115760854A (en) Deep learning-based power equipment defect detection method and device and electronic equipment
CN115205596A (en) Image classification method, device, equipment and storage medium
CN112541899B (en) Incomplete detection method and device of certificate, electronic equipment and computer storage medium
CN112561893B (en) Picture matching method and device, electronic equipment and storage medium
CN114463685A (en) Behavior recognition method and device, electronic equipment and storage medium
CN112183520A (en) Intelligent data information processing method and device, electronic equipment and storage medium
CN114742828B (en) Intelligent analysis method and device for workpiece damage assessment based on machine vision
CN115423716B (en) Image enhancement method, device, equipment and storage medium based on multidimensional filtering
CN113222873B (en) Image data enhancement method and device based on two-dimensional Gaussian distribution and storage medium
CN116013091B (en) Tunnel monitoring system and analysis method based on traffic flow big data
CN117934417A (en) Method, device, equipment and medium for identifying apparent defects of road based on neural network
CN118172313A (en) Cervical digital slice image quality monitoring method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination