CN113255847B - Tire wear degree prediction method based on generation of countermeasure network - Google Patents
Tire wear degree prediction method based on generation of countermeasure network Download PDFInfo
- Publication number
- CN113255847B CN113255847B CN202110769828.3A CN202110769828A CN113255847B CN 113255847 B CN113255847 B CN 113255847B CN 202110769828 A CN202110769828 A CN 202110769828A CN 113255847 B CN113255847 B CN 113255847B
- Authority
- CN
- China
- Prior art keywords
- tire
- image
- generator
- loss function
- forged
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000007781 pre-processing Methods 0.000 claims abstract description 11
- 238000012549 training Methods 0.000 claims description 32
- 238000006243 chemical reaction Methods 0.000 claims description 25
- 125000004122 cyclic group Chemical group 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 18
- 230000008485 antagonism Effects 0.000 claims description 12
- 238000013507 mapping Methods 0.000 claims description 12
- 238000011176 pooling Methods 0.000 claims description 12
- 230000007704 transition Effects 0.000 claims description 9
- 230000004913 activation Effects 0.000 claims description 6
- 230000002457 bidirectional effect Effects 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 2
- 230000015572 biosynthetic process Effects 0.000 claims 2
- 238000003786 synthesis reaction Methods 0.000 claims 2
- 238000005259 measurement Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 28
- 238000012360 testing method Methods 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Probability & Statistics with Applications (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a tire wear degree prediction method based on a generation countermeasure network, which is characterized by comprising the following steps: s1: preprocessing the shot pictures of the side surfaces of the tires; s2: reconstructing the tire side image processed by the S1 into a tire front image by using an IST-GAN network model framework; s3: and predicting the tire wear degree of the converted front image of the tire by using a TWP prediction model frame to obtain a corresponding prediction conclusion. The invention does not need to carry out repeated large amount of manual measurement, thereby saving the labor cost; the method can predict the wear degree of the tire by only taking a side photo of the vehicle tire, realizes remote prediction of the wear condition of the tire, is convenient for making a plan for replacing the tire in advance, and saves time cost.
Description
Technical Field
The invention relates to the technical field of pattern wear degree identification and measurement in the tire industry, in particular to a tire wear degree prediction method based on a generation countermeasure network.
Background
It is well known that the friction between the tire and the road surface is the source of vehicle drive, braking and steering. The design of patterns on the tread of the automobile tire can effectively improve the friction force between the tire and the ground and the water storage and drainage capacity of the tire, and is beneficial to the heat dissipation of the tire.
When the depth of the tyre pattern is lower than the critical value, the friction force between the tyre and the ground is obviously reduced, and the water storage and drainage capacity of the tyre is greatly influenced. When a water film exists on the road surface, the 'water slip phenomenon' is easy to generate, and great potential safety hazard is caused to traffic.
At present, two main detection methods for the pattern depth of the tire tread are available, the first method is shown in figure 1, and is to manually measure a plurality of main grooves on the same section of the tire tread through a tire pattern depth gauge or a vernier caliper and take an average value. The second method, as shown in fig. 2, is obtained by horizontally scanning the surface of the tire with a laser sensor, and has the disadvantage that the cost of the laser, processor and other devices is high, and thus the method cannot be generally applied. Also, neither technique is capable of remotely measuring the tire tread depth.
Under the normal use condition of a vehicle, when the vehicle tire is not detached, a user generally cannot shoot a strict front image of the vehicle tire, but can easily obtain a side photo of the tire through a mobile phone, but the side photo cannot clearly identify the linear pattern grooves of the tire.
In view of the above, the present invention provides an image transformation method, which can reconstruct a tire side image into a corresponding tire front image, thereby implementing a method for predicting the degree of tire wear remotely, quickly, and simply.
Disclosure of Invention
Based on the above technical background, the main object of the present invention is to provide a tire wear degree prediction method based on a generation countermeasure network, which can implement remote measurement, predict the tire wear degree through a shot tire side photo, further know the tire wear condition in time, and effectively reduce the investment of manpower and material resources.
In order to solve the problems, the technical scheme adopted by the invention is as follows:
a tire wear degree prediction method based on a generation countermeasure network is characterized by comprising the following steps:
s1: preprocessing the shot pictures of the side surfaces of the tires;
s2: reconstructing the tire side image processed by the S1 into a tire front image by using an IST-GAN network model framework;
s3: and predicting the tire wear degree of the converted front image of the tire by using a TWP prediction model frame to obtain a corresponding prediction conclusion.
Further, the preprocessing of the tire side photograph in S1 means adjusting the format and pixels of the tire side photograph and performing a gradation process on the color of the photograph.
Further, the training process of the IST-GAN network model framework in S2 and the TWP prediction model framework in S3 both depend on the building of a tire sample data set, which includes:
taking a certain number of tires as samples, acquiring front photos (vertical to tire treads) and side photos (having a certain inclination angle with the tire treads) of the same position of each sample tire to obtain a series of front images and side images of the tires, and preprocessing the photos to obtain a data set for training an IST-GAN network model framework;
measuring and recording the depth of a linear groove of each sample tire, setting three threshold labels as classification bases according to the depth of the linear groove, classifying the three threshold labels into three classification data sets with different wear degrees, namely recommended tire replacement, good tire and excellent tire, and using the classification data sets for training a TWP prediction model frame;
further, in order to ensure that the tire side image in S2 can be accurately reconstructed into a tire front image, the IST-GAN network model framework designs two cyclic conversion branches based on the two generators G1, G2, and the two generators G1, G2 are trained and optimized by means of the two discriminators D1, D2 in the two cyclic conversion branches;
the two loop transition branches comprise a forward loop consistency transition branch and a reverse loop consistency transition branch;
in the forward cycle uniformity conversion branch, the generator G1 synthesizes a forged tire side image using the tire front image p as an inputThe generator G2 reconstructs a corresponding tire face image from the forged tire side image(ii) a The forward cycle consistency formula is;
In the reverse cyclic consistency conversion branch, the generator G2 synthesizes a forged tire face image using the tire side image s as an inputThe generator G1 reconstructs a corresponding tire side image from the forged tire face image(ii) a The reverse cycle consistency formula is;
In order to ensure that each picture can be mapped to a target in the bidirectional cyclic consistency conversion process, a cyclic consistency loss function is designed:
wherein generator G1 is used to implement data set from tire front faceI P To tire side data setI s Mapping of (2); generator G2 is used to implement a tire side data setI s To tire face data setI P Mapping of (2);representing a function expectation;representing a data set from the front of a tyreI P A randomly sampled tire front image;representing a data set from the tire sideI s A randomly sampled tire side image;the L1 norm, representing the matrix, learns the corresponding generators G1 and G2 by minimizing the loss.
During the forward loop conversion, a forward loop antagonism loss function is designed between the generator G1 and the discriminator D1, so that the discriminator D1 can compare the synthesized forged tire side image with the input tire side image and select a better synthesized (i.e., least loss) forged tire side image, the forward loop antagonism loss function being:
during the reverse cyclic conversion, a reverse cyclic antagonism loss function is designed between the generator G2 and the discriminator D2, so that the discriminator D2 can compare the synthesized forged tire front image with the input tire front image, and then select a better synthesized (i.e., least loss) forged tire front image, where the reverse cyclic antagonism loss function is:
finally, a better IST-GAN network model is obtained by minimizing the overall target loss, and the target function of the IST-GAN network model framework is as follows:
the importance of the discriminant loss function and the generator loss function is controlled by the parameter lambda, and the larger the value of the parameter lambda is, the higher the weight of the cycle consistency loss function of the generator is, so that the reduction of the cycle consistency loss is more meaningful, namely, the IST-GAN network model focuses more on reducing the loss of the generator in the training process.
And testing the trained IST-GAN network model framework by using a test set, and outputting a reconstructed tire front image.
Further, the learning rate of the IST-GAN network model framework during training is 2e-4, the batch size is 1, the iteration period is 300, an Intel (R) core i7-9700 CPU processor is used for iteration, and the whole training process takes about 9 hours.
Further, the TWP prediction model framework in S3 is composed of three input branches, three convolutional layers, a full-link layer, and a classification layer; the training process comprises the following steps:
s31, inputting three types of classified data sets which are labeled and represent different wear degrees by using three input branches;
s32, mapping the tire image reconstructed in the S2 to a hidden layer feature space through three layers of convolutional layers, and performing feature extraction;
s33, inputting the output feature tensor into the full connection layer, and mapping the learned distributed feature representation to a sample mark space;
and S34, classifying the tire wear degree of the tire into one of three different wear degrees through the classification layer.
Furthermore, in the training process, the model is corrected by recording the corresponding loss value and accuracy of each training until the training is finished.
Furthermore, the three convolutional layers are built by a convolution-pooling-activation rule, wherein the size of a convolution kernel is 5 x 5, the moving step length of the convolution kernel is 1, the number of the convolution kernels is 64, the pooling size is 2 x 2, the pooling step length is 2, and the pooling type is maximum pooling;
the convolutional layer and the fully-connected layer both utilize a nonlinear activation function LeakyReLU, and the classification layer utilizes a softmax activation function.
Further, using softmax classification as an output layer, selecting cross entropy as a loss function, and calculating the loss function value through softmax:
where k is the number of categories, cable represents the tag value of the input data, and q represents the predicted value of the input data. Secondly, the effectiveness of the method is evaluated by the output accuracy:
wherein,representing the actual value of the input data, i.e. the image of the front face of the tyre.
The tire wear degree prediction method based on the generation countermeasure network combines two network models of a GAN network and a deep learning network, wherein the IST-GAN network model reconstructs a sample, a sample set is expanded, and the image style is converted; meanwhile, the advantages of the two network models are combined, and the accuracy and the robustness of the models can be improved. The IST-GAN network model adopts mutual verification bidirectional cycle conversion, and compared with unidirectional cycle, the conversion from the front data to the side data of the tire can be more real, so that a better generator can be obtained through training. For the data set, the data set does not need to be marked manually, so that a large amount of labor cost is saved, and the efficiency is improved. For a user, the method of the invention can predict and know the wear condition of the tire by only taking a photo of the side surface of the tire, thereby greatly facilitating the user and simultaneously reducing the after-sale service cost of a tire company.
Drawings
FIG. 1 is a schematic view of manually measuring tire groove depth using a tire tread depth gauge;
FIG. 2 is a schematic view of measuring tire groove depth using a laser;
FIG. 3 is a captured tire front image and tire side image;
FIG. 4 is a flowchart of an image style conversion method framework IST-GAN;
fig. 5 is a flowchart of a tire wear level prediction method framework TWP.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
In daily life, under the condition that a vehicle is normally used, when the vehicle tire is not detached, a user generally cannot shoot a strict front image of the vehicle tire, but usually can obtain a side photo of the tire through a mobile phone easily, but the side photo cannot clearly identify the linear pattern grooves of the tire. Based on this, the tire wear degree prediction method based on the generation countermeasure network of the embodiment relates to an image style conversion method, by which reconstruction of a corresponding tire front image from a tire side surface image can be realized, thereby assisting in completing the prediction of the tire wear degree.
Furthermore, without specialized measuring tools, it is difficult to calculate the groove depth of a tire, and thus to know the degree of wear of the tire. Therefore, the tire wear degree prediction method based on the generation countermeasure network of the embodiment also relates to a simple and efficient tire wear degree prediction method, and the method can predict the tire wear degree of the reconstructed tire front image, so as to help the vehicle owner know the tire wear condition of the vehicle owner before going to the vehicle repair shop.
The image style conversion method described in this embodiment is mainly to construct a framework based on a generation countermeasure network, which is defined as an IST-GAN network model framework in this embodiment, and output a reconstructed tire front image by training the model framework.
The tire wear degree prediction method described in this embodiment is mainly to build a deep learning-based framework, that is, a TWP prediction model framework, train the model framework through three types of data sets with different wear degrees, and predict reconstructed tire front image input, so as to predict which of three types, i.e., recommended tire replacement, good tire and excellent tire, the tire belongs to.
Since the training of the above two model frames must depend on a certain number of data sets, before the training of the two model frames, a tire sample data set needs to be established first, and the establishing method of the tire sample data set includes:
taking a certain number of tires as samples, acquiring a front photo (vertical to the tire tread) and a side photo (having a certain inclination angle with the tire tread) of the same position of each tire sample, preprocessing the photos, wherein the preprocessing is to perform consistency processing on the formats and pixel sizes (in the embodiment, the front images and the side images of the tires are cut into 256 × 256 pixels in batch) of a series of collected tire front images and tire side images, and converting the colors of the processed images into gray levels so as to meet the training requirements of subsequent frames. In a series of processed tire front images and tire side images, 80% of image data is used as a training set, and the other 20% of image data is used as a test set for training an IST-GAN network model framework;
measuring and recording the linear groove depth of each sample tire, setting three threshold labels as classification bases according to the linear groove depth, classifying the three threshold labels into three classification data sets with different wear degrees, namely recommended replacement, good tire and excellent tire, and using the classification data sets for training an IST-GAN network model framework. In the embodiment, a car is taken as an example, a tire with the depth of a tire linear groove in the range of 1.6-3.5mm is defined as recommended to be replaced, a tire with the depth of the tire linear groove in the range of 3.5-6mm is defined as good, and a tire with the depth of the tire linear groove in the range of 6-8mm is defined as excellent; for example, the vehicle models such as trucks except cars can be classified by referring to the threshold set by the national standard.
The tire wear degree prediction method based on the generation countermeasure network of the embodiment comprises the following steps:
s1: preprocessing a shot tire side photo, wherein the preprocessing comprises adjusting the format and pixels of the photo, and converting the color of the processed image into gray;
s2: reconstructing the tire side image processed by the S1 into a tire front image by using an IST-GAN network model framework;
s3: and predicting the tire wear degree of the converted front image of the tire by using a TWP prediction model frame to obtain a corresponding prediction conclusion.
In this embodiment, in order to ensure that the tire side image in S2 can accurately reconstruct the tire front image, the IST-GAN network model framework designs two cyclic conversion branches based on the two generators G1 and G2, and the two generators G1 and G2 are trained and optimized by means of the two discriminators D1 and D2 in the two cyclic conversion branches;
the two loop transition branches comprise a forward loop consistency transition branch and a reverse loop consistency transition branch;
in the forward cycle uniformity conversion branch, the generator G1 synthesizes a forged tire side image using the tire front image p as an inputThe generator G2 reconstructs a corresponding tire face image from the forged tire side image(ii) a The forward cycle consistency formula is;
In the reverse cycle uniformity shift branch, generator G2 will turn the tire sideUsing the face image s as input, synthesizing a forged tire face imageThe generator G1 reconstructs a corresponding tire side image from the forged tire face image(ii) a The reverse cycle consistency formula is;
In order to ensure that each picture can be mapped to a target in the bidirectional cyclic consistency conversion process, a cyclic consistency loss function is designed:
wherein generator G1 is used to implement data set from tire front faceI P To tire side data setI s Mapping of (2); generator G2 is used to implement a tire side data setI s To tire face data setI P Mapping of (2);representing a function expectation;representing a data set from the front of a tyreI P A randomly sampled tire front image;representing a data set from the tire sideI s A randomly sampled tire side image;l1 norm representing matrix, by minimumThe losses are normalized to learn the corresponding generators G1 and G2.
During the forward loop conversion, a forward loop antagonism loss function is designed between the generator G1 and the discriminator D1, so that the discriminator D1 can compare the synthesized forged tire side image with the input tire side image and select a better synthesized (i.e., least loss) forged tire side image, the forward loop antagonism loss function being:
during the reverse cyclic conversion, a reverse cyclic antagonism loss function is designed between the generator G2 and the discriminator D2, so that the discriminator D2 can compare the synthesized forged tire front image with the input tire front image, and then select a better synthesized (i.e., least loss) forged tire front image, where the reverse cyclic antagonism loss function is:
finally, a better IST-GAN network model is obtained by minimizing the overall target loss, and the target function of the IST-GAN network model framework is as follows:
the importance of the discriminant loss function and the generator loss function is controlled by the parameter lambda, and the larger the value of the parameter lambda is, the higher the weight of the cycle consistency loss function of the generator is, so that the reduction of the cycle consistency loss is more meaningful, namely, the IST-GAN network model focuses more on reducing the loss of the generator in the training process.
And testing the trained IST-GAN network model framework by using a test set, and outputting a reconstructed tire front image.
The IST-GAN network model framework of this embodiment has a training learning rate of 2e-4, a batch size of 1, an iteration cycle of 300, and uses an Intel (R) core i7-9700 CPU processor for iteration, and the whole training process takes about 9 hours.
In this embodiment, the TWP prediction model framework related to S3 is composed of three input branches, three convolutional layers, a full connection layer, and a classification layer; the model frame is trained through three types of classified data sets with different wear degrees, and reconstructed tire front image input is predicted, so that the tire is predicted to belong to three types of recommended replacement, good tire and excellent tire.
The training process comprises the following steps:
s31, inputting three types of classified data sets with different wear degrees and labels thereof by using three input branches;
s32, mapping the tire image reconstructed in the S2 to a hidden layer feature space through three layers of convolutional layers, and performing feature extraction;
s33, inputting the output feature tensor into the full connection layer, and mapping the learned distributed feature representation to a sample mark space;
s34, classifying the image into one of three different wear degrees through a classification layer, if the image belongs to the tire linear groove depth range of 1.6-3.5mm, the tire is recommended to be replaced, if the image belongs to the tire linear groove depth range of 3.5-6mm, the tire is good, if the image belongs to the tire linear groove depth range of 6-8mm, the tire is good, and therefore the tire wear degree of the tire is obtained.
Furthermore, in the training process, the model is corrected by recording the corresponding loss value and accuracy of each training until the training is finished.
Furthermore, the three convolutional layers are built by a convolution-pooling-activation rule, wherein the size of a convolution kernel is 5 x 5, the moving step length of the convolution kernel is 1, the number of the convolution kernels is 64, the pooling size is 2 x 2, the pooling step length is 2, and the pooling type is maximum pooling;
the convolutional layer and the fully-connected layer both utilize a nonlinear activation function LeakyReLU, and the classification layer utilizes a softmax activation function.
Further, using softmax classification as an output layer, selecting cross entropy as a loss function, and calculating the loss function value through softmax:
where k is the number of categories, cable represents the tag value of the input data, and q represents the predicted value of the input data. Secondly, the effectiveness of the method is evaluated by the output accuracy:
wherein,representing the actual value of the input data, i.e. the image of the front face of the tyre.
According to the tire wear degree prediction method based on the generation countermeasure network, a user only needs to shoot a tire side picture to obtain the tire wear degree, so that a plan for replacing the tire is made conveniently in advance, the labor and material cost can be greatly saved, and the time, the labor and the worry are saved.
In summary, although the present invention has been described with reference to the preferred embodiments, it should be understood that various changes and modifications can be made by those skilled in the art without departing from the spirit and scope of the invention.
Claims (5)
1. A tire wear degree prediction method based on a generation countermeasure network is characterized by comprising the following steps:
s1: preprocessing the shot pictures of the side surfaces of the tires;
s2: reconstructing the tire side image processed by the S1 into a tire front image by using an IST-GAN network model framework;
s3: predicting the tire wear degree of the converted front image of the tire by using a TWP prediction model frame to obtain a corresponding prediction conclusion;
in the S2, the IST-GAN network model framework designs two loop conversion branches based on the two generators G1, G2, and the two generators G1, G2 are trained and optimized by means of the two discriminators D1, D2 in the two loop conversion branches;
the two loop transition branches comprise a forward loop consistency transition branch and a reverse loop consistency transition branch;
in the forward cycle uniformity conversion branch, the generator G1 synthesizes a forged tire side image using the tire front image p as an inputThe generator G2 reconstructs a corresponding tire face image from the forged tire side image(ii) a The forward cycle consistency formula is;
In the reverse cyclic consistency conversion branch, the generator G2 synthesizes a forged tire face image using the tire side image s as an inputThe generator G1 reconstructs a corresponding tire side image from the forged tire face image(ii) a The reverse cycle consistency formula is;
In order to ensure that each picture can be mapped to a target in the bidirectional cyclic consistency conversion process, a cyclic consistency loss function is designed:
wherein generator G1 is used to implement data set from tire front faceI P To tire side data setI s Mapping of (2); generator G2 is used to implement a tire side data setI s To tire face data setI P Mapping of (2);representing a function expectation;representing a data set from the front of a tyreI P A randomly sampled tire front image;representing a data set from the tire sideI s A randomly sampled tire side image;an L1 norm representing a matrix, the corresponding generators G1 and G2 being learned by minimizing the loss;
in the forward cycle conversion process, a forward cycle antagonism loss function is designed between the generator G1 and the discriminator D1, so that the discriminator D1 can compare the synthesized and forged tire side image with the input tire side image and further select a forged tire side image with the minimum synthesis loss, the forward cycle antagonism loss function being:
in the reverse cyclic conversion process, a reverse cyclic antagonism loss function is designed between the generator G2 and the discriminator D2, so that the discriminator D2 can compare the synthesized and forged tire front image with the input tire front image, and further select a forged tire front image with the minimum synthesis loss, wherein the reverse cyclic antagonism loss function is as follows:
finally, an IST-GAN network model is obtained by minimizing the overall target loss, the objective function of the IST-GAN network model framework:
the importance of the arbiter loss function and the generator loss function is controlled by the parameter lambda, and the larger the value of the parameter lambda is, the higher the weight of the cycle consistency loss function of the generator is;
the training process of the IST-GAN network model framework in S2 and the TWP prediction model framework in S3 both depend on the establishment of a tire sample data set, which includes:
taking a certain number of tires as samples, acquiring front photos and side photos of the same position of each sample tire to obtain a series of front images and side images of the tires, and preprocessing the photos to obtain a data set for training an IST-GAN network model framework;
measuring and recording the depth of a linear groove of each sample tire, setting three threshold labels as classification bases according to the depth of the linear groove, classifying the three threshold labels into three classification data sets with different wear degrees, namely recommended tire replacement, good tire and excellent tire, and using the classification data sets for training a TWP prediction model frame;
the TWP prediction model framework of S3 is composed of three input branches, three convolutional layers, a full connection layer and a classification layer; the training process comprises the following steps:
s31, inputting three types of classified data sets which are labeled and represent different wear degrees by using three input branches;
s32, mapping the tire image reconstructed in the S2 to a hidden layer feature space through three layers of convolutional layers, and performing feature extraction;
s33, inputting the output feature tensor into the full connection layer, and mapping the learned distributed feature expression to a sample mark space;
and S34, classifying the tire wear degree of the tire into one of three different wear degrees through the classification layer.
2. The method of predicting the degree of tire wear based on the generation of a countermeasure network of claim 1,
in S1, the preprocessing of the captured tire side image means adjusting the format and pixels of the tire side image and performing gradation processing on the color of the image.
3. The method of predicting the degree of tire wear based on the generation of a countermeasure network of claim 1,
and in the training process, the model is corrected by recording the corresponding loss value and accuracy of each training until the training is finished.
4. The method of predicting the degree of tire wear based on the generation of a countermeasure network of claim 1,
the three convolutional layers are built by a convolution-pooling-activation rule, wherein the size of a convolution kernel is 5 x 5, the moving step length of the convolution kernel is 1, the number of the convolution kernels is 64, the pooling size is 2 x 2, the pooling step length is 2, and the pooling type is maximum pooling;
the convolutional layer and the fully-connected layer both utilize a nonlinear activation function LeakyReLU, and the classification layer utilizes a softmax activation function.
5. The method of predicting the degree of tire wear based on the generation of a countermeasure network of claim 1,
using softmax classification as the output layer, selecting cross entropy as the loss function, and calculating the loss function value by softmax:
wherein k is the number of types, cable represents the label value of the input data, and q represents the predicted value of the input data; secondly, the effectiveness of the method is evaluated by the output accuracy:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110769828.3A CN113255847B (en) | 2021-07-08 | 2021-07-08 | Tire wear degree prediction method based on generation of countermeasure network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110769828.3A CN113255847B (en) | 2021-07-08 | 2021-07-08 | Tire wear degree prediction method based on generation of countermeasure network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113255847A CN113255847A (en) | 2021-08-13 |
CN113255847B true CN113255847B (en) | 2021-10-01 |
Family
ID=77190851
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110769828.3A Active CN113255847B (en) | 2021-07-08 | 2021-07-08 | Tire wear degree prediction method based on generation of countermeasure network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113255847B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114663386B (en) * | 2022-03-21 | 2024-08-06 | 东南大学 | Water film removing method for airport pavement disease image |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101040637B1 (en) * | 2008-12-04 | 2011-06-13 | 한국타이어 주식회사 | Tire abrasion virtual test method and apparatus thereof |
JP7132701B2 (en) * | 2017-08-10 | 2022-09-07 | 株式会社ブリヂストン | Tire image recognition method and tire image recognition device |
US10730352B2 (en) * | 2018-02-22 | 2020-08-04 | Ford Global Technologies, Llc | System and method for tire wear prognostics |
CN110059751A (en) * | 2019-04-19 | 2019-07-26 | 南京链和科技有限公司 | A kind of tire code and tire condition recognition methods based on machine learning |
CN111976389B (en) * | 2020-08-03 | 2021-09-21 | 清华大学 | Tire wear degree identification method and device |
CN112270402A (en) * | 2020-10-20 | 2021-01-26 | 山东派蒙机电技术有限公司 | Training method and system for tire wear identification model |
-
2021
- 2021-07-08 CN CN202110769828.3A patent/CN113255847B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN113255847A (en) | 2021-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107239730B (en) | Quaternion deep neural network model method for intelligent automobile traffic sign recognition | |
CN106529419B (en) | The object automatic testing method of saliency stacking-type polymerization | |
CN110163069B (en) | Lane line detection method for driving assistance | |
CN113902915A (en) | Semantic segmentation method and system based on low-illumination complex road scene | |
CN110598564B (en) | OpenStreetMap-based high-spatial-resolution remote sensing image transfer learning classification method | |
CN111160481B (en) | Adas target detection method and system based on deep learning | |
CN111767874B (en) | Pavement disease detection method based on deep learning | |
CN105139385A (en) | Image visual saliency region detection method based on deep automatic encoder reconfiguration | |
CN111310592B (en) | Detection method based on scene analysis and deep learning | |
CN109919921B (en) | Environmental impact degree modeling method based on generation countermeasure network | |
CN113255847B (en) | Tire wear degree prediction method based on generation of countermeasure network | |
CN117197763A (en) | Road crack detection method and system based on cross attention guide feature alignment network | |
CN113269224A (en) | Scene image classification method, system and storage medium | |
CN104036242B (en) | The object identification method of Boltzmann machine is limited based on Centering Trick convolution | |
CN116797787A (en) | Remote sensing image semantic segmentation method based on cross-modal fusion and graph neural network | |
CN110910347A (en) | Image segmentation-based tone mapping image no-reference quality evaluation method | |
CN112861617A (en) | Slope disaster identification system based on monitoring image | |
CN115272224A (en) | Unsupervised pavement damage detection method for smart city construction | |
CN116309228A (en) | Method for converting visible light image into infrared image based on generation of countermeasure network | |
CN113298065B (en) | Eye melanin tumor identification method based on self-supervision learning | |
CN108711150B (en) | End-to-end pavement crack detection and identification method based on PCA | |
CN103106663B (en) | Realize the method for SIM card defects detection based on image procossing in computer system | |
CN116664431B (en) | Image processing system and method based on artificial intelligence | |
CN117351360A (en) | Remote sensing image road extraction method based on attention mechanism improvement | |
CN107273793A (en) | A kind of feature extracting method for recognition of face |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |