CN114120243A - Gardenia stir-frying process monitoring method based on machine vision - Google Patents

Gardenia stir-frying process monitoring method based on machine vision Download PDF

Info

Publication number
CN114120243A
CN114120243A CN202111472898.9A CN202111472898A CN114120243A CN 114120243 A CN114120243 A CN 114120243A CN 202111472898 A CN202111472898 A CN 202111472898A CN 114120243 A CN114120243 A CN 114120243A
Authority
CN
China
Prior art keywords
gardenia
image
region
monitoring
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111472898.9A
Other languages
Chinese (zh)
Other versions
CN114120243B (en
Inventor
张村
章宇珍
王云
贾哲
程朋乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Materia Medica of CAMS
Beijing Forestry University
Original Assignee
Institute of Materia Medica of CAMS
Beijing Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Materia Medica of CAMS, Beijing Forestry University filed Critical Institute of Materia Medica of CAMS
Priority to CN202111472898.9A priority Critical patent/CN114120243B/en
Publication of CN114120243A publication Critical patent/CN114120243A/en
Application granted granted Critical
Publication of CN114120243B publication Critical patent/CN114120243B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for monitoring a gardenia scorching process based on machine vision. Aiming at the problem that the existing method cannot monitor the charring process of the gardenia in real time, the method for monitoring the charring process of the gardenia based on machine vision is provided. The method comprises the following specific processes: firstly, acquiring a gardenia image in real time, extracting a gardenia region in the image, removing an inner surface region of the gardenia region to obtain a segmented gardenia image, extracting a gray average value and a standard deviation of each single-channel image of HSV and Lab from the segmented gardenia region to obtain a feature vector describing gardenia color information, performing principal component analysis on the feature vector, taking the obtained first 5 principal components as a model input data model, and outputting scorching state data as a model. The method improves the existing machine vision analysis method, and realizes real-time monitoring of the charring process of fructus Gardeniae.

Description

Gardenia stir-frying process monitoring method based on machine vision
Technical Field
The invention belongs to the field of processing and monitoring of traditional Chinese medicinal materials, and particularly relates to a method for monitoring a gardenia charring process based on machine vision.
Background
Zhi Zi is commonly used to treat feverish dysphoria, jaundice due to damp-heat, stranguria and other symptoms. Raw gardenia is bitter and cold in nature and is easy to hurt spleen and stomach, the bitter and cold nature of stir-fried gardenia is weakened, the blood cooling and bleeding stopping effects are enhanced, and the application is wide in clinic. In the processing process of the gardenia decoction pieces, the apparent color change is the main basis for judging the processing degree, but the visual color identification method is very easily influenced by the subjective judgment of objective conditions such as color difference and external light, and is difficult to accurately measure and quantify.
In recent years, many scholars apply machine vision to the traditional Chinese medicine processing research to objectively and digitally analyze subjectively fuzzy colors. The gardenia needs to be crushed in the frying process, the crushed fragments of the gardenia have an outer skin and an inner skin after being mixed, the color difference between the two is huge, in order to eliminate the influence of the difference on color analysis, the fragments in the frying process need to be taken out by the existing machine vision method, and are ground into powder for color analysis, and the requirement of real-time monitoring of the gardenia scorching process cannot be met.
Disclosure of Invention
The invention aims to provide a method for monitoring a gardenia scorching process based on machine vision, which can solve the problems.
The method comprises the following basic steps:
step one, acquiring a gardenia image: operating according to fructus Gardeniae charring process, adding fructus Gardeniae decoction pieces, timing, shooting charring process of fructus Gardeniae with camera in real time, and reading fructus Gardeniae image R collected by camera every 1min0
Secondly, extracting the gardenia area: image R0Converting the default RGB image into an HSV image, setting H, S, V maximum and minimum thresholds under an HSV color model, and acquiring a binary mask image M for extracting a gardenia region1Masking the film M1And the original image R0Performing AND operation to obtain the Gardenia region R1
Thirdly, removing the inner surface area: for the Gardenia region R1In the RGB color model, R, G, B maximum and minimum thresholds are set, and a binarized mask image M from which an inner surface region is removed is acquired2Masking the film M2With the region R of Gardenia1An AND operation is performed to remove the inner surface region and leave a residual region R2
Fourthly, extracting color characteristic values: for the segmented gardenia image region R2Respectively extracting the gray average value and the standard deviation of each single-channel image under HSV and Lab color models to obtain 12 characteristic variables for describing the color information of the gardenia to form a characteristic vector;
fifthly, monitoring the scorched state: and (3) carrying out principal component analysis on the color feature vectors, inputting the obtained first 5 principal components serving as models into a data model trained by deep learning, and outputting scorching state data serving as models.
The mask M in the second step1Under the HSV color model, the color characteristics of the gardenia area are as follows,
H1<H<H2,S1<S<S2,V1<V<V2wherein H is1、S1、V1Minimum threshold values for three channels, H2、S2、V2Maximum threshold values for three channels, respectively, of the original image R0Marking the pixel value which meets the three channel threshold conditions at the same time as 1, otherwise marking the pixel value as 0, and obtaining a binary mask image M for extracting the gardenia area1
Mask M in the third step2Under the RGB color model, the color characteristic of the inner surface area of the gardenia is R1<R<R2,G1<G<G2,B1<B<B2Wherein R is1、G1、B1Minimum threshold values, R, for three channels, respectively2、G2、B2Maximum threshold values for three channels, image R1Marking the pixel value of the image which meets the threshold conditions of the three channels at the same time as 1, otherwise marking the pixel value of the image as 0, and obtaining the binary mask image M with the inner surface area removed2
And in the fourth step, extracting a color characteristic value, wherein the calculation formula of the gray average value m and the standard deviation delta of each single-channel image is as follows:
Figure BDA0003386179550000021
Figure BDA0003386179550000031
wherein M is the total number of pixels of the gardenia image, i is the gray value, and ziThe number of pixels having a gray scale value i is indicated, and L is the number of gray scales of a single-channel image.
And fifthly, monitoring the scorching state, namely firstly, establishing a data model for identifying the scorching state of the gardenia by using a large amount of characteristic data as training data and adopting a deep learning technology, inputting image data and characteristic image data acquired in real time into the data model after the data model is established, and outputting scorching state data by the model. Meanwhile, the data model adopts a self-learning mode to incorporate the newly input data into the training database of the data model so as to continuously improve the data model.
The invention has the beneficial effects that: the real-time color analysis of the gardenia charring process is realized by combining machine vision and a traditional image processing algorithm, and the real-time monitoring of the gardenia charring process is further realized.
Drawings
FIG. 1 is a flow chart of the calculation process of the present invention.
Detailed Description
The foregoing and other features of the invention will become apparent from the following specification taken in conjunction with the accompanying drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the embodiments in which the principles of the invention may be employed, it being understood that the invention is not limited to the embodiments described, but, on the contrary, is intended to cover all modifications, variations, and equivalents falling within the scope of the appended claims.
101, operating according to the gardenia scorching process flow, starting timing after gardenia decoction pieces are put into the decoction pieces, shooting the scorching process of the gardenia in real time through a camera, and reading a gardenia image R acquired by the camera every 1min0
Step 201, image R0Converting from an RGB image to an HSV image;
202, under the HSV color model, the color characteristic of the gardenia area is H1<H<H2,S1<S<S2,V1<V<V2Wherein H is1、S1、V1Minimum threshold values for three channels, H2、S2、V2Maximum threshold values for three channels, respectively, of the original image R0Marking the pixel value which meets the three channel threshold conditions at the same time as 1, otherwise marking the pixel value as 0, and obtaining a binary mask image M for extracting the gardenia area1
Step 203, the binary mask image M in step 202 is processed1And the original image R0Performing AND operation to obtain the Gardenia region R1
Step 301, for the gardenia region R1In the RGB color model, the color characteristic of the inner surface area of gardenia is R1<R<R2,G1<G<G2,B1<B<B2Wherein R is1、G1、B1Minimum threshold values, R, for three channels, respectively2、G2、B2Maximum threshold values for three channels, image R1Pixel values satisfying three channel threshold conditions simultaneouslyMarked as 1, otherwise marked as 0, so as to obtain the binary mask image M with the internal surface area removed2
Step 302, the binary mask image M in step 204 is processed2With the region R of Gardenia1An AND operation is performed to remove the inner surface region and leave a residual region R2
Step 401, for the segmented gardenia image region R2Under two color models of HSV and Lab, the gray average value m and the standard deviation delta of each single-channel image are respectively extracted, and the calculation formula is as follows:
Figure BDA0003386179550000041
Figure BDA0003386179550000042
wherein M is the total number of pixels of the gardenia image, i is the gray value, and ziExpressing the number of pixels with a gray value of i, wherein L is the gray level number of a single-channel image;
and step 402, obtaining 12 characteristic variables describing the gardenia color information calculated in the step 401 to form a characteristic vector.
Step 501, firstly, a large amount of characteristic data are used as training data, and a data model for identifying the scorching state of the gardenia is established by adopting a deep learning technology;
step 502, after the data model is established, inputting the image data and the characteristic image data collected in real time into the data model, outputting the scorching state data by the model, and simultaneously, incorporating the newly input data into the training database of the data model by the data model in a self-learning mode so as to continuously perfect the data model.

Claims (5)

1. A method for monitoring a gardenia scorching process based on machine vision comprises the following steps:
step one, acquiring a gardenia image: the preparation method comprises parching fructus Gardeniae to brown, adding fructus Gardeniae decoction pieces, and decoctingTiming, shooting the charring process of the gardenia by a camera in real time, and reading the gardenia image R acquired by the camera every 1min0
Secondly, extracting the gardenia area: image R0Converting the default RGB image into an HSV image, setting H, S, V maximum and minimum thresholds under an HSV color model, and acquiring a binary mask image M for extracting a gardenia region1Masking the film M1And the original image R0Performing AND operation to obtain the Gardenia region R1
Thirdly, removing the inner surface area: for the Gardenia region R1In the RGB color model, R, G, B maximum and minimum thresholds are set, and a binarized mask image M from which an inner surface region is removed is acquired2Masking the film M2With the region R of Gardenia1An AND operation is performed to remove the inner surface region and leave a residual region R2
Fourthly, extracting color characteristic values: for the segmented gardenia image region R2Respectively extracting the gray average value and the standard deviation of each single-channel image under HSV and Lab color models to obtain 12 characteristic variables for describing the color information of the gardenia to form a characteristic vector;
fifthly, monitoring the scorched state: and (3) carrying out principal component analysis on the color feature vectors, inputting the obtained first 5 principal components serving as models into a data model trained by deep learning, and outputting scorching state data serving as models.
2. The method for monitoring the charring process of gardenia jasminoides ellis based on machine vision as claimed in claim 1, wherein the mask M in the second step1In the HSV color model, the color characteristic of the gardenia area is H1<H<H2,S1<S<S2,V1<V<V2Wherein H is1、S1、V1Minimum threshold values for three channels, H2、S2、V2Maximum threshold values for three channels, respectively, of the original image R0Pixels satisfying three channel threshold conditions simultaneouslyThe value is marked as 1, otherwise, the value is marked as 0, and the binary mask image M for extracting the gardenia area is obtained1
3. The method for monitoring the charring process of gardenia jasminoides ellis based on machine vision as claimed in claim 1, wherein the mask M is used in the third step2Under the RGB color model, the color characteristic of the inner surface area of the gardenia is R1<R<R2,G1<G<G2,B1<B<B2Wherein R is1、G1、B1Minimum threshold values, R, for three channels, respectively2、G2、B2Maximum threshold values for three channels, image R1Marking the pixel value of the image which meets the threshold conditions of the three channels at the same time as 1, otherwise marking the pixel value of the image as 0, and obtaining the binary mask image M with the inner surface area removed2
4. The method for monitoring the gardenia scorching process based on the machine vision as claimed in claim 1, wherein the fourth step is to extract the color feature values, and the calculation formula of the gray mean value m and the standard deviation δ of each single-channel image is as follows:
Figure FDA0003386179540000021
Figure FDA0003386179540000022
wherein M is the total number of pixels of the gardenia image, i is the gray value, and ziThe number of pixels having a gray scale value i is indicated, and L is the number of gray scales of a single-channel image.
5. The method for monitoring the process of charring gardenia jasminoides ellis frying process based on machine vision as claimed in claim 1, wherein in the fifth step of monitoring the charring state, a data model for identifying the charring state of gardenia jasminoides ellis is first established by using a large amount of characteristic data as training data and adopting a deep learning technique, and after the data model is established, image data and characteristic image data collected in real time are input into the data model, and the model outputs the charring state data. Meanwhile, the data model adopts a self-learning mode to incorporate the newly input data into the training database of the data model so as to continuously improve the data model.
CN202111472898.9A 2021-12-01 2021-12-01 Gardenia stir-frying process monitoring method based on machine vision Active CN114120243B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111472898.9A CN114120243B (en) 2021-12-01 2021-12-01 Gardenia stir-frying process monitoring method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111472898.9A CN114120243B (en) 2021-12-01 2021-12-01 Gardenia stir-frying process monitoring method based on machine vision

Publications (2)

Publication Number Publication Date
CN114120243A true CN114120243A (en) 2022-03-01
CN114120243B CN114120243B (en) 2023-04-07

Family

ID=80366542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111472898.9A Active CN114120243B (en) 2021-12-01 2021-12-01 Gardenia stir-frying process monitoring method based on machine vision

Country Status (1)

Country Link
CN (1) CN114120243B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589470A (en) * 2012-02-14 2012-07-18 大闽食品(漳州)有限公司 Fuzzy-neural-network-based tea leaf appearance quality quantification method
CN104931428A (en) * 2015-05-26 2015-09-23 南京中医药大学 On-line control method for preparation process of fructus gardenia
CN105079180A (en) * 2015-07-06 2015-11-25 天圣制药集团股份有限公司 Processing method of decoction pieces of cape jasmine fruit
CN110244605A (en) * 2019-05-29 2019-09-17 安徽华润金蟾药业股份有限公司 A kind of line Quality Control device and method of Chinese medicine frying
CN110956217A (en) * 2019-12-06 2020-04-03 广东美的白色家电技术创新中心有限公司 Food maturity recognition method and device and computer storage medium
CN112931643A (en) * 2021-03-17 2021-06-11 河南科技大学 Control method of tea frying robot and tea frying robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589470A (en) * 2012-02-14 2012-07-18 大闽食品(漳州)有限公司 Fuzzy-neural-network-based tea leaf appearance quality quantification method
CN104931428A (en) * 2015-05-26 2015-09-23 南京中医药大学 On-line control method for preparation process of fructus gardenia
CN105079180A (en) * 2015-07-06 2015-11-25 天圣制药集团股份有限公司 Processing method of decoction pieces of cape jasmine fruit
CN110244605A (en) * 2019-05-29 2019-09-17 安徽华润金蟾药业股份有限公司 A kind of line Quality Control device and method of Chinese medicine frying
CN110956217A (en) * 2019-12-06 2020-04-03 广东美的白色家电技术创新中心有限公司 Food maturity recognition method and device and computer storage medium
CN112931643A (en) * 2021-03-17 2021-06-11 河南科技大学 Control method of tea frying robot and tea frying robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘玉杰: "砂炒马钱子炮制工艺优化与质量评价研究", 《中国优秀硕士学位论文全文数据库 医药卫生科技辑》 *

Also Published As

Publication number Publication date
CN114120243B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
Barata et al. Improving dermoscopy image classification using color constancy
CN108010024B (en) Blind reference tone mapping image quality evaluation method
CN109044322A (en) A kind of contactless heart rate variability measurement method
WO2018098986A1 (en) Automatic detection system and method for tongue images in traditional chinese medicine
CN111860538A (en) Tongue color identification method and device based on image processing
CN105160346B (en) A kind of greasy recognition methods of curdy fur on tongue based on texture and distribution characteristics
CN105069131B (en) The capsule endoscope image search method of view-based access control model vocabulary and partial descriptor
CN110495888B (en) Standard color card based on tongue and face images of traditional Chinese medicine and application thereof
CN107330393A (en) A kind of neonatal pain expression recognition method based on video analysis
CN109242792B (en) White balance correction method based on white object
RAJA et al. Screening diabetic retinopathy in developing countries using retinal images
CN109241963B (en) Adaboost machine learning-based intelligent identification method for bleeding point in capsule gastroscope image
CN109975292A (en) A kind of atlantic salmon and rainbow trout method for quick identification based on machine vision
CN102136077B (en) Method for automatically recognizing lip color based on support vector machine
Wang et al. Facial image medical analysis system using quantitative chromatic feature
CN113130066A (en) Tongue diagnosis image identification method based on artificial intelligence
CN117788407A (en) Training method for glaucoma image feature extraction based on artificial neural network
CN114120243B (en) Gardenia stir-frying process monitoring method based on machine vision
Mahajan et al. Quality analysis of Indian basmati rice grains using top-hat transformation
CN109636864A (en) A kind of tongue dividing method and system based on color correction Yu depth convolutional neural networks
CN109711306B (en) Method and equipment for obtaining facial features based on deep convolutional neural network
Nagano et al. Development of a skin texture evaluation system using a convolutional neural network
Kanawong et al. ZHENG classification in Traditional Chinese Medicine based on modified specular-free tongue images
Kang et al. Dental plaque quantification using mean-shift-based image segmentation
CN112464871A (en) Deep learning-based traditional Chinese medicine tongue image processing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant