CN115631488A - Jetson Nano-based fruit maturity nondestructive testing method and system - Google Patents

Jetson Nano-based fruit maturity nondestructive testing method and system Download PDF

Info

Publication number
CN115631488A
CN115631488A CN202211270167.0A CN202211270167A CN115631488A CN 115631488 A CN115631488 A CN 115631488A CN 202211270167 A CN202211270167 A CN 202211270167A CN 115631488 A CN115631488 A CN 115631488A
Authority
CN
China
Prior art keywords
fruit
maturity
image
information
fruits
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211270167.0A
Other languages
Chinese (zh)
Inventor
项新建
周焜
褚银泽
姚佳娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lover Health Science and Technology Development Co Ltd
Original Assignee
Zhejiang Lover Health Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lover Health Science and Technology Development Co Ltd filed Critical Zhejiang Lover Health Science and Technology Development Co Ltd
Priority to CN202211270167.0A priority Critical patent/CN115631488A/en
Publication of CN115631488A publication Critical patent/CN115631488A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content

Abstract

The invention belongs to the technical field of fruit maturity detection, and particularly relates to a Jetson Nano-based fruit maturity nondestructive detection method and system. The method comprises the following steps: s1, constructing a fruit maturity data set by an automatic labeling method, and training a target detection network YOLOv 4-tiny; s2, shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video; s3, inputting the image extracted in the step S2 as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit; and S4, displaying the obtained maturity information of the fruits through a display. The method has the characteristics of capability of realizing rapid detection of the fruit maturity through images, strong model robustness and capability of effectively avoiding subjective influence of manual detection.

Description

Jetson Nano-based fruit maturity nondestructive testing method and system
Technical Field
The invention belongs to the technical field of fruit maturity detection, and particularly relates to a Jetson Nano-based fruit maturity nondestructive detection method and system.
Background
At present, fruit maturity detection is an important component of agricultural intelligence. The traditional machine learning method has the following problems that the fruit maturity is judged to be remarkable in effect according to a threshold value or a color difference value by extracting features after segmenting and preprocessing an image, but the following problems exist: 1. the background influence is large, and the robustness of the model is poor due to the fact that the needed data set is few; 2. the recognition speed is low, and the features are usually extracted after the image is preprocessed; 3. the deployment difficulty is high, the traditional machine learning method is slow in calculation and has requirements on the recognition environment, and the traditional machine learning method is generally deployed in a pipeline instead of an embedded device.
Therefore, it is necessary to design a Jetson Nano-based fruit maturity nondestructive testing method and system which can realize rapid detection of fruit maturity through images, have strong model robustness and can effectively avoid subjective influence of manual detection. Jetson Nano refers to an embedded AI platform.
For example, chinese patent application No. CN202210200381.2 describes a fruit ripening detection method and system based on YOLOv4 model and convolutional neural network. The method comprises the following steps: s1, inputting a training picture into a fruit recognition unit based on a YOLOv4 model; s2, inputting a training data set from a fruit identification unit into a fruit ripening detection unit to train a deep learning model of the fruit ripening detection unit based on a convolutional neural network; s3, shooting pictures through a camera and the raspberry and inputting the pictures into a fruit identification unit; and S4, inputting the image into a fruit ripening detection unit, and outputting whether the fruit ripens or not. Although the fruit ripening detection model is combined on the basis of the YOLOv4 model to detect the ripening of fruits, and a lightweight YOLOv4-tiny structure is used as an identification network in the YOLOv4 model to improve the precision and speed of fruit detection, and the method can be applied to the fruit ripening detection in daily life to facilitate the storage of fruits, the defects are that data set training is still needed, and the problems of inconsistent subjective judgment and low efficiency of each person still exist in the data set labeled manually.
Disclosure of Invention
The invention provides a Jetson Nano-based fruit maturity nondestructive testing method and system which can realize rapid testing of fruit maturity through images, have strong model robustness and effectively avoid subjective influence of manual testing, and aims to overcome the problems that in the prior art, a deep learning algorithm adopted by the existing fruit maturity testing still needs data set training, and the subjective judgment of each person is inconsistent and the efficiency is too low in a manually labeled data set.
In order to achieve the purpose of the invention, the invention adopts the following technical scheme:
a fruit maturity nondestructive testing method based on Jetson Nano comprises the following steps:
s1, constructing a fruit maturity data set by an automatic labeling method, and training a target detection network YOLOv 4-tiny;
s2, shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
s3, inputting the image extracted in the step S2 as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit;
and S4, displaying the obtained maturity information of the fruits through a display.
Preferably, step S1 includes the steps of:
s11, collecting a single background fruit image with known maturity and a fruit image with unknown maturity in a natural environment;
s12, segmenting the fruit image with the known maturity of a single background, extracting features and determining a maturity judgment standard;
s13, performing label information supplementation on the fruit image with unknown maturity in the natural environment according to the maturity judging standard obtained in the step S12 to obtain a fruit maturity data set in the natural environment;
and S14, training the lightweight target detection network YOLOv4-tiny according to the fruit maturity data set in the natural environment obtained in the step S13.
Preferably, step S12 includes the steps of:
s121, segmenting the fruit image with the known maturity of the single background, and removing background information;
s122, converting the RGB color space of the image into a Lab color space;
and S123, performing feature extraction on the image after the color space is converted by using a K-means clustering algorithm, and taking the feature extraction as a fruit maturity classification standard.
Preferably, step S13 includes the steps of:
s131, identifying and positioning the fruits in the image through a target detection algorithm, and obtaining fruit coordinate information;
s132, cutting and mapping the fruits in the image according to the obtained fruit coordinate information;
and S133, classifying the ripeness of the mapped fruits by adopting a ripeness classification standard of the fruits, and updating the label file.
Preferably, the specific process of step S12 is as follows:
s121, converting the input image into a gray scale image, and segmenting fruits in the image by an Otsu segmentation method; setting the threshold value as K, dividing the gray scale image into two parts
Figure BDA0003894804080000031
The probability of the pixel being divided into C1 and C2 is p1 and p2 respectively; calculate the pixel mean value of C1 and record as M C1 By the same token, obtain M C2 And the global pixel mean value is M, and the inter-class variance is obtained according to the following formula:
σ 2 =p 1 *(M c1 -M) 2 +p 2 *(M c2 -M) 2
is simplified and obtained
σ 2 =p 1 *p 2 *(M c1 -M c2 ) 2
Traverse the data range 0-255, find σ 2 The maximum K value is the threshold.
S122, converting the segmented image from an RGB color space to a Lab color space; calculating the pixel mean value of the A channel and recording as M A In the same way, the mean value of the B channel is recorded as M B And the value of the color difference ratio a/b is recorded as M C Calculating clustering centers according to a K-means clustering algorithm, setting 3 maturity degrees required to be divided, and then requiring 3 clustering centers, wherein the formula is as follows:
Figure BDA0003894804080000041
Figure BDA0003894804080000042
wherein, mc i Denotes that the ith object 1. Ltoreq. I.ltoreq.n, C j Denotes the jth cluster center, mc it The t-th attribute, C, representing the ith object jt T-th attribute, | S, representing jth maturity i I represents the number of objects in the ith cluster; the cluster center refers to maturity;
sequentially comparing the distance from each object to each cluster center, and distributing the objects to the cluster of the cluster center closest to the object to obtain 3 clusters { S } 1 ,S 2 ,S 3 }。
Preferably, the specific process of step S13 is as follows:
training a depth model only performing identification but not detecting maturity, identifying an image of an unfractionated fruit data set through a target detection algorithm, mapping the identified fruit image to the step S122, performing maturity clustering, and creating a training set label according to the identified position information and the obtained maturity information.
The invention also provides a fruit maturity nondestructive testing system based on Jetson Nano, which comprises:
the data construction and training module is used for constructing a fruit maturity data set by an automatic labeling method and training a target detection network YOLOv 4-tiny;
the image acquisition module is used for shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
the fruit maturity information acquisition module is used for inputting the extracted image as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit;
and the information display module is used for displaying the obtained maturity information of the fruits through a display.
Compared with the prior art, the invention has the beneficial effects that: (1) The invention provides an automatic labeling method based on machine learning, which is characterized in that a deep learning model is trained by a data set manufactured through the method, so that a target detection algorithm has a maturity detection function, and the real-time detection of the maturity of fruits is realized; (2) The invention can realize the rapid detection of the fruit maturity through the image; (3) The model adopted by the invention has stronger robustness and can identify the fruits in a natural environment; (4) The invention can effectively avoid the subjective influence of manual detection.
Drawings
FIG. 1 is a flow chart of a fruit ripeness detection and identification algorithm of the present invention;
FIG. 2 is a flow chart of an automatic labeling method of the present invention;
FIG. 3 is a schematic representation of a graded fruit ripeness image of the present invention;
FIG. 4 is a schematic representation of an image of an unfractionated fruit dataset according to the present invention;
FIG. 5 is a hardware structure diagram of a Jetson Nano-based fruit maturity non-destructive inspection system according to the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention, the following description will explain the embodiments of the present invention with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
Example (b):
as shown in FIG. 1, the invention provides a fruit maturity nondestructive testing method based on Jetson Nano, which comprises the following steps:
s1, constructing a fruit maturity data set by an automatic labeling method, and training a target detection network YOLOv 4-tiny;
s2, shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
s3, inputting the image extracted in the step S2 as an input layer into a trained target detection network YOLOv4-tiny to perform feature extraction and classification prediction, and obtaining maturity information of the fruits;
and S4, displaying the obtained maturity information of the fruits through a display.
In addition, the picking personnel can carry out real-time detection or detection after shooting images through the touch operation of the LCD screen.
Wherein, the ripeness information of the fruit is divided into immature, mature and mature.
The step S1 includes the steps of:
s11, collecting a single background fruit image with known maturity and a fruit image with unknown maturity in a natural environment;
s12, segmenting the fruit image with the known maturity of a single background, extracting features and determining a maturity judgment standard;
s13, performing label information supplementation on the fruit image with unknown ripeness in the natural environment according to the ripeness judgment standard obtained in the step S12 to obtain a fruit ripeness data set in the natural environment;
and S14, training the lightweight target detection network YOLOv4-tiny according to the fruit maturity data set in the natural environment obtained in the step S13.
As shown in fig. 2, step S12 includes the steps of:
s121, segmenting the single background fruit image with the known maturity, and removing background information;
s122, converting the RGB color space of the image into a Lab color space;
and S123, performing feature extraction on the image after the color space is converted by using a K-means clustering algorithm, and taking the feature extraction as a fruit maturity classification standard.
Experiments prove that the color ratio a/b as a comprehensive chromaticity index can be used as a reference index of related fruit ripening. The fruit image data set is automatically labeled by combining deep learning and machine learning, so that the problem of inconsistent maturity caused by personal subjective influence during manual labeling is solved. The graded maturity images are shown in fig. 3, in which the fruit images are unripe, mature and mature, respectively, from left to right.
An image of an unfractionated fruit dataset, which is used for the training of the depth model, is shown in fig. 4. The quantity of the unfractionated fruit data sets is large, and the problem of inconsistent subjective classification is easy to occur if manual labeling is carried out on the maturity, so that the training effect is poor. Therefore, the present invention proceeds to the following step S13 process:
s131, identifying and positioning the fruits in the image through a target detection algorithm, and obtaining fruit coordinate information;
s132, cutting and mapping the fruits in the image according to the obtained fruit coordinate information;
and S133, classifying the ripeness of the mapped fruits by adopting a ripeness classification standard of the fruits, and updating the label file.
The automatic labeling method provided by the invention has the following specific implementation process:
1. converting an input image into a gray-scale image, and segmenting fruits in the image by an Otsu segmentation method; setting the threshold value as K, dividing the gray scale image into two parts
Figure BDA0003894804080000071
The probability of the pixel being divided into C1 and C2 is p1 and p2 respectively; calculate the pixel mean value of C1 and record as M C1 By the same token, obtain M C2 The global pixel mean value is M,and solving the variance between the classes, wherein the formula is as follows:
σ 2 =p 1 *(M c1 -M) 2 +p 2 *(M c2 -M) 2
is simple and easy to obtain
σ 2 =p 1 *p 2 *(M c1 -M c2 ) 2
Traverse the data range 0-255, find σ 2 The maximum K value is the threshold.
2. Converting the segmented image from an RGB color space to a Lab color space; the pixel mean value of the A channel is calculated and recorded as M A In the same way, the mean value of the B channel is recorded as M B And the value of the color difference ratio a/b is recorded as M C Calculating clustering centers according to a K-means clustering algorithm, setting 3 maturity degrees required to be divided, and then needing 3 clustering centers, wherein the formula is as follows:
Figure BDA0003894804080000072
Figure BDA0003894804080000073
wherein, mc i Denotes that the ith object 1. Ltoreq. I.ltoreq.n, C j Denotes the jth cluster center, mc it The t-th attribute, C, representing the ith object jt T-th attribute, | S, representing jth maturity i L represents the number of objects in the ith cluster; the cluster center refers to maturity;
sequentially comparing the distance from each object to each cluster center, and distributing the objects to the cluster of the cluster center closest to the object to obtain 3 clusters { S } 1 ,S 2 ,S 3 }。
3. Training a depth model which only carries out identification but not detecting the maturity, identifying the images of the fruit data sets which are not classified through a target detection algorithm, mapping the identified images of the fruits to the step 2, carrying out maturity clustering, and establishing a new training set label according to the identified position information and the obtained maturity information.
The invention also provides a fruit maturity nondestructive testing system based on Jetson Nano, which comprises:
the data construction and training module is used for constructing a fruit maturity data set by an automatic labeling method and training a target detection network YOLOv 4-tiny;
the image acquisition module is used for shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
the fruit maturity information acquisition module is used for inputting the extracted image as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit;
and the information display module is used for displaying the obtained maturity information of the fruits through a display.
The equipment that specifically adopts is shown in fig. 5, and the fruit image is gathered to the camera, transmits into embedded AI platform Jetson Nano and discerns and detects, shows the testing result on the display screen and feeds back to picking personnel. The data construction and training module and the fruit maturity information acquisition module are carried on a Jetson Nano development board.
The invention provides an automatic labeling method based on machine learning, which is used for manufacturing a data set training deep learning model, so that a target detection algorithm has a maturity detection function, and the real-time detection of fruit maturity is realized; the invention can realize the rapid detection of the fruit maturity through the image; the model adopted by the invention has stronger robustness and can identify the fruits in a natural environment; the invention can effectively avoid the subjective influence of manual detection.
The foregoing has outlined rather broadly the preferred embodiments and principles of the present invention and it will be appreciated that those skilled in the art may devise variations of the present invention that are within the spirit and scope of the appended claims.

Claims (7)

1. A fruit maturity nondestructive testing method based on Jetson Nano is characterized by comprising the following steps:
s1, constructing a fruit maturity data set by an automatic labeling method, and training a target detection network YOLOv 4-tiny;
s2, shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
s3, inputting the image extracted in the step S2 as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit;
and S4, displaying the obtained maturity information of the fruits through a display.
2. The Jetson Nano-based fruit maturity non-destructive inspection method of claim 1, wherein step S1 comprises the steps of:
s11, collecting a single background fruit image with known maturity and a fruit image with unknown maturity in a natural environment;
s12, segmenting the fruit image with the known maturity of a single background, extracting features and determining a maturity judgment standard;
s13, performing label information supplementation on the fruit image with unknown ripeness in the natural environment according to the ripeness judgment standard obtained in the step S12 to obtain a fruit ripeness data set in the natural environment;
and S14, training the lightweight target detection network YOLOv4-tiny according to the fruit maturity data set in the natural environment obtained in the step S13.
3. The Jetson Nano-based fruit maturity non-destructive testing method according to claim 2, wherein step S12 comprises the steps of:
s121, segmenting the fruit image with the known maturity of the single background, and removing background information;
s122, converting the RGB color space of the image into a Lab color space;
and S123, performing feature extraction on the image after the color space is converted by using a K-means clustering algorithm, and using the feature extraction as a fruit maturity classification standard.
4. The Jetson Nano-based fruit maturity non-destructive inspection method of claim 3, wherein step S13 comprises the steps of:
s131, identifying and positioning the fruits in the image through a target detection algorithm, and obtaining fruit coordinate information;
s132, cutting and mapping the fruits in the image according to the obtained fruit coordinate information;
and S133, classifying the ripeness of the mapped fruits by adopting a ripeness classification standard of the fruits, and updating the label file.
5. The Jetson Nano-based fruit maturity nondestructive testing method according to claim 2, wherein the specific process of step S12 is as follows:
s121, converting the input image into a gray-scale image, and segmenting fruits in the image by an Otsu segmentation method; setting the threshold value as K, dividing the gray scale image into two parts
Figure FDA0003894804070000021
The probability of the pixel being divided into C1 and C2 is p1 and p2 respectively; calculate the pixel mean of C1 and record as M C1 By the same token, obtain M C2 And when the global pixel mean value is M, solving the inter-class variance by the following formula:
σ 2 =p 1 *(M c1 -M) 2 +p 2 *(M c2 -M) 2
is simplified and obtained
σ 2 =p 1 *p 2 *(M c1 -M c2 ) 2
Traverse the data range 0-255, find σ 2 The maximum K value is the threshold.
S122, converting the segmented image from an RGB color space to a Lab color space; calculating the pixel mean value of the A channel and recording as M A In the same way, the mean value of the B channel is recorded as M B And the value of the color difference ratio a/b is recorded as M C Calculating clustering centers according to a K-means clustering algorithm, setting 3 maturity degrees required to be divided, and then needing 3 clustering centers, wherein the formula is as follows:
Figure FDA0003894804070000022
Figure FDA0003894804070000023
wherein, mc i Denotes that the ith object 1 ≦ i ≦ n, C j Denotes the jth cluster center, mc it The t-th attribute, C, representing the ith object jt T-th attribute, | S, representing jth maturity i L represents the number of objects in the ith cluster; the cluster center refers to maturity;
sequentially comparing the distance from each object to each cluster center, and distributing the objects to the cluster of the cluster center closest to the object to obtain 3 clusters { S } 1 ,S 2 ,S 3 }。
6. The Jetson Nano-based fruit maturity nondestructive testing method of claim 5, wherein the specific process of step S13 is as follows:
training a depth model only performing identification but not detecting maturity, identifying an image of an unfractionated fruit data set through a target detection algorithm, mapping the identified fruit image to the step S122, performing maturity clustering, and creating a training set label according to the identified position information and the obtained maturity information.
7. Fruit maturity nondestructive test system based on Jetson Nano, its characterized in that includes:
the data construction and training module is used for constructing a fruit maturity data set by an automatic labeling method and training a target detection network YOLOv 4-tiny;
the image acquisition module is used for shooting fruit videos or images in a natural environment; if the video is shot, extracting the frame-by-frame images of the video;
the fruit maturity information acquisition module is used for inputting the extracted image as an input layer into a trained target detection network YOLOv4-tiny for feature extraction and classification prediction to obtain the maturity information of the fruit;
and the information display module is used for displaying the obtained maturity information of the fruits through a display.
CN202211270167.0A 2022-10-18 2022-10-18 Jetson Nano-based fruit maturity nondestructive testing method and system Pending CN115631488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211270167.0A CN115631488A (en) 2022-10-18 2022-10-18 Jetson Nano-based fruit maturity nondestructive testing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211270167.0A CN115631488A (en) 2022-10-18 2022-10-18 Jetson Nano-based fruit maturity nondestructive testing method and system

Publications (1)

Publication Number Publication Date
CN115631488A true CN115631488A (en) 2023-01-20

Family

ID=84907104

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211270167.0A Pending CN115631488A (en) 2022-10-18 2022-10-18 Jetson Nano-based fruit maturity nondestructive testing method and system

Country Status (1)

Country Link
CN (1) CN115631488A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117273869A (en) * 2023-11-21 2023-12-22 安徽农业大学 Intelligent agricultural product pushing method, system, device and medium based on user data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117273869A (en) * 2023-11-21 2023-12-22 安徽农业大学 Intelligent agricultural product pushing method, system, device and medium based on user data
CN117273869B (en) * 2023-11-21 2024-02-13 安徽农业大学 Intelligent agricultural product pushing method, system, device and medium based on user data

Similar Documents

Publication Publication Date Title
US20230289979A1 (en) A method for video moving object detection based on relative statistical characteristics of image pixels
CN110148130B (en) Method and device for detecting part defects
CN109978822B (en) Banana maturity judging modeling method and judging method based on machine vision
CN115351598A (en) Numerical control machine tool bearing detection method
CN106778687B (en) Fixation point detection method based on local evaluation and global optimization
CN113592845A (en) Defect detection method and device for battery coating and storage medium
CN115082683A (en) Injection molding defect detection method based on image processing
CN110490842B (en) Strip steel surface defect detection method based on deep learning
CN111398291B (en) Flat enameled electromagnetic wire surface flaw detection method based on deep learning
CN111815564B (en) Method and device for detecting silk ingots and silk ingot sorting system
CN108960142B (en) Pedestrian re-identification method based on global feature loss function
CN112766218B (en) Cross-domain pedestrian re-recognition method and device based on asymmetric combined teaching network
CN113706490B (en) Wafer defect detection method
CN109584206B (en) Method for synthesizing training sample of neural network in part surface flaw detection
CN113221710A (en) Neural network-based drainage pipeline defect identification method, device, equipment and medium
CN106872473A (en) A kind of potato defects detection identifying system design based on machine vision
CN115841488B (en) PCB hole inspection method based on computer vision
CN107330440B (en) Ocean state calculation method based on image recognition
CN115631488A (en) Jetson Nano-based fruit maturity nondestructive testing method and system
CN115019294A (en) Pointer instrument reading identification method and system
CN116977931A (en) High-altitude parabolic identification method based on deep learning
CN111950452A (en) Face recognition method
CN112257621A (en) Equipment image identification method for unmanned aerial vehicle inspection
CN114662594B (en) Target feature recognition analysis system
Huang et al. Mango surface defect detection based on HALCON

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination