CN108182380B - Intelligent fisheye pupil measurement method based on machine learning - Google Patents

Intelligent fisheye pupil measurement method based on machine learning Download PDF

Info

Publication number
CN108182380B
CN108182380B CN201711248089.3A CN201711248089A CN108182380B CN 108182380 B CN108182380 B CN 108182380B CN 201711248089 A CN201711248089 A CN 201711248089A CN 108182380 B CN108182380 B CN 108182380B
Authority
CN
China
Prior art keywords
fish
eye
pupil
classifier
fisheye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711248089.3A
Other languages
Chinese (zh)
Other versions
CN108182380A (en
Inventor
赵瑶池
胡祝华
刘世光
张逸然
骆剑
钟杰卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201711248089.3A priority Critical patent/CN108182380B/en
Publication of CN108182380A publication Critical patent/CN108182380A/en
Application granted granted Critical
Publication of CN108182380B publication Critical patent/CN108182380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Abstract

The invention relates to an intelligent measurement method of pupil diameter of fish based on machine learning, which comprises the following steps: and acquiring a fish-eye image, calculating a fish-eye classifier of the acquired fish-eye diagram, obtaining a fish-eye region by using the fish-eye classifier, calculating the pixel size of a fish-eye pupil of the fish-eye region, and converting the pixel size of the fish-eye pupil into the fish-eye actual size. According to the method provided by the invention, the fisheye classifier obtained by training based on the AdaBoost machine learning method can accurately detect the fisheye region in the fish image, and can realize contactless intelligent measurement of the fisheye pupil by combining a computer vision technology and an image processing technology, so that the accuracy and stability of measured data are ensured, and meanwhile, the measuring efficiency and the measuring intelligence are improved.

Description

Intelligent fisheye pupil measurement method based on machine learning
Technical Field
The invention relates to a measurement method in the field of aquaculture, in particular to an intelligent measurement method for pupil parameters of fish eyes.
Background
In the aquaculture process, the characteristic parameters of the fish have important significance, for example, in the breeding of fine variety of the fish, the growth characters of the fish such as body length, body width, tail stem length, pupil of the fish eye and the like need to be measured periodically, and the indexes have important reference values for making breeding and mating plans and have indispensable effects on the evaluation of fine variety breeding. In addition, these indexes are important evaluation parameters for classifying and evaluating the fish grade and freshness, and the fish body length and body width, especially the fish eye pupil data.
However, the traditional measuring method of the physical parameters of the fish is to measure the physical parameters of the fish step by using measuring tools such as a ruler, a vernier caliper and the like under the condition that the fish body leaves water, and in order to reduce the severe stress response of the fish, the fish needs to be measured after anesthesia, so that the time and the labor are consumed, feeding stopping can occur in the part of living body measured after measurement, even the living body cannot keep alive, and the growth and the survival of the fish are affected difficultly in a reversing way. Therefore, it is necessary to provide an intelligent, accurate and contactless device for measuring the physical signs of fish.
The non-contact measurement can be realized by using computer vision and image processing technology to measure various physical indexes of the fish, and the measurement efficiency can be improved to a great extent. The technology is utilized to extract fish body, measure fish sign data and further classify and sort, but most of the technology is aimed at measuring body length, body width and weight data, and in the aspect of fish eye pupil measurement, as fish eyes are embedded into fish body, the fish eyes are distinguished from the fish body, and the difficulty of automatically dividing the fish eye pupils from the iris of the fish eyes is high, and high requirements are definitely put forward for the measurement technology based on computer vision, so that the intelligent measurement of the fish eye pupils has a huge distance from actual expectations.
Disclosure of Invention
In order to reduce or eliminate the gap, the invention provides a novel intelligent measuring method for the pupil diameter of the fish eye. The fish-eye classifier is obtained by training a machine learning algorithm based on Adboost, fish-eye areas in the fish-eye classifier are detected and identified by using the classifier to image fish-eye images, the pupil of the fish-eye is obtained by detecting through Hough circle transformation, and the pupil diameter is calculated. The method can solve the defects of time and labor consumption, poor precision and low efficiency in the traditional fisheye pupil diameter measurement, and realize nondestructive, accurate and intelligent measurement.
The invention provides a novel intelligent measuring method for the pupil diameter of a fish eye, which aims to intelligently measure the pupil parameters of the fish eye by using a machine learning method and a computer vision technology, and realize nondestructive and intelligent measurement of the pupil of the fish eye while improving the measuring efficiency.
The invention adopts the following technical scheme:
a machine learning-based fisheye pupil intelligent measurement method, the method comprising:
s1, collecting an image of a fish;
s2, calculating a fish eye classifier in the acquired fish-case image;
s3, a fish-eye classifier is utilized to obtain and extract a fish-eye region;
s4, calculating the pixel size of the pupil of the fish eye in the fish eye area;
s5, converting the pixel size of the pupil of the fish eye into the actual size of the fish eye.
Preferably, the fish-eye classifier includes:
s2-1, generating a positive and negative sample set of the fish-eye image by using the acquired fish-eye diagram;
s2-2, extracting characteristic values of samples from the sample set;
s2-3, obtaining a fish-eye classifier from the sample feature values by using an AdaBoost algorithm.
Further, the fisheye image sample is characterized by Haar-like features, including edge features, linear features, and center-surround features.
Preferably, the detecting and extracting the fish-eye region includes:
s3-1, graying a manually intercepted fish eye part in the fish-case image to be used as a positive sample set; removing the part containing fish eyes from the small image which is halved after the grey image of the fish-eye image as a negative sample set;
s3-2, extracting Haar-like features from the sample set, and calculating feature values by using an integral graph; the method comprises the steps of carrying out a first treatment on the surface of the
S3-3, acquiring the fisheye region from a fisheye classifier input with the characteristic value.
Further, the fish-eye area is a circular area.
Preferably, the calculating of the pixel size of the fisheye pupil includes:
s4-1, graying a fish-eye area;
s4-2, transforming the pixel size of the pupil of the fish eye by using a Hough circle;
preferably, the capturing of the fish images includes,
converting the converted fisheye pupil pixel size into an actual size as follows:
Figure GDA0001597587100000031
in the above, L RP Is the actual bottom length of the platform of the image acquisition device, L CP Pixels of the bottom length of the image-capturing device
Size, L CFx Represents pupil pixel size, L RFx Representing the actual pupil size.
Compared with the closest prior art, the invention has the following beneficial effects:
1. the technical scheme adopted by the invention utilizes computer vision and image processing technology to measure the pupil diameter of the fish eye, and can realize non-contact and nondestructive measurement.
2. The invention utilizes a machine learning method to train the fish-eye classifier, basically does not depend on manual operation in the measurement process, does not need any experience value, and can realize intelligent measurement.
3. Compared with manual measurement, the invention has the advantages of high measurement speed, capability of avoiding errors caused by physiological fatigue of operators in the manual operation process and improving the measurement accuracy and the measurement efficiency.
Drawings
FIG. 1 is a flow chart of the present invention
FIG. 2 is a block diagram of the intelligent measurement method of the pupil of the fish eye based on machine learning;
FIG. 3 is an image of a fish sample acquired in an embodiment of the present invention;
FIG. 4 is a positive example of an embodiment of the present invention;
FIG. 5 is a negative example of an embodiment of the present invention;
FIG. 6 is an example of features used to describe fish eyes in an embodiment of the present invention;
FIG. 7 is a flow chart of cascading strong classifiers in an embodiment of the invention;
FIG. 8 is a gray scale image obtained in the step of detecting a fisheye region and the detected fisheye region in the embodiment of the invention;
FIG. 9 is a diagram of a fish-eye area where a partial fish sample is detected in an embodiment of the invention;
FIG. 10 is a graph showing the relative deviation statistics of the pupil diameter of the fish eye and the manually measured diameter in an embodiment of the invention;
FIG. 11 is a time line graph of algorithm consumption in an embodiment of the present invention.
Detailed Description
The following describes the embodiments of the present invention in detail with reference to the drawings.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a novel intelligent measuring method for pupil diameter of fish eyes, which is shown in figures 1-2 and comprises the following steps:
s1, collecting an image of a fish;
s2, calculating to obtain a fish-eye classifier;
s3, detecting and extracting a fish-eye region by using a fish-eye classifier;
s4, calculating the pixel size of the pupil of the fish eye in the fish eye area;
s5, converting the pixel size of the pupil of the fish eye into the actual size of the fish eye.
Preferably, the training fish-eye classifier comprises:
s2-1, generating a positive sample and a negative sample of the fish-eye image;
s2-2, extracting a characteristic value of a sample;
s2-3, training the sample characteristic value by adopting an AdaBoost algorithm to obtain the fish-eye classifier.
Further, the fisheye image sample is characterized by Haar-like features, including edge features, linear features, and center-surround features.
Preferably, the detecting and extracting the fish-eye region includes:
s3-1, graying the fish-scale image;
s3-2, extracting Haar-like features from the fish-eye area obtained in the step S3-1, and calculating feature values by using an integral graph;
s3-3, inputting the characteristic value obtained in the step S3-2 into a fish-eye classifier, and detecting and extracting a fish-eye region.
Further, the fish-eye area is a circular area.
Preferably, the calculating the pixel size of the pupil of the fish eye includes:
s4-1, carrying out gray treatment on the fish-eye area;
s4-2, obtaining the pixel size of the pupil of the fish eye by adopting Hough circle transformation on the fish eye area;
preferably, the converted fisheye pupil pixel size is converted to an actual size as follows:
Figure GDA0001597587100000041
in the above, L RP Is the actual bottom length of the platform of the image acquisition device, L CP For the pixel size of the bottom length of the image acquisition device, L CFx Represents pupil pixel size, L RFx Representing the actual pupil size.
Examples: taking the intelligent measurement of the eye pupil size of the oval pompano as an example, the specific method of the invention is further described:
step 1, under normal illumination conditions, using OLYMPUS TG-4, f/2.0, focal length, by Hainan university ocean college of production and research base, hainan university, new village and town salt pier, hainan blue ocean aquaculture Co., ltd: the image of the fish with the lens distortion correction is 4mm, 1000 images of the trachinotus ovatus are collected by a camera with the lens distortion correction, the format of the images is JPEG, the color mode is RGB, the size of the images is 4608 multiplied by 3456, the length of a collection platform of the images is 560mm, and the pixel size is 4608. In the fish image, 100 pieces of the sample are used for measurement, the rest of the legend is used for manufacturing positive and negative samples of the fish image, and the sizes of the positive and negative sample images are 24 x 24 and 284 x 284 respectively. The measured images are introduced into a fish-eye characteristic parameter measuring system in batches, and each image is sequentially processed in the following steps.
Step 2, from the collected fish-sample image shown in fig. 3, a positive sample image of 700 fish eyes shown in fig. 4 and a negative sample image of 2124 shown in fig. 5 are produced. The sample was prepared as follows:
and manually cutting out a fish eye part from the fish image, and taking the fish eye part as a positive sample set after graying. Dividing the fish-eye image into a plurality of small images after graying, removing the part containing fish eyes in the small images, and graying the rest of the small images to be used as a negative sample set.
The fisheye classification detector values for the positive and negative sample set images are calculated using the Adaboost algorithm, the calculation comprising:
(1) Calculating characteristic values
The Haar-like feature values of the positive and negative samples are calculated and the Haar-like feature examples for describing the fish-eye feature are shown in fig. 6, which are edge features, linear features and center-around features, respectively.
(2) AdaBoost calculations include:
weak classifier value h based on Haar-like features j (x) The following formula is shown:
Figure GDA0001597587100000051
wherein p is j Is not controlled byThe sign factor of the equation direction; f (f) j (x) Represents the j-th Haar-like feature; θ is the characteristic value f j And judging whether the window is an object to be detected or not by using the corresponding threshold value.
Generating a class set by using Haar-like characteristics according to the steps, searching weak classifiers with minimum errors in the class set, and cascading the weak classifiers into strong classifiers. The algorithm is as follows:
Figure GDA0001597587100000052
/>
Figure GDA0001597587100000061
(3) Cascading of strong classifiers
And (3) cascading the strong classifiers in a multi-layer detection mode to obtain a higher detection rate, wherein the obtained AdaBoost classifier value is used for fish eye detection.
Step 3, fish eye detection process: the experimental image is subjected to graying, and then a fish eye area is detected from the detected image according to the calculated fish eye detection classifier value, so as to obtain ROI (Region of Interest) and separate the fish eye area from the fish image. The method comprises the following specific steps:
(1) Inputting a fish image, and converting the fish image into a gray image;
(2) Extracting Haar-like features from the gray scale image: calculating Haar-like characteristic values by using an integral graph;
(3) Inputting Haar-like characteristic values, and calculating to obtain an AdaBoost classifier h j (x) Detecting a fish-eye region, and solving an inscribed circle of the fish-eye region to obtain a fish-eye region and a ROI (region of interest);
(4) And (3) separating the ROI area from the fish graph in the step (1).
Fig. 8 is a gray scale image of the fish in this step, and the detected fish-eye area. The fish eye area detected in the fish image in the present embodiment is shown in the blue circle-circled portion of fig. 9.
And 4, obtaining the pixel size from the pupil of the detected fish eye.
The pupil of the fish eye is obtained by hough circle transformation, and the steps are as follows:
(1) Extracting edges of the ROI area by using a Canny operator;
(2) Carrying out morphological operation on the extracted edges;
(3) Selecting 3 points on the edge, and then solving a circle parameter according to a circle equation and the three points;
(4) Repeating the step (3), and solving the parameters of the circle from any three points on the edge;
(5) And selecting the circle of the most dense area as the pupil of the fish eye, thereby obtaining the pixel size of the pupil.
And 5, converting the pixel size of the pupil of the fish eye into the actual size of the fish eye according to the following formula:
Figure GDA0001597587100000071
wherein L is RP Is the actual size of the bottom length of the image acquisition device, L CP For the pixel size of the bottom length of the image acquisition device, L CFx Represents pupil pixel size, L RFx Representing the actual pupil size. Pupil actual size L RFx Push type L RFx =L CFx *5600/4608 calculation.
The measurement accuracy of this embodiment was checked with a relative error δ shown in the following formula: delta= (L) RF -L)/L, wherein L RF In this embodiment, L is the standard value RF Is the average of 3 manual measurements. The manual measurement process comprises the following steps: the anesthetized fish is placed on a fish measuring plate, the pupil and iris of the fish eye are measured 3 times respectively by a caliper, and the measurement results are recorded, and the average value is used as L.
The relative deviation between the parameters of the ovate pomfret and the manual measurement result measured in the step 5 is shown in fig. 10, and the statistical analysis shows that the correct detection rate is 94.2%, the missed detection rate is 1.5% and the false detection rate is 4.3%. The detailed data and actual effect of the detected pupils are shown in Table 1.
Statistics of the time-consuming operation of this embodiment, the efficiency of this embodiment of the invention is shown in fig. 11, and fig. 11 illustrates: the average time consumption is 324.371ms, and the detection speed of the technical scheme provided by the invention can reach about 185 fish/min. Actual measurement statistics: pupil speed was measured manually to be only about 8 fish/min. In addition, when in batch measurement, physiological fatigue of operators in manual measurement can occur, and the method has stable detection speed and more accurate and objective measurement result; in addition, the proficiency of operators can also affect the measurement time consumption, and the advantages of the technical scheme provided by the invention can be better reflected in the situations.
The method of the invention basically does not need manual operation, and does not need any prior knowledge requirement on fish due to the introduction of a machine learning method, so the method can be popularized to the fish eye measurement of most other fishes.
TABLE 1
Figure GDA0001597587100000081
Figure GDA0001597587100000091
Finally, it should be noted that: the above embodiments are only for illustrating the technical aspects of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: modifications and equivalents may be made to the specific embodiments of the invention without departing from the spirit and scope of the invention, which is intended to be covered by the claims.

Claims (5)

1. The intelligent measurement method for the pupil of the fish eye based on machine learning is used for collecting fish images and is characterized by comprising the following steps:
s1, calculating a fish eye classifier of an acquired fish graph;
s2, utilizing a fish-eye classifier to obtain a fish-eye region;
s3, calculating the pixel size of the pupil of the fish eye in the fish eye area;
s4, converting the pixel size of the pupil of the fish eye into the actual size of the fish eye;
generating the fish-eye classifier includes:
s2-1, generating a positive and negative sample set of the fish-eye image by using the acquired fish-eye diagram;
s2-2, extracting characteristic values of samples from the sample set;
s2-3, using an AdaBoost algorithm to carry out a fisheye classifier on the sample characteristic value;
the fisheye classification detector values for the positive and negative sample set images are calculated using the Adaboost algorithm, the calculation comprising:
(1) Calculating a characteristic value;
haar-like feature values of positive and negative samples are calculated for describing Haar-like features of fish-eye features: edge features, linear features, and center-around features;
(2) AdaBoost calculations include:
the weak classifier values hj (x) based on Haar-like features are as follows:
Figure FDA0004092927380000011
wherein p is j Is a sign factor controlling the direction of the inequality; fj (x) represents the j-th Haar-like feature; θ is the characteristic value f j A corresponding threshold value, and judging whether the object to be detected is the object to be detected or not by using the threshold value;
generating a class set by using Haar-like features according to the steps (1) - (2), searching weak classifiers with minimum errors in the class set, and cascading the weak classifiers into strong classifiers according to the algorithm as follows:
input sample set (x 1 ,y 1 ),......,(x n ,y n ) When y is i When = -1, it represents a negative sample, i.e. non-fish eye; when y is i When=1, it represents a positive sample, i.e., fish eye; n represents the total number of samples;
initialise: when y i When= -1, weight
Figure FDA0004092927380000012
m represents the number of negative samples; when y is i =1/>
Figure FDA0004092927380000013
l represents the number of positive samples;
Iteration:For t=1,......T:
1) Normalizing the weight:
Figure FDA0004092927380000021
2) Calculate each feature f j Is a weak classifier value h j And each weak classifier can only use one feature to calculate, the error of the feature is that
Figure FDA0004092927380000022
Where j represents the j feature in the Haar-like feature and i represents the i-th sample;
3) Selecting the corresponding minimum feature error E t Is a weak classifier h t
4) Updating weights:
Figure FDA0004092927380000023
wherein the method comprises the steps of
Figure FDA0004092927380000024
U is omega t,i Proportional to beta 1 Inversely proportional to a threshold;
5) If E shaped t =0 or
Figure FDA0004092927380000025
Let t=t-1, exit the loop;
output: strong classifier:
Figure FDA0004092927380000026
wherein the method comprises the steps of
Figure FDA0004092927380000027
Cascade of strong classifiers:
and cascading the strong classifiers in a multi-layer detection mode to obtain a higher detection rate, wherein the obtained AdaBoost classifier value is used for fish eye detection.
2. The method of claim 1, wherein the obtaining a fisheye region comprises:
s3-1, graying a manually intercepted fish eye part in the fish-case image to be used as a positive sample set; removing the part containing fish eyes from the halved image of the fish-eye image after graying, and taking the rest part as a negative sample set;
s3-2, extracting Haar-like features from the sample set, and calculating feature values by using an integral graph;
s3-3, acquiring the fisheye region from the fisheye classifier input with the characteristic value.
3. The method of claim 2, wherein the fisheye region is a circular region.
4. The method of claim 1, wherein the calculating of the pixel size of the fisheye pupil comprises:
s4-1, graying a fish-eye area;
s4-2, converting the gray-scale fish-eye area into the pixel size of the pupil of the fish-eye by using a Hough circle.
5. The machine learning based fisheye pupil intelligent measurement method as set forth in claim 1 wherein the capturing of the fish-eye images comprises: a sample holder, a support arm, and a camera;
the support arm supporting the camera above the sample holder;
the supporting arm is a supporting arm with adjustable distance in the horizontal direction and height in the vertical direction, records information and uses L RP The actual bottom length of the platform of the image acquisition device is represented, I represents the acquired fish image, L CP Representing the pixel size of the bottom length of the image acquisition device platform.
CN201711248089.3A 2017-11-30 2017-11-30 Intelligent fisheye pupil measurement method based on machine learning Active CN108182380B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711248089.3A CN108182380B (en) 2017-11-30 2017-11-30 Intelligent fisheye pupil measurement method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711248089.3A CN108182380B (en) 2017-11-30 2017-11-30 Intelligent fisheye pupil measurement method based on machine learning

Publications (2)

Publication Number Publication Date
CN108182380A CN108182380A (en) 2018-06-19
CN108182380B true CN108182380B (en) 2023-06-06

Family

ID=62545453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711248089.3A Active CN108182380B (en) 2017-11-30 2017-11-30 Intelligent fisheye pupil measurement method based on machine learning

Country Status (1)

Country Link
CN (1) CN108182380B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347134A (en) * 2019-07-29 2019-10-18 南京图玩智能科技有限公司 A kind of AI intelligence aquaculture specimen discerning method and cultivating system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008151471A1 (en) * 2007-06-15 2008-12-18 Tsinghua University A robust precise eye positioning method in complicated background image
WO2012039139A1 (en) * 2010-09-24 2012-03-29 パナソニック株式会社 Pupil detection device and pupil detection method
CN103136512A (en) * 2013-02-04 2013-06-05 重庆市科学技术研究院 Pupil positioning method and system
CN103440476A (en) * 2013-08-26 2013-12-11 大连理工大学 Locating method for pupil in face video
CN104482860A (en) * 2014-12-05 2015-04-01 浙江大学宁波理工学院 Automatic measuring device and method for fish type morphological parameters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008151471A1 (en) * 2007-06-15 2008-12-18 Tsinghua University A robust precise eye positioning method in complicated background image
WO2012039139A1 (en) * 2010-09-24 2012-03-29 パナソニック株式会社 Pupil detection device and pupil detection method
CN103136512A (en) * 2013-02-04 2013-06-05 重庆市科学技术研究院 Pupil positioning method and system
CN103440476A (en) * 2013-08-26 2013-12-11 大连理工大学 Locating method for pupil in face video
CN104482860A (en) * 2014-12-05 2015-04-01 浙江大学宁波理工学院 Automatic measuring device and method for fish type morphological parameters

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于AdaBoost的人眼检测优化算法;柳秀秀;《中国优秀硕士学位论文全文数据库信息科技辑》;20170515;摘要、第3-4章 *
基于图像处理的瞳孔大小检测算法研究;张慧敏等;《光电技术应用》;20170228;摘要、第1-3节 *
基于计算机视觉的卵形鲳鲹眼部特征检测方法研究;胡祝华等;《渔业现代化》;20170831;摘要、第1-6节 *

Also Published As

Publication number Publication date
CN108182380A (en) 2018-06-19

Similar Documents

Publication Publication Date Title
CN105389593B (en) Image object recognition methods based on SURF feature
CN104268505B (en) Fabric Defects Inspection automatic detecting identifier and method based on machine vision
US9639748B2 (en) Method for detecting persons using 1D depths and 2D texture
Wahab et al. Detecting diseases in chilli plants using K-means segmented support vector machine
CN104198497A (en) Surface defect detection method based on visual saliency map and support vector machine
CN109255757B (en) Method for segmenting fruit stem region of grape bunch naturally placed by machine vision
CN108186051B (en) Image processing method and system for automatically measuring double-apical-diameter length of fetus from ultrasonic image
WO2018059125A1 (en) Millimeter wave image based human body foreign object detection method and system
CN114972356B (en) Plastic product surface defect detection and identification method and system
CN108416814B (en) Method and system for quickly positioning and identifying pineapple head
CN105894536A (en) Method and system for analyzing livestock behaviors on the basis of video tracking
CN108230307B (en) Corn broken grain detection method based on contour centroid distance and neural network
Hekim et al. A hybrid model based on the convolutional neural network model and artificial bee colony or particle swarm optimization-based iterative thresholding for the detection of bruised apples
CN111898677A (en) Plankton automatic detection method based on deep learning
CN108182380B (en) Intelligent fisheye pupil measurement method based on machine learning
CN113378831B (en) Mouse embryo organ identification and scoring method and system
CN102680488B (en) Device and method for identifying massive agricultural product on line on basis of PCA (Principal Component Analysis)
CN113222889A (en) Industrial aquaculture counting method and device for aquatic aquaculture objects under high-resolution images
CN112465741A (en) Suspension spring, method and device for detecting defect of valve spring, and storage medium
CN107194319A (en) The mitotic mapping sorted based on SVMs and knowledge method for distinguishing
CN111553382A (en) KNN-based bighead carp classification method
Tawakal et al. The development of methods for detecting melon maturity level based on fruit skin texture using the histogram of oriented gradients and the support vector machine
CN115914560A (en) Intelligent accurate feeding method and device for sows, electronic equipment and storage medium
CN109472797A (en) Aquaculture fish three-dimensional coordinate acquisition methods based on computer vision technique
CN109685002A (en) A kind of dataset acquisition method, system and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant