CN110929755A - Poultry egg detection method, device and system, electronic equipment and storage medium - Google Patents

Poultry egg detection method, device and system, electronic equipment and storage medium Download PDF

Info

Publication number
CN110929755A
CN110929755A CN201911002044.7A CN201911002044A CN110929755A CN 110929755 A CN110929755 A CN 110929755A CN 201911002044 A CN201911002044 A CN 201911002044A CN 110929755 A CN110929755 A CN 110929755A
Authority
CN
China
Prior art keywords
egg
eggs
image
poultry
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911002044.7A
Other languages
Chinese (zh)
Inventor
李俊玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Haiyi Tongzhan Information Technology Co Ltd
Original Assignee
Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Haiyi Tongzhan Information Technology Co Ltd filed Critical Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority to CN201911002044.7A priority Critical patent/CN110929755A/en
Publication of CN110929755A publication Critical patent/CN110929755A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for recognising patterns
    • G06K9/62Methods or arrangements for pattern recognition using electronic means
    • G06K9/6267Classification techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for recognising patterns
    • G06K9/62Methods or arrangements for pattern recognition using electronic means
    • G06K9/6217Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06K9/6256Obtaining sets of training patterns; Bootstrap methods, e.g. bagging, boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Abstract

The application relates to an egg detection method, device, system, electronic equipment and storage medium, wherein the method comprises the following steps: acquiring a first image to be detected, wherein the first image to be detected comprises eggs picked up by a pickup device; classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in a pre-trained first classification model to generate a first classification result; and generating a first classification label corresponding to the poultry egg according to the first classification result. The technical scheme can reduce the breakage rate of the poultry eggs in the transportation process and improve the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.

Description

Poultry egg detection method, device and system, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a method, an apparatus, a system, an electronic device, and a storage medium for detecting eggs.
Background
The complete egg shell is elliptical, one end is large and the other end is small. The egg shell can be divided into an upper shell membrane, a lower shell and an air chamber.
The large and small ends of the eggs are directionally arranged, which is one of the commercialized treatment processes of egg grading packaging, and the main purpose is to enable all large ends of the eggs to face to one direction, and the packaged eggs are placed in an egg box or an egg tray with the large ends upwards, so that egg yolks can be prevented from being adhered to egg shells, and the storage life can be prolonged.
In the hatching scene, the laying direction of the big and small heads of the eggs can also directly influence the hatching rate and the production benefit. In addition, embryos can normally develop only by the fact that the air chambers face upwards, and the air chambers are mostly located in the direction of the big ends of the eggs, so that the eggs are required to be placed upwards according to the big ends in an egg stacking link by the industry standard, and the damage rate of the eggs can be reduced by the fact that the big ends face upwards in the transportation process.
Egg delivery balance is generally operated manually, and the wrong placing condition that the small head is upward generally exists in the process due to human errors or the shape of the large head and the small head is not obvious. And in some egg factories with larger scale, the eggs are oriented by adopting a split-column turnover type orientation device, the key movement on the device is axial movement and turnover movement, and the eggs form a horizontal deflection angle in the axial movement, which is a core parameter for calculating the automatic directional split-column movement of the big and small ends of the eggs.
The prior art mainly realizes the judgment of the large head and the small head by manual operation or mechanical movement. The manual operation workload is large, errors are easy to occur, human eyes can only distinguish the direction of the big end and the small end through the appearance, and the probability of misjudgment exists for eggs with unobvious appearance. The size is judged by adopting a mechanical mode, and the benefit is reduced because the poultry eggs are easy to collide, damage and the like in the mechanical movement process.
Disclosure of Invention
In order to solve the technical problems described above or at least partially solve the technical problems, the present application provides an egg detection method, an apparatus, a system, an electronic device and a storage medium.
In a first aspect, the present application provides a method for detecting an avian egg, comprising:
acquiring a first image to be detected, wherein the first image to be detected comprises eggs picked up by a pickup device;
classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in a pre-trained first classification model to generate a first classification result;
and generating a first classification label corresponding to the poultry egg according to the first classification result.
Optionally, the acquiring an image to be detected includes:
after the poultry eggs are picked up by the pickup device, shooting the poultry eggs to obtain a first poultry egg image;
obtaining a first image to be detected according to the first egg image;
classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in the pre-trained first classification model to generate a first classification result, wherein the first classification result comprises the following steps:
carrying out binarization processing on pixels of the first image to be detected to generate a binarized mask image;
inputting the mask image into the first classification model, wherein the first classification model is a second classification model for judging whether the big head of the poultry egg is upward or the small head of the poultry egg is upward;
classifying the poultry eggs according to the first classification model;
and outputting a first classification result, wherein the first classification result is used for identifying whether the big end of the poultry egg faces upwards or the small end of the poultry egg faces upwards.
Optionally, when the first egg image includes at least two eggs, obtaining the first image to be detected according to the first egg image includes:
segmenting the first egg image to obtain a first egg image comprising a single egg;
identifying profile information for the egg in the first egg image;
determining a minimum circumscribed rectangular frame of the poultry egg according to the contour information;
and cutting the first egg image according to the minimum circumscribed rectangle frame to obtain the first image to be detected.
Optionally, the method further includes:
generating a first sorting instruction according to the first sorting result, wherein the first sorting instruction is used for controlling sorting equipment to perform corresponding sorting operation on the poultry eggs;
sending the first sorting instruction to the sorting equipment.
Optionally, the first classification result further includes a first confidence that the big end of the egg faces upward or a second confidence that the small end faces upward;
the generating a first sorting instruction according to the first sorting result comprises:
when the first confidence coefficient is larger than or equal to a first preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a first egg tray;
when the second confidence coefficient is larger than or equal to a second preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a second egg tray;
when the first confidence is smaller than the first preset threshold, generating a first sorting instruction for sorting the eggs to a third egg tray;
and when the second confidence is smaller than the second preset threshold, generating a first sorting instruction for sorting the eggs to a fourth egg tray.
Optionally, the method further includes:
respectively acquiring sample images when the pickup equipment picks up the big head of the poultry egg and the sample images when the pickup equipment picks up the small head;
adding a label with an upward big end to a sample image when the picking device picks the big end of the egg, and adding a label with an upward small end to a sample image when the picking device picks the small end of the egg;
carrying out binarization processing on pixels of the sample image to generate a binarized mask sample image;
and performing two-classification training on the mask sample image based on a preset convolutional neural network, and distinguishing the directions of big heads and small heads by using the change characteristics of the contour curvature of the eggs to generate the first classification model.
Optionally, the pick-up device comprises a suction nozzle through which the eggs are sucked up.
Optionally, after generating the first classification tag corresponding to the egg according to the first classification result, the method further includes:
acquiring a second image to be detected, wherein the second image to be detected comprises the poultry eggs which are picked up by the pickup equipment and are illuminated;
classifying the poultry eggs in the second image to be detected according to whether air chambers are detected or not according to a pre-trained second classification model to generate a second classification result;
and according to the second classification result, a second classification label corresponding to the poultry egg is obtained.
Optionally, the method further includes:
generating a second sorting instruction according to the second sorting result, wherein the second sorting instruction is used for controlling sorting equipment to execute corresponding sorting operation;
sending the second sort instructions to the sorting equipment.
Optionally, when the first classification result includes a first confidence that the big end of the egg faces upward or a second confidence that the small end faces upward, the acquiring a second image to be detected includes:
when the eggs with the first confidence degrees smaller than a first preset threshold value or the second confidence degrees smaller than a second preset threshold value are picked up by the picking device, the picking device illuminates the eggs with lamps, and the eggs are shot to obtain a second egg image;
and obtaining a second image to be detected according to the second egg image.
In a second aspect, the present application provides a method for detecting an avian egg, comprising:
acquiring an image to be detected, wherein the image to be detected comprises eggs which are picked up by a pickup device and are illuminated;
classifying the poultry eggs in the image to be detected according to whether air chambers are detected or not according to a pre-trained classification model to generate a classification result;
and generating a classification label corresponding to the poultry egg according to the classification result.
Optionally, the method further includes:
when the pickup device picks up eggs, the lamps in the pickup equipment illuminate the eggs, and sample images of the eggs are respectively acquired when the pickup equipment picks up the large heads of the eggs and when the pickup equipment picks up the small heads of the eggs;
adding a label with an upward big end to the sample image when the picking device picks the big end of the egg, and adding a label with an upward small end to the sample image when the picking device picks the small end of the egg;
and performing two-classification training on the sample image based on a preset convolutional neural network, and determining to detect an air chamber according to the existence of an overexposure area at the contact part of the suction nozzle and the poultry egg in the sample image, so as to judge that the big end of the poultry egg is upward and generate the classification model.
Optionally, the picking device comprises a suction nozzle through which the eggs are sucked up; and a lamp is arranged in the suction nozzle, and the lamp is lightened after the poultry eggs are sucked by the suction nozzle.
In a third aspect, the present application provides an egg detection device comprising:
the acquisition module is used for acquiring an image to be detected, wherein the image to be detected comprises eggs picked by a pickup device;
the classification module is used for classifying the poultry eggs in the image to be detected according to the pre-trained classification model in the big-small direction according to the contours to generate a classification result;
and the generation module is used for generating the classification label corresponding to the eggs according to the classification result.
In a fourth aspect, the present application provides an egg detection device comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring an image to be detected, and the image to be detected comprises eggs which are picked up by a pickup device and are illuminated;
the classification module is used for classifying the poultry eggs in the image to be detected according to whether air chambers are detected or not according to a pre-trained classification model to generate a classification result;
and the generation module is used for generating the classification label corresponding to the eggs according to the classification result.
In a fifth aspect, the present application provides an egg detection system comprising: the poultry egg detection device comprises a first poultry egg detection device, a pickup device and a shooting device;
the picking device comprises a suction nozzle, and the poultry eggs are sucked up through the suction nozzle;
the shooting equipment shoots the eggs sucked up by the suction nozzle to obtain a first egg image;
the first egg detection device is used for obtaining a first image to be detected according to the first egg image, classifying the eggs in the first image to be detected according to a pre-trained first classification model in the big-small direction according to the outline, and generating a first classification result; and generating a first classification label corresponding to the poultry egg according to the first classification result.
Optionally, the system further comprises: sorting equipment;
the first egg detection device is used for generating a first sorting instruction according to the first classification result, and the first sorting instruction is used for controlling sorting equipment to perform corresponding sorting operation on the eggs; sending the first sorting instruction to the sorting equipment;
the sorting equipment is used for executing corresponding sorting operation according to the first sorting instruction.
Optionally, the system further comprises: a second egg detection device for detecting the eggs,
the first classification result generated by the first egg detection device further comprises: the first confidence coefficient that the big end of the egg faces upwards or the second confidence coefficient that the small end faces upwards;
a suction nozzle of the picking device sucks the poultry egg with the first confidence coefficient smaller than a first preset threshold value or the second confidence coefficient smaller than a second preset threshold value, and a lamp arranged in the suction nozzle is turned on;
the shooting device shoots the illuminated eggs to obtain a second egg image;
the second egg detection device is used for obtaining a second image to be detected according to the second egg image, classifying the eggs in the second image to be detected in the big-small direction according to whether air chambers are detected or not according to a pre-trained second classification model, and generating a second classification result; and generating a second classification label corresponding to the poultry egg according to the second classification result.
In a sixth aspect, the present application provides an electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the above method steps when executing the computer program.
In a seventh aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the above-mentioned method steps.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages: the big end and the small end of the eggs can be accurately identified through the classification model trained in advance according to the egg outline in the shot egg image, and the eggs with unobvious appearance can also be accurately judged. And subsequently, the poultry eggs with small heads upwards can be turned over in time, so that the large heads of the poultry eggs are upwards. Thereby reducing the breakage rate of the poultry eggs in the transportation process and improving the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a flow chart of a method for detecting eggs according to an embodiment of the present disclosure;
FIG. 2 is a schematic view of a suction nozzle provided in an embodiment of the present application for sucking up eggs;
FIG. 3 is a flow chart of a method for detecting eggs according to another embodiment of the present disclosure;
fig. 4 is a schematic view of an avian egg with an upwardly-facing large end according to an embodiment of the present application;
fig. 5 is a schematic view of an avian egg with the small head facing upward as provided in an embodiment of the present application;
fig. 6 is a flow chart of an egg detection method according to another embodiment of the present application;
fig. 7 is a schematic view of a binarized mask with an egg having a large end facing upward according to an embodiment of the present application;
fig. 8 is a schematic view of a binarized mask with an egg small head facing upward according to an embodiment of the present application;
FIG. 9 is a flow chart of a method for detecting eggs according to another embodiment of the present application;
FIG. 10 is a schematic view of an avian egg as provided by an embodiment of the present application, after being illuminated with its large end facing upwards;
FIG. 11 is a schematic view of an avian egg as provided in another embodiment of the present application, after illumination with the large end facing upward;
FIG. 12 is a flow chart of a method for detecting eggs according to another embodiment of the present application;
FIG. 13 is a flow chart of a method for detecting eggs according to another embodiment of the present application;
FIG. 14 is a flow chart of a method for detecting eggs according to another embodiment of the present application;
fig. 15 is a block diagram of an egg detection device according to an embodiment of the present disclosure;
FIG. 16 is a block diagram of an egg detection device according to another embodiment of the present application;
FIG. 17 is a block diagram of an egg detection system according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The method and the device realize judgment of the big and small ends of the eggs in a non-contact computer vision mode, not only can accurately identify the big and small ends of the eggs, but also can accurately judge the eggs with unobvious appearance, and do not need to make the eggs perform mechanical motion, so that the eggs are prevented from being collided and damaged.
The application relates to two methods for judging the big and small ends of eggs, namely judging through the contours of the eggs and judging through the positions of air chambers of the eggs, wherein the two methods can be used respectively or in combination.
First, a method for detecting eggs based on egg profiles provided by embodiments of the present application is described.
Fig. 1 is a flowchart of an egg detection method according to an embodiment of the present disclosure. As shown in fig. 1, the method comprises the steps of:
step S11, a first image to be detected is acquired, the first image to be detected including an egg picked up by the pickup device.
And step S12, classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in the pre-trained first classification model, and generating a first classification result.
And step S13, generating a first classification label corresponding to the poultry egg according to the first classification result.
In the embodiment, the big end and the small end of the eggs can be accurately identified by carrying out big end and small end identification through the pre-trained classification model according to the egg contours in the shot egg images, and the eggs with unobvious appearances can also be accurately judged. And subsequently, the poultry eggs with small heads upwards can be turned over in time, so that the large heads of the poultry eggs are upwards. Thereby reducing the breakage rate of the poultry eggs in the transportation process and improving the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
Wherein the pickup device comprises a suction nozzle through which the eggs are sucked up. The pick-up device may comprise a plurality of pick-up units, each of which is composed of a suction nozzle, a piston rod, a cylinder, a vacuum valve, and the like. The suction nozzle is connected with the cylinder through the piston rod, and the inside of the cylinder is vacuumized or filled with air through the switch of the vacuum valve, so that the piston rod drives the suction nozzle to suck up or put down eggs. Step S11 includes: after the eggs are sucked up by a suction nozzle of the pickup device, shooting the eggs to obtain a first egg image; and obtaining a first image to be detected according to the first egg image.
The pick-up device comprises 6 pick-up units, namely 6 suction nozzles for sucking up 6 eggs at the same time. During shooting, 3 eggs can be shot simultaneously, and the top end of each egg is partially shielded by the suction nozzle. Fig. 2 is a schematic view of a suction nozzle provided in the present application for sucking up an avian egg. As shown in FIG. 2, the first egg image includes 3 eggs.
Fig. 3 is a flowchart of an egg detection method according to another embodiment of the present disclosure. As shown in fig. 3, when the first egg image includes at least two eggs, each egg in the image needs to be extracted, and a first image to be detected is obtained according to the first egg image, including the following steps:
step S31, the first egg image is segmented to obtain a first egg image comprising a single egg.
Step S32, identifying contour information of the egg in the first egg image.
And step S33, determining the minimum circumscribed rectangle frame of the poultry egg according to the contour information.
And step S34, cutting the first egg image according to the minimum circumscribed rectangle frame to obtain a first image to be detected.
Fig. 4 is a schematic view of an avian egg with an upward-facing head according to an embodiment of the present application. Fig. 5 is a schematic view of an avian egg with the small head facing upward according to an embodiment of the present application. As shown in fig. 4 and 5, the image including a plurality of eggs is segmented to obtain an image to be detected of each egg, so that the big-end and small-end directions can be accurately identified through the classification model.
Fig. 6 is a flow chart of an egg detection method according to another embodiment of the present application. As shown in fig. 6, the step S12 includes:
in step S41, a binarization process is performed on the pixels of the first image to be detected, and a binarized mask image is generated.
And step S42, inputting the mask image into a first classification model, wherein the first classification model is a second classification model for judging whether the big end of the egg is upward or the small end of the egg is upward.
And step S43, classifying the eggs according to the first classification model.
And step S44, outputting a first classification result, wherein the first classification result is used for identifying whether the big end of the egg is upward or the small end of the egg is upward.
Fig. 7 is a schematic view of a binarized mask with an egg having a large end facing upward according to an embodiment of the present application. Fig. 8 is a schematic view of a binarized mask with an egg small end facing upward according to an embodiment of the present application. As shown in fig. 7 and 8, the contour of the binary mask image is more clear.
In the embodiment, the image to be detected is converted into the binary mask image, and the mask image can better reflect the real outline information of the poultry egg, so that the big-end and small-end classification through the classification model is more accurate.
In addition, when the eggs are shot, the eggs may be polished to improve the shooting effect, which means that the areas where the air chambers of the eggs are located are overexposed, so that the outlines of the eggs in the shot images are not clear, as shown in fig. 4. Thus, classification by a classification algorithm may not be accurately performed according to contour curvature variation. At the moment, the egg image is converted into the binary mask image, so that the influence of the air chamber on classification can be avoided, and the classification accuracy is improved.
In another embodiment, the method further comprises: generating a first sorting instruction according to the first classification result, wherein the first sorting instruction is used for controlling sorting equipment to execute corresponding sorting operation; the first sort instructions are sent to the sorting equipment.
Wherein the sorting operation of the sorting apparatus may include:
(1) based on the classification result, the eggs are sorted to different egg trays according to the direction of the big head and the small head, so that the eggs with the small heads upwards can be turned over manually or automatically.
(2) The eggs with small heads facing upwards are directly overturned and then are sorted to egg trays for placing the eggs with large heads facing upwards.
In the embodiment, the big end and the small end of the eggs can be accurately identified by firstly identifying the big end and the small end of the eggs according to the egg outlines in the shot egg images through the pre-trained classification model, and then the eggs with the small ends facing upwards can be timely turned over by the sorting operation of the sorting equipment, so that the large ends of the eggs face upwards. Thereby reducing the breakage rate of the poultry eggs in the transportation process and improving the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
In another embodiment, the first classification result generated in step S12 further includes: the first confidence coefficient that the big end of the egg faces upwards or the second confidence coefficient that the small end faces upwards. For example, for one egg, the confidence is 90% with the large end facing upward, while for another egg, the confidence is 60% with the small end facing upward.
If the confidence is low, the classification of the egg may be deemed to be actually uncertain, and the subsequent detection of the big and small ends of the egg may be further required. In sorting eggs, eggs may be sorted to different egg flats based on confidence in the classification. A first preset threshold of the confidence that the big end of the egg faces upwards and a second preset threshold of the confidence that the small end faces upwards can be preset, for example, the first preset threshold is 70% and the second preset threshold is 60%.
Thus, generating a first sort instruction according to the first sort result includes:
when the first confidence coefficient is larger than or equal to a first preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a first egg tray; when the second confidence coefficient is larger than or equal to a second preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a second egg tray; when the first confidence coefficient is smaller than a first preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a third egg tray; and when the second confidence coefficient is smaller than a second preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a fourth egg tray.
Wherein, the third egg tray and the fourth egg tray can be the same. For example, eggs with a confidence level of less than 70% for big heads and less than 60% for small heads facing upwards may be sorted to the same flat, and then further detection may be performed on the eggs in the flat.
For eggs in the second egg tray, the sorting device can directly perform the turning operation, so that the big ends of the eggs in the egg tray are upward.
And for the eggs in the third egg tray and the fourth egg tray, the sorting equipment can be controlled to turn over the eggs, or the eggs can not be turned over temporarily, and after the subsequent second-time large and small head detection and classification, whether turning over is needed or not is determined according to the classification result.
In this embodiment, the confidence degrees in the big-end and small-end directions are output through the classification model, so that eggs can be sorted more accurately, and eggs with high confidence degrees can be placed on corresponding egg trays to perform corresponding operations. For example, the egg tray with the confidence coefficient reaching the threshold value with the big head upwards can be directly placed into the hatching hall, the eggs in the egg tray with the confidence coefficient reaching the threshold value with the small head upwards can be directly placed into the hatching hall after being turned over, and the big head and the small head detection is not needed in the subsequent hatching process. And for egg trays with confidence coefficient not reaching the threshold value, the detection of the direction of the big head and the small head is needed again when embryo development sorting is carried out subsequently. Therefore, the accuracy of detecting the direction of the big end and the small end of the eggs is improved, and the hatchability and the hatching rate of the eggs are improved.
The first classification model used in this embodiment may be obtained by acquiring a large number of sample images of poultry eggs in advance and training the sample images.
When the eggs are sucked by the suction nozzle, the top ends of the eggs are partially shielded by the suction nozzle, the pre-collected sample images are partially shielded egg images, and the large and small ends of the eggs can still be distinguished according to the outline characteristics of the exposed parts of the first classification model obtained through training.
Fig. 9 is a flowchart of an egg detection method according to another embodiment of the present disclosure. As shown in fig. 9, the method further includes a training process of the first classification model, and the specific steps are as follows:
and step S51, respectively acquiring sample images of the eggs when the pickup equipment picks up the big ends of the eggs and when the pickup equipment picks up the small ends of the eggs.
The sample image may be acquired by: n5 rows of paper trays with 6 rows are prepared, no hatching eggs are left for 7 days, the big head of each hatching egg tray is manually ensured to face upwards, photographing and collecting are carried out in a mode that 6 eggs in each row of eggs in each tray are sucked up at the same time, and when all trays are collected, the images are defined as sample images with the big heads facing upwards. And (5) changing the hatching eggs for two times, enabling the small heads to face upwards, and collecting in the same way to be used as sample images with the small heads facing upwards. The data label can be obtained in the acquisition process, and the workload and uncertainty of subsequent manual marking are saved.
And step S52, adding a label with an upward big end to the sample image when the picking device picks the big end of the egg, and adding a label with an upward small end to the sample image when the picking device picks the small end of the egg.
In step S53, a binarization process is performed on the pixels of the sample image to generate a binarized mask sample image.
Because the binarized mask image can better reflect the real contour information of the poultry egg, the training is carried out based on the binarized mask image, so that the learning effect of the classification model on the curvature change characteristics of the poultry egg contour is better.
And step S54, performing two-classification training on the mask sample image based on a preset convolutional neural network, and distinguishing the directions of the big head and the small head by utilizing the change characteristics of the contour curvature of the poultry egg to generate a first classification model.
The preset convolutional neural network can be a neural network structure such as VGG16, Googlenet, mobilnenet v2 and the like.
In the embodiment, the egg sample images with incomplete outlines are classified and trained in advance, and the curvature change characteristics of the egg outlines are learned, so that the big and small ends of eggs can be accurately identified in actual egg detection.
The method for detecting eggs based on the location of the egg air chambers provided by the embodiments of the present application is described below.
By collecting eggs without sperm for 7 days, the air chamber has been developed obviously, accounting for about 2/5 volume of the hatching egg, generally located at the big end of the egg. A lighting lamp is arranged in a suction nozzle of the pickup device, when eggs are sucked by the suction nozzle, the eggs are lighted, and due to the light transmittance of the air chamber, an air chamber area in the image can be overexposed.
Fig. 10 is a schematic view of an avian egg as it is illuminated with its large end facing upward according to embodiments of the present application. Fig. 11 is a schematic view of an avian egg as provided in another embodiment of the present application, after being illuminated with its large end facing upward. As shown in FIGS. 10 and 11, if the big end of the egg is upward, an overexposed area is generated at the contact position of the egg and the suction nozzle in the shot image, and the boundary of the air chamber is displayed clearly, as marked by a black frame in the image. And when the small ends of the eggs face upwards, no overexposure area is generated in the shot image.
According to the imaging characteristics of the air chamber, the big and small ends of the poultry egg can be classified according to whether the air chamber is detected in the image or not according to a pre-trained second classification model.
Fig. 12 is a flowchart of an egg detection method according to another embodiment of the present disclosure. As shown in fig. 12, the method includes the steps of:
step S61, an image to be detected is acquired, the image to be detected including the eggs picked up by the pickup device and illuminated.
And step S62, classifying the poultry eggs in the image to be detected according to the pre-trained classification model according to whether air chambers are detected, and generating a classification result.
And step S63, generating classification labels corresponding to the eggs according to the classification results.
In the embodiment, the big end and the small end of the egg can be accurately identified by judging whether the air chamber is detected in the shot egg image or not through the pre-trained classification model. And subsequently, the poultry eggs with small heads upwards can be turned over in time, so that the large heads of the poultry eggs are upwards. Thereby reducing the breakage rate of the poultry eggs in the transportation process and improving the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
In another embodiment, the method further comprises; generating a sorting instruction according to the classification result, wherein the sorting instruction is used for controlling sorting equipment to execute corresponding sorting operation; and sending the sorting instruction to the sorting equipment.
In the embodiment, the big end and the small end of the egg can be accurately identified by judging whether the air chamber is detected in the shot egg image or not through the pre-trained classification model. Then, through the sorting operation of the sorting equipment, the poultry eggs with small heads upwards can be turned over in time, and the large heads upwards. Thereby reducing the breakage rate of the poultry eggs in the transportation process and improving the hatchability and the hatching rate of the poultry eggs during hatching. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
Fig. 13 is a flowchart of an egg detection method according to another embodiment of the present disclosure. As shown in fig. 13, the method further includes a training process of the second classification model, and the specific steps are as follows:
and step S71, when the pickup device picks up the eggs, the lamps in the pickup equipment illuminate the eggs, and sample images of the eggs are respectively acquired when the pickup equipment picks up the big ends of the eggs and when the pickup equipment picks up the small ends of the eggs.
And step S72, adding labels with the big ends facing upwards to the sample image when the picking device picks the big ends of the eggs, and adding labels with the small ends facing upwards to the sample image when the picking device picks the small ends of the eggs.
And step S73, performing two-classification training on the sample image based on a preset convolutional neural network, and determining that an air chamber is detected according to the existence of an overexposure area in the sample image, so as to judge that the big end of the egg is upward and generate a classification model.
The preset convolutional neural network can be a neural network structure such as yolov2, yolov3 and ssd.
In the embodiment, classification training is performed on the egg sample images in advance, and the characteristics of the exposure area in the egg image are learned according to the light transmittance of the air chamber, so that the big and small ends of the egg can be accurately identified in actual egg detection.
In this embodiment, the pick-up device comprises a suction nozzle through which eggs are sucked up. The lamp is arranged in the suction nozzle, and when the eggs are sucked up by the suction nozzle, the lamp is lightened to illuminate the eggs. Therefore, in the shot image, an overexposure area is generated at the contact part of the poultry egg and the suction nozzle, and the boundary of the air chamber is displayed clearly.
In an egg hatching scene, the two egg detection methods can be combined for use, so that the accuracy of judging the big and small ends of the eggs is improved.
In the early stage of hatching, the orientation of the big head and the small head of the early fresh eggs is analyzed according to the egg contour. Eggs which are difficult to distinguish according to the outlines, namely eggs with confidence coefficient not meeting the requirements, can enter an incubation hall firstly, the requirement of early embryo development on air chambers is not sensitive, when the embryo development conditions need to be sorted in 7 days, the function of judging big heads and small heads is added, and the orientation of the big heads and small heads is analyzed according to the detection of air chambers of the eggs.
After the step S13, the method further includes:
acquiring a second image to be detected, wherein the second image to be detected comprises the poultry eggs which are picked up by the pickup equipment and are illuminated;
classifying the poultry eggs in the second image to be detected according to whether air chambers are detected or not according to a pre-trained second classification model to generate a second classification result;
and according to the second classification result, a second classification label corresponding to the poultry egg.
In another embodiment, the method further comprises:
generating a second sorting instruction according to the second sorting result, wherein the second sorting instruction is used for controlling sorting equipment to execute corresponding sorting operation;
the second sort instructions are sent to the sorting equipment.
In another embodiment, when the first classification result includes a first confidence that the big end of the egg is upward or a second confidence that the small end of the egg is upward, acquiring a second image to be detected, including:
when the eggs with the first confidence degrees smaller than a first preset threshold value or the second confidence degrees smaller than a second preset threshold value are picked up by the picking device, the inner lamp of the picking device illuminates the eggs, and the eggs are shot to obtain a second egg image;
and obtaining a second image to be detected according to the second egg image.
The combination of the above two egg detection methods is described in detail below with a specific example.
Fig. 14 is a flowchart of an egg detection method according to another embodiment of the present disclosure. As shown in fig. 14, the specific flow is as follows:
step S81, the pickup device sucks up the eggs and takes pictures of the eggs;
step S82, acquiring a first image to be detected including a single egg;
step S83, inputting a first image to be detected into a first classification model, classifying the poultry egg according to the outline in the big-small head direction, and inputting the confidence coefficient in the big-small head direction; the threshold of confidence of big head up or small head up may be set to 8%;
step S84, judging whether the confidence coefficient is larger than or equal to 8%, if yes, executing step S88, and if no, executing step S85;
step S85, after the eggs are incubated in the incubation hall for 7 days, the pickup device sucks up the eggs, the lamps in the suction nozzles are lightened, the eggs are illuminated, and the eggs are photographed;
step S86, acquiring a second image to be detected of each egg in a second egg tray;
step S87, inputting the second image to be detected into a second classification model, and classifying the poultry eggs according to the direction of big and small heads of the poultry eggs according to whether air chambers are detected or not; step S88 is executed;
step S88, recording the directions of big and small ends of the eggs;
and step S89, generating a corresponding sorting instruction according to the direction of the big end and the small end of the eggs, and sending the sorting instruction to sorting equipment.
In the embodiment, based on the egg contour and the air chamber imaging characteristics, the direction of the big end and the small end of the egg is judged twice in the early stage of egg hatching, the big end and the small end of the egg can be distinguished more accurately, and then the eggs with the small ends facing upwards can be turned over in time through the sorting operation of sorting equipment, so that the big ends (air chambers) facing upwards during egg hatching are ensured, and the hatching rate of the eggs are improved. In addition, based on non-contact detection, avoid birds, beasts and eggs to appear colliding the damage.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods.
Fig. 15 is a block diagram of an egg detection device according to an embodiment of the present disclosure, which may be implemented as part or all of an electronic device through software, hardware or a combination of the two. As shown in fig. 15, the egg detection apparatus includes:
the acquiring module 91 is used for acquiring an image to be detected, wherein the image to be detected comprises eggs picked up by the picking-up device;
the classification module 92 is used for classifying the poultry eggs in the image to be detected according to the pre-trained classification model in the big-small direction according to the outlines so as to generate a classification result;
and the generating module 93 is configured to generate a classification label corresponding to the egg according to the classification result.
Fig. 16 is a block diagram of an egg detection apparatus according to another embodiment of the present application, which may be implemented as part or all of an electronic device through software, hardware or a combination of the two. As shown in fig. 16, the egg detection apparatus includes:
the acquiring module 101 is used for acquiring an image to be detected, wherein the image to be detected comprises eggs which are picked up by a pickup device and are illuminated;
the classification module 102 is configured to classify poultry eggs in the image to be detected according to whether air chambers are detected or not according to a pre-trained classification model, and generate a classification result;
and the generating module 103 is used for generating classification labels corresponding to the eggs according to the classification result.
Fig. 17 is a block diagram of an egg detection system according to an embodiment of the present application, which further includes: a first egg detection device 111, a pickup apparatus 112, and a capture apparatus 113.
The pick-up device 112 comprises a suction nozzle through which eggs are sucked up;
the shooting device 113 shoots the eggs sucked up by the suction nozzle to obtain a first egg image;
the first egg detection device 111 is used for obtaining a first image to be detected according to the first egg image, and classifying the eggs in the first image to be detected according to the outline of a pre-trained first classification model in the big-small direction to generate a first classification result; generating a first sorting instruction according to the first classification result; the first sort instructions are sent to the sorting equipment 114.
The system further comprises: sorting equipment 114. The first egg detection device 111 is used for generating a first sorting instruction according to the first classification result, and the first sorting instruction is used for controlling the sorting equipment to perform corresponding sorting operation on the eggs; sending the first sorting instruction to sorting equipment; the sorting device 114 is used for executing corresponding sorting operation according to the first sorting instruction.
In another embodiment, the system further comprises: a second egg detection device 115 for detecting a second egg,
the first classification result generated by the first egg detection device 111 further includes: a first confidence coefficient that the big end of the egg faces upwards or a second confidence coefficient that the small end faces upwards;
a suction nozzle of the pickup device 112 sucks up the poultry egg with the first confidence coefficient smaller than a first preset threshold value or the second confidence coefficient smaller than a second preset threshold value, and a lamp arranged in the suction nozzle is turned on;
the shooting device 113 shoots the illuminated eggs to obtain a second egg image;
the second egg detection device 115 is used for obtaining a second image to be detected according to the second egg image, classifying the eggs in the second image to be detected according to whether air chambers are detected or not according to a pre-trained second classification model, and generating a second classification result; and generating a second classification label corresponding to the poultry egg according to the second classification result.
In addition, the second egg detection device 115 is further configured to generate a second sorting instruction according to a second sorting result; the second sort instructions are sent to sorting equipment 114. And the sorting equipment 114 is used for executing corresponding sorting operation according to the second sorting instruction.
An embodiment of the present application further provides an electronic device, as shown in fig. 18, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501, when executing the computer program stored in the memory 1503, implements the steps of the method embodiments described below.
The communication bus mentioned in the electronic device may be a Peripheral component interconnect (pci) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method embodiments described below.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (20)

1. An egg detection method, comprising:
acquiring a first image to be detected, wherein the first image to be detected comprises eggs picked up by a pickup device;
classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in a pre-trained first classification model to generate a first classification result;
and generating a first classification label corresponding to the poultry egg according to the first classification result.
2. The method of claim 1,
the acquisition waits to detect the image, includes:
after the poultry eggs are picked up by the pickup device, shooting the poultry eggs to obtain a first poultry egg image;
obtaining a first image to be detected according to the first egg image;
classifying the poultry eggs in the first image to be detected according to the outline of the poultry eggs in the pre-trained first classification model to generate a first classification result, wherein the first classification result comprises the following steps:
carrying out binarization processing on pixels of the first image to be detected to generate a binarized mask image;
inputting the mask image into the first classification model, wherein the first classification model is a second classification model for judging whether the big head of the poultry egg is upward or the small head of the poultry egg is upward;
classifying the poultry eggs according to the first classification model;
and outputting a first classification result, wherein the first classification result is used for identifying whether the big end of the poultry egg faces upwards or the small end of the poultry egg faces upwards.
3. The method of claim 2, wherein when the first egg image includes at least two eggs, said obtaining the first image to be detected from the first egg image comprises:
segmenting the first egg image to obtain a first egg image comprising a single egg;
identifying profile information for the egg in the first egg image;
determining a minimum circumscribed rectangular frame of the poultry egg according to the contour information;
and cutting the first egg image according to the minimum circumscribed rectangle frame to obtain the first image to be detected.
4. The method of claim 1, further comprising:
generating a first sorting instruction according to the first sorting result, wherein the first sorting instruction is used for controlling sorting equipment to perform corresponding sorting operation on the poultry eggs;
sending the first sorting instruction to the sorting equipment.
5. The method of claim 4, wherein the first classification result further comprises a first confidence that the egg is big end up or a second confidence that the egg is small end up;
the generating a first sorting instruction according to the first sorting result comprises:
when the first confidence coefficient is larger than or equal to a first preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a first egg tray;
when the second confidence coefficient is larger than or equal to a second preset threshold value, generating a first sorting instruction for sorting the poultry eggs to a second egg tray;
when the first confidence is smaller than the first preset threshold, generating a first sorting instruction for sorting the eggs to a third egg tray;
and when the second confidence is smaller than the second preset threshold, generating a first sorting instruction for sorting the eggs to a fourth egg tray.
6. The method of claim 1, further comprising:
respectively acquiring sample images when the pickup equipment picks up the big head of the poultry egg and the sample images when the pickup equipment picks up the small head;
adding a label with an upward big end to a sample image when the picking device picks the big end of the egg, and adding a label with an upward small end to a sample image when the picking device picks the small end of the egg;
carrying out binarization processing on pixels of the sample image to generate a binarized mask sample image;
and performing two-classification training on the mask sample image based on a preset convolutional neural network, and distinguishing the directions of big heads and small heads by using the change characteristics of the contour curvature of the eggs to generate the first classification model.
7. A method according to any one of claims 1 to 7, wherein the pick-up apparatus comprises a suction nozzle through which the eggs are sucked up.
8. The method according to claim 1, wherein after generating the first classification tag corresponding to the avian egg from the first classification result, the method further comprises:
acquiring a second image to be detected, wherein the second image to be detected comprises the poultry eggs which are picked up by the pickup equipment and are illuminated;
classifying the poultry eggs in the second image to be detected according to whether air chambers are detected or not according to a pre-trained second classification model to generate a second classification result;
and according to the second classification result, a second classification label corresponding to the poultry egg is obtained.
9. The method of claim 8, further comprising:
generating a second sorting instruction according to the second sorting result, wherein the second sorting instruction is used for controlling sorting equipment to execute corresponding sorting operation;
sending the second sort instructions to the sorting equipment.
10. The method according to claim 8, wherein when the first classification result comprises a first confidence that the big end of the egg is upward or a second confidence that the small end of the egg is upward, the acquiring a second image to be detected comprises:
when the eggs with the first confidence degrees smaller than a first preset threshold value or the second confidence degrees smaller than a second preset threshold value are picked up by the picking device, the picking device illuminates the eggs with lamps, and the eggs are shot to obtain a second egg image;
and obtaining a second image to be detected according to the second egg image.
11. An egg detection method, comprising:
acquiring an image to be detected, wherein the image to be detected comprises eggs which are picked up by a pickup device and are illuminated;
classifying the poultry eggs in the image to be detected according to whether air chambers are detected or not according to a pre-trained classification model to generate a classification result;
and generating a classification label corresponding to the poultry egg according to the classification result.
12. The method of claim 11, further comprising:
when the pickup device picks up eggs, the lamps in the pickup equipment illuminate the eggs, and sample images of the eggs are respectively acquired when the pickup equipment picks up the large heads of the eggs and when the pickup equipment picks up the small heads of the eggs;
adding a label with an upward big end to the sample image when the picking device picks the big end of the egg, and adding a label with an upward small end to the sample image when the picking device picks the small end of the egg;
and performing two-classification training on the sample image based on a preset convolutional neural network, and determining that an air chamber is detected according to the existence of an overexposure area in the sample image, so as to judge that the big end of the poultry egg is upward and generate the classification model.
13. A method according to claim 11 or 12, wherein the pick-up apparatus comprises a suction nozzle by which the eggs are sucked up; and a lamp is arranged in the suction nozzle, and the lamp is lightened after the poultry eggs are sucked by the suction nozzle.
14. An egg detection device, comprising:
the acquisition module is used for acquiring an image to be detected, wherein the image to be detected comprises eggs picked by a pickup device;
the classification module is used for classifying the poultry eggs in the image to be detected according to the pre-trained classification model in the big-small direction according to the contours to generate a classification result;
and the generation module is used for generating the classification label corresponding to the eggs according to the classification result.
15. An egg detection device, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring an image to be detected, and the image to be detected comprises eggs which are picked up by a pickup device and are illuminated;
the classification module is used for classifying the poultry eggs in the image to be detected according to whether air chambers are detected or not according to a pre-trained classification model to generate a classification result;
and the generation module is used for generating the classification label corresponding to the eggs according to the classification result.
16. An egg detection system, comprising: the poultry egg detection device comprises a first poultry egg detection device, a pickup device and a shooting device;
the picking device comprises a suction nozzle, and the poultry eggs are sucked up through the suction nozzle;
the shooting equipment shoots the eggs sucked up by the suction nozzle to obtain a first egg image;
the first egg detection device is used for obtaining a first image to be detected according to the first egg image, classifying the eggs in the first image to be detected according to a pre-trained first classification model in the big-small direction according to the outline, and generating a first classification result; and generating a first classification label corresponding to the poultry egg according to the first classification result.
17. The system of claim 16, further comprising: sorting equipment;
the first egg detection device is used for generating a first sorting instruction according to the first classification result, and the first sorting instruction is used for controlling sorting equipment to perform corresponding sorting operation on the eggs; sending the first sorting instruction to the sorting equipment;
the sorting equipment is used for executing corresponding sorting operation according to the first sorting instruction.
18. The system of claim 16, further comprising: a second egg detection device for detecting the eggs,
the first classification result generated by the first egg detection device further comprises: the first confidence coefficient that the big end of the egg faces upwards or the second confidence coefficient that the small end faces upwards;
a suction nozzle of the picking device sucks the poultry egg with the first confidence coefficient smaller than a first preset threshold value or the second confidence coefficient smaller than a second preset threshold value, and a lamp arranged in the suction nozzle is turned on;
the shooting device shoots the illuminated eggs to obtain a second egg image;
the second egg detection device is used for obtaining a second image to be detected according to the second egg image, classifying the eggs in the second image to be detected in the big-small direction according to whether air chambers are detected or not according to a pre-trained second classification model, and generating a second classification result; and generating a second classification label corresponding to the poultry egg according to the second classification result.
19. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method steps of any of claims 1-13.
20. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of claims 1 to 13.
CN201911002044.7A 2019-10-21 2019-10-21 Poultry egg detection method, device and system, electronic equipment and storage medium Pending CN110929755A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911002044.7A CN110929755A (en) 2019-10-21 2019-10-21 Poultry egg detection method, device and system, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911002044.7A CN110929755A (en) 2019-10-21 2019-10-21 Poultry egg detection method, device and system, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110929755A true CN110929755A (en) 2020-03-27

Family

ID=69849421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911002044.7A Pending CN110929755A (en) 2019-10-21 2019-10-21 Poultry egg detection method, device and system, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110929755A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866400A (en) * 2020-07-02 2020-10-30 北京海益同展信息科技有限公司 Image processing method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1449655A (en) * 2002-03-27 2003-10-22 株式会社堀内 Method and apparatus for determining the sex of fertilized egg
US20040040515A1 (en) * 2002-08-30 2004-03-04 Kabusiki Kaisya Horiuchi Method and apparatus for determining the sex of a fertilized egg
CN102628724A (en) * 2012-04-19 2012-08-08 江苏大学 Fowl egg mass center detecting method based on image
CN106872467A (en) * 2017-01-04 2017-06-20 北京天诚智能科技有限公司 Chicken embryo fertility detection method and apparatus
CN108051449A (en) * 2018-01-30 2018-05-18 华中农业大学 The online visible detection method of Salted duck egg face crack based on morphologic edge detection
CN109191461A (en) * 2018-10-22 2019-01-11 广东工业大学 A kind of Countryside Egg recognition methods and identification device based on machine vision technique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1449655A (en) * 2002-03-27 2003-10-22 株式会社堀内 Method and apparatus for determining the sex of fertilized egg
US20040040515A1 (en) * 2002-08-30 2004-03-04 Kabusiki Kaisya Horiuchi Method and apparatus for determining the sex of a fertilized egg
CN102628724A (en) * 2012-04-19 2012-08-08 江苏大学 Fowl egg mass center detecting method based on image
CN106872467A (en) * 2017-01-04 2017-06-20 北京天诚智能科技有限公司 Chicken embryo fertility detection method and apparatus
CN108051449A (en) * 2018-01-30 2018-05-18 华中农业大学 The online visible detection method of Salted duck egg face crack based on morphologic edge detection
CN109191461A (en) * 2018-10-22 2019-01-11 广东工业大学 A kind of Countryside Egg recognition methods and identification device based on machine vision technique

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
余飞等: "种鸡蛋储存和管理关键技术研究进展", 《上海畜牧兽医通讯》 *
马秀莲等: "基于嵌入式系统与机器视觉的上孵前无精蛋识别系统", 《农业机械学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866400A (en) * 2020-07-02 2020-10-30 北京海益同展信息科技有限公司 Image processing method and device

Similar Documents

Publication Publication Date Title
CN109409365A (en) It is a kind of that method is identified and positioned to fruit-picking based on depth targets detection
CN108686978B (en) ARM-based fruit category and color sorting method and system
Aquino et al. vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis
CN107084991A (en) The detection of quartz pushrod bubble and quality grading method based on machine vision
CN109829914B (en) Method and device for detecting product defects
CN105651776A (en) Device and method for automatically grading beef carcass meat yield based on computer vision
WO2018232518A1 (en) Determining positions and orientations of objects
CN108830293B (en) Animal weight identification method and device
CN111178197A (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
CN111239142A (en) Paste appearance defect detection device and method
CN110929755A (en) Poultry egg detection method, device and system, electronic equipment and storage medium
CN108764159A (en) Animal face recognition methods under condition of small sample and system
CN108874910B (en) Vision-based small target recognition system
CN111860652B (en) Method, device, equipment and medium for measuring animal body weight based on image detection
CN110991220A (en) Egg detection method, egg image processing method, egg detection device, egg image processing device, electronic equipment and storage medium
Chao et al. Design of a dual-camera system for poultry carcasses inspection
CN113793385A (en) Method and device for positioning fish head and fish tail
CN111161265A (en) Animal counting and image processing method and device
García-Manso et al. Towards selective and automatic harvesting of broccoli for agri-food industry
CN109191461A (en) A kind of Countryside Egg recognition methods and identification device based on machine vision technique
Jurado et al. Electronic system for the detection of chicken eggs suitable for incubation through image processing
CN113221704A (en) Animal posture recognition method and system based on deep learning and storage medium
WO2021088729A1 (en) Image processing method and apparatus, electronic device and storage medium
CN113191394A (en) Machine vision-based conveyor belt deviation diagnosis method and system
CN110930360A (en) Egg detection method, egg image processing method, egg detection device, image processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co., Ltd

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Shuke Haiyi Information Technology Co., Ltd

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Beijing Economic and Technological Development Zone, Beijing 100176

Applicant before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information