CN117745796A - Live pig weight estimation method based on fish eye lens - Google Patents

Live pig weight estimation method based on fish eye lens Download PDF

Info

Publication number
CN117745796A
CN117745796A CN202311768283.XA CN202311768283A CN117745796A CN 117745796 A CN117745796 A CN 117745796A CN 202311768283 A CN202311768283 A CN 202311768283A CN 117745796 A CN117745796 A CN 117745796A
Authority
CN
China
Prior art keywords
tangential
radial
live
reference line
far
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311768283.XA
Other languages
Chinese (zh)
Inventor
胡秋
蔡毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Hand Sight Information Technology Co ltd
Original Assignee
Chengdu Hand Sight Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Hand Sight Information Technology Co ltd filed Critical Chengdu Hand Sight Information Technology Co ltd
Priority to CN202311768283.XA priority Critical patent/CN117745796A/en
Publication of CN117745796A publication Critical patent/CN117745796A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a live pig weight estimation method based on a fish eye lens, which comprises the following steps: s1: collecting a detection image of a pigsty area through a fish eye lens; s2: calculating an area compensation factor of the live pigs in the detection image; s3: learning and training are carried out through a separation algorithm, so that the identification of the live pigs in the detection image is completed, and the area and the identification area of the live pigs are obtained; s4: calculating the prediction area of the live pigs according to the area compensation factors and the identification areas of the live pigs; s5: calculating the final area of the live pig according to the relation between the distance from the live pig to the center point in the detection image and the radius of the detection image; s6: and setting a type coefficient of the live pigs according to the nutritional status of the live pigs, and calculating the weight of the live pigs according to the type coefficient. The method can collect images with a larger pigsty range through the fisheye lens, and high-precision estimation is carried out on the area of the live pig through area distortion compensation of the live pig under the fisheye lens, so that high-precision monitoring of the weight of the live pig is realized.

Description

Live pig weight estimation method based on fish eye lens
Technical Field
The invention relates to the technical field of animal weight measurement, in particular to a live pig weight estimation method based on fish eye lenses.
Background
The weight of the live pigs is an important basis for the producer to measure the quality of the live pigs, is also an important index for managing the growth, health and marketing of the live pigs, is a necessary condition for making scientific feeding decisions, particularly, the weight of the fattening pigs is estimated in time, and the decision of profit is important, because the pork producer can make correct marketing decisions for the group pigs while reducing the labor and feed cost.
With the rapid development and application of computer technology, innovative technologies such as artificial intelligence, big data, internet of things, machine vision and the like are deeply fused with the traditional animal husbandry, wherein the machine vision technology provides a new thought for pig body size measurement and weight estimation, and can better solve the pain point problem in the breeding process. Currently, computer vision technology is successfully applied to the development of the live pig industry, and live pig non-contact weight estimation based on 2D or 3D images is one of typical applications. The method mainly comprises the steps of collecting real-time weight of a live pig by installing a sensor in a pig farm and manually intervening through a conventional video AI technology, denoising the collected image, separating to further extract relevant indexes of a body scale (including characteristics of the length, the height, the width of a belly, the width of a chest and the area of a back of the live pig), constructing an automatic live pig weight model through a nonlinear function method such as a convolutional neural network model, a PointNet++ model and the like, and finally modeling the collected data through an algorithm and then carrying out weight estimation on the live pig.
The prior art has the following defects:
(1) The preparation workload is large, and a large amount of data training is needed to be prepared.
(2) The applicability is poor, the image algorithm has higher sensitivity to the environment, and the accuracy of changing the environment is possibly poor.
(3) The hardware requirement is higher, and the running algorithm is required to have a severe requirement on the performance of the server.
(4) The pig farm in China is mainly middle and small in size, the hardware facilities of the pig farm are old and the configuration is lost, and the transformation cost is high. The workload of the early training or the hardware cost of the later use leads to higher overall cost of the whole scheme and difficult commercialization.
Disclosure of Invention
The invention provides a live pig weight estimation method based on a fish eye lens, which aims to improve universality of the live pig weight estimation method, reduce hardware cost of live pig weight estimation and improve convenience and efficiency of live pig weight estimation.
In order to achieve the above object, the present invention provides a live pig weight estimation method based on fish eye lens, the method comprising:
s1: collecting a detection image of a pigsty area through a fish eye lens, wherein the detection image is circular;
s2: calculating an area compensation factor of the live pigs in the detection image;
s3: learning and training are carried out through a separation algorithm, so that the identification of the live pigs in the detection image is completed, and the area and the identification area of the live pigs are obtained;
s4: calculating the prediction area of the live pigs according to the area compensation factors and the identification areas of the live pigs;
s5: calculating the final area of the live pig according to the relation between the distance from the live pig to the center point in the detection image and the radius of the detection image;
s6: setting a type coefficient of the live pigs according to the nutritional status of the live pigs, and calculating the weight of the live pigs according to the type coefficient;
weight of live pig = final area of live pig x type coefficient.
Further, the step of calculating S2 includes:
s21: marking the center point of the detected image;
s22: selecting an over-center reference target, a radial reference target and a tangential reference target in the pigsty area, and marking an over-center reference line, a radial far reference line and a tangential far reference line in the detection image according to the over-center reference target, the radial reference target and the tangential reference target;
s23: according to the actual lengths of the over-center reference target, the radial reference target and the tangential reference target, converting the pixel lengths of the over-center reference line, the radial far reference line and the tangential far reference line into standard reference lengths corresponding to 100cm in equal length, wherein the standard reference lengths are respectively an over-center reference line standard length, a radial far reference line standard length and a tangential far reference line standard length;
s24: calculating radial far reference linear variables and tangential far reference linear variables according to the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line;
obtaining the distance between the over-center reference line and the center point of the radial far reference line to obtain the radial distance;
obtaining the distance between the over-center reference line and the center point of the tangential far reference line to obtain a tangential distance;
s25: converting the pixel length of the over-center reference line into a standard pixel length according to the actual length of the over-center reference target;
standard pixel length = pixel length of over-center reference line/actual length of over-center reference target;
s26: calculating radial parameters according to the radial far reference linear variable and the radial distance according to a normal function, and calculating tangential parameters according to the tangential far reference linear variable and the tangential distance, wherein the normal function is as follows:
wherein: σ=0.3989422804, d is radial distance or tangential distance;
when d is the radial distance, f (d) is a radial far-reference linear variable, and the calculated k value is a radial parameter;
when d is tangential distance, f (d) is tangential far reference linear variable, and the calculated k value is tangential parameter;
s27: and calculating the area compensation factor of the live pig according to the normal function, the radial parameter, the tangential parameter and the distance from the live pig to the central point in the detection image.
Further, in S22:
marking an over-center reference line by taking the length of an over-center reference target in the detection image as a standard at a position close to the center point in the detection image;
marking a radial far reference line by taking the length of a radial reference target in the detection image as a benchmark in the radial direction far from the over-center reference line in the detection image;
and marking a tangential far reference line by taking the length of a tangential reference target in the detected image as a benchmark in the tangential direction far from the over-center reference line in the detected image.
Further, the actual lengths of the over-center reference target, the radial reference target and the tangential reference target are consistent.
Further, the over-center reference target, the radial reference target and the tangential reference target are excrement leakage plate holes arranged in the pigsty area.
Further, in S23:
the calculation formula of the standard length of the over-center reference line is as follows: lcs=100×lc/lcr;
the calculation formula of the standard length of the radial far reference line is as follows: le1 s=100×le1/le1r;
the calculation formula of the standard length of the tangential far reference line is as follows: le2 s=100×le2/le2r;
wherein: lcs, le1s and le2s are respectively the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line; lc, le1, le2 are the pixel length of the over-center reference line, the pixel length of the radial far reference line, the pixel length of the tangential far reference line, respectively; lcr, le1r, le2r are the actual length of the over-center reference target, the actual length of the radial reference target, the actual length of the tangential reference target, respectively.
Further, in S24:
the calculation formula of the radial far reference linear variable is as follows: r is R 1 =le1s/lcs;
The calculation formula of the tangential far reference linear variable is as follows: r is R 2 =le2s/lcs;
Wherein: r is R 1 、R 2 Radial far reference linear variable and tangential far reference linear variable respectively; lcs, le1s, le2s are the over-center reference line standard length, the radial far reference line standard length, and the tangential far reference line standard length, respectively.
Further, the step of calculating S27 includes:
the first step: calculating the distance between the live pigs and the central point;
and a second step of: calculating the deformation of the live pigs in radial and tangential directions:
the radial parameters and the distance between the live pigs and the central point are brought into the normal function to calculate radial far-reference linear variables of the live pigs;
the tangential parameters and the distance between the live pigs and the central point are brought into the normal function to calculate tangential far reference linear variables of the live pigs;
and a third step of: calculating an area compensation factor of the live pig according to the radial deformation and tangential deformation of the live pig;
the calculation formula of the area compensation factor of the live pigs is as follows:
wherein: d, d p Distance from the live pigs to the central point; f1 (d) p ) Is a tangential far reference linear variable of live pigs; f2 (d) p ) Is a radial far-reference linear variable of live pigs; ls is the standard pixel length; s is S c Is an area compensation factor of live pigs.
Further, in S4, the formula for calculating the predicted area of the live pig is:
S r =S i ×S c ×2.5;
wherein: s is S r Is the prediction area of the live pigs; s is S i Is the identification area of the live pigs; s is S c Is an area compensation factor of live pigs.
Further, in S5:
if the distance from the live pig to the center point in the detection image is less than 45% of the radius of the detection image:
if the distance from the live pig to the center point in the detection image is more than or equal to 45% of the radius of the detection image:
wherein: s is S p Is the final area of the live pigs; s is S r Is the prediction area of the live pigs; d, d p Distance from the live pigs to the central point; r is the radius of the detected image.
Further, the type coefficient is T:
when live pigs are well nourished, t=142;
when live pigs are normal in nutrition, t=156;
when live pigs were poorly nourished, t=162.
The invention has the beneficial effects that:
(1) According to the live pig weight estimation method based on the fisheye lens, images with a larger pigsty range can be acquired through the fisheye lens, a small amount of learning training is performed through a conventional separation algorithm, the live pig in the images is identified, the area and the area where the live pig is located are circled, and the live pig weight is monitored with high precision due to the fact that the images acquired by the fisheye lens have large distortion.
(2) The method has low requirement on the overall performance of the hardware system, mainly adopts a conventional calculation method, does not need to do a large amount of preliminary preparation work in the process of estimating the weight of the raw pigs, has less training sample quantity required by the training module, only needs to do relevant auxiliary line configuration, and reduces the transformation cost of the intelligent weight measuring module of the pig farm.
(3) The non-contact measurement is adopted without other sensors, so that the applicability is strong, the relevance to the whole environment is low, and different environments can be used quickly. The method can realize on-line monitoring of the weight of the live pigs in a non-contact type unmanned intervention mode, reduce manpower, reduce the loss of stress reaction formed by the traditional weight measurement method to the live pig breeding, and be applicable to different types of pig farms.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a live pig weight estimation method based on fish eye lenses.
Fig. 2 is a schematic diagram of a detected image collected by a fisheye lens.
Fig. 3 is a flowchart of S2 calculating the area compensation factor of the live pig in the detected image.
Fig. 4 is a schematic diagram of an over-center reference line, a radial far reference line, and a tangential far reference line of the detected image annotation in S22.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention; it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments, and that all other embodiments obtained by persons of ordinary skill in the art without making creative efforts based on the embodiments in the present invention are within the protection scope of the present invention.
The embodiment of the application provides a live pig weight estimation method based on a fisheye lens, specifically, referring to fig. 1, the method includes:
s1: collecting a detection image of a pigsty area through a fish eye lens, wherein the detection image is circular;
in order to realize weight estimation of live pigs in a large-scale pigsty and reduce the configuration cost of multiple lenses, the embodiment adopts a fisheye lens to shoot a pigsty area, the detected image obtained in the embodiment is shown in fig. 2, the fisheye lens can provide a wide-angle view, and compared with other types of lenses, the fisheye lens can cover a larger area, and the detected image obtained through the fisheye lens is circular.
S2: and calculating an area compensation factor of the live pigs in the detection image.
The detection image of the pigsty area is acquired through the step S1, and because the detection image acquired by the fisheye lens has large distortion, in order to realize accurate estimation of the weight of the live pigs in the detection image, the distortion needs to be supplemented, and in the embodiment, the compensation of the distortion is realized by acquiring the area supplementing factor of the live pigs.
S3: learning and training are carried out through a separation algorithm, so that the identification of the live pigs in the detection image is completed, and the area and the identification area of the live pigs are obtained;
in the embodiment S3, a large amount of preliminary preparation work is not needed in the pig recognition module, the training module needs a small amount of training samples, and a small amount of learning and training are performed through a conventional separation algorithm to complete the recognition of pigs in the graph.
S4: calculating the prediction area of the live pigs according to the area compensation factors and the identification areas of the live pigs;
s5: calculating the final area of the live pig according to the relation between the distance from the live pig to the center point in the detection image and the radius of the detection image;
s6: setting a type coefficient of the live pigs according to the nutritional status of the live pigs, and calculating the weight of the live pigs according to the type coefficient;
weight of live pig = final area of live pig x type coefficient.
In this embodiment, can gather the image of great pigsty scope through the fisheye lens, carry out a small amount of study training through conventional separation algorithm, accomplish the discernment to live pig in the image to circle its place region and area, because there is great distortion in the image that the fisheye lens gathered, this scheme carries out the high accuracy estimation to live pig area through live pig area distortion compensation under the fisheye lens, realizes the high accuracy monitoring of live pig weight.
Meanwhile, the scheme of the embodiment does not need other sensors, adopts non-contact measurement, has strong applicability, has little relevance to the whole environment, and can be rapidly used in different environments. The method can realize on-line monitoring of the weight of the live pigs in a non-contact type unmanned intervention mode, reduce manpower, reduce the loss of stress reaction formed by the traditional weight measurement method to the live pig breeding, and be applicable to different types of pig farms.
In order to realize the calculation of distortion under the high-precision fisheye lens, the calculation of the area compensation factor of the live pig is performed through S2, and in this embodiment, the calculation step of S2 includes the following steps, a calculation flow diagram is shown in fig. 3:
s21: marking a center point of the detection image;
after the detected image acquired by the fisheye lens is obtained, the center point of the fisheye image pickup picture is firstly confirmed, namely the center point of the detected image is firstly required to be confirmed (possibly not the center point of the picture, if the detected image cannot be directly confirmed through the picture, the dotting confirmation position is required to be input), the center point is marked, and only when the fisheye lens is downward opposite, the lens center point is the image center point, so that the lens center point cannot be regarded as the image center point. The large gap can only be marked manually to locate the center of the image.
S22: selecting an over-center reference target, a radial reference target and a tangential reference target in a pigsty area, and marking an over-center reference line, a radial far reference line and a tangential far reference line in a detection image according to the over-center reference target, the radial reference target and the tangential reference target;
as shown in fig. 3, in order to realize correction of distortion of detected images, in this embodiment, an over-center reference target, a radial reference target and a tangential reference target are selected at two positions of an over-center point and a frame near an edge according to the layout condition of a pigsty, and an over-center reference line, a radial far-reference line and a tangential far-reference line are marked according to three reference targets, wherein the tangential far-reference line is perpendicular to the radial direction, the radial far-reference line coincides with the radial direction, and is preferably the far-center position of a detection area, and selection of the reference targets needs to be carefully performed in the selection process of the reference targets, because all calculation accuracies are related to accurate selection of the reference points.
S23: according to the actual lengths of the over-center reference target, the radial reference target and the tangential reference target, the pixel lengths of the over-center reference line, the radial far-reference line and the tangential far-reference line are converted into standard reference lengths corresponding to 100cm in equal length, and the standard lengths are respectively the standard length of the over-center reference line, the standard length of the radial far-reference line and the standard length of the tangential far-reference line.
S24: according to the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line, calculating radial far reference linear variables and tangential far reference linear variables;
obtaining the distance between the over center reference line and the center point of the radial far reference line to obtain the radial distance, as shown in FIG. 4, where d 1 Is the radial distance;
obtaining the distance between the center points of the over-center reference line and the tangential far reference line to obtain tangential distance, as shown in figure 4, d 2 Is the tangential distance.
S25: converting the pixel length of the over-center reference line into a standard pixel length according to the actual length of the over-center reference target;
standard pixel length = pixel length of over-center reference line/actual length of over-center reference target;
in the embodiment, in S4 to S6, based on the over-center reference line, the radial far reference line, and the tangential far reference line, a plurality of parameters including the radial far reference line variable, the tangential far reference line variable, the radial distance, the tangential distance, and the standard pixel length are calculated.
S26: according to a normal function, calculating radial parameters according to radial far reference linear variables and radial distances, and calculating tangential parameters according to tangential far reference linear variables and tangential distances, wherein the normal function is as follows:
wherein: σ=0.3989422804, d is radial distance or tangential distance;
when d is the radial distance, f (d) is a radial far-reference linear variable, and the calculated k value is a radial parameter;
when d is tangential distance, f (d) is tangential far reference linear variable, and the calculated k value is tangential parameter;
in this embodiment, the distortion of the image is radial distortion and tangential distortion, respectively, and in S26, the radial parameter of the radial distortion and the tangential parameter of the tangential distortion are calculated based on a normal function.
S27: and calculating the area compensation factor of the live pig according to the normal function, the radial parameter, the tangential parameter and the distance from the live pig to the central point in the detection image.
And S27, according to the distance between the live pig and the central point, obtaining tangential distortion quantity and radial distortion quantity of the live pig in the detection image, and further calculating the area compensation factor of the live pig.
The area compensation factor of the live pig can be obtained through the above calculation, wherein the accuracy of the area compensation factor of the live pig is related to the selection of the over-center reference target, the radial reference target, and the tangential reference target in step S22. Specifically, in this embodiment, the labeling process of the reference line is:
marking an over-center reference line by taking the length of an over-center reference target in the detected image as a standard at a position close to the center point in the detected image;
marking a radial far reference line by taking the length of a radial reference target in the detected image as a benchmark in the radial direction far from the over-center reference line in the detected image;
and marking a tangential far reference line by taking the length of a tangential reference target in the detected image as a benchmark in the tangential direction away from the over-center reference line in the detected image.
And marking the over-center reference line, the radial far reference line and the tangential far reference line marked in the detection image by taking the lengths of the over-center reference target, the radial reference target and the tangential reference target in the detection image as references respectively.
In order to simplify the calculation process, the actual lengths of the over-center reference target, the radial reference target and the tangential reference target are equal, and the conversion in the estimation process can be reduced, and in this embodiment, the actual lengths of the over-center reference target, the radial reference target and the tangential reference target are consistent. The actual length is the actual length and not the length displayed in the detected image, and since the detected image has distortion, the length of each reference object in the detected image does not coincide with the actual length.
As shown in fig. 2 and fig. 4, a plurality of excrement leakage plate holes are uniformly formed in the pigsty range, the actual lengths of the excrement leakage plate holes are identical, and the embodiment combines the actual conditions of the pigsty, and the over-center reference target, the radial reference target and the tangential reference target are excrement leakage plate holes formed in the pigsty area.
S23, converting the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line, wherein the calculation formula in S23 is as follows:
the calculation formula of the standard length of the over-center reference line is as follows: lcs=100×lc/lcr;
the calculation formula of the standard length of the radial far reference line is as follows: le1 s=100×le1/le1r;
the calculation formula of the standard length of the tangential far reference line is as follows: le2 s=100×le2/le2r;
wherein: lcs, le1s and le2s are respectively the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line; lc, le1, le2 are the pixel length of the over-center reference line, the pixel length of the radial far reference line, the pixel length of the tangential far reference line, respectively; lcr, le1r, le2r are the actual length of the over-center reference target, the actual length of the radial reference target, the actual length of the tangential reference target, respectively.
S24, calculating deformation amounts of reference lines in the radial direction and the tangential direction, wherein in S24:
the calculation formula of the radial far reference linear variable is as follows: r is R 1 =le1s/lcs;
The calculation formula of the tangential far reference linear variable is as follows: r is R 2 =le2s/lcs;
Wherein: r is R 1 、R 2 Radial far reference linear variable and tangential far reference linear variable respectively; lcs, le1s, le2s are the over-center reference line standard length, the radial far reference line standard length, and the tangential far reference line standard length, respectively.
S27, calculating an area compensation factor of the live pigs, wherein the step of calculating S27 comprises the following steps:
the first step: calculating the distance d between live pigs and central point p
And a second step of: calculating the deformation of the live pigs in radial and tangential directions:
the radial parameter and the distance between the live pigs and the central point are brought into a normal function to calculate the radial far reference linear variable f2 (d p );
The tangential parameters and the distance from the live pig to the central point are brought into a normal function to calculate the tangential far reference linear variable f1 (d) p );
And a third step of: calculating area compensation factors of the live pigs according to the deformation amounts of the live pigs in the radial direction and the tangential direction;
the calculation formula of the area compensation factor of the live pigs is as follows:
wherein: d, d p Distance from the live pigs to the central point; f1 (d) p ) Is a tangential far reference linear variable of live pigs; f2 (d) p ) Is a radial far-reference linear variable of live pigs; ls is the standard pixel length; s is S c Is an area compensation factor of live pigs.
In the embodiment, the distortion is compensated by the area compensation factor of the live pig in the S4, so as to obtain the predicted area of the live pig through calculation, the primary estimated chest circumference length is 2.5 times of the projection width, and the calculation formula of the predicted area of the live pig is as follows:
S r =S i ×S c ×2.5;
wherein: s is S r Is the prediction area of the live pigs; s is S i Is the identification area of the live pigs; s is S c Is an area compensation factor of live pigs.
S5, calculating the final area according to the relation between the distance of the live pig from the center point and the radius of the detection image, wherein the distortion amount of the image is larger as the detection image is far from the center point, in the embodiment, S5:
if the distance from the live pig to the center point in the detection image is less than 45% of the radius of the detection image:
if the distance from the live pig to the center point in the detection image is more than or equal to 45% of the radius of the detection image:
wherein: s is S p Is the final area of the live pigs; s is S r Is the prediction area of the live pigs; d, d p Distance from the live pigs to the central point; r is the radius of the detected image.
The 45% distortion is smaller, the distortion after 45% is larger, the correction is needed by different formulas, and 45% is selected and obtained mainly through reverse deduction according to practically collected data.
Step S6, after obtaining the final area of the live pigs, estimating the weight of the live pigs according to the type coefficient of the live pigs, wherein the type coefficient is T:
when live pigs are well nourished, t=142;
when live pigs are normal in nutrition, t=156;
when live pigs were poorly nourished, t=162.
In order to further improve the calculation accuracy, the embodiment may collect a plurality of detection images, and estimate the weight of each live pig in the detection images for a plurality of times, so as to obtain the average weight of each live pig as the estimated weight of the final measured live pig and output the estimated weight.
According to the live pig weight estimation method based on the fisheye lens, images with a larger pigsty range can be acquired through the fisheye lens, a small amount of learning training is performed through a conventional separation algorithm, the live pig in the images is identified, the area and the area where the live pig is located are circled, and the live pig weight is monitored with high precision due to the fact that the images acquired by the fisheye lens have large distortion.
The method of the embodiment has low requirements on the overall performance of the hardware system, mainly adopts a conventional calculation method, does not need to do a large amount of preliminary preparation work in the raw pig weight estimation process, has less training sample quantity required by the training module, only needs to do relevant auxiliary line configuration, and reduces the reconstruction cost of the intelligent pig farm weight measurement module.
The method of the embodiment does not need other sensors, adopts non-contact measurement, has strong applicability, has little relevance to the whole environment, and can be rapidly used in different environments. The method can realize on-line monitoring of the weight of the live pigs in a non-contact type unmanned intervention mode, reduce manpower, reduce the loss of stress reaction formed by the traditional weight measurement method to the live pig breeding, and be applicable to different types of pig farms.
It will be appreciated by those skilled in the art that all or part of the flow of the method of the above embodiment may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, and the program may include the flow of the embodiment of the above methods when executed. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a random access memory (Random Access Memory, RAM), or the like.
The foregoing is merely a preferred embodiment of the invention, and although the invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described in the foregoing embodiments, or equivalents may be substituted for some of the features thereof. Modifications, equivalents, and alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. A live pig weight estimation method based on fish eye lenses, which is characterized by comprising the following steps:
s1: collecting a detection image of a pigsty area through a fish eye lens, wherein the detection image is circular;
s2: calculating an area compensation factor of the live pigs in the detection image;
s3: learning and training are carried out through a separation algorithm, so that the identification of the live pigs in the detection image is completed, and the area and the identification area of the live pigs are obtained;
s4: calculating the prediction area of the live pigs according to the area compensation factors and the identification areas of the live pigs;
s5: calculating the final area of the live pig according to the relation between the distance from the live pig to the center point in the detection image and the radius of the detection image;
s6: setting a type coefficient of the live pigs according to the nutritional status of the live pigs, and calculating the weight of the live pigs according to the type coefficient;
weight of live pig = final area of live pig x type coefficient.
2. The method for estimating the weight of a live pig based on a fisheye lens according to claim 1, wherein the step of calculating S2 comprises:
s21: marking the center point of the detected image;
s22: selecting an over-center reference target, a radial reference target and a tangential reference target in the pigsty area, and marking an over-center reference line, a radial far reference line and a tangential far reference line in the detection image according to the over-center reference target, the radial reference target and the tangential reference target;
s23: according to the actual lengths of the over-center reference target, the radial reference target and the tangential reference target, converting the pixel lengths of the over-center reference line, the radial far reference line and the tangential far reference line into standard reference lengths corresponding to 100cm in equal length, wherein the standard reference lengths are respectively an over-center reference line standard length, a radial far reference line standard length and a tangential far reference line standard length;
s24: calculating radial far reference linear variables and tangential far reference linear variables according to the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line;
obtaining the distance between the over-center reference line and the center point of the radial far reference line to obtain the radial distance;
obtaining the distance between the over-center reference line and the center point of the tangential far reference line to obtain a tangential distance;
s25: converting the pixel length of the over-center reference line into a standard pixel length according to the actual length of the over-center reference target;
standard pixel length = pixel length of over-center reference line/actual length of over-center reference target;
s26: calculating radial parameters according to the radial far reference linear variable and the radial distance according to a normal function, and calculating tangential parameters according to the tangential far reference linear variable and the tangential distance, wherein the normal function is as follows:
wherein: σ=0.3989422804, d is radial distance or tangential distance;
when d is the radial distance, f (d) is a radial far-reference linear variable, and the calculated k value is a radial parameter;
when d is tangential distance, f (d) is tangential far reference linear variable, and the calculated k value is tangential parameter;
s27: and calculating the area compensation factor of the live pig according to the normal function, the radial parameter, the tangential parameter and the distance from the live pig to the central point in the detection image.
3. The method for estimating the weight of a live pig based on a fisheye lens according to claim 1, wherein in S4, the formula for calculating the predicted area of the live pig is:
S r =S i ×S c ×2.5;
wherein: s is S r Is the prediction area of the live pigs; s is S i Is the identification area of the live pigs; s is S c Is an area compensation factor of live pigs.
4. The method for estimating the weight of a live pig based on a fish eye lens according to claim 1, wherein in S5:
if the distance from the live pig to the center point in the detection image is less than 45% of the radius of the detection image:
if the distance from the live pig to the center point in the detection image is more than or equal to 45% of the radius of the detection image:
wherein: s is S p Is the final area of the live pigs; s is S r Is the prediction area of the live pigs; d, d p Distance from the live pigs to the central point; r is the radius of the detected image.
5. The method for estimating the weight of a live pig based on a fish eye lens according to claim 2, wherein in S22:
marking an over-center reference line by taking the length of an over-center reference target in the detection image as a standard at a position close to the center point in the detection image;
marking a radial far reference line by taking the length of a radial reference target in the detection image as a benchmark in the radial direction far from the over-center reference line in the detection image;
and marking a tangential far reference line by taking the length of a tangential reference target in the detected image as a benchmark in the tangential direction far from the over-center reference line in the detected image.
6. The live pig weight estimation method based on the fisheye lens according to claim 2, wherein the actual lengths of the over-center reference target, the radial reference target and the tangential reference target are identical.
7. The method for estimating the weight of a live pig based on a fish eye lens according to claim 6, wherein the over-center reference target, the radial reference target and the tangential reference target are manure leaking plate holes arranged in a pigsty area.
8. The method for estimating the weight of live pigs based on fish-eye lenses according to claim 2, wherein in S23:
the calculation formula of the standard length of the over-center reference line is as follows: lcs=100×lc/lcr;
the calculation formula of the standard length of the radial far reference line is as follows: le1 s=100×le1/le1r;
the calculation formula of the standard length of the tangential far reference line is as follows: le2 s=100×le2/le2r;
wherein: lcs, le1s and le2s are respectively the standard length of the over-center reference line, the standard length of the radial far reference line and the standard length of the tangential far reference line; lc, le1, le2 are the pixel length of the over-center reference line, the pixel length of the radial far reference line, the pixel length of the tangential far reference line, respectively; lcr, le1r, le2r are the actual length of the over-center reference target, the actual length of the radial reference target, the actual length of the tangential reference target, respectively.
9. The method for estimating the weight of a live pig based on a fish eye lens according to claim 2, wherein in S24:
the calculation formula of the radial far reference linear variable is as follows: r is R 1 =le1s/lcs;
The calculation formula of the tangential far reference linear variable is as follows: r is R 2 =le2s/lcs;
Wherein: r is R 1 、R 2 Radial far reference linear variable and tangential far reference linear variable respectively; lcs, le1s, le2s are the over-center reference line standard length, the radial far reference line standard length, and the tangential far reference line standard length, respectively.
10. The method for estimating the weight of a live pig based on a fisheye lens according to claim 2, wherein the step of calculating S27 comprises:
the first step: calculating the distance between the live pigs and the central point;
and a second step of: calculating the deformation of the live pigs in radial and tangential directions:
the radial parameters and the distance between the live pigs and the central point are brought into the normal function to calculate radial far-reference linear variables of the live pigs;
the tangential parameters and the distance between the live pigs and the central point are brought into the normal function to calculate tangential far reference linear variables of the live pigs;
and a third step of: calculating an area compensation factor of the live pig according to the radial deformation and tangential deformation of the live pig;
the calculation formula of the area compensation factor of the live pigs is as follows:
wherein: d, d p Distance from the live pigs to the central point; f1 (d) p ) Is a tangential far reference linear variable of live pigs; f2 (d) p ) Is a radial far-reference linear variable of live pigs; ls is the standard pixel length; s is S c Is an area compensation factor of live pigs.
CN202311768283.XA 2023-12-21 2023-12-21 Live pig weight estimation method based on fish eye lens Pending CN117745796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311768283.XA CN117745796A (en) 2023-12-21 2023-12-21 Live pig weight estimation method based on fish eye lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311768283.XA CN117745796A (en) 2023-12-21 2023-12-21 Live pig weight estimation method based on fish eye lens

Publications (1)

Publication Number Publication Date
CN117745796A true CN117745796A (en) 2024-03-22

Family

ID=90281010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311768283.XA Pending CN117745796A (en) 2023-12-21 2023-12-21 Live pig weight estimation method based on fish eye lens

Country Status (1)

Country Link
CN (1) CN117745796A (en)

Similar Documents

Publication Publication Date Title
AU2019101786A4 (en) Intelligent pig group rearing weighing method and apparatus, electronic device and storage medium
US20230129428A1 (en) System for high performance, ai-based dairy herd management and disease detection
CN114255357A (en) Group-breeding pig identity identification and health monitoring method based on computer vision
CN110991222B (en) Object state monitoring and sow oestrus monitoring method, device and system
Lu et al. An automatic splitting method for the adhesive piglets’ gray scale image based on the ellipse shape feature
CN115861721B (en) Livestock and poultry breeding spraying equipment state identification method based on image data
CN112184791A (en) Yak weight prediction method based on CNN-LSTM neural network
CN114049577A (en) Fish specification measuring method and system
CN115115830A (en) Improved Transformer-based livestock image instance segmentation method
CN110287902B (en) Livestock and poultry survival detection method, device, equipment and computer program product
CN114627554A (en) Automatic aquaculture feeding centralized management method and system for aquatic products
CN111797831A (en) BIM and artificial intelligence based parallel abnormality detection method for poultry feeding
Tonachella et al. An affordable and easy-to-use tool for automatic fish length and weight estimation in mariculture
CN114898405A (en) Portable broiler chicken abnormity monitoring system based on edge calculation
CN116091786B (en) Holographic body ruler self-coding method, system, equipment and storage medium for pig weight estimation
CN117745796A (en) Live pig weight estimation method based on fish eye lens
CN112906510A (en) Fishery resource statistical method and system
CN115914560A (en) Intelligent accurate feeding method and device for sows, electronic equipment and storage medium
CN116295022A (en) Pig body ruler measurement method based on deep learning multi-parameter fusion
Wang et al. Research on application of smart agriculture in cotton production management
CN114022831A (en) Binocular vision-based livestock body condition monitoring method and system
CN111507432A (en) Intelligent weighing method and system for agricultural insurance claims, electronic equipment and storage medium
AU2021100096A4 (en) Ai-based crop recommendation system for smart farming towards agriculture 5.0
CN116452597B (en) Sow backfat high-precision determination method, system, equipment and storage medium
CN116576782B (en) Underwater fish body length measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination