CN117974749A - Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion - Google Patents

Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion Download PDF

Info

Publication number
CN117974749A
CN117974749A CN202410023426.2A CN202410023426A CN117974749A CN 117974749 A CN117974749 A CN 117974749A CN 202410023426 A CN202410023426 A CN 202410023426A CN 117974749 A CN117974749 A CN 117974749A
Authority
CN
China
Prior art keywords
image
visible light
chicken breast
infrared
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410023426.2A
Other languages
Chinese (zh)
Inventor
张铁民
马闯
陈锐填
郑海坤
杨继康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN202410023426.2A priority Critical patent/CN117974749A/en
Publication of CN117974749A publication Critical patent/CN117974749A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a chicken breast angle measurement method, a system and a medium based on visible light and infrared image fusion, wherein the method comprises the following steps: acquiring live chicken breast images acquired by a registered visible light camera and an infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image; fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image; performing image preprocessing on the generated fusion image; inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken. The invention can realize accurate, efficient, safe and reliable measurement of chicken breast angle parameters, improves the measurement mode of current phenotype parameters, provides reference standards for breeding and production, and promotes the development of accurate livestock breeding in poultry farming industry.

Description

Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion
Technical Field
The invention relates to a chicken breast angle measurement method, a system and a medium based on visible light and infrared image fusion, and belongs to the technical field of intelligent poultry cultivation.
Background
The market demand for poultry products such as poultry meat, poultry eggs and the like is continuously increased, and the quality requirement is gradually increased. In poultry farming production, the phenotypic parameters of poultry are important reference indicators in the monitoring of their growth and development and in the breeding links. The accurate acquisition of the phenotype parameters of the poultry can effectively guide breeding and production, improve the quality of poultry products and create higher economic benefits; however, the current mode of obtaining the phenotype parameters of the poultry is mainly manual measurement, the measurement tool and the measurement method are not uniform, a single person is difficult to finish the measurement work, more human resources are consumed, meanwhile, the measurement standard is not uniform due to the human subjective factors, the development requirement of the accurate livestock breeding industry is difficult to meet, the birds are easily frightened and damaged in the measurement process, the physical and mental health and the production performance of the poultry are affected, and the animal welfare requirement is not met.
Disclosure of Invention
In view of the above, the invention provides a chicken breast angle measuring method, a device, a system, a computer device and a storage medium based on visible light and infrared image fusion, which can realize accurate, efficient, safe and reliable measurement of chicken breast angle parameters, improve the measurement mode of current phenotype parameters, provide reference standards for breeding and production, and promote the development of accurate livestock breeding in the poultry industry.
The first aim of the invention is to provide a chicken breast angle measurement method based on fusion of visible light and infrared images.
The second aim of the invention is to provide a chicken breast angle measuring device based on fusion of visible light and infrared images.
The third object of the invention is to provide a chicken breast angle measurement system based on fusion of visible light and infrared images.
A fourth object of the present invention is to provide a computer device.
A fifth object of the present invention is to provide a storage medium.
The first object of the present invention can be achieved by adopting the following technical scheme:
a chicken breast angle measurement method based on visible light and infrared image fusion, the method comprising:
acquiring live chicken breast images acquired by a registered visible light camera and an infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image;
fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image;
Performing image preprocessing on the generated fusion image;
inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken.
Further, before the live chicken breast images acquired by the registered visible light camera and the registered infrared camera are acquired, the method further comprises:
the method comprises the steps of obtaining solid circle array calibration plate images collected by a visible light camera and an infrared camera, wherein the calibration plate images comprise a second visible light image and a second infrared image, and the placement position of the calibration plate is consistent with the position of a chicken breast in actual measurement;
traversing the circle centers in the second visible light image and the second infrared image respectively, wherein the obtained circle center coordinates are matching point coordinates;
Calculating the mapping relation from any point P 0(x0,y0 in the second infrared image to the point P ' (x ', y ') in the second visible light image according to the coordinates of the same matching point in the two images, wherein the mapping relation is as follows:
wherein, (x i,yi) is the pixel coordinate of the i-th matching point in the second infrared image, (x' i,y′i) is the pixel coordinate in the second visible light image, dx and dy respectively represent the offset in the x axis and the y axis, RSS is the sum of squares of the residuals, minRSS represents that the solution of the optimal dx and dy is obtained by deflecting the RSS about the dx and dy and making the partial derivative be 0 so that the RSS is minimum, and the solution of the optimal dx and dy is the offset (dx, dy);
And according to the obtained mapping relation, taking a pixel coordinate system of the second visible light image as a reference, aligning pixel point coordinates of the second visible light image and the second infrared image through affine transformation, and realizing registration of the second visible light and the second infrared image.
Further, the fusing the first visible light image and the first infrared image by the image information fusion algorithm to obtain a fused image specifically includes:
Extracting texture features g from the first visible light image through a Gabor filter, and calculating to obtain detail weight parameters, wherein the detail weight parameters are represented by the following formula:
Wherein max g and min g are the maximum and minimum values in texture feature g, respectively; max w and min w are the maximum value and the minimum value of the range of the detail weight parameter w respectively;
respectively applying a Haar wavelet base to R, G, B color channels of a first visible light image and a first infrared image to obtain approximate coefficients, horizontal detail coefficients, vertical detail coefficients and diagonal detail coefficients of all channels in each image, and calculating wavelet coefficients of a fusion image, wherein the wavelet coefficients are expressed by the following formula:
Wherein w is a detail weight parameter; wavelet coefficients of the visible light image are represented by cA vis、cHvis、cVvis, and cD vis; wavelet coefficients of the infrared image are represented by cA inf、cHinf、cVinf and cD inf; wavelet parameters of the fused image are represented by cA m、cHm、cVm and cD m;
Carrying out noise reduction treatment on wavelet coefficients of the fusion image through an average filtering algorithm, and obtaining three color channels of the fusion image R, G, B through inverse wavelet transformation;
And merging the three color channels of the fusion image, carrying out normalization processing, and normalizing the numerical value of each channel to be within the range of 0-255, thereby obtaining the fusion image.
Furthermore, the neural network model is a regression network model formed by introducing an SE module and an SPP layer based on ResNet-50 backbone networks;
The SE module is arranged behind the last BN layer of each residual block and sequentially comprises a self-adaptive pooling layer, a first linear layer, a first ReLU activation function, a second linear layer and a second ReLU activation function;
After the SPP layer is arranged on the last layer of the ResNet-50 backbone network and convolved, the SPP layer sequentially comprises a first convolution layer, a spatial pyramid pooling layer and a second convolution layer.
The second object of the invention can be achieved by adopting the following technical scheme:
A chicken breast angle measurement device based on visible light and infrared image fusion, the device comprising:
the acquisition module is used for acquiring live chicken breast images acquired by the registered visible light camera and the registered infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image;
the fusion module is used for fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image;
the preprocessing module is used for preprocessing the generated fusion image;
The measuring and calculating module is used for inputting the fusion image subjected to image preprocessing into a trained neural network model for measuring and calculating, and outputting a measuring and calculating result to obtain the chicken breast angle of the live chicken.
The third object of the present invention can be achieved by adopting the following technical scheme:
A chicken breast angle measurement system based on visible light and infrared image fusion comprises a chicken breast fixed acquisition device and a computer;
The chicken breast fixing and collecting device comprises a flexible clamp holder, an LED lamp, a frame, a visible light camera and an infrared camera, wherein the flexible clamp holder, the LED lamp, the visible light camera and the infrared camera are arranged in the frame, the positions of the flexible clamp holder 1 and the LED lamp are relatively parallel, the visible light camera and the infrared camera are arranged in parallel, shooting lenses of the visible light camera and the infrared camera face the flexible clamp holder, and the visible light camera and the infrared camera are respectively connected with a computer;
the computer is used for executing the chicken breast angle measuring method.
Further, the infrared camera and the visible light camera are arranged at the top of the frame, so that the shooting lens is vertical to or downward at a certain angle to aim at the chest of the chicken so as to shoot the positions of the chicken bone process and the chest at two sides;
or the infrared camera and the visible light camera are arranged at the bottom of the frame, so that the shooting lens is vertical to or at a certain angle upwards to aim at the chest of the chicken so as to shoot the positions of the chicken bone process and the chest on two sides.
Further, the flexible clamp holder is a rotary clamping part and comprises a rotating shaft, a rotating rod and a torsion spring, wherein the rotating shaft is arranged at the bottom of the rod piece, one end of the rotating rod is connected with the rotating shaft, the other end of the rotating rod is connected with the other end of the opening end of the clamping groove, one end of the torsion spring is arranged on the rod piece, and the other end of the torsion spring is arranged at the joint of the rotating shaft and the rotating rod;
Or the flexible clamp holder is a sliding block type clamping part and comprises a sliding block and a spring, one end of the spring is arranged on the rod piece, the other end of the spring is arranged on one side of the sliding block, and the other side of the sliding block is connected with one end of the opening end of the clamping groove.
Further, the chicken breast angle detection device also comprises an identity recognition device which is connected with the computer and used for reading chicken breast angle information and realizing corresponding matching with the measured chicken breast angle parameters.
The fourth object of the present invention can be achieved by adopting the following technical scheme:
The computer equipment comprises a processor and a memory for storing a program executable by the processor, and is characterized in that the chicken breast angle measuring method is realized when the processor executes the program stored by the memory.
The fifth object of the present invention can be achieved by adopting the following technical scheme:
A storage medium storing a program which, when executed by a processor, implements the chicken breast angle measurement method described above.
Compared with the prior art, the invention has the following beneficial effects:
the invention effectively reduces the damage to chickens while improving the measurement efficiency, has high measurement speed, high precision and high stability based on the fused image chicken breast angle measurement algorithm, and realizes the accurate, reliable and standardized measurement of the live chicken breast angle.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a front view of the chicken breast fixing and collecting device in embodiment 1 of the present invention.
Fig. 2 is a diagram showing the axial structure of a chicken breast fixing and collecting device according to embodiment 1 of the present invention.
Fig. 3 is a front view block diagram of one form of flexible holder of embodiment 1 of the present invention.
Fig. 4 is a diagram of the axial side structure of one form of the flexible holder of embodiment 1 of the present invention.
Fig. 5 is a front view of another form of flexible holder of embodiment 1 of the invention.
Fig. 6 is a diagram of an alternate form of axial side structure of the flexible holder of embodiment 1 of the present invention.
Fig. 7 is a flowchart of a chicken breast angle measurement method based on fusion of visible light and infrared images in embodiment 1 of the present invention.
Fig. 8 is a flowchart of the registration operation of the method for measuring chicken breast angle based on fusion of visible light and infrared images in embodiment 1 of the present invention.
Fig. 9 is an image fusion flow chart of a chicken breast angle measurement method based on fusion of visible light and infrared images in embodiment 1 of the present invention.
Fig. 10 is a block diagram of a chicken breast angle measuring device based on fusion of visible light and infrared images according to embodiment 2 of the present invention.
Fig. 11 is a block diagram showing the structure of a computer device according to embodiment 3 of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by persons of ordinary skill in the art without making any inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention.
Example 1:
The embodiment provides a chicken breast angle measurement system based on visible light and infrared image fusion, the system comprises a chicken breast fixed acquisition device and a computer, as shown in fig. 1 and 2, the chicken breast fixed acquisition device comprises a flexible clamp holder 1, an LED lamp 2, a frame 3, a universal wheel 4, a visible light camera 5 and an infrared camera 6, the frame 3 can be made of aluminum alloy materials and the like, the flexible clamp holder 1, the LED lamp 2, the visible light camera 5 and the infrared camera 6 are all arranged in the frame 3, the positions of the flexible clamp holder 1 and the LED lamp 2 are relatively parallel, the visible light camera 5 and the infrared camera 6 are arranged in parallel, the shooting lenses of the visible light camera 5 and the infrared camera 6 face the flexible clamp holder 1, so that chicken breast images of chickens to be detected can be conveniently shot, and the visible light camera 5 and the infrared camera 6 are respectively connected with the computer; the universal wheel 4 is arranged at the bottom of the frame 3 to facilitate the movement of the measuring frame.
As shown in fig. 3 to 6, the flexible holder 1 of this embodiment is used for fixing chicken and comprises a base 7, a frame 8, a rod 9, a plate 10, a holding strip, an adjusting component and a feather pressing plate 14, wherein the upper end of the rod 9 is arranged on the base 7, the frame 8 is arranged in the middle of the base 7, the plate 10 is arranged at the bottom end of the frame 8, the feather pressing plate 14 is arranged at the bottom end of the plate 10, the holding strip is made of a flexible material, the holding strip is bent to form a holding groove 11, and the adjusting component is arranged at the bottom end of the rod 9 and has two forms of drawing type and rotating type.
When the adjusting component is a pull-out adjusting component, as shown in fig. 3 and 4, the pull-out adjusting component comprises a sliding block 12 and a spring 13, one end of the spring 13 is arranged on the rod piece 9, the other end of the spring is arranged on one side of the sliding block 12, and the other side of the sliding block 12 is connected with one end of the opening end of the clamping groove 11; when in use, when the sliding block 12 is controlled to move horizontally, the spring 13 is driven to move, and the clamping groove 11 is widened under the action of the sliding block 12; after the size of the opening end of the clamping groove 11 is adjusted, chicken feet of chickens to be detected are put in, then the sliding block 12 is loosened, the sliding block 12 horizontally moves under the elastic action of the spring 13, and the clamping groove 11 is contracted and clamping operation is completed under the action of the sliding block 12.
When the adjusting component is a rotary adjusting component, as shown in fig. 5 and 6, the rotary adjusting component comprises a rotating shaft 15, a rotating rod 16 and a torsion spring 17, wherein the rotating shaft 15 is arranged at the bottom of the rod piece 9, one end of the rotating rod 16 is connected with the rotating shaft 15, the other end of the rotating rod 16 is connected with the other end of the open end of the clamping groove 11, one end of the torsion spring 17 is arranged on the rod piece 9, and the other end of the torsion spring 17 is arranged at the joint of the rotating shaft 15 and the rotating rod 16; when in use, the rotating rod 17 is far away from the frame 8 under the action of the torsion spring 16, and the clamping groove 11 is long under the action of the rotating rod 17; the chicken feet of the chicken to be detected are placed in the clamping groove 11, the rotating rod 17 is driven by the gravity of the chicken to overcome the elastic action of the torsion spring 16 to begin to shrink, the rotating rod 17 rotates towards the direction close to the frame 8 and completes the clamping operation, and the rotating shaft 15 limits the rotating angle of the rotating rod to 90 degrees.
Further, the visible light camera 5 and the infrared camera 6 of the embodiment are arranged at the top of the frame 3, so that the shooting lens is vertical to or at a certain angle downwards to aim at the chest of the chicken so as to shoot the positions of the chicken bone process and the chest on two sides; it will be appreciated that the infrared camera 5 and the visible light camera 6 may also be disposed at the bottom of the frame 3, and the photographing lens is aligned vertically or at an angle to the chest of the chicken so as to photograph the positions of the chicken's osseous process and the chest on both sides.
Furthermore, the chicken breast angle measurement system of the embodiment can further comprise an identity recognition device, chicken information can be stored in devices such as a wing mark and a foot ring in advance, the identity recognition device is connected with a computer, and the chicken breast angle measurement system can read the chicken information by using a wireless radio frequency technology, a bar code scanner, a soft and hard keyboard input mode and the like and achieve corresponding matching with the measured chicken breast angle parameters.
As shown in fig. 7, the embodiment provides a chicken breast angle measurement method based on fusion of visible light and infrared images, which can be realized by the computer and comprises the following steps:
S701, acquiring live chicken breast images acquired by a visible light camera and an infrared camera after registration.
Before step S701, the visible light camera and the infrared camera need to be registered, as shown in fig. 8, specifically including:
s801, acquiring solid circle array calibration plate images acquired by a visible light camera and an infrared camera.
The solid circle array calibration plate of this embodiment uses the calibration plate of 5*6's solid circle array pattern, and the pattern face of calibration plate is just looking at visible light camera and infrared camera, and the position that puts of calibration plate keeps unanimous with chicken chest place when actually measuring, and the distance of calibration plate from camera lens keeps unanimous with chicken chest place when actually measuring promptly, uses two kinds of cameras to shoot it simultaneously, and the calibration plate image that obtains includes second visible light image and second infrared image.
S802, traversing circle centers in the second visible light image and the second infrared image respectively, wherein the obtained circle center coordinates are matching point coordinates.
S803, calculating a mapping relation from any point P 0(x0,y0) in the second infrared image to a point P ' (x ', y ') in the second visible light image according to the coordinates of the same matching point in the two images, wherein the mapping relation is as follows:
Wherein, (x i,yi) is the pixel coordinate of the i-th matching point in the second infrared image, (x' i,y′i) is the pixel coordinate in the second visible light image, dx and dy respectively represent the offset in the x axis and the y axis, RSS is the sum of squares of the residuals, minRSS represents that the solution of the optimal dx and dy is obtained by deflecting the RSS about the dx and dy and making the partial derivative be 0 so that the RSS is minimum, and the solution of the optimal dx and dy is the offset (dx, dy).
S804, according to the obtained mapping relation, the pixel coordinate system of the second visible light image is taken as a reference, and the pixel point coordinates of the second visible light image and the pixel point coordinates of the second infrared image are aligned through affine transformation, so that the registration of the second visible light and the second infrared image is realized.
After the registration of the visible light camera and the infrared camera is completed, the flexible clamp is used for fixing the live chicken, and when the breast angle image of the live chicken is acquired, an LED lamp for providing a light source is set to be 738.2cd, and the color temperature is set to be 6000K; setting the focal length of a visible light camera to be 6mm, the focusing distance to be 300-450mm, the aperture to be F2.8, the exposure time to be 50000 mu s and the automatic gain to be continuous; setting the focusing distance of the infrared camera to 300-450mm; the infrared camera and the visible light camera are fixed in the measuring frame, the infrared camera and the visible light camera are arranged at the top of the measuring frame, the camera lens is vertically downward and is looking at the breast of the chicken, and the collected live chicken breast image comprises a first visible light image and a first infrared image.
S702, fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image.
Further, as shown in fig. 9, the image information fusion algorithm of the present embodiment includes the steps of:
S901, extracting a texture feature g from a first visible light image through a Gabor filter, and calculating to obtain a detail weight parameter, wherein the size of the Gabor filter is (7, 7), and the detail weight parameter is calculated according to the following formula:
Wherein max g and min g are the maximum and minimum values in texture feature g, respectively; max w and min w are the maximum and minimum values, respectively, of the range of detail weight parameters w.
S902, respectively applying a Haar wavelet base to R, G, B color channels of a first visible light image and a first infrared image to obtain approximate coefficients, horizontal detail coefficients, vertical detail coefficients and diagonal detail coefficients of all channels in each image, and calculating wavelet coefficients of a fusion image, wherein the wavelet coefficients are represented by the following formula:
Wherein w is a detail weight parameter; wavelet coefficients of the visible light image are represented by cA vis、cHvis、cVvis, and cD vis; wavelet coefficients of the infrared image are represented by cA inf、cHinf、cVinf and cD inf; wavelet parameters of the fused image are represented by cA m、cHm、cVm and cD m.
S903, carrying out noise reduction treatment on wavelet coefficients of the fusion image through a mean value filtering algorithm (the size of the linear aperture is 3), and obtaining three color channels of the fusion image R, G, B through inverse wavelet transformation.
S904, combining the three color channels of the fusion image, carrying out normalization processing, and normalizing the numerical value of each channel to be in the range of 0-255, thereby obtaining the fusion image.
S703, performing image preprocessing on the generated fusion image.
Specifically, the chicken breast keel position is taken as the center, the fused image is adjusted to 224 x 224 uniform size through scaling and cutting in sequence, and finally denoising operation is carried out, so that the image quality is improved.
S704, inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken.
The neural network model of the embodiment is a regression network model formed by introducing an SE module and an SPP layer based on ResNet-50 backbone networks; the SE module is arranged behind the last BN layer of each residual block and sequentially comprises a self-adaptive pooling layer, a first linear layer, a first ReLU activation function, a second linear layer and a second ReLU activation function; the SPP layer is arranged after the last layer of ResNet-50 backbone networks is convolved, and sequentially comprises a first convolution layer, a space pyramid pooling layer and a second convolution layer, wherein the space pyramid pooling layer is formed by connecting the characteristics obtained by the first convolution layer with the characteristics obtained by carrying out average pooling on the characteristics in grids of 5 x 5, 9x 9 and 13 x 13 respectively.
The neural network model training process of the embodiment is as follows: after the shooting angles are determined, a large number of images with the same shooting angle are collected as a data set, chest angles of all the shot chickens are measured manually, the chest angles are used as labels of data corresponding to the images, and the neural network model is trained by utilizing the images in the data set and the corresponding labels.
It should be noted that while the method operations of the above embodiments are described in a particular order, this does not require or imply that the operations must be performed in that particular order or that all of the illustrated operations be performed in order to achieve desirable results. Rather, the depicted steps may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
Example 2:
As shown in fig. 10, this embodiment provides a chicken breast angle measurement device based on fusion of visible light and infrared images, which includes an acquisition module 1001, a fusion module 1002, a preprocessing module 1003 and a measurement module 1004, and specific functions of each module are as follows:
An acquisition module 1001, configured to acquire live chicken breast images acquired by the registered visible light camera and the registered infrared camera, where the live chicken breast images include a first visible light image and a first infrared image;
the fusion module 1002 is configured to fuse the first visible light image with the first infrared image through an image information fusion algorithm, so as to obtain a fused image;
a preprocessing module 1003, configured to perform image preprocessing on the generated fusion image;
The calculating module 1004 is configured to input the fused image after image preprocessing into a trained neural network model for calculating, and output a calculating result to obtain the chicken breast angle of the live chicken.
Specific implementation of each module described above is referred to above in embodiment 1; it should be noted that, the apparatus provided in this embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure is divided into different functional modules, so as to perform all or part of the functions described above.
Example 3:
The present embodiment provides a computer apparatus, as shown in fig. 11, which includes a processor 1102, a memory, an input device 1103, a display device 1104 and a network interface 1105 connected through a device bus 1101, where the processor is used to provide computing and control capabilities, the memory includes a nonvolatile storage medium 1106 and an internal memory 1107, where the nonvolatile storage medium 1106 stores an operating device, a computer program and a database, and the internal memory 1107 provides an environment for the operation of the operating device and the computer program in the nonvolatile storage medium, and when the processor 1102 executes the computer program stored in the memory, the chicken breast angle measurement method of the above embodiment 1 is implemented as follows:
acquiring live chicken breast images acquired by a registered visible light camera and an infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image;
fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image;
Performing image preprocessing on the generated fusion image;
inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken.
Example 4:
The present embodiment provides a storage medium, which is a computer readable storage medium storing a computer program, and when the computer program is executed by a processor, the chicken breast angle measurement method of the above embodiment 1 is implemented as follows:
acquiring live chicken breast images acquired by a registered visible light camera and an infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image;
fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image;
Performing image preprocessing on the generated fusion image;
inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken.
The computer readable storage medium of the present embodiment may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an apparatus, device, or means of electronic, magnetic, optical, electromagnetic, infrared, or semiconductor, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In this embodiment, the computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution apparatus, device, or apparatus. In the present embodiment, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable storage medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution apparatus, device, or apparatus. A computer program embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable storage medium may be written in one or more programming languages, including an object oriented programming language such as Java, python, C ++, and conventional procedural programming languages, such as the C-language or similar programming languages, or combinations thereof to perform the present embodiments. The program may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
In conclusion, the method and the device effectively reduce damage to chickens while improving the measurement efficiency, and the chicken breast angle measurement algorithm based on the fusion image has the advantages of high measurement speed, high precision and high stability, and realizes accurate, reliable and standardized measurement of the live chicken breast angle.
The above description is only of the preferred embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can substitute or change the technical solution and the inventive conception of the present invention equally within the scope of the disclosure of the present invention.

Claims (10)

1. The utility model provides a chicken breast angle measuring method based on visible light and infrared image fusion, which is characterized in that the method comprises the following steps:
acquiring live chicken breast images acquired by a registered visible light camera and an infrared camera, wherein the live chicken breast images comprise a first visible light image and a first infrared image;
fusing the first visible light image and the first infrared image through an image information fusion algorithm to obtain a fused image;
Performing image preprocessing on the generated fusion image;
inputting the fusion image subjected to image preprocessing into a trained neural network model for calculation, and outputting a calculation result to obtain the chicken breast angle of the live chicken.
2. The chicken breast angle measurement method of claim 1, wherein prior to acquiring live chicken breast images acquired by the registered visible and infrared cameras, further comprising:
the method comprises the steps of obtaining solid circle array calibration plate images collected by a visible light camera and an infrared camera, wherein the calibration plate images comprise a second visible light image and a second infrared image, and the placement position of the calibration plate is consistent with the position of a chicken breast in actual measurement;
traversing the circle centers in the second visible light image and the second infrared image respectively, wherein the obtained circle center coordinates are matching point coordinates;
Calculating the mapping relation from any point P 0(x0,y0 in the second infrared image to the point P ' (x ', y ') in the second visible light image according to the coordinates of the same matching point in the two images, wherein the mapping relation is as follows:
wherein, (x i,yi) is the pixel coordinate of the i-th matching point in the second infrared image, (x' i,y′i) is the pixel coordinate in the second visible light image, dx and dy respectively represent the offset in the x axis and the y axis, RSS is the sum of squares of the residuals, minRSS represents that the solution of the optimal dx and dy is obtained by deflecting the RSS about the dx and dy and making the partial derivative be 0 so that the RSS is minimum, and the solution of the optimal dx and dy is the offset (dx, dy);
And according to the obtained mapping relation, taking a pixel coordinate system of the second visible light image as a reference, aligning pixel point coordinates of the second visible light image and the second infrared image through affine transformation, and realizing registration of the second visible light and the second infrared image.
3. The chicken breast angle measurement method according to claim 1, wherein the fusing of the first visible light image and the first infrared image by the image information fusion algorithm to obtain a fused image specifically comprises:
Extracting texture features g from the first visible light image through a Gabor filter, and calculating to obtain detail weight parameters, wherein the detail weight parameters are represented by the following formula:
Wherein max g and min g are the maximum and minimum values in texture feature g, respectively; max w and min w are the maximum value and the minimum value of the range of the detail weight parameter w respectively;
respectively applying a Haar wavelet base to R, G, B color channels of a first visible light image and a first infrared image to obtain approximate coefficients, horizontal detail coefficients, vertical detail coefficients and diagonal detail coefficients of all channels in each image, and calculating wavelet coefficients of a fusion image, wherein the wavelet coefficients are expressed by the following formula:
Wherein w is a detail weight parameter; wavelet coefficients of the visible light image are represented by cA vus、cHvis、cVvis, and cD vis; wavelet coefficients of the infrared image are represented by cA inf、cHinf、cVinf and cD inf; wavelet parameters of the fused image are represented by cA m、cHm、cVm and cD m;
Carrying out noise reduction treatment on wavelet coefficients of the fusion image through an average filtering algorithm, and obtaining three color channels of the fusion image R, G, B through inverse wavelet transformation;
And merging the three color channels of the fusion image, carrying out normalization processing, and normalizing the numerical value of each channel to be within the range of 0-255, thereby obtaining the fusion image.
4. The chicken breast angle measurement method of claim 1, wherein the neural network model is a regression network model formed by introducing an SE module and an SPP layer based on ResNet-50 backbone networks;
The SE module is arranged behind the last BN layer of each residual block and sequentially comprises a self-adaptive pooling layer, a first linear layer, a first ReLU activation function, a second linear layer and a second ReLU activation function;
After the SPP layer is arranged on the last layer of the ResNet-50 backbone network and convolved, the SPP layer sequentially comprises a first convolution layer, a spatial pyramid pooling layer and a second convolution layer.
5. The chicken breast angle measurement system based on the fusion of visible light and infrared images is characterized by comprising a chicken breast fixed acquisition device and a computer;
The chicken breast fixing and collecting device comprises a flexible clamp holder, an LED lamp, a frame, a visible light camera and an infrared camera, wherein the flexible clamp holder, the LED lamp, the visible light camera and the infrared camera are arranged in the frame, the positions of the flexible clamp holder 1 and the LED lamp are relatively parallel, the visible light camera and the infrared camera are arranged in parallel, shooting lenses of the visible light camera and the infrared camera face the flexible clamp holder, and the visible light camera and the infrared camera are respectively connected with a computer;
the computer for performing the chicken breast angle measurement method of any one of claims 1-4.
6. The chicken breast angle measurement system of claim 5, wherein the infrared camera and the visible light camera are arranged on the top of the frame, so that the shooting lens is vertical or downward at a certain angle to the chicken breast to shoot the positions of the chicken breast bone process and the breasts at two sides;
or the infrared camera and the visible light camera are arranged at the bottom of the frame, so that the shooting lens is vertical to or at a certain angle upwards to aim at the chest of the chicken so as to shoot the positions of the chicken bone process and the chest on two sides.
7. The chicken breast angle measurement system of claim 5, wherein the flexible clamp is a rotary clamping component and comprises a rotating shaft, a rotating rod and a torsion spring, wherein the rotating shaft is arranged at the bottom of the rod piece, one end of the rotating rod is connected with the rotating shaft, the other end of the rotating rod is connected with the other end of the clamping groove, one end of the torsion spring is arranged at the rod piece, and the other end of the torsion spring is arranged at the joint of the rotating shaft and the rotating rod;
Or the flexible clamp holder is a sliding block type clamping part and comprises a sliding block and a spring, one end of the spring is arranged on the rod piece, the other end of the spring is arranged on one side of the sliding block, and the other side of the sliding block is connected with one end of the opening end of the clamping groove.
8. The chicken breast angle measurement system of claim 5, further comprising an identification device connected to the computer for reading chicken breast angle information and matching the measured chicken breast angle parameters.
9. A computer device comprising a processor and a memory for storing a program executable by the processor, characterized in that the processor, when executing the program stored in the memory, implements the chicken breast angle measurement method of any one of claims 1-4.
10. A storage medium storing a program, wherein the program, when executed by a processor, implements the chicken breast angle measurement method of any one of claims 1-4.
CN202410023426.2A 2024-01-08 2024-01-08 Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion Pending CN117974749A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410023426.2A CN117974749A (en) 2024-01-08 2024-01-08 Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410023426.2A CN117974749A (en) 2024-01-08 2024-01-08 Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion

Publications (1)

Publication Number Publication Date
CN117974749A true CN117974749A (en) 2024-05-03

Family

ID=90865410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410023426.2A Pending CN117974749A (en) 2024-01-08 2024-01-08 Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion

Country Status (1)

Country Link
CN (1) CN117974749A (en)

Similar Documents

Publication Publication Date Title
Yu et al. Segmentation and measurement scheme for fish morphological features based on Mask R-CNN
CN107667903B (en) Livestock breeding living body weight monitoring method based on Internet of things
US7853046B2 (en) Imaging system and method for body condition evaluation
CN104482860B (en) Fish morphological parameters self-operated measuring unit and method
CN108921057B (en) Convolutional neural network-based prawn form measuring method, medium, terminal equipment and device
CN111178197A (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
CN106372629A (en) Living body detection method and device
CN105180802B (en) A kind of dimension of object information identifying method and device
CN106846462B (en) insect recognition device and method based on three-dimensional simulation
CN110826371A (en) Animal identification method, device, medium and electronic equipment
AU2016327051A1 (en) Image analysis for making animal measurements including 3-D image analysis
Bhoj et al. Image processing strategies for pig liveweight measurement: Updates and challenges
CN110569735A (en) Analysis method and device based on back body condition of dairy cow
CN112365578A (en) Three-dimensional human body model reconstruction system and method based on double cameras
CN113706512B (en) Live pig weight measurement method based on deep learning and depth camera
Xiang et al. Measuring stem diameter of sorghum plants in the field using a high-throughput stereo vision system
Yu et al. An intelligent measurement scheme for basic characters of fish in smart aquaculture
CN111145205A (en) Pig body temperature detection method based on infrared image under multiple pig scenes
CN113180640B (en) Size measuring method and system for livestock breeding
CN112261399B (en) Capsule endoscope image three-dimensional reconstruction method, electronic device and readable storage medium
CN113284111A (en) Hair follicle region positioning method and system based on binocular stereo vision
CN111079617B (en) Poultry identification method and device, readable storage medium and electronic equipment
CN110751085B (en) Mouse behavior recognition method
CN117974749A (en) Chicken breast angle measurement method, system and medium based on visible light and infrared image fusion
CN111814698A (en) Method for detecting calf-protecting behavior of cows in pasturing area based on artificial intelligence and aerial images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination