CN111626985A - Poultry body temperature detection method based on image fusion and poultry house inspection system - Google Patents
Poultry body temperature detection method based on image fusion and poultry house inspection system Download PDFInfo
- Publication number
- CN111626985A CN111626985A CN202010314114.9A CN202010314114A CN111626985A CN 111626985 A CN111626985 A CN 111626985A CN 202010314114 A CN202010314114 A CN 202010314114A CN 111626985 A CN111626985 A CN 111626985A
- Authority
- CN
- China
- Prior art keywords
- image
- poultry
- fusion
- body temperature
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 244000144977 poultry Species 0.000 title claims abstract description 199
- 230000004927 fusion Effects 0.000 title claims abstract description 92
- 230000036760 body temperature Effects 0.000 title claims abstract description 80
- 238000007689 inspection Methods 0.000 title claims abstract description 66
- 238000001514 detection method Methods 0.000 title claims abstract description 38
- 238000001931 thermography Methods 0.000 claims abstract description 63
- 238000003062 neural network model Methods 0.000 claims abstract description 23
- 238000000034 method Methods 0.000 claims description 14
- 238000005457 optimization Methods 0.000 claims description 12
- 239000002245 particle Substances 0.000 claims description 11
- 230000007613 environmental effect Effects 0.000 claims description 10
- 230000002159 abnormal effect Effects 0.000 claims description 9
- 238000012549 training Methods 0.000 claims description 8
- 238000000354 decomposition reaction Methods 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 5
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims description 2
- 238000009395 breeding Methods 0.000 abstract description 14
- 230000001488 breeding effect Effects 0.000 abstract description 14
- 238000006243 chemical reaction Methods 0.000 abstract description 7
- 235000013594 poultry meat Nutrition 0.000 description 159
- 241000287828 Gallus gallus Species 0.000 description 22
- 235000013330 chicken meat Nutrition 0.000 description 22
- 238000012544 monitoring process Methods 0.000 description 9
- 241000271566 Aves Species 0.000 description 7
- 229910052732 germanium Inorganic materials 0.000 description 7
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 239000007789 gas Substances 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 206010011409 Cross infection Diseases 0.000 description 3
- 206010029803 Nosocomial infection Diseases 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000005057 refrigeration Methods 0.000 description 3
- 241000272517 Anseriformes Species 0.000 description 2
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000035622 drinking Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- QGZKDVFQNNGYKY-UHFFFAOYSA-N Ammonia Chemical compound N QGZKDVFQNNGYKY-UHFFFAOYSA-N 0.000 description 1
- 241000561734 Celosia cristata Species 0.000 description 1
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 1
- 241000566145 Otus Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 210000001520 comb Anatomy 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 235000013601 eggs Nutrition 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005548 health behavior Effects 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 229910000037 hydrogen sulfide Inorganic materials 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000006996 mental state Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 1
- 239000012788 optical film Substances 0.000 description 1
- 239000005304 optical glass Substances 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000009374 poultry farming Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007306 turnover Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/10—Image enhancement or restoration using non-spatial domain filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20064—Wavelet transform [DWT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiation Pyrometers (AREA)
Abstract
The invention relates to the technical field of poultry breeding, and discloses a poultry body temperature detection method and a poultry house inspection system based on image fusion, wherein the poultry body temperature detection method comprises the following steps: acquiring a thermal imaging image and a visible light image of the poultry; carrying out image fusion on the thermal imaging image and the visible light image to obtain a fused image; establishing a deep neural network model, carrying out feature recognition on the key parts of the poultry in the fusion image, acquiring temperature information of the key parts of the poultry on the fusion image, and converting according to the temperature information of the key parts of the poultry to acquire body temperature information of the poultry; the poultry house temperature intelligent detection method based on the fusion of the thermal imaging image and the visible light image realizes the intelligent detection of the poultry body temperature through the feature recognition of the deep neural network model and the corresponding temperature conversion, ensures the accuracy of the detection result, meets the poultry house inspection requirements under various cultivation environments, greatly improves the poultry house inspection efficiency and saves the poultry cultivation cost.
Description
Technical Field
The invention relates to the technical field of poultry breeding, in particular to a poultry body temperature detection method based on image fusion and a poultry house inspection system.
Background
In the poultry breeding process, the inspection of the poultry in the poultry house is a basic work, the health condition of the poultry can be known at any time through daily inspection, the problems in the breeding process are found, the poultry can be conveniently and timely taken precautionary measures, and the breeding benefit is improved. The inspection work of the poultry mainly comprises the steps of checking the mental state, the motion state, the temperature and humidity of a poultry house, the ventilation condition, the drinking condition, the feeding condition, the illumination, the excrement and the like of the poultry. The poultry house patrol has high requirements on patrol personnel, and the patrol personnel are required to have certain professional knowledge.
At present, inspection of poultry houses is mainly manual inspection, the problems of high labor cost, low efficiency, severe manual working environment, easy biological contact cross infection and the like of inspection exist, the problems are urgently needed to be solved by replacing manual inspection with equipment or robots, and the key problem is how to identify behaviors and physiological abnormality of poultry in poultry houses.
In poultry farming, laying hens, one of the poultry, have been raised on a large scale through poultry houses, and inspection robots or inspection schemes shown below have been proposed:
in the existing inspection robot, one of the inspection robots is a ground walking type robot, which carries an artificial intelligence program and an electronic image recorder comprising 2D and 3D, supports a basic structure walking in a poultry house, and can distinguish chickens, equipment, position of eggs on a floor and dead chickens; and the other is a ceiling type which can move on a simple rail inside the poultry house to monitor management problems of the poultry house, such as: health problems of laying hens, positioning dead birds and the like.
In the existing inspection scheme, a method suitable for a modern laying hen breeding process to quickly identify a dead chicken is determined by aiming at an image bottom layer algorithm of a health behavior monitoring robot of a laying hen, the behavior of a single laying hen is identified by adopting a machine vision method, the collected image is preprocessed by using the laying hen as a research object through an image segmentation method and a filtering processing technology, the behaviors of the laying hen are identified and classified, the method for automatically detecting the sick chicken is also provided based on the machine vision identification of the color of a cockscomb, the condition of inconsistent light in a poultry house is processed by an Otus segmentation method, noise interference is reduced through expansion corrosion processing, then a support vector machine training classifier is used for training and classifying the processed image information, suspected sick chicken is screened through the classifier threshold value, and the position and the alarm are positioned.
However, for the laying hens, the body temperature is taken as an important index reflecting the health state of the laying hens, and the body temperature of the laying hens is simply collected by a thermal imaging camera in the existing inspection scheme. Due to the characteristics of poultry house breeding, the light sources needed by the chicken house are usually less, and the chicken house is relatively dark. Although the thermal imaging is slightly influenced by weather and an external environment light source, the thermal imaging is sensitive to the temperature and humidity change of the surrounding environment, and the infrared thermal imaging is only used for detecting the abnormality of the highest point temperature of a single whole picture, so that the body temperature monitoring of all chickens in the whole picture cannot be realized. The visible light image is not influenced by the ambient temperature, but is easily influenced by the external ambient light source, and the light source of the poultry breeding house is dim. Therefore, when the body temperature of the laying hen is patrolled and examined currently, only the temperature information on infrared thermal imaging is used, and rich useful information contained in the visible light image is not considered, so that the advantages of infrared thermal imaging and corresponding information of the visible light image are not combined, the influence of environmental temperature and humidity factors on thermal imaging is not considered, the inaccurate acquisition of the body temperature data of the laying hen is caused, and the accurate judgment on the health state of the laying hen is influenced. Meanwhile, although some fusion algorithms are preliminarily researched at present, the fusion algorithms do not consider weighting factors and cannot analyze the content of a source image, and fusion image information is usually lost and deformed, so that infrared thermal imaging and visible light images cannot be fused by adopting corresponding technical means in practice, images meeting inspection requirements cannot be obtained, and body temperature information of laying hens can be acquired more accurately.
Disclosure of Invention
The embodiment of the invention provides a poultry body temperature detection method based on image fusion, which is used for solving the problem that the poultry body temperature is difficult to accurately obtain only through thermal imaging in poultry house inspection at present.
The embodiment of the invention provides a poultry house inspection system based on an image fusion poultry body temperature detection method.
In order to solve the above technical problem, an aspect of the embodiments of the present invention provides a poultry body temperature detection method based on image fusion, including: s1, acquiring a thermal imaging image and a visible light image of the poultry; s2, carrying out image fusion on the thermal imaging image and the visible light image to obtain a fusion image containing heat source characteristic information and visible light information of the poultry; and S3, establishing a deep neural network model, performing feature recognition on the key parts of the poultry in the fusion image, acquiring temperature information of the key parts of the poultry on the fusion image, and converting the temperature information of the key parts of the poultry to acquire body temperature information of the poultry, wherein the deep neural network model is obtained by training the fusion image acquired in S2 as a sample and the fusion image which is corresponding to the fusion image and is marked with the key parts of the poultry as a label.
Wherein, still include: and S4, comparing the body temperature information of the poultry with the normal body temperature threshold value of the poultry in the corresponding growth period, and identifying the poultry with abnormal body temperature.
Wherein said image fusing said thermographic image with said visible light image in S2 further comprises: s21, performing image preprocessing to obtain the thermal imaging image and the visible light image with the same size; s22, taking the visible light image as a basic image and taking the thermal imaging image as an input image to register the image; and S23, performing image decomposition on the basic image and the input image by adopting binary tree discrete wavelet transform, performing wavelet coefficient fusion on the decomposed images, and obtaining the fused image by utilizing inverted binary tree discrete wavelet transform.
Wherein, S23 includes wavelet coefficient fusion of the decomposed image by the following formula: for wavelet fusion coefficients, M (x, y) is the preprocessed visible image, N (x, y) is the preprocessed thermographic image, w1Is the weight of M (x, y), w2Is the weight of N (x, y), where w1And w2The weighted value of (2) is determined by a self-correcting particle swarm optimization algorithm.
Wherein, still include: optimizing wavelet fusion coefficients by adopting the self-correcting particle swarm optimization algorithm to obtain the data on w shown in the following formula1And w2When the fused image is determined to reach the preset matching degree and be converged, acquiring w in the weight matrix1And w2Optimal set of weight values:
where w is the entropy of the maximum fused image, w1iIs w1The ith weight value of (1)2iIs w2I is more than or equal to 1 and less than or equal to Y, and Y is a natural number more than 1.
Wherein the deep neural network model is established based on a YOLO image recognition algorithm, wherein the key parts of the poultry comprise the poultry head.
Wherein, still include: acquiring a fusion image of a plurality of target birds in S2, acquiring temperature maps of key parts of each target bird in the fusion image in S3, averaging the temperature maps of the key parts of each target bird, and determining corresponding body temperature maps of the plurality of target birds based on the current ambient temperature.
The embodiment of the invention also provides a poultry house inspection system, which comprises an inspection robot, wherein a first sensing assembly and a second sensing assembly are arranged on the inspection robot; the inspection robot inspects the poultry house by adopting the poultry body temperature detection method based on image fusion to obtain inspection information; wherein, first sensing component is used for gathering the thermal imaging image and the visible light image information of poultry in the pouity dwelling place, second sensing component is used for gathering the current environmental parameter information of pouity dwelling place, it includes the body temperature information of poultry with environmental parameter information to patrol and examine the information.
The inspection robot sends the inspection information to a cloud server through a wireless network.
One or more technical solutions in the embodiments of the present invention have at least one of the following technical effects:
the poultry body temperature detection method based on image fusion comprehensively considers the physical characteristics of the thermal imaging image and the visible light image, obtains the fusion image capable of comprehensively representing the heat source characteristic information and the visible light information of poultry through image fusion, performs characteristic identification on the key parts of the poultry through the deep neural network model, and obtains the temperature map information of each key part of the poultry, so that the body temperature information of the poultry can be converted according to the temperature map information of the key parts of the poultry.
According to the invention, based on the fusion of thermal imaging and visible light images, the intelligent detection of the body temperature of the poultry is realized through the feature recognition of the deep neural network model and the corresponding temperature conversion, the accuracy of the detection result is ensured, and the inspection requirements of poultry houses in various breeding environments are met.
According to the poultry house inspection system provided by the embodiment of the invention, due to the adoption of the poultry body temperature detection method based on image fusion, the inspection efficiency of the poultry house is greatly improved on the basis of ensuring the accuracy of inspection information, compared with manual inspection, the inspection labor intensity of inspection personnel is effectively reduced, the risk of cross infection caused by inspection of the inspection personnel on a plurality of poultry houses is avoided, the problem of large gaps of the inspection personnel at present is especially solved, the poultry breeding cost is saved, and the economic benefit of poultry breeding is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flow chart of a poultry body temperature detection method based on image fusion according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating the identification of key parts of poultry according to an embodiment of the present invention;
FIG. 3 is a flowchart of optimizing image fusion by a self-correcting particle swarm optimization algorithm according to an embodiment of the present invention;
FIG. 4 is a flow chart of temperature monitoring of chickens according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of the inspection robot according to the embodiment of the present invention.
In the figure: 1. a mobile platform; 2. a lifting mechanism; 3. a first sensing component; 4. a second sensing component; 5. and a sensing acquisition terminal.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Referring to fig. 1, the present embodiment provides a poultry body temperature detection method based on image fusion, including: s1, acquiring a thermal imaging image and a visible light image of the poultry; s2, carrying out image fusion on the thermal imaging image and the visible light image to obtain a fusion image containing the heat source characteristic information and the visible light information of the poultry; s3, establishing a deep neural network model, performing feature recognition on the key parts of the poultry in the fusion image, acquiring temperature information of the key parts of the poultry on the fusion image, and converting the temperature information of the key parts of the poultry to acquire body temperature information of the poultry, wherein the deep neural network model is obtained by training the fusion image acquired in S2 as a sample and the fusion image which is marked with the key parts of the poultry and corresponds to the fusion image as a label.
Specifically, the poultry body temperature detection method shown in this embodiment comprehensively considers the physical characteristics of the thermal imaging image and the visible light image, obtains a fusion image capable of comprehensively characterizing the heat source characteristic information and the visible light information of the poultry through image fusion, performs characteristic identification on the poultry key part through the deep neural network model, and obtains the temperature map information of the poultry key part on the fusion image, so that the poultry body temperature information can be converted according to the temperature map information of the poultry key part.
According to the invention, based on the fusion of thermal imaging and visible light images, the intelligent detection of the body temperature of the poultry is realized through the feature recognition of the deep neural network model and the corresponding temperature conversion, the accuracy of the detection result is ensured, and the inspection requirements of poultry houses in various breeding environments are met.
It should be noted that the poultry shown in this embodiment includes chickens, ducks, geese, and the like, and since the laying hens are usually raised in a coop, the poultry body temperature detection method shown in this embodiment is particularly suitable for detecting the body temperature of the laying hens.
Meanwhile, the key parts of the poultry refer to parts of the poultry body having characteristic shapes so as to be easily recognized and photographed, such as the head, the wings and the like of the poultry, and since the head of the poultry is easily recognized and photographed compared with the wings, especially for chickens, the parts with the highest body surface temperature are the parts of the chicken head, and are more easily recognized in the fused image, so that when the feature recognition is performed on the fused image through the deep neural network model, the key parts shown in the embodiment are preferably the image positions corresponding to the poultry head in the fused image.
When the body temperature information of the poultry is converted, the conversion can be carried out based on the mapping relation between the temperature information of the key parts of the poultry and the body temperature information of the poultry, and in order to further ensure the accuracy of the conversion result, the conversion result can be corrected by referring to the current environment temperature of the poultry house.
In addition, when the inspection robot shown in the following embodiment is used for inspecting the poultry house, the thermal imaging camera and the visible light camera are respectively installed on the inspection robot, the thermal imaging camera is used for collecting the thermal imaging images of the poultry in the poultry house, and the visible light camera is used for collecting the visible light images of the poultry in the poultry house.
The thermal imaging camera is preferably a non-refrigeration thermal imaging camera, and a relatively simple temperature controller can be used to stabilize the temperature at or close to the ambient temperature because the non-refrigeration thermal imaging camera does not adopt a low-temperature refrigeration working principle.
The thermal imaging camera adopts the germanium sheet to protect the lens, and as the infrared temperature measuring instrument and the thermal imaging instrument need to use the filters of middle and far infrared rays when in use, the general working wave bands of the infrared temperature measuring instrument and the thermal imaging instrument are 2-13 mu m, and the germanium sheet has good light transmission property just in the middle and far infrared rays, and the transmittance of the common optical glass in the wave bands is extremely low. The optical film is plated on the germanium sheet, so that the transmittance can be greatly increased, and the reflectivity of the surface of the germanium glass is reduced. The germanium sheet is opaque in the visible light band. Therefore, the germanium sheet is selected as the protective lens of the thermal imaging camera, so that the situation that the monitoring accuracy is influenced due to the fact that the thermal imaging lens is directly polluted by dust in a poultry house is prevented, and the germanium sheet is used for conveniently wiping.
Wherein the temperature monitoring range of the thermal imaging camera covers the monitoring range of-20 ℃ to 60 ℃. Considering the inspection at two sides of the channel in the poultry house, the angle of view of the thermal imaging camera is designed to be larger than 30 degrees, and the larger the angle of view is in a certain range, the better the thermal imaging camera is. It is desirable that the thermal image of the head of the poultry being tested be at least greater than 2 pixels, and preferably greater than 10 pixels. The frame frequency of the thermal imaging camera can be adjusted to be larger than 1HZ, and the requirements can be met. In consideration of the cost of thermal imaging, the present embodiment selects one thermal imaging camera, and a plurality of cameras can be used if the cost is not considered. Simultaneously, still require that thermal imaging camera's base is adjustable to adjust into so that with the angle of visible light integration, be about to thermal imaging camera through adjustable pedestal mounting on patrolling and examining the robot. In addition, the thermal imaging camera is required to monitor poultry on two sides of the passageway of the poultry house and poultry on one side, and the monitoring range of the thermal imaging camera is consistent with that of the visible light camera during monitoring, so that the dual-mode fusion of the thermal imaging image and the visible light image is carried out.
Wherein, the visible light camera adopts the camera that can adapt to under the low visibility condition, and its near-infrared lamp requires can not light, in order to produce stress to the poultry. The focal length of the lens of the visible light camera is required to be less than 4mm so as to shoot the poultry at a close distance. Meanwhile, the visible light camera adopts a non-spherical distortion camera, and the field angle range of the visible light camera is required to be capable of covering poultry on one side. Two visible light cameras are installed and are respectively used for monitoring poultry on two sides of the passage. In addition, visible light cameras are also used to monitor the overall condition of the poultry house, such as: feeding condition of poultry, poultry excrement condition, drinking condition of poultry and the like.
Preferably, this embodiment further includes: and S4, comparing the body temperature information of the poultry with the normal body temperature threshold value of the poultry in the corresponding growth period, and identifying the poultry with abnormal body temperature.
Specifically, the normal body temperatures of the poultry in different growth periods are inconsistent, so that the normal body temperature threshold value of the poultry in the corresponding growth period can be established based on historical statistical data of the normal body temperatures of the poultry in different growth periods, the body temperature information of the poultry obtained in the step S3 is compared with the normal body temperature threshold value of the poultry in the corresponding growth period, the poultry with abnormal body temperature can be identified, the feeding personnel can find and research problems existing in the feeding process in time, corresponding precautionary measures can be taken in time, and the breeding benefit is improved.
Preferably, as shown in fig. 2, the image fusing the thermal imaging image and the visible light image in S2 in this embodiment further includes: s21, preprocessing the image to obtain a thermal imaging image and a visible light image with the same size; s22, taking the visible light image as the basic image and the thermal imaging image as the input image to register the image; and S23, performing image decomposition on the basic image and the input image by adopting binary tree discrete wavelet transform, performing wavelet coefficient fusion on the decomposed image, and obtaining a fused image by utilizing inverted binary tree discrete wavelet transform.
Specifically, the step S21 is to ensure that the two types of images have the same size by preprocessing the thermal imaging image and the visible light image, so as to satisfy the basic requirements of the image registration of the step S22 and the image fusion of the step S23.
In step S22, the thermographic image and the visible light image are both registered by affine transformation.
Since the binary tree discrete wavelet transform has a good performance for processing a large-sized image, the binary tree discrete wavelet transform provides phase information and has characteristics of direction selectivity and displacement invariance, so that the binary tree discrete wavelet transform is used for image decomposition in step S23 to decompose the registered image into high-frequency and low-frequency sub-bands.
The binary tree discrete wavelet transform of the set image Q may be represented by f, which may be represented in different scales as: f ═ f1,f2,f3,...,f,QIn which QApproximate or low frequency subband representing the final decomposition level, fHigh frequency subbands representing levels. f. ofThe directional subband consists of 12 directional subbands, wherein 6 types are real and 6 types are imaginary, and the following formula is shown in detail:
wherein,respectively representing a real directional subband and a fictitious directional subband, and so on, the other five real directional subbands and the other five fictitious directional subbands in the directional subband group can be obtained.
Meanwhile, step S23 includes performing wavelet coefficient fusion on the decomposed image using the following formula: for wavelet fusion coefficients, M (x, y) is the preprocessed visible image, N (x, y) is the preprocessed thermographic image, w1Is the weight of M (x, y), w2Is the weight of N (x, y), where w1And w2The weighted value of (2) is determined by a self-correcting particle swarm optimization algorithm. Thereby w1And w2The percentage of the visible light image coefficient and the thermal imaging image coefficient in the fused image is determined, and the information loss and the change of the image spectral characteristics can be reduced.
Further, determining w by employing a self-correcting particle swarm optimization algorithm1And w2Weighted value of (2), self-calibration particleThe subgroup optimization algorithm can increase entropy and reduce root mean square error for the wavelet coefficient fusion model. In this embodiment, the wavelet fusion coefficient is optimized by the self-correcting particle swarm optimization algorithm to obtain the value of w1And w2When determining that the fused image reaches the preset matching degree and converges, acquiring w in the weight matrix1And w2Optimal set of weight values:
where w is the entropy of the maximum fused image, w1iIs w1The ith weight value of (1)2iIs w2I is more than or equal to 1 and less than or equal to Y, and Y is a natural number more than 1.
Thus, w may be obtained when determining wavelet coefficient fusion1And w2An optimal set of weight values to efficiently obtain the information and reduce the loss.
Specifically, fig. 3 illustrates a flow chart for optimizing image fusion using a self-correcting particle swarm optimization algorithm. When image fusion is performed, w is initialized first1And w2Inputting the decomposed visible light and the corresponding frequency sub-band of thermal imaging into the wavelet coefficient fusion model, optimizing the wavelet coefficient fusion model by a self-correcting particle swarm optimization algorithm, and obtaining a group of w1And w2The weighted value of the wavelet coefficient is obtained by utilizing the inverse binary tree discrete wavelet transform, the matching degree of the fused image is checked, whether the fused image is converged or not is determined, if the fused image is not converged, the calculation is invalid, the wavelet coefficient fusion model needs to be optimized by returning through the self-correcting particle swarm optimization algorithm again, and a group of w is obtained by calculation again1And w2And repeating the above steps until it is finally determined that the calculation result converges, thereby obtaining an optimal set of w1And w2And storing the weight values.
This embodiment is to obtain an optimal set of w1And w2After the weighted value, the following formula is adopted to carry out the turnover binary tree separationWavelet transform of the scatter to obtain a fused image:
wherein, P is a fusion image, and T is binary tree discrete wavelet transform.
Preferably, the establishing a deep neural network model in S3 in this embodiment further includes:
s31, dividing the fusion images of the poultry into a training set and a testing set;
s32, inputting the training set into a deep neural network model for training to extract the characteristic information of the key parts of the poultry, and generating network parameters of the deep neural network model after the positions of the key parts obtained according to the characteristic information meet the set iteration times;
and S33, testing the network parameters of the deep neural network model by using the test set, and acquiring the optimal deep neural network model of the optimal network parameters when the average precision and the recall rate of the test meet the set threshold values.
Preferably, the deep neural network model in the embodiment is established based on a YOLO image recognition algorithm, and the key parts of the poultry for recognition comprise the heads of the poultry.
Specifically, in consideration of the real-time performance of detection and identification of the inspection robot and the requirement of algorithm calculation amount, a YOLO image identification algorithm is selected as a core algorithm of a target identification link, wherein the YOLO image identification algorithm is used as a deep convolutional neural network framework for simplifying the nomination process of a target candidate frame, the detection problem of a target object is simulated into a convolutional neural network of a regression problem, and the network model consists of a region division input layer, a convolutional layer, a pooling layer, a full connection layer and an output layer.
The YOLO image recognition algorithm first segments the target picture into a plurality of meshes, each mesh being input to the input layer and passing the computation backward. If the center of the object falls within a certain grid, the grid will detect the object in the area. The confidence value represents whether the target object is contained in the current grid or not and the accuracy of the position of the target object, and whether the target object (such as the head of the poultry) exists in the grid or not is determined according to the confidence value score obtained by each grid, so that the determination and the identification of the target object are completed.
Preferably, this embodiment further includes: a fusion image of the target birds is acquired in S2, a temperature map of each of key parts of the target birds in the fusion image is acquired in S3, the temperature maps of the key parts of the target birds are averaged, and the body temperature maps of the target birds are obtained using the current ambient temperature correction.
Specifically, since the poultry raised in the poultry house are usually densely distributed, in the actual shooting, each thermal imaging image and each visible light image may simultaneously include image information of a plurality of target poultry, so that based on the image fusion processing in step S2, a fusion image including a plurality of target poultry can be obtained, and further, when performing feature recognition through the deep neural network model, the key parts of a plurality of target poultry can be simultaneously recognized on the fusion image at one time, and therefore, according to the mapping comparison relationship between the temperature of the key parts (heads) of the poultry and the body temperature of the poultry, and based on the current environmental temperature correction, the body temperature maps of the plurality of target poultry in the fusion image can be correspondingly converted, so that the efficiency of detecting the body temperature of the poultry in the poultry house is greatly improved, and the accuracy of the detection result is also ensured, the average value of the temperature maps of the key parts of the target poultry is obtained, and the accuracy of the acquired temperature information of the key parts of the poultry is further ensured.
As shown in fig. 4, the present embodiment is described in detail again by taking the example that the body temperature of the chicken commonly found in the poultry is inspected, and the detection process is as follows:
after the detection is started, acquiring a thermal imaging image and a visible light image of the chicken; then, carrying out image preprocessing and registration, and carrying out image fusion on the thermal imaging image and the visible light image obtained by registration to obtain a fusion image containing heat source characteristic information and visible light information of the chicken; then introducing the fused image into a deep neural network model to perform chicken head target identification, after the identification is completed, calculating the average value of the chicken head area temperature, and converting each chicken head temperature and body temperature in a target image to obtain the body temperature information of the chicken; then, comparing the body temperature information of the chickens with the normal body temperature threshold value of the corresponding growth period to identify the chickens with abnormal body temperature; and finally, recording the position of the chicken with abnormal body temperature in the picture and the position of the inspection robot, recording infrared thermal imaging, finishing abnormal identification and ending.
Preferably, as shown in fig. 5, the embodiment further provides a poultry house inspection system, which includes an inspection robot, wherein the inspection robot is provided with a first sensing assembly 3 and a second sensing assembly 4; the inspection robot inspects the poultry house by adopting the poultry body temperature detection method based on image fusion to obtain inspection information; wherein, first sensing component is used for gathering the thermal imaging image and the visible light image information of poultry in the pouity dwelling place, and second sensing component is used for gathering the current environmental parameter information of pouity dwelling place, and it includes the body temperature information and the environmental parameter information of poultry to patrol and examine information, and wherein, it still can differentiate the unusual poultry of body temperature based on the body temperature information of poultry to patrol and examine the robot to further confirm the positional information of the unusual poultry of body temperature based on the position of the unusual poultry of body temperature in fusing the image.
Specifically, the pouity dwelling place system of patrolling and examining that this embodiment provided, owing to adopted the above-mentioned poultry body temperature detection method based on image fusion, on the basis of guaranteeing to patrol and examine the information accuracy, still promoted the efficiency of patrolling and examining the pouity dwelling place by a wide margin, for the manual work is patrolled and examined, effectively reduced the intensity of labour that patrols and examines of patrolling and examining personnel, avoid patrolling and examining personnel to a plurality of pouity dwelling places and patrol and examine and produce cross infection's risk, especially solved the big problem of current personnel's breach of patrolling and examining, practiced thrift poultry breeding cost, and improved the economic.
It should be noted that in the present embodiment, the first sensing assembly 3 includes the thermal imaging camera and the visible light camera shown in the above embodiments, so as to respectively acquire the thermal imaging image and the visible light image information of the poultry in the poultry house.
Meanwhile, the current environmental parameter information of the poultry house collected by the second sensing assembly 4 comprises environmental parameter information such as temperature, humidity, harmful gas and sound, and the corresponding sensing detection devices can be a temperature sensor, a humidity sensor, a harmful gas sensor and a sound sensor which are known in the art in sequence, wherein each sensing detection device corresponding to the second sensing assembly 4 is respectively in communication connection with the sensing collection terminal 5, and the sensing collection terminal 5 comprises a data transmission unit which is known in the art so as to perform data conversion on data detected by each sensing detection device corresponding to the second sensing assembly 4. Harmful gases in the poultry house comprise ammonia gas, hydrogen sulfide gas, carbon dioxide gas and the like, and can be collected by adopting corresponding types of harmful gas sensors; the sound sensor functions as a microphone for receiving sound waves as known in the art, and by collecting the sound information of the poultry, it can be used to further perform abnormal sound analysis to judge the feeding state of the poultry.
As shown in fig. 5, the inspection robot in this embodiment includes a moving platform 1 and a lifting mechanism 2, and the first sensing component 3 and the second sensing component 4 are separately installed on the lifting mechanism 2 to reach corresponding preset heights under the driving of the lifting mechanism 2, so as to better meet the requirements of poultry health inspection at different positions of a poultry house at different heights; the lifting mechanism 2 can preferably adopt a lead screw lifting mechanism known in the art, and comprises a stepping motor, a lead screw, two sliding rails, two lifting sliding tables and the like, wherein the stepping motor and the lead screw form driving connection, the lead screw and the sliding rails are arranged along the vertical direction, the two lifting sliding tables are respectively provided with a first sensing assembly 3 and a second sensing assembly 4, and the two lifting sliding tables are in sliding connection with the sliding rails and are also in threaded fit with the lead screw.
Meanwhile, a depth camera can be arranged on the inspection robot so as to estimate the weight of the poultry; in addition, the inspection robot further has a walking path autonomous planning function, an automatic obstacle avoidance function and an automatic charging function so as to meet the requirement of automatic inspection.
In the embodiment, the inspection robot sends the inspection information to the cloud server through the wireless network.
Specifically, the robot of patrolling and examining that this embodiment is shown still includes controller and wireless communication module, first sensing subassembly 3 and second sensing subassembly 4 of controller difference communication connection, wherein, the second sensing subassembly 4 is connected in the above-mentioned 5 communication of shown sensing acquisition terminal of controller accessible, and connect cloud ware through wireless communication module communication, and specifically with WIFI, 3G, 4G, wireless networks such as 5G will patrol and examine information transmission to cloud ware, wherein, every robot of patrolling and examining is equipped with independent unique address, conveniently register at cloud ware, establish the account number at cloud ware, the robot of patrolling and examining of registering is bound with the pouity dwelling place. Therefore, the patrol personnel can access the cloud server through a computer or a handheld terminal and remotely check the current position information of the poultry house of the patrol robot, the picture information of the poultry with abnormal body temperature and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (9)
1. A poultry body temperature detection method based on image fusion is characterized by comprising the following steps:
s1, acquiring a thermal imaging image and a visible light image of the poultry;
s2, carrying out image fusion on the thermal imaging image and the visible light image to obtain a fusion image containing heat source characteristic information and visible light information of the poultry;
s3, establishing a deep neural network model, carrying out feature recognition on the poultry key parts in the fusion image, acquiring the temperature information of the poultry key parts on the fusion image, and converting according to the temperature information of the poultry key parts to obtain the body temperature information of the poultry;
the deep neural network model is obtained by training using the fusion image obtained in S2 as a sample and using the fusion image labeled with the poultry key part corresponding to the fusion image as a label.
2. The poultry body temperature detection method based on image fusion as claimed in claim 1, further comprising: and S4, comparing the body temperature information of the poultry with the normal body temperature threshold value of the poultry in the corresponding growth period, and identifying the poultry with abnormal body temperature.
3. The method for detecting body temperature of poultry based on image fusion as claimed in claim 1, wherein said image fusion of said thermographic image and said visible light image in S2 further comprises:
s21, performing image preprocessing to obtain the thermal imaging image and the visible light image with the same size;
s22, taking the visible light image as a basic image and taking the thermal imaging image as an input image to register the image;
and S23, performing image decomposition on the basic image and the input image by adopting binary tree discrete wavelet transform, performing wavelet coefficient fusion on the decomposed images, and obtaining the fused image by utilizing inverted binary tree discrete wavelet transform.
4. The method for detecting body temperature of poultry based on image fusion as claimed in claim 1, wherein S23 includes wavelet coefficient fusion of the decomposed image using the following formula:
5. The poultry body temperature detection method based on image fusion according to claim 4The method is characterized by further comprising the following steps: optimizing wavelet fusion coefficient by adopting the self-correcting particle swarm optimization algorithm to obtain w shown in the following formula1And w2When the fused image is determined to reach the preset matching degree and be converged, acquiring w in the weight matrix1And w2Optimal set of weight values:
where w is the entropy of the maximum fused image, w1iIs w1The ith weight value of (1)2iIs w2I is more than or equal to 1 and less than or equal to Y, and Y is a natural number more than 1.
6. The poultry body temperature detection method based on image fusion according to claim 1, wherein the deep neural network model is established based on a YOLO image recognition algorithm, wherein the poultry key parts comprise poultry heads.
7. The poultry body temperature detection method based on image fusion as claimed in claim 1, further comprising: acquiring a fusion image of a plurality of target birds in S2, acquiring temperature maps of key parts of each target bird in the fusion image in S3, averaging the temperature maps of the key parts of each target bird, and determining corresponding body temperature maps of the plurality of target birds based on the current ambient temperature.
8. A poultry house inspection system comprises an inspection robot and is characterized in that,
the inspection robot is provided with a first sensing assembly and a second sensing assembly;
the inspection robot inspects the poultry house by adopting the poultry body temperature detection method based on image fusion according to any one of claims 1 to 7 to obtain inspection information;
wherein, first sensing component is used for gathering the thermal imaging image and the visible light image information of poultry in the pouity dwelling place, second sensing component is used for gathering the current environmental parameter information of pouity dwelling place, it includes the body temperature information of poultry with environmental parameter information to patrol and examine the information.
9. The aviary inspection system according to claim 8,
and the inspection robot sends the inspection information to a cloud server through a wireless network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010314114.9A CN111626985A (en) | 2020-04-20 | 2020-04-20 | Poultry body temperature detection method based on image fusion and poultry house inspection system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010314114.9A CN111626985A (en) | 2020-04-20 | 2020-04-20 | Poultry body temperature detection method based on image fusion and poultry house inspection system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111626985A true CN111626985A (en) | 2020-09-04 |
Family
ID=72258982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010314114.9A Pending CN111626985A (en) | 2020-04-20 | 2020-04-20 | Poultry body temperature detection method based on image fusion and poultry house inspection system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111626985A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112927816A (en) * | 2021-01-27 | 2021-06-08 | 广东顺德鲁棒智能技术有限公司 | Body temperature detection system, body temperature trend prediction method and system |
CN112945395A (en) * | 2021-03-17 | 2021-06-11 | 西藏新好科技有限公司 | Livestock and poultry animal body temperature evaluation method based on target detection |
CN113223033A (en) * | 2021-05-10 | 2021-08-06 | 广州朗国电子科技有限公司 | Poultry body temperature detection method, device and medium based on image fusion |
CN113223035A (en) * | 2021-06-07 | 2021-08-06 | 南京农业大学 | Intelligent inspection system for cage-rearing chickens |
CN113396840A (en) * | 2021-06-16 | 2021-09-17 | 福州木鸡郎智能科技有限公司 | Robot is patrolled and examined in chicken of dying of illness judgement |
CN113947555A (en) * | 2021-09-26 | 2022-01-18 | 国网陕西省电力公司西咸新区供电公司 | Infrared and visible light fused visual system and method based on deep neural network |
CN114407051A (en) * | 2022-03-07 | 2022-04-29 | 烟台艾睿光电科技有限公司 | Livestock and poultry farm inspection method and livestock and poultry farm robot |
WO2022120564A1 (en) * | 2020-12-08 | 2022-06-16 | 深圳迈瑞生物医疗电子股份有限公司 | Method and apparatus for temperature measurement, and storage medium |
CN114743224A (en) * | 2022-06-13 | 2022-07-12 | 金乡县康华乳业有限公司 | Animal husbandry livestock body temperature monitoring method and system based on computer vision |
CN114898405A (en) * | 2022-05-27 | 2022-08-12 | 南京农业大学 | Portable broiler chicken abnormity monitoring system based on edge calculation |
CN114980011A (en) * | 2022-07-01 | 2022-08-30 | 浙江大学 | Livestock and poultry body temperature monitoring system and method with cooperation of wearable sensor and infrared camera |
CN115119766A (en) * | 2022-06-16 | 2022-09-30 | 天津农学院 | Sow oestrus detection method based on deep learning and infrared thermal imaging |
CN115937791A (en) * | 2023-01-10 | 2023-04-07 | 华南农业大学 | Poultry counting method and device suitable for multiple breeding modes |
WO2023130804A1 (en) * | 2022-01-06 | 2023-07-13 | 华南农业大学 | Meat duck physiological growth information inspection method and system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107491781A (en) * | 2017-07-21 | 2017-12-19 | 国家电网公司 | A kind of crusing robot visible ray and infrared sensor data fusion method |
CN108428224A (en) * | 2018-01-09 | 2018-08-21 | 中国农业大学 | Animal body surface temperature checking method and device based on convolutional Neural net |
CN109060143A (en) * | 2018-08-16 | 2018-12-21 | 南京农业大学 | Boar Thermometer System and method based on thermal infrared technology |
WO2019061293A1 (en) * | 2017-09-29 | 2019-04-04 | 深圳市大疆创新科技有限公司 | Object detection method, object detection terminal, and computer readable medium |
CN110200598A (en) * | 2019-06-12 | 2019-09-06 | 天津大学 | A kind of large-scale plant that raises sign exception birds detection system and detection method |
-
2020
- 2020-04-20 CN CN202010314114.9A patent/CN111626985A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107491781A (en) * | 2017-07-21 | 2017-12-19 | 国家电网公司 | A kind of crusing robot visible ray and infrared sensor data fusion method |
WO2019061293A1 (en) * | 2017-09-29 | 2019-04-04 | 深圳市大疆创新科技有限公司 | Object detection method, object detection terminal, and computer readable medium |
CN108428224A (en) * | 2018-01-09 | 2018-08-21 | 中国农业大学 | Animal body surface temperature checking method and device based on convolutional Neural net |
CN109060143A (en) * | 2018-08-16 | 2018-12-21 | 南京农业大学 | Boar Thermometer System and method based on thermal infrared technology |
CN110200598A (en) * | 2019-06-12 | 2019-09-06 | 天津大学 | A kind of large-scale plant that raises sign exception birds detection system and detection method |
Non-Patent Citations (1)
Title |
---|
刘波: "基于多源图像的生猪体表温度和步态特征提取方法的研究" * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022120564A1 (en) * | 2020-12-08 | 2022-06-16 | 深圳迈瑞生物医疗电子股份有限公司 | Method and apparatus for temperature measurement, and storage medium |
CN112927816A (en) * | 2021-01-27 | 2021-06-08 | 广东顺德鲁棒智能技术有限公司 | Body temperature detection system, body temperature trend prediction method and system |
CN112927816B (en) * | 2021-01-27 | 2021-11-23 | 广东顺德鲁棒智能技术有限公司 | Body temperature detection system, body temperature trend prediction method and system |
CN112945395A (en) * | 2021-03-17 | 2021-06-11 | 西藏新好科技有限公司 | Livestock and poultry animal body temperature evaluation method based on target detection |
CN113223033A (en) * | 2021-05-10 | 2021-08-06 | 广州朗国电子科技有限公司 | Poultry body temperature detection method, device and medium based on image fusion |
CN113223035A (en) * | 2021-06-07 | 2021-08-06 | 南京农业大学 | Intelligent inspection system for cage-rearing chickens |
CN113223035B (en) * | 2021-06-07 | 2023-08-22 | 南京农业大学 | Intelligent inspection system for cage chickens |
CN113396840A (en) * | 2021-06-16 | 2021-09-17 | 福州木鸡郎智能科技有限公司 | Robot is patrolled and examined in chicken of dying of illness judgement |
CN113947555A (en) * | 2021-09-26 | 2022-01-18 | 国网陕西省电力公司西咸新区供电公司 | Infrared and visible light fused visual system and method based on deep neural network |
WO2023130804A1 (en) * | 2022-01-06 | 2023-07-13 | 华南农业大学 | Meat duck physiological growth information inspection method and system |
CN114407051A (en) * | 2022-03-07 | 2022-04-29 | 烟台艾睿光电科技有限公司 | Livestock and poultry farm inspection method and livestock and poultry farm robot |
CN114898405A (en) * | 2022-05-27 | 2022-08-12 | 南京农业大学 | Portable broiler chicken abnormity monitoring system based on edge calculation |
CN114898405B (en) * | 2022-05-27 | 2023-08-25 | 南京农业大学 | Portable broiler chicken anomaly monitoring system based on edge calculation |
CN114743224A (en) * | 2022-06-13 | 2022-07-12 | 金乡县康华乳业有限公司 | Animal husbandry livestock body temperature monitoring method and system based on computer vision |
CN115119766A (en) * | 2022-06-16 | 2022-09-30 | 天津农学院 | Sow oestrus detection method based on deep learning and infrared thermal imaging |
CN115119766B (en) * | 2022-06-16 | 2023-08-18 | 天津农学院 | Sow oestrus detection method based on deep learning and infrared thermal imaging |
CN114980011A (en) * | 2022-07-01 | 2022-08-30 | 浙江大学 | Livestock and poultry body temperature monitoring system and method with cooperation of wearable sensor and infrared camera |
CN114980011B (en) * | 2022-07-01 | 2023-04-07 | 浙江大学 | Livestock and poultry body temperature monitoring method with cooperation of wearable sensor and infrared camera |
WO2024001791A1 (en) * | 2022-07-01 | 2024-01-04 | 浙江大学 | Livestock and poultry body temperature monitoring system and method with wearable sensors and infrared cameras |
CN115937791A (en) * | 2023-01-10 | 2023-04-07 | 华南农业大学 | Poultry counting method and device suitable for multiple breeding modes |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111626985A (en) | Poultry body temperature detection method based on image fusion and poultry house inspection system | |
Virlet et al. | Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring | |
CN109977813A (en) | A kind of crusing robot object localization method based on deep learning frame | |
CN110567964B (en) | Method and device for detecting defects of power transformation equipment and storage medium | |
CN102081039A (en) | Environment-controllable hyperspectral image detecting device for crop nutrition and moisture | |
CN114037552B (en) | Method and system for polling physiological growth information of meat ducks | |
CN103489006A (en) | Computer vision-based rice disease, pest and weed diagnostic method | |
CN114898238B (en) | Wild animal remote sensing identification method and device | |
US20220307971A1 (en) | Systems and methods for phenotyping | |
CN111696139A (en) | System and method for estimating group weight of white feather breeding hens based on RGB image | |
CN114847196B (en) | Intelligent beehive and bee identification tracking counting system based on deep learning | |
CN111028378A (en) | Unmanned aerial vehicle inspection system and inspection method for fishing complementary photovoltaic power station | |
CN114898405B (en) | Portable broiler chicken anomaly monitoring system based on edge calculation | |
CN211746249U (en) | Device for identifying chickens in cage | |
CN114460080A (en) | Rice disease and pest intelligent monitoring system | |
CN118470550B (en) | Natural resource asset data acquisition method and platform | |
CN116593404A (en) | Hyperspectral remote sensing monitoring data processing system based on surface water pollution factors | |
CN116646082A (en) | Disease early warning traceability system based on abnormal chicken manure under laminated cage raising mode | |
CN116189076A (en) | Observation and identification system and method for bird observation station | |
CN115656202A (en) | Multiband optical detection device for surface state of insulator | |
KR20220129706A (en) | System for counting fruit quantity and method thereof | |
CN111096252A (en) | Device and method for identifying chickens in cage | |
CN115015495B (en) | Dynamic close-range miniature intelligent sensor for quality of growing spherical fruits and vegetables | |
KR20240111344A (en) | System and method for managing crops pest | |
KR102597253B1 (en) | Smart Farm System Through Video-based Crop Cultivation Growth Platform Construction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |