WO2021191976A1 - 体重推定装置、体重推定方法及びプログラム - Google Patents
体重推定装置、体重推定方法及びプログラム Download PDFInfo
- Publication number
- WO2021191976A1 WO2021191976A1 PCT/JP2020/012751 JP2020012751W WO2021191976A1 WO 2021191976 A1 WO2021191976 A1 WO 2021191976A1 JP 2020012751 W JP2020012751 W JP 2020012751W WO 2021191976 A1 WO2021191976 A1 WO 2021191976A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- estimation
- weight
- point cloud
- cloud data
- pig
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 46
- 238000004364 calculation method Methods 0.000 claims abstract description 29
- 230000037396 body weight Effects 0.000 claims description 37
- 230000037237 body shape Effects 0.000 claims 1
- 238000012545 processing Methods 0.000 description 34
- 238000012937 correction Methods 0.000 description 22
- 238000007781 pre-processing Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 13
- 244000144972 livestock Species 0.000 description 13
- 241000282887 Suidae Species 0.000 description 7
- 238000012549 training Methods 0.000 description 5
- 238000012952 Resampling Methods 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 241000283690 Bos taurus Species 0.000 description 1
- 241000283086 Equidae Species 0.000 description 1
- 241000287828 Gallus gallus Species 0.000 description 1
- 241001494479 Pecora Species 0.000 description 1
- 235000013330 chicken meat Nutrition 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01G—WEIGHING
- G01G9/00—Methods of, or apparatus for, the determination of weight, not provided for in groups G01G1/00 - G01G7/00
Definitions
- the present invention relates to a weight estimation device, a weight estimation method and a program.
- Livestock farmers have traditionally been able to keep track of the weight of the livestock they are raising. This is because, for example, in domestic animals such as pigs, the value as meat may decrease when the body weight exceeds a certain level.
- One embodiment of the present invention has been made in view of the above points, and an object of the present invention is to estimate the body weight with high accuracy.
- the weight estimation device includes an acquisition means for acquiring three-dimensional point cloud data representing a depth value at each point of the weight estimation target, and the acquisition means acquired by the acquisition means.
- the weight can be estimated with high accuracy.
- a weight estimation system 1 capable of estimating the weight of a pig with high accuracy
- a pig is assumed as an example of livestock for which the weight is estimated, but the livestock is not limited to pigs, and is, for example, various livestock such as cows, horses, sheep, and chickens. May be good.
- the body weight estimation target is not limited to livestock, but may be, for example, pet animals (so-called pets), wild animals, and the like.
- FIG. 1 is a diagram showing an example of the overall configuration of the weight estimation system 1 according to the present embodiment.
- the body weight estimation system 1 includes a body weight estimation device 10 for estimating body weight and a camera device 20 for photographing a pig P, which is a body weight estimation target.
- the weight estimation device 10 and the camera device 20 are communicably connected to each other by, for example, wirelessly, wiredly, or both.
- the camera device 20 is a digital camera, a smartphone, a tablet terminal, or the like used by a photographer who shoots a pig P. The photographer can obtain imaging data by photographing the pig P from above the pig P using the camera device 20.
- the camera device 20 may be a camera fixedly installed at a predetermined position.
- the camera device 20 is provided with a depth sensor, measures the depth of each point within the shooting range, and generates shooting data indicating the height of each of these points. Therefore, the shooting data generated by the camera device 20 is represented by a point cloud of (x, y, z), where each position in the shooting range is (x, y) and the depth value at each of these positions is z. NS.
- the shooting data represented by the point cloud (x, y, z) will be referred to as “point cloud data”.
- the body weight estimation device 10 is a computer or computer system that estimates the body weight of the pig P from the point cloud data generated by the camera device 20.
- the body weight estimation device 10 estimates the body weight of the pig P by a regression equation calculated using the learning data prepared in advance.
- the training data is data for calculating a regression equation. For example, it is represented by a set of a point cloud data generated by photographing a pig P whose body weight is known and a correct weight indicating the weight of the pig P. Will be done.
- the regression equation may be called a regression model or the like, and is an example of an estimation model for estimating the weight of the estimation target.
- the weight estimation device 10 includes a weight estimation processing unit 100 and a storage unit 110.
- the weight estimation processing unit 100 is realized by processing one or more programs installed in the weight estimation device 10 to be executed by a processor such as a CPU (Central Processing Unit).
- the storage unit 110 can be realized by using an auxiliary storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive), for example.
- the body weight estimation device 10 uses the weight estimation processing unit 100 to calculate a regression equation for estimating the weight of the pig P using the training data prepared in advance, and the weight of the pig P is calculated by this regression equation. Execute the "weight estimation process" to estimate. Further, the storage unit 110 contains information necessary for executing the learning process and the body weight estimation process and various processing results (for example, learning data, a regression equation, point group data of the pig P to be weight-estimated, a body weight estimation result, and the like. ) Is memorized.
- the overall configuration of the weight estimation system 1 shown in FIG. 1 is an example, and may be another configuration.
- the weight estimation system 1 may include a plurality of camera devices 20.
- a part or all of the weight estimation processing unit 100 may be realized by a function provided by a cloud service or the like.
- the shooting conditions when the photographer shoots the pig P using the camera device 20 will be described.
- the photographing screen 1000 shown in FIG. 2 is displayed on the display or the like of the camera device 20.
- a guide 1100 for adjusting the shooting position of the pig P is displayed on the shooting screen 1000.
- the guide 1100 includes a guide 1110 indicating the center of the photographing range.
- the shooting direction is the positive direction of the z-axis
- the downward direction of the shooting screen 1000 (that is, the head direction of the pig P) is the positive direction of the x-axis
- the right direction of the shooting screen 1000 (that is, the left side direction of the pig P). Is the positive direction of the y-axis.
- the photographer moves the camera device 20 above the pig P, adjusts the contour of the pig P to match the frame indicated by the guide 1100, and then presses the shooting button 1200 to shoot the pig P. ..
- the shooting button 1200 to shoot the pig P. ..
- the point cloud data 2000 generated by photographing a certain pig P with the camera device 20 will be described.
- the point cloud data 2000 is a set of points (x, y, z) in the xyz space.
- the head direction of the pig P is the positive direction of the x-axis
- the left side direction of the pig P is the positive direction of the y-axis
- the shooting direction of the camera device 20 is. It is the positive direction of the z-axis.
- FIG. 4 is a diagram showing an example of the functional configuration of the body weight estimation processing unit 100 according to the present embodiment.
- the acquisition unit 101 acquires (reads) the learning data stored in the storage unit 110 in the learning process.
- the learning data is data represented by a set of a point cloud data generated by photographing a pig P whose body weight is known and a correct answer weight indicating the weight of the pig P. be.
- the acquisition unit 101 acquires the point cloud data generated by photographing the pig P to be weight-estimated from the storage unit 110 (or the camera device 20) in the weight estimation process.
- the pre-processing unit 102 performs a predetermined pre-processing on the point cloud data acquired by the acquisition unit 101 in the learning process and the weight estimation process.
- preprocessing for example, when points other than the tonp P portion are included in the point cloud data, such points are removed, or the point cloud of the tonp P portion is resampled and converted to a predetermined resolution. Resampling and processing can be mentioned.
- the pretreatment is a treatment for improving the accuracy of estimating the body weight of the pig, and does not necessarily have to be performed.
- the image creation unit 103 creates a projected image in which the point cloud data after the preprocessing is projected onto the xy plane in the learning process and the weight estimation process.
- the processing unit 104 performs processing for removing the head portion and the tail portion of the pig P from the point cloud data and the projected image after the preprocessing. This is because the head and tail parts of the pig move during shooting, and the head and tail parts are included or not included in the point cloud data, which adversely affects the weight estimation accuracy.
- the index value calculation unit 105 calculates a predetermined index value by using the point cloud data after processing and the projected image after processing in the learning process and the weight estimation process.
- the following four index values are calculated.
- the regression equation calculation unit 106 calculates the regression equation using the index value calculated by the index value calculation unit 105 and the correct body weight in the learning process.
- the regression equation is represented by the following equation (1).
- Estimated weight a 1 x (corrected projected area) + a 2 x (posture correction value) + a 3 x (shooting height correction value) + a 4 x (body length) / (body width) + b ...
- a 1 , a 2 , a 3 , a 4 , and b are parameters of the regression equation.
- the regression equation calculation unit 106 calculates the regression equation by estimating these parameters a 1 , a 2 , a 3 , a 4 , and b by a known method (for example, the least squares method).
- the parameter corresponding to the unused index value (that is, the parameter to be multiplied by the index value) is selected. It may be set to 0.
- the body weight estimation unit 107 estimates the weight of the pig P by the regression equation shown in the above equation (1) using the index value calculated by the index value calculation unit 105.
- the parameters a 1 , a 2 , a 3 , a 4 , and b estimated in the learning process are used.
- the preprocessing unit 102 performs a predetermined preprocessing on the point cloud data included in the learning data acquired in step S101 above (step S102).
- preprocessing for the point cloud data it is assumed that points other than the tonp P portion are removed and conversion to a predetermined resolution is performed by resampling the point cloud of the tonp P portion.
- the conversion to a predetermined resolution by resampling means that each point included in the point cloud data is resampled and a three-dimensional point cloud group having a predetermined predetermined interval (for example, the interval between adjacent points) is set. It means to obtain a point cloud (such as a point cloud that is 1.0 cm).
- the image creation unit 103 uses the point cloud data preprocessed in step S102 above (or, if no preprocessing is performed, the point cloud data included in the training data acquired in step S101 above). ) Is projected onto the xy plane to create a projected image (step S103).
- a projected image is shown in FIG.
- the projected image 3000 shown in FIG. 6 is an image obtained by projecting the point cloud data 2000 onto a certain xy plane, and includes the pig region 3100 on which the point cloud of the pig P portion of the point cloud data 2000 is projected.
- the point 3110 is the origin of the xy plane, and the point at the position indicated by the guide 1110 in FIG. 2 is projected onto the xy plane.
- the image creation unit 103 may calculate a dividing line 3120 that divides the pig region 3100 in the projected image 3000 into left and right (left and right with respect to the traveling direction when the pig walks).
- a dividing line 3120 is when the x-coordinate value included in the pig region 3100 is x 1 , x 2 , ..., X M, and the x-coordinate value is x i (1 ⁇ i ⁇ M).
- the processing unit 104 uses the point cloud data preprocessed in step S102 above (or, if no preprocessing is performed, the point cloud data included in the learning data acquired in step S101 above). And the projected image created in step S103 above, processing is performed to remove the head portion and the tail portion of the pig P (step S104).
- the processing unit 104 performs the processing according to the following Steps 1 to 7. In the following, as an example, a case where the point cloud data 2000 shown in FIG. 3 and the projected image 3000 shown in FIG. 6 are processed will be described.
- Step2 Next, the processing unit 104 creates a circle S 11 that is a point on the boundary of the pig region 3100 and fits the point included in the search region R 11 (that is, a circle that approximates these point groups). presume. Similarly, the processing section 104 estimates a circle S 11 that is a point on the boundary of the pig region 3100 and fits the point included in the search region R 21. Hereinafter, these circles will be referred to as "fit circle candidates".
- the fit circle candidate can be estimated by the least squares method. For example, refer to "https://myenigma.hatenablog.com/entry/2015/1507/214600".
- R 1j ⁇ (x, y)
- x 1j ⁇ x ⁇ x 1j + L ⁇ and R 2j ⁇ (x, y)
- d 1 , d 2 > 0 are preset values.
- the processing unit 104 slides the search area R 1 (j-1) by d 1 in the x-axis direction to make the search area R 1j, and sets the search area R 2 (j-1) in the x-axis direction. Slide d 2 to obtain the search area R 2j .
- Step4 Next, the processing unit 104 estimates a circle S1j that is a point on the boundary of the pig region 3100 and fits the point included in the search region R1j. Similarly, the processing unit 104 estimates a circle S 11 that is a point on the boundary of the pig region 3100 and fits the point included in the search region R 2j.
- Steps 3 to 4 described above are repeated, for example, with the range of values that the x coordinate can take in the projected image 3000 as X 1 ⁇ x ⁇ X 2 until x 21 ⁇ L ⁇ X 1 and X 2 ⁇ x 1j + L are satisfied. Will be executed. As a result, a set of fit circle candidates ⁇ S 1j ⁇ and ⁇ S 2j ⁇ can be obtained.
- Step5 Then, the processing unit 104 from the set of fit circle candidate ⁇ S 1j ⁇ with selecting fit circle S 1 for removing head portion, from the set of fit circle candidate ⁇ S 2j ⁇ to select the fit circle S 2 to remove the tail portion.
- the processing unit 104 from the set of fit circle candidate ⁇ S 1j ⁇ with selecting fit circle S 1 for removing head portion, from the set of fit circle candidate ⁇ S 2j ⁇ to select the fit circle S 2 to remove the tail portion.
- a method of selecting the fit circle S 1 will be described.
- x-coordinate is the largest-fit circle candidate at medium (that is, an error takes a minimum value, and fit circle candidate on the most right side) may be selected as fit circle S 1.
- fit circle S 2 when selecting the fitting circle S 2 from the degree of change of the error is smallest x-coordinate among the fit circle candidate error takes a minimum value fit circle candidate (i.e., an error takes a minimum value, and fit circle candidate on the leftmost) is selected as the fit circle S 2.
- Step 6 Then, the processing unit 104 removes the head portion and the tail portion in the pig region 3100 by using the fit circles S 1 and S 2 obtained in the above Step 5, respectively.
- the machined portion 104 selects the straight line parallel to the y-axis and in contact with the fit circle S 1 with the larger x-coordinate value as T 1. as deletes the region x-coordinate value is greater than the straight line T 1 in pigs region 3100. As a result, the head portion in the pig region 3100 is removed.
- the processing unit 104 is parallel to the y-axis, and, among the line tangent to fit circle S 2, a straight line towards the x-coordinate value is smaller As T 2 , the region of the pig region 3100 whose x-coordinate value is smaller than that of the straight line T 2 is deleted. As a result, the tail portion in the pig region 3100 is removed.
- a large x-coordinate value of the center of the fit circle S 1, and each on the circumference of the fit circle S 1 of The area in which the x-coordinate value is larger than the point may be deleted.
- the x-coordinate value is smaller than the center of the fitting circle S 2
- an area in the range x-coordinate value is smaller than the points on the circumference of the fitting circle S 2 You may.
- both the head portion and the tail portion in the pig region 3100 were removed, but for example, instead of removing both, only one of them may be removed. Further, when the area of the area to be removed (deleted) is equal to or less than a predetermined threshold value, the area may not be removed.
- Step 7 Finally, the processed portion 104 is projected onto the head portion and the tail portion when creating the point cloud (that is, the projected image 3000) corresponding to the portions (head portion and tail portion) removed in Step 6 above.
- the point cloud is removed from the point cloud data 2000.
- the index value calculation unit 105 calculates the corrected projected area, the posture correction value, the shooting height correction value, and the body length and body width as index values (step S105). The calculation method of each index value will be described below.
- 1.0 / sin ⁇ ⁇ 1.12 corrects the x-axis direction of the side of the unit region V 1 ⁇ V 5 by (i.e., multiplied by 1.0 / sin [theta), corrected
- the rear unit area is W 1 to W 5 .
- the area of the corrected unit area W 1 ⁇ W 5 calculates the area of the pig region 3100. This area is the corrected projection area.
- the side in the x-axis direction of the unit region is corrected, but the side in the y-axis direction may be corrected, and both the side in the x-axis direction and the side in the y-axis direction are corrected. May be done.
- the posture correction value is calculated as an index value.
- f be a polynomial function that approximates the points (x, y, z) corresponding to the points (x, y) on the dividing line 3120 described in FIG. That is, let f be a polynomial function such that z ⁇ f (x, y), (where (x, y) is a point on the dividing line 3120).
- a function f can be obtained by, for example, a library such as polyfit.
- a predetermined coefficient of the polynomial function f (for example, a quadratic coefficient of x, a quadratic coefficient of y, a coefficient of xy, etc.) is used as the posture correction value.
- a predetermined coefficient of the polynomial function f for example, a quadratic coefficient of x
- the posture correction value for example, a quadratic coefficient of x
- the distance from the camera device 20 to the pig P varies depending on the height of the photographer, the length of the hand, and the like, which may affect the accuracy of weight estimation. Therefore, the shooting height correction value is calculated as an index value.
- the body length and width are calculated as index values.
- a straight line 3130 that passes through the point 3110 (that is, the origin of the xy plane) and ends at a point on the boundary of the pig region 3100 is calculated and is orthogonal to the dividing line 3120. Then, the length of this straight line 3130 may be the body width.
- the points (x, y, z) corresponding to the points (x, y) on the dividing line 3120 are defined.
- the function to be approximated may be f, and the length of the curve 2110 represented by this function f may be the body length.
- the regression equation calculation unit 106 calculates the regression equation using the index value and the correct body weight corresponding to each learning data (step S106). That is, the regression equation calculation unit 106 initializes the parameters a 1 , a 2 , a 3 , a 4 , and b to appropriate initial values, and then uses the index value corresponding to the training data for each training data. The estimated body weight is calculated by the above formula (1). Then, the regression equation calculation unit 106 calculates the difference between the estimated body weight and the correct body weight corresponding to the learning data, and then minimizes the sum of the squares of the differences calculated for each learning data. , Parameters a 1 , a 2 , a 3 , a 4 , b are estimated. As a result, a regression equation for estimating the body weight of pig P is calculated.
- FIG. 13 is a flowchart showing an example of the weight estimation process according to the present embodiment.
- the parameters a 1 , a 2 , a 3 , a 4 , and b of the regression equation shown in the above equation (1) those estimated by the above learning process are used.
- the acquisition unit 101 acquires the point cloud data generated by photographing the pig P whose body weight is to be estimated from the storage unit 110 (or the camera device 20) (step S201).
- the preprocessing unit 102 performs a predetermined preprocessing on the point cloud data acquired in the above step S201 in the same manner as in step S102 of FIG. 5 (step S202).
- the image creation unit 103 is acquired in the point cloud data (or, if the preprocessing is not performed, in the above step S201) that was preprocessed in the above step S202, similarly to the step S103 in FIG.
- a projected image is created by projecting the point cloud data) onto the xy plane (step S203).
- the processing unit 104 was acquired in the point cloud data (or, if the preprocessing is not performed, in the above step S201) that was preprocessed in the above step S202, similarly to the step S104 in FIG.
- the point cloud data) and the projected image created in step S203 are processed to remove the head portion and the tail portion of the pig P (step S204).
- the index value calculation unit 105 calculates the corrected projected area, the posture correction value, the shooting height correction value, and the body length and body width as index values (step S205).
- the body weight estimation unit 107 estimates the weight of the pig P by the regression equation shown in the above equation (1) using the index value calculated in the above step S205 (step S206). From this, the weight of the pig P is estimated.
- Weight estimation system 10 Weight estimation device 20 Camera device 100 Weight estimation processing unit 101 Acquisition unit 102 Preprocessing unit 103 Image creation unit 104 Processing unit 105 Index value calculation unit 106 Regression formula calculation unit 107 Weight estimation unit 110 Storage unit
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/012751 WO2021191976A1 (ja) | 2020-03-23 | 2020-03-23 | 体重推定装置、体重推定方法及びプログラム |
JP2022509787A JPWO2021191976A1 (enrdf_load_stackoverflow) | 2020-03-23 | 2020-03-23 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/012751 WO2021191976A1 (ja) | 2020-03-23 | 2020-03-23 | 体重推定装置、体重推定方法及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021191976A1 true WO2021191976A1 (ja) | 2021-09-30 |
Family
ID=77891082
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/012751 WO2021191976A1 (ja) | 2020-03-23 | 2020-03-23 | 体重推定装置、体重推定方法及びプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2021191976A1 (enrdf_load_stackoverflow) |
WO (1) | WO2021191976A1 (enrdf_load_stackoverflow) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114001810A (zh) * | 2021-11-08 | 2022-02-01 | 厦门熵基科技有限公司 | 一种体重计算方法及装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012519277A (ja) * | 2009-02-27 | 2012-08-23 | ボディー サーフェイス トランスレーションズ, インコーポレイテッド | 三次元表示を使用する物理パラメータの推定 |
US20130064432A1 (en) * | 2010-05-19 | 2013-03-14 | Thomas Banhazi | Image analysis for making animal measurements |
JP2019045478A (ja) * | 2017-09-06 | 2019-03-22 | 国立大学法人 宮崎大学 | 家畜の体重推定装置及び家畜の体重推定方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61114108A (ja) * | 1984-11-09 | 1986-05-31 | Ishida Scales Mfg Co Ltd | 重量判定方法 |
JP2002286421A (ja) * | 2001-03-28 | 2002-10-03 | Hideo Minagawa | 動物の体重測定装置 |
WO2010127277A2 (en) * | 2009-05-01 | 2010-11-04 | Texas Tech University System | Remote contactless stereoscopic mass estimation system |
US8755570B2 (en) * | 2011-04-27 | 2014-06-17 | Steve Gomas | Apparatus and method for estimation of livestock weight |
JP6083638B2 (ja) * | 2012-08-24 | 2017-02-22 | 国立大学法人 宮崎大学 | 動物体の体重推定装置、及び体重推定方法 |
JP6559197B2 (ja) * | 2017-09-01 | 2019-08-14 | Nttテクノクロス株式会社 | 体重出力装置、体重出力方法及びプログラム |
JP7284500B2 (ja) * | 2018-04-25 | 2023-05-31 | 株式会社ノア | 体重推定装置 |
JP7057971B2 (ja) * | 2018-06-06 | 2022-04-21 | 全国農業協同組合連合会 | 動物体の体重推定装置及び体重推定方法 |
-
2020
- 2020-03-23 JP JP2022509787A patent/JPWO2021191976A1/ja active Pending
- 2020-03-23 WO PCT/JP2020/012751 patent/WO2021191976A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012519277A (ja) * | 2009-02-27 | 2012-08-23 | ボディー サーフェイス トランスレーションズ, インコーポレイテッド | 三次元表示を使用する物理パラメータの推定 |
US20130064432A1 (en) * | 2010-05-19 | 2013-03-14 | Thomas Banhazi | Image analysis for making animal measurements |
JP2019045478A (ja) * | 2017-09-06 | 2019-03-22 | 国立大学法人 宮崎大学 | 家畜の体重推定装置及び家畜の体重推定方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114001810A (zh) * | 2021-11-08 | 2022-02-01 | 厦门熵基科技有限公司 | 一种体重计算方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2021191976A1 (enrdf_load_stackoverflow) | 2021-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6564018B2 (ja) | 放射線画像の肺野セグメンテーション技術及び骨減弱技術 | |
US20230245341A1 (en) | Positioning device, estimation method, and non-transitory computer-readable medium | |
US10796186B2 (en) | Part recognition method, information processing apparatus, and imaging control system | |
JP2023129574A (ja) | 体重推定装置、体重推定方法及びプログラム | |
WO2022209435A1 (ja) | コンピュータプログラム、モデル生成方法、推定方法及び推定装置 | |
WO2021191976A1 (ja) | 体重推定装置、体重推定方法及びプログラム | |
US12214489B2 (en) | Control device, control system, control method, and recording medium with control program recorded thereon | |
US20230267593A1 (en) | Workpiece measurement method, workpiece measurement system, and program | |
CN115135973A (zh) | 重量推定装置和程序 | |
WO2021191975A1 (ja) | 体重推定装置、体重推定方法及びプログラム | |
US20210404843A1 (en) | Information processing apparatus, control method for information processing apparatus, and storage medium | |
CN114187659A (zh) | 用于对猪只姿态进行识别的姿态识别方法及其相关产品 | |
US20230419468A1 (en) | Image processing apparatus, image processing method, and image processing program | |
JP7277829B2 (ja) | カメラパラメータ推定装置、カメラパラメータ推定方法及びカメラパラメータ推定プログラム | |
CN118311955A (zh) | 无人机控制方法、终端、无人机及存储介质 | |
CN111524167A (zh) | 一种移动放射源定位校正与优化方法 | |
JP2017167671A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2010187130A (ja) | カメラ校正装置、カメラ校正方法、カメラ校正プログラムおよびそのプログラムを記録した記録媒体。 | |
CN113409268B (zh) | 基于单目相机的可通行区域检测方法、装置及存储介质 | |
CN110570680A (zh) | 利用地图信息确定对象位置的方法和系统 | |
CN114782537A (zh) | 基于3d视觉的人体颈动脉定位方法及装置 | |
JP4468019B2 (ja) | 画像処理装置 | |
JP2020180905A (ja) | レール曲率推定装置 | |
JP2006195790A (ja) | レンズ歪推定装置、レンズ歪推定方法、及びレンズ歪推定プログラム | |
JP7309953B1 (ja) | サイズ算出方法、サイズ算出装置、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20926782 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022509787 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20926782 Country of ref document: EP Kind code of ref document: A1 |