WO2022202793A1 - Système d'évaluation de condition de croissance - Google Patents

Système d'évaluation de condition de croissance Download PDF

Info

Publication number
WO2022202793A1
WO2022202793A1 PCT/JP2022/013100 JP2022013100W WO2022202793A1 WO 2022202793 A1 WO2022202793 A1 WO 2022202793A1 JP 2022013100 W JP2022013100 W JP 2022013100W WO 2022202793 A1 WO2022202793 A1 WO 2022202793A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
animal
unit
point cloud
evaluation system
Prior art date
Application number
PCT/JP2022/013100
Other languages
English (en)
Japanese (ja)
Inventor
裕 竹村
彩乃 矢羽田
俊和 阿出川
宗央 横山
Original Assignee
学校法人東京理科大学
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 学校法人東京理科大学, 株式会社トプコン filed Critical 学校法人東京理科大学
Publication of WO2022202793A1 publication Critical patent/WO2022202793A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • This disclosure relates to a growing condition evaluation system.
  • BCS Body Condition Score
  • the BCS is considered a health condition estimation device that can be automatically determined without touching the animal (see Patent Document 1, for example).
  • the health condition estimating device obtains a three-dimensional coordinate group indicating the three-dimensional shape of the animal by using a range image sensor, and based on the three-dimensional coordinate group, a feature value indicating the width of the body of the animal, Characteristic values indicating the position of the spine are obtained, and the BCS is calculated based on them. Therefore, the conventional health condition estimating apparatus can obtain an evaluation based on the BCS with reduced variation while reducing the burden on the animal.
  • a range image sensor is used to acquire a group of three-dimensional coordinates of an animal. For this reason, in the conventional health condition estimating device, there is a limit to the reproduction of the three-dimensional shape of the animal with the acquired three-dimensional coordinate group. It is difficult to get to.
  • the present disclosure has been made in view of the above circumstances, and aims to provide a growth situation evaluation system that can appropriately obtain an evaluation of the growth situation of animals.
  • the growth situation evaluation system of the present disclosure includes a laser beam that acquires point cloud data representing the outline of the animal in three-dimensional coordinates by receiving the reflected light from the animal of the emitted laser beam.
  • a measuring device a surface data generation unit that generates three-dimensional surface data of the animal based on the point cloud data, an approximate curve calculation unit that calculates an approximate curve that fits the three-dimensional surface data, and the approximate curve and a breeding status evaluation unit that generates evaluation data indicating the evaluation of the animal based on the evaluation.
  • the growth situation evaluation system of the present disclosure it is possible to appropriately obtain an evaluation of the animal's growth situation.
  • FIG. 1 is an explanatory diagram showing the overall configuration of a growing condition evaluation system of Example 1 as an example of a growing condition evaluation system according to the present disclosure
  • FIG. It is a block diagram which shows the structure of the control system in a growing condition evaluation system. It is explanatory drawing which shows a mode that the imaging device in a growing condition evaluation system is attached. It is explanatory drawing which shows the image which the dairy cow was imaged by the imaging device. It is explanatory drawing which shows the laser measuring device in a growth condition evaluation system.
  • 3 is a block diagram showing the configuration of a control system in the laser measuring instrument; FIG. It is explanatory drawing which shows the point cloud data acquired by one laser measuring device (1st).
  • FIG. 10 is an explanatory diagram showing synthesized point cloud data obtained by synthesizing both point cloud data; It is an explanatory view showing animal point cloud data showing a dairy cow.
  • FIG. 4 is an explanatory diagram showing three-dimensional surface data obtained by meshing animal point cloud data;
  • FIG. 10 is an explanatory diagram showing surface data near the buttocks obtained by cutting out the vicinity of the buttocks from the three-dimensional surface data;
  • FIG. 4 is an explanatory diagram showing each slice position for cutting out cross-sectional data near the buttocks from plane data near the buttocks;
  • FIG. 14 is an explanatory diagram showing cross-sectional data in the vicinity of the buttocks, in which data corresponding to each slice position in FIG. 13 are arranged in order from the left.
  • FIG. 10 is an explanatory diagram showing how an approximate curve of the contour of cross-sectional data near the buttocks is obtained; It is explanatory drawing which shows buttock vicinity cross-sectional data (its outline) and an approximated curve as an example. It is a flowchart which shows the growing condition evaluation process (cultivating condition evaluation processing method) performed by the control mechanism of a growing condition evaluation system.
  • Example 1 of the growing condition evaluation system 10 as one embodiment of the growing condition evaluation system according to the present disclosure will be described below with reference to FIGS. 1 to 17 .
  • 1 and 3 schematically show the vicinity of the drinking fountain 52 (data acquisition area 14) in the barn 50, and does not necessarily match the actual state of the barn 50.
  • FIG. 1 and 3 schematically show the vicinity of the drinking fountain 52 (data acquisition area 14) in the barn 50, and does not necessarily match the actual state of the barn 50.
  • the growth status evaluation system 10 automatically evaluates the growth status of animals.
  • the growing condition evaluation system 10 of Example 1 evaluates the growing condition of a Holstein cow (hereinafter referred to as dairy cow 51) as an example of an animal.
  • dairy cow 51 a Holstein cow
  • This growing condition evaluation system 10 is installed in a cow barn 50 as shown in FIGS.
  • a plurality of dairy cows 51 are kept in the cowshed 50, and each dairy cow 51 can be moved.
  • a cowshed 50 is provided with a drinking fountain 52 for a dairy cow 51 .
  • the drinking fountain 52 is constructed by placing a water tub 53 in a space in which a plurality of dairy cows 51 can enter.
  • the water tub 53 is elongated and arranged at one corner of the drinking fountain 52 .
  • a dairy cow 51 periodically visits the drinking fountain 52 of its own volition and stops in front of the water tub 53 . Therefore, in the growing condition evaluation system 10 , the drinking fountain 52 is set as the data acquisition area 14 .
  • the control mechanism 11 comprehensively controls the operation of the growing condition evaluation system 10 by deploying a program stored in the storage unit 18 or the built-in internal memory 11a, for example, on a RAM (Random Access Memory). to control.
  • the internal memory 11a is composed of a RAM or the like
  • the storage unit 18 is composed of a ROM (Read Only Memory), an EEPROM (Electrically Erasable Programmable ROM), or the like.
  • the growing condition evaluation system 10 includes a printer that prints the measurement results in response to a measurement completion signal or an instruction from the measurer, an output unit that outputs the measurement results to an external memory or a server, and an operation controller. An audio output unit for notifying the situation and the like is provided as appropriate.
  • the control mechanism 11 is connected to a camera 12 and two laser measuring devices 13 (one is designated as the first and the other is designated as the second in FIG. 2) and appropriately controls them, and signals (data) from them are It is possible to receive This connection may be wired or wireless as long as it is possible to exchange signals with the camera 12 and each laser measuring device 13 .
  • the control mechanism 11 may be provided at a position different from the cowshed 50 or may be provided inside the cowshed 50 .
  • An operation unit 15 , a display unit 16 , a communication unit 17 and a storage unit 18 are connected to the control mechanism 11 .
  • the operation unit 15 is for operating various settings for evaluating the growth situation, operations and settings of the camera 12 and each laser measuring device 13, and the like.
  • the operation unit 15 may be composed of an input device such as a keyboard and a mouse, or may be composed of software keys displayed on the display screen of the display unit 16 of a touch panel type.
  • the display unit 16 displays an image I acquired by the camera 12 (either a still image or a moving image (see FIG. 4)), point cloud data D1 acquired by each laser measuring device 13 (see FIGS. 7 and 8), and Various data (see FIGS. 9 to 14, etc.), evaluation data Ev indicating the evaluation of the animal (dairy cow 51), etc. are displayed.
  • the display unit 16 is configured by a liquid crystal display device (LCD monitor) as an example, and is provided in the control mechanism 11 together with the operation unit 15 .
  • the operation unit 15 and the display unit 16 may be composed of a mobile terminal such as a smart phone or a tablet, and are not limited to the configuration of the first embodiment.
  • the communication unit 17 communicates with the camera 12, each laser measuring device 13 (the communication unit 45 thereof), and an external device, and drives the camera 12 and each laser measuring device 13, and receives an image I from the camera 12 and each device. Reception of the point cloud data D1 from the laser measuring instrument 13 is enabled.
  • the control mechanism 11 can be configured by a tablet terminal, and in that case, the operation unit 15 and the display unit 16 can be configured integrally with the control mechanism 11 .
  • the camera 12 is capable of photographing the entire data acquisition area 14, and in Example 1, a 4K camera (having a resolution of 3840 ⁇ 2160 pixels) is used. As shown in FIGS. 1 and 3, the camera 12 is attached to the installation plate 54 above the drinking fountain 52 which is the data acquisition area 14, and can photograph the drinking fountain 52 without disturbing the dairy cows 51. and The installation plate 54 is provided on a post 55 of the barn 50 for installation of the camera 12 .
  • the camera 12 always captures the water fountain 52 while the growing condition evaluation system 10 is in operation, and outputs the captured image I (the data (see FIG. 4)) to the control mechanism 11 .
  • the two laser measuring devices 13 are installed at known points, project pulse laser beams toward the measurement points, receive reflected light (pulse reflected light) of the pulse laser beams from the measurement points, and Distance measurement is performed, and the distance measurement results are averaged to perform highly accurate distance measurement.
  • Each laser measuring device 13 scans the set measurement range and evenly sets measurement points over the entire measurement range, so that the surface shape of the object existing over the entire measurement range is three-dimensionally measured.
  • a collection of three-dimensional position data (hereinafter referred to as point cloud data D1) indicated by original coordinates can be obtained.
  • each laser measuring device 13 can acquire the point cloud data D1 representing the surface shape of the object existing within the set measurement range.
  • the two laser measuring devices 13 are positioned so as to form a pair across the drinking fountain 52, which is the data acquisition area 14, and at a height that does not interfere with the movement of the dairy cow 51. placed in position. Both laser measurement devices 13 acquire point cloud data D1 of at least half of the body (either left or right) near the buttocks 51a of the dairy cow 51, regardless of the posture of the cow 51 in the drinking fountain 52 (data acquisition area 14). A positional relationship with respect to the drinking fountain 52 is set so that it is possible.
  • These two laser measuring instruments 13 have the same configuration, except that they are provided at different positions.
  • the laser measuring device 13 may adopt a phase difference measuring method using a light beam modulated at a predetermined frequency, or may adopt another method, and is not limited to the first embodiment.
  • the laser measuring instrument 13 includes a base 31, a body portion 32, and a vertical rotating portion 33, as shown in FIGS.
  • the pedestal 31 is a portion that is attached to the installation base 34 .
  • the installation base 34 is attached to a post 55 of the barn 50 for installation of the laser measuring device 13 .
  • the body portion 32 is provided on the base 31 so as to be rotatable about the vertical axis with respect to the base 31 .
  • the body portion 32 has a U shape as a whole, and a vertical rotating portion 33 is provided in the portion therebetween.
  • a measuring instrument display section 35 and a measuring instrument operating section 36 are provided in the main body section 32 .
  • the measuring instrument display section 35 is a section that displays various operation icons, settings, etc. for measurement under the control of the measuring instrument control section 43, which will be described later.
  • the measuring device operation unit 36 is a place where operations for using and setting various functions of the laser measuring device 13 are performed, and information that has been input and operated is output to the measuring device control unit 43 .
  • various operation icons displayed on the measuring instrument display section 35 function as the measuring instrument operating section 36 .
  • the measuring device display unit 35 and the measuring device operation unit 36 are provided on the installation table 34 of the support 55 by providing the above functions to the operation unit 15 and the display unit 16 in the growing condition evaluation system 10. It is possible to operate remotely while keeping the As will be described later, if the laser measuring device 13 is capable of scanning the data acquisition area 14 (an animal such as the dairy cow 51 there), the location and installation method can be appropriately set. Well, it is not limited to the configuration of the first embodiment.
  • the vertical rotation part 33 is provided on the main body part 32 so as to be rotatable around a rotation axis extending in the horizontal direction.
  • a range finding optical unit 37 is incorporated in the vertical rotation unit 33 .
  • the distance measuring optical unit 37 projects a pulsed laser beam as distance measuring light and receives reflected light (pulse reflected light) from the measuring point to measure the light wave distance to the measuring point.
  • a horizontal rotation drive section 38 and a horizontal angle detection section 39 are provided in the body section 32 that allows the vertical rotation section 33 to rotate around the horizontal axis.
  • the horizontal rotation driving section 38 rotates the body section 32 with respect to the pedestal 31 around the vertical axis, that is, in the horizontal direction.
  • the horizontal angle detection unit 39 detects (measures) the horizontal angle in the collimation direction by detecting the horizontal rotation angle of the main body 32 with respect to the pedestal 31 .
  • the horizontal rotation drive section 38 can be configured with a motor
  • the horizontal angle detection section 39 can be configured with an encoder.
  • the body portion 32 is provided with a vertical rotation drive portion 41 and a vertical angle detection portion 42 .
  • the vertical rotation drive section 41 rotates the vertical rotation section 33 with respect to the body section 32 around the horizontal axis, that is, in the vertical direction.
  • the vertical angle detection unit 42 detects (measures) the vertical angle of the collimation direction by detecting the vertical angle of the vertical rotation unit 33 with respect to the main unit 32 .
  • the vertical rotation drive unit 41 can be configured with a motor
  • the vertical angle detection unit 42 can be configured with an encoder.
  • the main body part 32 incorporates a measuring device control part 43 .
  • the measuring device control unit 43 controls the operation of the laser measuring device 13 by a program stored in the connected storage unit 44 .
  • the storage unit 44 is configured by a semiconductor memory or various storage media, and stores programs such as a calculation program necessary for measurement and a data transmission program for generating and transmitting information to be transmitted. Group data D1 is stored appropriately. The information and data are appropriately transmitted to the control mechanism 11 via the communication section 45 described later and the communication section 17 (see FIG. 2) described above.
  • the measuring instrument control section 43 includes a measuring instrument display section 35 , a measuring instrument operation section 36 , a distance measuring optical section 37 , a horizontal rotation driving section 38 , a horizontal angle detecting section 39 , a vertical rotation driving section 41 , a vertical angle detecting section 42 , the storage unit 44 and the communication unit 45 are connected.
  • the communication unit 45 enables communication between the control mechanism 11 (see FIG. 2) and the measuring device control unit 43 via the communication unit 17, and each data stored in the storage unit 44 under the control of the measuring device control unit 43. Submit information accordingly.
  • the communication unit 45 enables exchange of data and the like with the control mechanism 11 (communication unit 17).
  • the communication unit 45 may perform wired communication with the communication unit 17 via a laid LAN cable, or may perform wireless communication with the communication unit 17 .
  • Output values for measurement from the distance measurement optical unit 37, the horizontal angle detection unit 39, and the vertical angle detection unit 42 are input to the measuring device control unit 43. Based on these output values, the measuring instrument control unit 43 determines the arrival time difference or phase difference between the reference light propagating through the reference optical path provided in the main unit 32 and the reflected light acquired via the distance measuring optical unit 37. to the measurement point (reflection point). In addition, the measuring device control unit 43 measures (calculates) the calculated elevation angle and horizontal angle when measuring the distance. Then, the measuring device control section 43 stores these measurement results in the storage section 44 and appropriately transmits them to the control mechanism 11 (communication section 17) via the communication section 45.
  • the measuring device control unit 43 controls the driving of the horizontal rotation driving unit 38 and the vertical rotation driving unit 41 to appropriately rotate the main body unit 32 and the vertical rotation unit 33 (see FIG. 1), so that the vertical rotation unit 33 can be directed in a predetermined direction and can scan a predetermined range.
  • the measuring device control unit 43 of the first embodiment scans the water fountain 52 which is the data acquisition area 14, and each position of the water fountain 52 including the dairy cow 51 there is taken as a measurement point.
  • the positional relationship between the two laser measuring devices 13 and the drinking fountain 52 is known in advance and is constant (does not change).
  • the measuring device control unit 43 of the first embodiment sets measurement points at intervals of 12.5 mm on the scanning plane.
  • the scanning plane can be appropriately set within the drinking fountain 52, and in Example 1, it is set at a position where the vicinity 51a of the buttocks of the dairy cow 51 is assumed to be located.
  • the measuring device control unit 43 controls the distance measuring optical unit 37 to perform distance measurement (distance measurement) for each set measurement point while scanning the drinking fountain 52 . At this time, the measuring device control unit 43 measures (calculates) the elevation angle and horizontal angle in the sighting direction, thereby measuring the three-dimensional coordinate position of each measurement point of the drinking fountain 52 . Then, the measuring device control unit 43 generates point cloud data D1 (one example is shown in FIG. 7 and the other example is shown in FIG. 8) is generated and transmitted to the control mechanism 11 via the communication unit 45 and the communication unit 17 as appropriate.
  • the point cloud data D1 indicates the surface shape of the drinking fountain 52, and if the dairy cow 51 is present at the drinking fountain 52, the surface shape of the dairy cow 51 is also included.
  • the control mechanism 11 specifies an animal detection unit 61, a data acquisition unit 62, a data synthesizing unit 63, an animal extraction unit 64, and a surface data generation unit 65 for evaluation of the growth status of the dairy cow 51. It includes a part cut-out portion 66 , a part volume calculation portion 67 , a cut surface extraction portion 68 , an approximate curve calculation portion 69 and a growth condition evaluation portion 71 .
  • the animal detection unit 61 detects from the image I acquired by the camera 12 that there is an animal (in the first embodiment, the dairy cow 51 (see FIG. 4)) at the drinking fountain 52 defined as the data acquisition area 14 .
  • the animal detection unit 61 recognizes various shapes in the image I based on the contrast and the like, and distinguishes between the facilities such as the water tub 53 in the drinking fountain 52 and the dairy cow 51 based on the recognized shapes and the like.
  • the animal detection unit 61 since the camera 12 is arranged at a predetermined position, the animal detection unit 61 knows in advance the image of the facilities such as the water trough 53 at the drinking fountain 52 , so that the image of the dairy cow 51 can be detected by using the image.
  • the animal detection section 61 outputs a signal to the data acquisition section 62 indicating that the dairy cow 51 has been detected at the drinking fountain 52 . Therefore, the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 in cooperation with the camera 12 .
  • the animal detection unit 61 of Example 1 individually identifies the detected animal (dairy cow 51) and generates individual identification data D2 indicating the identified information.
  • the animal detection unit 61 identifies which dairy cow 51 among the dairy cows kept in the barn 50 is shown, based on the image I in which the dairy cow 51 is shown.
  • the animal detection unit 61 recognizes the shape and position of the black and white pattern on the cow 51 based on the contrast and the like from the area where the cow 51 is projected in the image I, and the recognized black and white pattern is registered in advance.
  • the dairy cow 51 is specified by comparing with the black and white pattern of the dairy cow.
  • the animal detection unit 61 generates individual identification data D2 by identifying the dairy cow 51 for each individual according to the identification, and outputs the individual identification data D2 to the data acquisition unit 62 .
  • the animal detection unit 61 identifies which of the cows kept in the barn 50 the cow 51 displayed in the image I, that is, the cow 51 at the watering hole 52 at that time, is Other methods, such as using tags, may be used as long as the individual identification data D2 identifying the dairy cow 51 for each individual is generated based on the method, and the configuration is not limited to that of the first embodiment.
  • the data acquisition unit 62 When the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, it drives the two laser measurement devices 13 to scan the drinking fountain 52 (data acquisition area 14). Each laser measuring device 13 acquires the point cloud data D1 (see FIGS. 7 and 8) of the drinking fountain 52, associates the point cloud data D1 with the individual identification data D2, and outputs the data synthesizing unit 63 with the individual identification data D2.
  • the point cloud data D1 since the cow 51 is present at the drinking fountain 52 when the data acquisition unit 62 is driven as described above, the point cloud data D1 includes the cow 51 indicated by the individual identification data D2. Become.
  • the data synthesizing unit 63 synthesizes the point cloud data D1 acquired by the two laser measuring instruments 13 to generate synthesized point cloud data D3 (see FIG. 9).
  • the two laser measuring devices 13 are provided at positions that form a pair with the drinking fountain 52 interposed therebetween as described above, they measure the same drinking fountain 52 from mutually different directions.
  • the respective point cloud data D1 can be synthesized (so-called point cloud synthesis), different surfaces of the drinking fountain 52 (for example, one is the right side and the other is the left side of the cow 51 there) can be a collection of three-dimensional position data including the surface shape of
  • the data synthesizing unit 63 connects overlapping portions of point groups (so-called point group matching), uses targets that serve as landmarks (so-called tie point method), and uses other known techniques.
  • the point cloud data D1 are synthesized to generate synthesized point cloud data D3.
  • the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance.
  • the data synthesizing unit 63 can appropriately generate the synthesized point cloud data D3.
  • the data synthesizing unit 63 associates the generated synthetic point cloud data D3 with the individual identification data D2 and outputs the data to the animal extracting unit 64 .
  • the animal extracting unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the combined point cloud data D3 generated by the data synthesizing unit 63, thereby extracting the animal representing the surface shape of the dairy cow 51.
  • Point cloud data D4 (see FIG. 10) is generated.
  • the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance.
  • the point cloud data of the drinking fountain 52 to be acquired is also known in advance. For this reason, the animal extraction unit 64 generates animal point cloud data D4 by taking the difference between the synthetic point cloud data D3 and the point cloud data representing the drinking fountain 52 without the dairy cow 51 .
  • this animal point cloud data D4 always includes at least half of the body near the buttocks 51a regardless of the posture of the cow 51 in the drinking fountain 52. .
  • the animal extraction unit 64 associates the generated animal point cloud data D4 with the individual identification data D2 and outputs the generated animal point cloud data D4 to the surface data generation unit 65 .
  • the animal extraction unit 64 may have another configuration as long as it generates the animal point cloud data D4, and is not limited to the configuration of the first embodiment.
  • the animal extraction unit 64 may pre-register three-dimensional coordinate positions (coordinate data) indicating various angles, sizes, and types of dairy cows 51, and extract the animal point cloud data D4 by comparing them. good.
  • the animal extraction unit 64 extracts clean planes such as the floor and walls of the drinking fountain 52 from the synthesized point cloud data D3, and uses the plane as a reference to perform threshold processing on the place where the dairy cow 51 is likely to be, thereby A place where 51 is likely to be is extracted.
  • the animal extracting unit 64 divides the clusters of the point cloud data close to the extracted location into a plurality of groups by clustering processing, and extracts the animal point cloud data D4 assuming that the largest cluster among them is the dairy cow 51. good.
  • the surface data generation unit 65 generates three-dimensional surface data D5 (see FIG. 11) showing the dairy cow 51 using a known technique based on the animal point cloud data D4.
  • the surface data generation unit 65 of the first embodiment generates three-dimensional surface data D5 (mesh data) of the dairy cow 51 by pasting a plurality of meshes using an algorithm for generating triangular meshes and using Poisson reconstruction.
  • the surface data generating unit 65 may have another configuration as long as it generates the three-dimensional surface data D5, and is not limited to the configuration of the first embodiment.
  • the surface data generation unit 65 associates the generated three-dimensional surface data D5 with the individual identification data D2 and outputs the data to the specific part extraction unit 66 .
  • the specific part extracting unit 66 From the three-dimensional surface data D5 representing the dairy cow 51, the specific part extracting unit 66 generates buttock vicinity surface data D6 (see FIGS. 12 and 13) of the buttock vicinity 51a.
  • the specific part cutout section 66 since the vicinity 51a of the buttocks of the cow 51 is considered to be important in evaluating the breeding status of the dairy cow 51, the specific part cutout section 66 generates the vicinity of the buttocks surface data D6.
  • the specific part cutout section 66 of Example 1 extracts the characteristic portion of the bone in the vicinity of the buttocks 51a from the three-dimensional surface data D5, and sets the cutout surface based on the characteristic portion.
  • the specific part cut-out portion 66 is cut from the middle position between the pin bone (see symbol Bp in FIG. 13) and the tail head (see symbol Bt in FIG. 13) in the front-back direction of the cow 51 to the hook bone (see symbol Bt in FIG. Bh) length as a reference (referred to as reference length Lr). Cut with Scv.
  • the specific part cut-out portion 66 cuts out along a cut-out plane Sch perpendicular to the vertical direction so as to exclude the lower abdomen (from the belly to the milk) in the height direction of the dairy cow 51 . This is because the lower abdomen is a part that tends to change from day to day, and is not preferable for judging the breeding status.
  • the buttock vicinity surface data D6 is limited to the configuration of the first embodiment, as long as it is a three-dimensional coordinate position (coordinate data) indicating the buttock vicinity 51a, and the size (position of each cut surface) may be appropriately set. not.
  • the specific part extracting section 66 associates the individual identification data D2 with the generated buttock neighboring plane data D6 and outputs the data to the part volume calculating section 67 .
  • the part volume calculator 67 uses a known technique to calculate the volume (cutout part volume Vc) of the buttock neighboring plane data D6 (cutout part) based on the buttock neighboring plane data D6 generated by the specific part extracting section 66. Calculate The part volume calculation unit 67 calculates, for all triangular meshes, the volume of trigonometric supposition, for example, with the triangular mesh in the three-dimensional surface data D5 as the base and the reference point in the buttock vicinity surface data D6 as the vertex. Thus, the excised portion volume Vc is calculated.
  • part volume calculator 67 is not limited to the configuration of the first embodiment as long as it calculates the cut-out part volume Vc, which is the volume of the buttock vicinity plane data D6.
  • Part volume calculation section 67 associates individual identification data D2 with each cut-out part volume Vc and outputs it to growth status evaluation section 71 .
  • the cut surface extraction unit 68 generates, from the buttock vicinity surface data D6, the buttock vicinity cross-sectional data D7 (see FIG. 14), which is a cross section obtained by cutting the buttock vicinity 51a of the dairy cow 51 along a predetermined plane.
  • the cut plane extracting unit 68 of the first embodiment cuts the buttock vicinity plane data D6 along a slice position Sp set at an arbitrary position in the fore-and-aft direction on a predetermined plane perpendicular to the front-and-rear direction of the cow 51. By doing so, cross-sectional data D7 near the buttocks is generated.
  • This slice position Sp may be set as appropriate, but in Example 1, it is set at five positions shown in FIG.
  • the slice position Sp1 is set at the same position as the cut-out plane Scv when the buttock vicinity plane data D6 is cut out.
  • the slice position Sp3 is the position of the hook bone.
  • the slice position Sp2 is an intermediate position between the slice positions Sp1 and Sp3.
  • the slice position Sp5 is an intermediate position between tail head (see symbol Bt) and pin bone (see symbol Bp).
  • the slice position Sp4 is an intermediate position between the slice positions Sp3 and Sp5.
  • the cut surface extraction unit 68 may generate a cross section cut along a plane perpendicular to the width direction of the dairy cow 51 as the buttock neighborhood cross-section data D7.
  • the slice position Sp6 is shown. This slice position Sp6 is parallel to the spine at an intermediate position between the spine and the hook bone (see symbol Bh).
  • FIG. 14 An example of the cross-sectional data D7 near the buttocks is shown in FIG. In FIG. 14, they are arranged in order from the left side as viewed from the front in correspondence with the numbers at the end of each slice position Sp, and the numbers 1 to 6 are added to the end. That is, the buttock vicinity cross-sectional data D71 is at the slice position Sp1, the buttock vicinity cross-sectional data D72 is at the slice position Sp2, the buttock vicinity cross-sectional data D73 is at the slice position Sp3, the buttock vicinity cross-sectional data D74 is at the slice position Sp4, and the buttock vicinity cross-sectional data D74 is at the slice position Sp4.
  • the cut surface extraction unit 68 associates the generated buttock neighborhood cross-section data D7 with the individual identification data D2 and outputs the data to the approximated curve calculation unit 69 .
  • the approximate curve calculation unit 69 calculates an approximate curve Ac (see FIGS. 15 and 16) that fits the contour OL (see FIG. 14) on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68.
  • the approximated curve calculator 69 can calculate the approximated curve Ac that fits the contour OL by using Kochi's technique. In the first embodiment, the Bezier curve is used to smooth the approximated curve Ac. . This concept will be described with reference to FIGS. 15 and 16. FIG.
  • FIG. 15 shows a part of the contour OL and two approximation curves Ac calculated by fitting them (hereinafter, when shown separately, the left side is referred to as the first approximation curve Ac1 and the right side is referred to as the second approximation curve Ac2). ) and .
  • the outline OL is indicated by a thick line
  • the first approximate curve Ac1 is indicated by a thin dashed line
  • the second approximate curve Ac2 is indicated by a thin dashed line.
  • the approximated curve calculation unit 69 has calculated two approximated curves Ac (assumed to be Ac1 and Ac2) for a portion of the contour OL shown in FIG.
  • the approximation curve calculation unit 69 sets passing points P at the start point and the end point on the contour OL in order to calculate the first approximation curve Ac1 that fits from the vicinity of the left end to the vicinity of the center of the contour OL.
  • the starting point side is defined as a passing point Ps
  • the end point side is defined as a passing point Pe.
  • the approximate curve calculator 69 sets a plurality of control points (two C1 and C2 in the example of FIG.
  • the approximated curve calculator 69 adjusts the position of each control point (C1, C2) so that the curve from the passing point Ps to the passing point Pe overlaps the contour OL.
  • the approximated curve calculator 69 can obtain the first approximated curve Ac1 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
  • the approximated curve calculation unit 69 places the passing point Ps near the center on the contour OL. ', a passing point Pe is set near the right end, and a plurality of control points (C1', C2') are set.
  • the passing point Ps' has the same coordinates as the passing point Pe of the first approximated curve Ac1 in order to show the contour OL by the approximated curve without a break.
  • the approximated curve calculator 69 adjusts the positions of the respective control points (C1', C2') so that the curve from the passing point Ps' to the passing point Pe' overlaps the contour OL.
  • the approximated curve calculator 69 can obtain the second approximated curve Ac2 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
  • the approximate curve calculation unit 69 sets each passing point P and each control point C in consideration of the above, and thereby calculates both approximate curves (Ac1, Ac2) that smoothly continue and overlap the contour OL. can ask.
  • the approximate curve calculation unit 69 of the first embodiment standardizes the method of obtaining the approximate curve Ac. That is, the approximate curve calculation unit 69 divides the contour OL into a plurality of ranges based on characteristic points of the dairy cow 51 in the contour OL (for example, characteristic portions of various bones appearing in the contour OL), and divides the contour OL into a plurality of ranges. Assume that an approximated curve Ac is to be obtained.
  • the approximate curve calculation unit 69 sets the characteristic points of the dairy cows 51 to be common points regardless of the individual differences of the dairy cows 51, so that the same number of approximate curves Ac for all the dairy cows 51 regardless of individual differences. can represent the contour OL.
  • the approximated curve calculator 69 sets passing points P at the start and end points of each region, and predefines the number of control points C to be set in each region.
  • the approximate curve calculator 69 can represent the contour OL with an equal number of approximate curves Ac regardless of the individual differences of the dairy cows 51, and the coefficients and values of the approximate curves Ac corresponding to the individual differences of the dairy cows 51.
  • the position of each passing point P and each control point C can be changed.
  • the approximated curve calculation unit 69 can represent the contour OL with a plurality of approximated curves Ac by obtaining a plurality of approximated curves Ac that match the contour OL of the buttock vicinity cross-sectional data D7 as described above.
  • FIG. 16 shows a contour OL represented by a plurality of approximation curves Ac.
  • the example of FIG. 16 shows an approximation curve fitted to the contour OL of the cross-sectional data near the buttocks D73 of FIG. It is represented by curve Ac.
  • seven control points referred to as symmetrical control points Cs in FIG. 16 for distinction) are provided.
  • the rearing condition evaluation unit 71 evaluates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69. Generate evaluation data Ev.
  • the rearing condition evaluation unit 71 can obtain the conventionally used BCS by using the cut-out part volume Vc, for example, and can use this BCS as the evaluation data Ev of the dairy cow 51 .
  • the growth status evaluation unit 71 determines each coefficient (for example, the amount of change from the reference value, the distribution of each coefficient, etc.) in each approximate curve Ac, the positions (coordinates or reference positions) of each passing point P and each control point C ) can be obtained, and these can be used as the evaluation data Ev of the dairy cow 51 .
  • the breeding status evaluation unit 71 associates the evaluation data Ev of the dairy cow 51 with the individual identification data D2 and stores them in the storage unit 18 as appropriate.
  • the breeding status evaluation unit 71 may generate the evaluation data Ev of the dairy cow 51 by using other numerical values based on the cut-out part volume Vc and each approximate curve Ac.
  • the breeding status evaluation unit 71 may generate the evaluation data Ev based on the cut-out part volume Vc of the target dairy cow 51 and changes over time in each approximate curve Ac.
  • the control mechanism 11 appropriately displays the evaluation data Ev stored in the storage unit 18 on the display unit 16 or outputs it to an external device via the communication unit 17 as appropriate.
  • the evaluation data Ev is associated with the individual identification data D2, it is possible to easily grasp which dairy cow 51 is being raised. Thereby, the control mechanism 11 can report the evaluation data Ev of the dairy cow 51 .
  • FIG. 17 a description will be given of a rearing state evaluation process (raising state evaluation control method) as an example of evaluating the rearing state of the dairy cow 51 using the growing state evaluation system 10.
  • FIG. This training status evaluation process is executed by the control mechanism 11 based on a program stored in the storage unit 18 or the internal memory 11a.
  • Each step (each process) of the flow chart of FIG. 17 will be described below.
  • the flow chart of FIG. 17 is started when the growing condition evaluation system 10 is activated, a browser or an application is launched, the camera 12 is driven, and both laser measuring instruments 13 are placed in a standby state.
  • step S1 it is determined whether or not the dairy cow 51 exists in the drinking fountain 52 (data acquisition area 14). If YES, proceed to step S2, and if NO, step S1 is repeated.
  • the animal detection unit 61 analyzes the image I acquired by the camera 12 to determine whether or not the dairy cow 51 is present at the drinking fountain 52. Output to the acquisition unit 62 .
  • the individual identification data D2 is generated by identifying the individual dairy cow 51, and the individual identification data D2 is output to the data acquisition unit 62.
  • step S2 the point cloud data D1 of the drinking fountain 52 is acquired, and the process proceeds to step S3.
  • step S2 when the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, the two laser measuring devices 13 are driven to scan the drinking fountain 52 (data acquisition area 14). , to obtain the point cloud data D1 of the drinking fountain 52 .
  • step S2 when the data acquiring unit 62 receives the point cloud data D1 from both laser measuring devices 13, it outputs the individual identification data D2 to the respective point cloud data D1 to the data synthesizing unit 63 in association with each other.
  • step S3 synthetic point cloud data D3 is generated, and the process proceeds to step S4.
  • the data synthesizing unit 63 synthesizes the point cloud data D1 acquired by both laser measuring devices 13 to generate synthesized point cloud data D3, and adds the individual identification data D2 to the synthesized point cloud data D3 thus generated. It is output to the animal extractor 64 in association with it.
  • step S4 animal point cloud data D4 is generated, and the process proceeds to step S5.
  • the animal extraction unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the synthesized point cloud data D3 generated by the data synthesis unit 63 to generate animal point cloud data D4,
  • the individual identification data D2 are associated with the generated animal point group data D4 and output to the plane data generation unit 65 .
  • step S5 three-dimensional surface data D5 is generated, and the process proceeds to step S6.
  • the surface data generation unit 65 generates three-dimensional surface data D5 of the dairy cow 51 from the animal point cloud data D4 generated by the animal extraction unit 64, and adds the individual identification data D2 to the generated three-dimensional surface data D5. It is output to the specific part extraction unit 66 in association with it.
  • step S6 buttock vicinity surface data D6 is generated, and the process proceeds to step S7.
  • the specific part extracting unit 66 generates buttock vicinity surface data D6 of the buttock vicinity 51a of the cow 51 from the three-dimensional surface data D5 generated by the surface data generation unit 65, and generates the buttock vicinity surface data D6. is associated with individual identification data D2 and output to part volume calculation unit 67 .
  • step S7 the excision part volume Vc is calculated, and the process proceeds to step S8.
  • step S7 the part volume calculation unit 67 calculates the cutout part volume Vc, which is the volume, from the buttock vicinity surface data D6 (cutout part) generated by the specific part cutout part 66, and the calculated cutout part volume Vc is calculated.
  • the individual identification data D2 is associated with the volume Vc and output to the growing condition evaluation unit 71 .
  • step S8 cross-sectional data D7 near the buttocks is generated, and the process proceeds to step S9.
  • the cut surface extraction unit 68 extracts the buttock vicinity 51a of the dairy cow 51 from the buttock vicinity surface data D6 (cut out portion) generated by the specific portion cutout unit 66 along a predetermined plane.
  • Proximal cross-sectional data D7 is generated, and individual identification data D2 is associated with the generated buttock proximate cross-sectional data D7 and output to approximated curve calculation unit 69 .
  • step S9 an approximate curve Ac is calculated, and the process proceeds to step S10.
  • the approximated curve calculator 69 calculates an approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extractor 68, and calculates the approximated curve Ac at each passing point.
  • P and each control point C are associated with the individual identification data D2 and output to the breeding state evaluation unit 71 .
  • step S10 the evaluation data Ev is generated, and the process proceeds to step S11.
  • the rearing condition evaluation unit 71 calculates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69.
  • Evaluation data Ev indicating the evaluation of is generated, and individual identification data D2 is associated with the evaluation data Ev and stored in the storage unit 18 as appropriate.
  • step S11 it is determined whether or not the training status evaluation process has ended. If YES, this training status evaluation process ends, and if NO, the process returns to step S1. In step S11, when the growing condition evaluation system 10 is stopped or when the operating unit 15 is operated to end the growing condition evaluating process, it is determined that the growing condition evaluating process has ended.
  • the growth condition evaluation system 10 detects the presence of the dairy cow 51 at the drinking fountain 52 via the camera 12, it acquires the point cloud data D1 of the drinking fountain 52 using both laser measuring devices 13 (steps S1 and S2). Thereafter, based on the point cloud data D1, the growing condition evaluation system 10 calculates the volume of the vicinity of the buttocks 51a of the dairy cow 51 (cutout portion volume Vc) (steps S3 to S7), and the outer surface side of the vicinity of the buttocks 51a. An approximated curve Ac that fits the contour OL of is calculated (steps S3 to S6, S8 and S9).
  • the growing condition evaluation system 10 generates evaluation data Ev of the dairy cow 51 using the cut-out part volume Vc and the approximation curve Ac, and appropriately stores them in the storage unit 18 (step S10).
  • the growing condition evaluation system 10 can notify the evaluation data Ev by appropriately displaying the evaluation data Ev on the display unit 16 or outputting the evaluation data Ev to an external device via the communication unit 17 as appropriate.
  • the conventional growing condition evaluation system acquires the three-dimensional coordinate group of the animal using a range image sensor. Since the range image sensor has limitations in increasing the accuracy and resolution of the acquired three-dimensional coordinate group, it is difficult for the three-dimensional coordinate group to appropriately express the outline of the actual animal. For this reason, with the health condition estimation device of the prior art, it is difficult to obtain an appropriate evaluation by the BCS even if the three-dimensional coordinate group is used.
  • the conventional health condition estimation device uses a distance image sensor, the contour of the animal indicated by the three-dimensional coordinate group is different from that of an actual animal, such as jagged unevenness. It becomes difficult to represent with a formula (approximate curve). For this reason, with the conventional health condition estimation device, it is difficult to obtain a mathematical expression that appropriately expresses the contour shape, and it is difficult to appropriately evaluate the growth situation using the contour.
  • the growing condition evaluation system 10 uses the laser measuring device 13 to acquire the three-dimensional coordinate group of the animal (the dairy cow 51 in Example 1).
  • the laser measuring instrument 13 can acquire a three-dimensional coordinate group (point group data D1) with accuracy at the level used for precise surveying, and also has an extremely high resolution (12.5 mm interval in Example 1). ), the point cloud data D1 can represent the contour of the actual dairy cow 51 very faithfully.
  • the laser measuring device 13 of Example 1 can set the accuracy of the reaching distance of the pulsed laser beam to an error of 3.5 mm or less, and set the accuracy of the scanning plane to an error of 2.0 mm or less. It is possible to reduce the accuracy of angle measurement to an error of 6 seconds (angle) or less for both vertical and horizontal.
  • the laser measuring device 13 of Example 1 can be set to a low output mode, but even in that case, except that the accuracy of the reaching distance of the pulsed laser beam becomes an error of 4.0 mm or less. , and others can be the above-described accuracy.
  • the growing condition evaluation system 10 can acquire a three-dimensional coordinate group with extremely high accuracy and can also achieve extremely high resolution by using the laser measuring device 13 . Therefore, the growing condition evaluation system 10 can appropriately obtain an evaluation of the growing condition by using the point cloud data D1 from the laser measuring device 13 .
  • the contour OL of the dairy cow 51 indicated by the point cloud data D1 can represent the actual contour of the dairy cow 51 very faithfully. It is possible to obtain an approximation curve Ac that can be approximated by a smooth curve and appropriately represents the contour of the actual dairy cow 51 . Therefore, in the growing condition evaluation system 10, since the approximated curve Ac representing the contour of the actual dairy cow 51 can be obtained with extreme fidelity, the contour OL can be represented by a standardized formula, and the growing condition can be evaluated without variation. You can get an evaluation.
  • the growing condition evaluation system 10 approximates the contour OL based on the point cloud data D1 to find the approximated curve Ac, so that the body shape of the dairy cow 51 can be represented by a mathematical formula, and the growing condition can be evaluated. Since it can be calculated by calculation, the evaluation can be made objective and appropriate.
  • the growing condition evaluation system 10 standardizes the method of obtaining the approximated curve Ac, so that individual differences in the contour OL can be determined by the coefficients, the positions of the passing points P and the control points C, etc. in the approximated curve Ac. can be shown. Therefore, the growing condition evaluation system 10 can generate the evaluation data Ev based on changes in the positions of each coefficient, each passing point P and each control point C, and the like, thereby making it possible to judge the growing condition more uniformly. do.
  • the growing condition evaluation system 10 since the growing condition evaluation system 10 generates the evaluation data Ev using the approximate curve Ac obtained from the point group data D1 acquired by the laser measuring device 13, the evaluation data Ev can be notified as the evaluation of the growing condition. can be done. Therefore, the growth situation evaluation system 10 can prevent individual differences and fluctuations in judgment from being reflected in the evaluation of the growth situation, and can eliminate differences caused by different places, facilities, times, etc. can. As a result, the growing condition evaluation system 10 can generate evaluation data Ev based on a unified standard, and can evaluate the dairy cow 51 (animal) as a quantitative numerical value.
  • the growing condition evaluation system 10 Only when the cow 51 is detected at the drinking fountain 52 via the camera 12, the growing condition evaluation system 10 obtains the point cloud data D1 of the drinking fountain 52 using the two laser measuring devices 13, and based on this, evaluates data. Ev can be generated. Therefore, the growing condition evaluation system 10 can obtain the evaluation data Ev without touching the dairy cow 51 while making use of the fact that the dairy cow 51 comes to the drinking fountain 52 of its own will. Therefore, the growing condition evaluation system 10 can determine an appropriate growing condition without applying stress to the dairy cow 51 such as touching it or guiding it to an unintended place for generating the evaluation data Ev. do.
  • the growth condition evaluation system 10 automatically obtains the evaluation data Ev by using the natural behavior of the dairy cow 51, it is possible to eliminate the time limit and manage the system all day (24 hours). Furthermore, the growing condition evaluation system 10 does not acquire the point cloud data D1 by the two laser measuring devices 13 when the dairy cow 51 is not present at the drinking fountain 52, so acquisition and accumulation of unnecessary data can be prevented. and can be operated efficiently.
  • the growing condition evaluation system 10 of Example 1 of the growing condition evaluation system can obtain the following effects.
  • the growing condition evaluation system 10 is a laser measuring instrument that acquires point cloud data D1 representing the outline of the animal in three-dimensional coordinates by receiving the reflected light of the emitted laser beam (pulse laser beam) from the animal (dairy cow 51). 13.
  • the growing condition evaluation system 10 also includes a surface data generation unit 65 that generates three-dimensional surface data D5 of the animal based on the point cloud data D1, and an approximate curve calculator that calculates an approximate curve Ac that fits the three-dimensional surface data D5. 69, and a breeding condition evaluation unit 71 that generates evaluation data Ev indicating an evaluation of the animal based on the approximated curve Ac.
  • the growing condition evaluation system 10 can calculate the approximated curve Ac representing the contour OL with a standardized formula based on the point cloud data D1 that very faithfully represents the contour of the actual animal. Since the evaluation data Ev can be generated from the data, it is possible to obtain an appropriate evaluation of the training status without variation.
  • the approximate curve calculation unit 69 fits the contour OL of the cut surface (buttock vicinity cross-sectional data D7) obtained by cutting the three-dimensional surface data D5 along the cutting plane to calculate the approximate curve Ac. do. For this reason, the growth condition evaluation system 10 determines the position of the cut surface based on the characteristic portion using bones, etc., thereby suppressing the capacity of the generated evaluation data Ev and facilitating comparison between animals. Appropriately assess the situation.
  • the growing condition evaluation system 10 sets a data acquisition area 14 for acquiring point cloud data D1 of an animal (dairy cow 51), and an animal detection mechanism (camera 12, An animal detection unit 61) is provided.
  • an animal detection mechanism camera 12, An animal detection unit 61
  • the growing condition evaluation system 10 acquires the point cloud data D1 of the data acquisition region 14.
  • FIG. Therefore, since the growing condition evaluation system 10 drives the laser measuring device 13 only when the animal is in the data acquisition area 14, the large-capacity point cloud data D1 is not acquired in the absence of the animal. Evaluation data Ev can be generated well.
  • the growing condition evaluation system 10 can generate the evaluation data Ev without giving stress to the animal such as touching it or leading it to an unintended place, etc., and can prevent hindrance to the breeding of the animal.
  • the evaluation data Ev can be generated by utilizing the time when the animal stops itself while acting naturally. , can eliminate animal stress associated with its production.
  • the growing condition evaluation system 10 has a pair of laser measuring instruments 13 with a data acquisition area 14 interposed therebetween. Therefore, the growth condition evaluation system 10 can acquire the point cloud data D1 of the animal in the data acquisition area 14 regardless of the animal's position, posture, facing direction, etc., and can provide an appropriate image of the animal while reducing stress. Evaluation data Ev can be generated.
  • both laser measurement devices 13 acquire point cloud data D1 of at least half of the animal's body (either left or right) regardless of the posture of the animal in the drinking fountain 52 as the data acquisition area 14. A positional relationship with respect to the drinking fountain 52 is set so that it can be obtained. Therefore, the growing condition evaluation system 10 can greatly increase the possibility of obtaining the point cloud data D1 necessary for generating the evaluation data Ev while eliminating the animal's stress caused by the generation of the evaluation data Ev.
  • the growing condition evaluation system 10 further includes a data synthesizing unit 63 that synthesizes the point cloud data D1 acquired by each laser measuring device 13 to generate synthesized point cloud data D3, and an animal (dairy cow 51) from the synthesized point cloud data D3. and an animal extracting unit 64 that generates animal point cloud data D4 indicating . Then, in the growing condition evaluation system 10, the surface data generation unit 65 generates three-dimensional surface data D5 from the animal point group data D4. For this reason, the growing condition evaluation system 10 synthesizes the two point cloud data D1 to obtain the synthesized point cloud data D3, thereby obtaining the data necessary for evaluating the growing condition of the animal (the vicinity of the buttocks 51a in the first embodiment).
  • the growing condition evaluation system 10 generates data (animal point cloud data D4) indicating animals required for evaluation of the growing condition among the synthesized point cloud data D3, and from there, three-dimensional surface data D5. , the evaluation data Ev is generated, the volume of the three-dimensional surface data D5 and the evaluation data Ev and the amount of work for generating them can be suppressed, and the evaluation data Ev can be efficiently generated.
  • the growing condition evaluation system 10 the animal detection mechanism (camera 12, animal detection unit 61) generates individual identification data D2 in which the detected animal (dairy cow 51) is individually identified, and the growth condition evaluation unit 71 generates Individual identification data D2 is associated with evaluation data Ev. Therefore, the growing condition evaluation system 10 can appropriately evaluate the growing condition of each animal even in a situation where a plurality of animals visit the data acquisition area 14 at random times or in any order.
  • the growth situation evaluation system 10 as an example of the growth situation evaluation system according to the present disclosure, it is possible to appropriately obtain an evaluation of the rearing situation of the animal (dairy cow 51).
  • Example 1 the growth situation evaluation system of the present disclosure has been described based on Example 1, but the specific configuration is not limited to Example 1, and the gist of the invention according to each claim of the scope of claims. Design changes and additions are permitted as long as they do not deviate.
  • the drinking fountain 52 is set as the data acquisition area 14 .
  • the data acquisition area 14 may be set as appropriate, and is not limited to the configuration of the first embodiment.
  • the data acquisition area 14 by setting a place where the animal regularly visits at its own will like the watering place 52, such as a feeding place, the breeding situation can be monitored without stressing the animal. Appraisal can be obtained properly.
  • the stress of the animal can be further reduced and the breeding situation can be monitored. evaluation can be obtained appropriately.
  • the above required time depends on the scanning speed of each laser measuring device 13, and can be shortened by, for example, increasing the number of pulsed laser beams emitted at one time. Even if it is shortened, the point cloud data D1 can be obtained appropriately.
  • Example 1 the dairy cow 51 is targeted as an example of an animal, and the evaluation data Ev indicating the evaluation of the breeding status is generated.
  • an animal for which evaluation data Ev is generated is not limited to the configuration of the first embodiment as long as the animal is required to evaluate the raising state.
  • the approximated curve calculation unit 69 calculates the approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68 .
  • the locations used for evaluating the growth status are basically left-right symmetrical.
  • the curve Ac may be calculated (evaluated on the half of the body), and is not limited to the configuration of the first embodiment.
  • the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 based on the image from the camera 12 .
  • the animal detection mechanism is not limited to the configuration of the first embodiment as long as it detects the presence of an animal in the data acquisition area 14 .
  • a device using infrared rays such as an infrared scanner or an infrared thermography can be used. In this case, detection of animals can be made more reliable even in dark conditions.
  • a pressure sensor may be provided at the faucet of the drinking fountain 52
  • a weight detection device may be provided in the data acquisition area 14, or a sensor may be provided at the entrance of the data acquisition area 14. can.
  • Example 1 the laser measuring instruments 13 are provided in pairs with the data acquisition area 14 interposed therebetween. However, as long as the laser measuring device 13 acquires the point group data D1 of the animal existing in the data acquisition area 14, a single laser measuring device 13 may be provided, or three or more may be provided. not.
  • Example 1 the specific part extraction unit 66 generates the buttock vicinity surface data D6 of the buttock vicinity 51a of the dairy cow 51 .
  • the specified part may be appropriately set, and is not limited to the configuration of the first embodiment.
  • the specific portion extraction unit 66, the cut surface extraction unit 68, and the approximated curve calculation unit 69 are based on the characteristic portion of the bone.
  • each of these parts may be based on other parts as long as they are detectable when setting the parts of the animal and are common regardless of individual differences between animals, It is not limited to the configuration of the first embodiment.
  • the approximate curve calculation unit 69 calculates the approximate curve Ac using a Bezier curve. However, the approximated curve calculation unit 69 calculates an approximated curve that can be represented by an equation in a predetermined format while adapting the contour of the specified location (the contour OL of the cross-sectional data D7 near the buttocks in the first embodiment). If so, the method of calculating the approximate curve may be set as appropriate, and is not limited to the configuration of the first embodiment.

Abstract

La présente invention concerne un système d'évaluation de condition de croissance qui peut évaluer de manière appropriée une condition de développement d'un animal. Le système d'évaluation de conditions de croissance (10) comprend : un appareil de mesure laser (13) qui acquiert des données de groupe de points (D1) qui indique l'aspect d'un animal (vache 51) dans des coordonnées tridimensionnelles par la réception d'une lumière laser qui a été émise vers l'animal (vache 51) et réfléchie par celui-ci ; une unité de génération de données de surface (65) qui génère des données tridimensionnelles (D5) de l'animal (vache 51) sur la base des données de groupe de points (D1) ; une unité de calcul de courbe d'approximation (69) qui calcule une courbe d'approximation (Ac) qui est appropriée pour les données tridimensionnelles de surface (D5) ; et une unité d'évaluation de condition de développement (71) qui génère, sur la base de la courbe d'approximation (Ac), des données d'évaluation (Ev) qui indiquent l'évaluation de l'animal (vache 51).
PCT/JP2022/013100 2021-03-25 2022-03-22 Système d'évaluation de condition de croissance WO2022202793A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-052422 2021-03-25
JP2021052422A JP2022150026A (ja) 2021-03-25 2021-03-25 生育状況評価システム

Publications (1)

Publication Number Publication Date
WO2022202793A1 true WO2022202793A1 (fr) 2022-09-29

Family

ID=83395847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/013100 WO2022202793A1 (fr) 2021-03-25 2022-03-22 Système d'évaluation de condition de croissance

Country Status (2)

Country Link
JP (1) JP2022150026A (fr)
WO (1) WO2022202793A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10500207A (ja) * 1994-04-14 1998-01-06 フェノ・イメージング、インク 動物の三次元表現型測定装置
US20100289879A1 (en) * 2009-05-01 2010-11-18 Texas Tech University System Remote Contactless Stereoscopic Mass Estimation System
JP2014044078A (ja) * 2012-08-24 2014-03-13 Univ Of Miyazaki 動物体の体重推定装置、及び体重推定方法
JP2019045478A (ja) * 2017-09-06 2019-03-22 国立大学法人 宮崎大学 家畜の体重推定装置及び家畜の体重推定方法
JP2019187277A (ja) * 2018-04-24 2019-10-31 国立大学法人 宮崎大学 牛のボディコンディションスコアの評価装置、評価方法及び評価プログラム
JP2019211364A (ja) * 2018-06-06 2019-12-12 全国農業協同組合連合会 動物体の体重推定装置及び体重推定方法
WO2021166894A1 (fr) * 2020-02-18 2021-08-26 国立大学法人宮崎大学 Dispositif d'estimation de poids et programme

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10500207A (ja) * 1994-04-14 1998-01-06 フェノ・イメージング、インク 動物の三次元表現型測定装置
US20100289879A1 (en) * 2009-05-01 2010-11-18 Texas Tech University System Remote Contactless Stereoscopic Mass Estimation System
JP2014044078A (ja) * 2012-08-24 2014-03-13 Univ Of Miyazaki 動物体の体重推定装置、及び体重推定方法
JP2019045478A (ja) * 2017-09-06 2019-03-22 国立大学法人 宮崎大学 家畜の体重推定装置及び家畜の体重推定方法
JP2019187277A (ja) * 2018-04-24 2019-10-31 国立大学法人 宮崎大学 牛のボディコンディションスコアの評価装置、評価方法及び評価プログラム
JP2019211364A (ja) * 2018-06-06 2019-12-12 全国農業協同組合連合会 動物体の体重推定装置及び体重推定方法
WO2021166894A1 (fr) * 2020-02-18 2021-08-26 国立大学法人宮崎大学 Dispositif d'estimation de poids et programme

Also Published As

Publication number Publication date
JP2022150026A (ja) 2022-10-07

Similar Documents

Publication Publication Date Title
US20200187906A1 (en) System and methods for at-home ultrasound imaging
US10219782B2 (en) Position correlated ultrasonic imaging
US9084556B2 (en) Apparatus for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus
JP5400343B2 (ja) 超音波による分娩の診察のための方法及び装置
CN100556360C (zh) 超声波探头轨迹显示装置及方法和超声波诊断装置及方法
US20180271484A1 (en) Method and systems for a hand-held automated breast ultrasound device
US11642096B2 (en) Method for postural independent location of targets in diagnostic images acquired by multimodal acquisitions and system for carrying out the method
CN110087555B (zh) 一种超声设备及其三维超声图像的显示变换方法、系统
ATE519430T1 (de) Vorrichtung für eine benutzerschnittstelle zur medizinischen ultraschallnavigation
WO2015078148A1 (fr) Procédé et système de balayage assisté par ultrasons
US9730675B2 (en) Ultrasound imaging system and an ultrasound imaging method
JP2015171476A (ja) 超音波診断装置及び超音波画像処理方法
JP7442548B2 (ja) 誘導式超音波撮像
CN110087550A (zh) 一种超声图像显示方法、设备及存储介质
US11523799B2 (en) Fetal imaging system and method
CN113795198A (zh) 用于控制体积速率的系统和方法
CN111511288A (zh) 超声肺评估
WO2022202793A1 (fr) Système d'évaluation de condition de croissance
KR102419310B1 (ko) 초음파 이미징 데이터로부터 태아 이미지들을 프로세싱 및 디스플레이하기 위한 방법들 및 시스템들
US11890142B2 (en) System and methods for automatic lesion characterization
WO2023190352A1 (fr) Système d'évaluation d'état de croissance
KR20200099910A (ko) 초음파 영상을 표시하는 방법, 장치 및 컴퓨터 프로그램 제품
JP2023146764A (ja) 生育状況評価システム
JP2023146765A (ja) 生育状況評価システム
KR102598211B1 (ko) 방광내 요량 측정용 초음파 스캐너

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22775576

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22775576

Country of ref document: EP

Kind code of ref document: A1