WO2022202793A1 - Growing condition evaluation system - Google Patents

Growing condition evaluation system Download PDF

Info

Publication number
WO2022202793A1
WO2022202793A1 PCT/JP2022/013100 JP2022013100W WO2022202793A1 WO 2022202793 A1 WO2022202793 A1 WO 2022202793A1 JP 2022013100 W JP2022013100 W JP 2022013100W WO 2022202793 A1 WO2022202793 A1 WO 2022202793A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
animal
unit
point cloud
evaluation system
Prior art date
Application number
PCT/JP2022/013100
Other languages
French (fr)
Japanese (ja)
Inventor
裕 竹村
彩乃 矢羽田
俊和 阿出川
宗央 横山
Original Assignee
学校法人東京理科大学
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 学校法人東京理科大学, 株式会社トプコン filed Critical 学校法人東京理科大学
Publication of WO2022202793A1 publication Critical patent/WO2022202793A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • This disclosure relates to a growing condition evaluation system.
  • BCS Body Condition Score
  • the BCS is considered a health condition estimation device that can be automatically determined without touching the animal (see Patent Document 1, for example).
  • the health condition estimating device obtains a three-dimensional coordinate group indicating the three-dimensional shape of the animal by using a range image sensor, and based on the three-dimensional coordinate group, a feature value indicating the width of the body of the animal, Characteristic values indicating the position of the spine are obtained, and the BCS is calculated based on them. Therefore, the conventional health condition estimating apparatus can obtain an evaluation based on the BCS with reduced variation while reducing the burden on the animal.
  • a range image sensor is used to acquire a group of three-dimensional coordinates of an animal. For this reason, in the conventional health condition estimating device, there is a limit to the reproduction of the three-dimensional shape of the animal with the acquired three-dimensional coordinate group. It is difficult to get to.
  • the present disclosure has been made in view of the above circumstances, and aims to provide a growth situation evaluation system that can appropriately obtain an evaluation of the growth situation of animals.
  • the growth situation evaluation system of the present disclosure includes a laser beam that acquires point cloud data representing the outline of the animal in three-dimensional coordinates by receiving the reflected light from the animal of the emitted laser beam.
  • a measuring device a surface data generation unit that generates three-dimensional surface data of the animal based on the point cloud data, an approximate curve calculation unit that calculates an approximate curve that fits the three-dimensional surface data, and the approximate curve and a breeding status evaluation unit that generates evaluation data indicating the evaluation of the animal based on the evaluation.
  • the growth situation evaluation system of the present disclosure it is possible to appropriately obtain an evaluation of the animal's growth situation.
  • FIG. 1 is an explanatory diagram showing the overall configuration of a growing condition evaluation system of Example 1 as an example of a growing condition evaluation system according to the present disclosure
  • FIG. It is a block diagram which shows the structure of the control system in a growing condition evaluation system. It is explanatory drawing which shows a mode that the imaging device in a growing condition evaluation system is attached. It is explanatory drawing which shows the image which the dairy cow was imaged by the imaging device. It is explanatory drawing which shows the laser measuring device in a growth condition evaluation system.
  • 3 is a block diagram showing the configuration of a control system in the laser measuring instrument; FIG. It is explanatory drawing which shows the point cloud data acquired by one laser measuring device (1st).
  • FIG. 10 is an explanatory diagram showing synthesized point cloud data obtained by synthesizing both point cloud data; It is an explanatory view showing animal point cloud data showing a dairy cow.
  • FIG. 4 is an explanatory diagram showing three-dimensional surface data obtained by meshing animal point cloud data;
  • FIG. 10 is an explanatory diagram showing surface data near the buttocks obtained by cutting out the vicinity of the buttocks from the three-dimensional surface data;
  • FIG. 4 is an explanatory diagram showing each slice position for cutting out cross-sectional data near the buttocks from plane data near the buttocks;
  • FIG. 14 is an explanatory diagram showing cross-sectional data in the vicinity of the buttocks, in which data corresponding to each slice position in FIG. 13 are arranged in order from the left.
  • FIG. 10 is an explanatory diagram showing how an approximate curve of the contour of cross-sectional data near the buttocks is obtained; It is explanatory drawing which shows buttock vicinity cross-sectional data (its outline) and an approximated curve as an example. It is a flowchart which shows the growing condition evaluation process (cultivating condition evaluation processing method) performed by the control mechanism of a growing condition evaluation system.
  • Example 1 of the growing condition evaluation system 10 as one embodiment of the growing condition evaluation system according to the present disclosure will be described below with reference to FIGS. 1 to 17 .
  • 1 and 3 schematically show the vicinity of the drinking fountain 52 (data acquisition area 14) in the barn 50, and does not necessarily match the actual state of the barn 50.
  • FIG. 1 and 3 schematically show the vicinity of the drinking fountain 52 (data acquisition area 14) in the barn 50, and does not necessarily match the actual state of the barn 50.
  • the growth status evaluation system 10 automatically evaluates the growth status of animals.
  • the growing condition evaluation system 10 of Example 1 evaluates the growing condition of a Holstein cow (hereinafter referred to as dairy cow 51) as an example of an animal.
  • dairy cow 51 a Holstein cow
  • This growing condition evaluation system 10 is installed in a cow barn 50 as shown in FIGS.
  • a plurality of dairy cows 51 are kept in the cowshed 50, and each dairy cow 51 can be moved.
  • a cowshed 50 is provided with a drinking fountain 52 for a dairy cow 51 .
  • the drinking fountain 52 is constructed by placing a water tub 53 in a space in which a plurality of dairy cows 51 can enter.
  • the water tub 53 is elongated and arranged at one corner of the drinking fountain 52 .
  • a dairy cow 51 periodically visits the drinking fountain 52 of its own volition and stops in front of the water tub 53 . Therefore, in the growing condition evaluation system 10 , the drinking fountain 52 is set as the data acquisition area 14 .
  • the control mechanism 11 comprehensively controls the operation of the growing condition evaluation system 10 by deploying a program stored in the storage unit 18 or the built-in internal memory 11a, for example, on a RAM (Random Access Memory). to control.
  • the internal memory 11a is composed of a RAM or the like
  • the storage unit 18 is composed of a ROM (Read Only Memory), an EEPROM (Electrically Erasable Programmable ROM), or the like.
  • the growing condition evaluation system 10 includes a printer that prints the measurement results in response to a measurement completion signal or an instruction from the measurer, an output unit that outputs the measurement results to an external memory or a server, and an operation controller. An audio output unit for notifying the situation and the like is provided as appropriate.
  • the control mechanism 11 is connected to a camera 12 and two laser measuring devices 13 (one is designated as the first and the other is designated as the second in FIG. 2) and appropriately controls them, and signals (data) from them are It is possible to receive This connection may be wired or wireless as long as it is possible to exchange signals with the camera 12 and each laser measuring device 13 .
  • the control mechanism 11 may be provided at a position different from the cowshed 50 or may be provided inside the cowshed 50 .
  • An operation unit 15 , a display unit 16 , a communication unit 17 and a storage unit 18 are connected to the control mechanism 11 .
  • the operation unit 15 is for operating various settings for evaluating the growth situation, operations and settings of the camera 12 and each laser measuring device 13, and the like.
  • the operation unit 15 may be composed of an input device such as a keyboard and a mouse, or may be composed of software keys displayed on the display screen of the display unit 16 of a touch panel type.
  • the display unit 16 displays an image I acquired by the camera 12 (either a still image or a moving image (see FIG. 4)), point cloud data D1 acquired by each laser measuring device 13 (see FIGS. 7 and 8), and Various data (see FIGS. 9 to 14, etc.), evaluation data Ev indicating the evaluation of the animal (dairy cow 51), etc. are displayed.
  • the display unit 16 is configured by a liquid crystal display device (LCD monitor) as an example, and is provided in the control mechanism 11 together with the operation unit 15 .
  • the operation unit 15 and the display unit 16 may be composed of a mobile terminal such as a smart phone or a tablet, and are not limited to the configuration of the first embodiment.
  • the communication unit 17 communicates with the camera 12, each laser measuring device 13 (the communication unit 45 thereof), and an external device, and drives the camera 12 and each laser measuring device 13, and receives an image I from the camera 12 and each device. Reception of the point cloud data D1 from the laser measuring instrument 13 is enabled.
  • the control mechanism 11 can be configured by a tablet terminal, and in that case, the operation unit 15 and the display unit 16 can be configured integrally with the control mechanism 11 .
  • the camera 12 is capable of photographing the entire data acquisition area 14, and in Example 1, a 4K camera (having a resolution of 3840 ⁇ 2160 pixels) is used. As shown in FIGS. 1 and 3, the camera 12 is attached to the installation plate 54 above the drinking fountain 52 which is the data acquisition area 14, and can photograph the drinking fountain 52 without disturbing the dairy cows 51. and The installation plate 54 is provided on a post 55 of the barn 50 for installation of the camera 12 .
  • the camera 12 always captures the water fountain 52 while the growing condition evaluation system 10 is in operation, and outputs the captured image I (the data (see FIG. 4)) to the control mechanism 11 .
  • the two laser measuring devices 13 are installed at known points, project pulse laser beams toward the measurement points, receive reflected light (pulse reflected light) of the pulse laser beams from the measurement points, and Distance measurement is performed, and the distance measurement results are averaged to perform highly accurate distance measurement.
  • Each laser measuring device 13 scans the set measurement range and evenly sets measurement points over the entire measurement range, so that the surface shape of the object existing over the entire measurement range is three-dimensionally measured.
  • a collection of three-dimensional position data (hereinafter referred to as point cloud data D1) indicated by original coordinates can be obtained.
  • each laser measuring device 13 can acquire the point cloud data D1 representing the surface shape of the object existing within the set measurement range.
  • the two laser measuring devices 13 are positioned so as to form a pair across the drinking fountain 52, which is the data acquisition area 14, and at a height that does not interfere with the movement of the dairy cow 51. placed in position. Both laser measurement devices 13 acquire point cloud data D1 of at least half of the body (either left or right) near the buttocks 51a of the dairy cow 51, regardless of the posture of the cow 51 in the drinking fountain 52 (data acquisition area 14). A positional relationship with respect to the drinking fountain 52 is set so that it is possible.
  • These two laser measuring instruments 13 have the same configuration, except that they are provided at different positions.
  • the laser measuring device 13 may adopt a phase difference measuring method using a light beam modulated at a predetermined frequency, or may adopt another method, and is not limited to the first embodiment.
  • the laser measuring instrument 13 includes a base 31, a body portion 32, and a vertical rotating portion 33, as shown in FIGS.
  • the pedestal 31 is a portion that is attached to the installation base 34 .
  • the installation base 34 is attached to a post 55 of the barn 50 for installation of the laser measuring device 13 .
  • the body portion 32 is provided on the base 31 so as to be rotatable about the vertical axis with respect to the base 31 .
  • the body portion 32 has a U shape as a whole, and a vertical rotating portion 33 is provided in the portion therebetween.
  • a measuring instrument display section 35 and a measuring instrument operating section 36 are provided in the main body section 32 .
  • the measuring instrument display section 35 is a section that displays various operation icons, settings, etc. for measurement under the control of the measuring instrument control section 43, which will be described later.
  • the measuring device operation unit 36 is a place where operations for using and setting various functions of the laser measuring device 13 are performed, and information that has been input and operated is output to the measuring device control unit 43 .
  • various operation icons displayed on the measuring instrument display section 35 function as the measuring instrument operating section 36 .
  • the measuring device display unit 35 and the measuring device operation unit 36 are provided on the installation table 34 of the support 55 by providing the above functions to the operation unit 15 and the display unit 16 in the growing condition evaluation system 10. It is possible to operate remotely while keeping the As will be described later, if the laser measuring device 13 is capable of scanning the data acquisition area 14 (an animal such as the dairy cow 51 there), the location and installation method can be appropriately set. Well, it is not limited to the configuration of the first embodiment.
  • the vertical rotation part 33 is provided on the main body part 32 so as to be rotatable around a rotation axis extending in the horizontal direction.
  • a range finding optical unit 37 is incorporated in the vertical rotation unit 33 .
  • the distance measuring optical unit 37 projects a pulsed laser beam as distance measuring light and receives reflected light (pulse reflected light) from the measuring point to measure the light wave distance to the measuring point.
  • a horizontal rotation drive section 38 and a horizontal angle detection section 39 are provided in the body section 32 that allows the vertical rotation section 33 to rotate around the horizontal axis.
  • the horizontal rotation driving section 38 rotates the body section 32 with respect to the pedestal 31 around the vertical axis, that is, in the horizontal direction.
  • the horizontal angle detection unit 39 detects (measures) the horizontal angle in the collimation direction by detecting the horizontal rotation angle of the main body 32 with respect to the pedestal 31 .
  • the horizontal rotation drive section 38 can be configured with a motor
  • the horizontal angle detection section 39 can be configured with an encoder.
  • the body portion 32 is provided with a vertical rotation drive portion 41 and a vertical angle detection portion 42 .
  • the vertical rotation drive section 41 rotates the vertical rotation section 33 with respect to the body section 32 around the horizontal axis, that is, in the vertical direction.
  • the vertical angle detection unit 42 detects (measures) the vertical angle of the collimation direction by detecting the vertical angle of the vertical rotation unit 33 with respect to the main unit 32 .
  • the vertical rotation drive unit 41 can be configured with a motor
  • the vertical angle detection unit 42 can be configured with an encoder.
  • the main body part 32 incorporates a measuring device control part 43 .
  • the measuring device control unit 43 controls the operation of the laser measuring device 13 by a program stored in the connected storage unit 44 .
  • the storage unit 44 is configured by a semiconductor memory or various storage media, and stores programs such as a calculation program necessary for measurement and a data transmission program for generating and transmitting information to be transmitted. Group data D1 is stored appropriately. The information and data are appropriately transmitted to the control mechanism 11 via the communication section 45 described later and the communication section 17 (see FIG. 2) described above.
  • the measuring instrument control section 43 includes a measuring instrument display section 35 , a measuring instrument operation section 36 , a distance measuring optical section 37 , a horizontal rotation driving section 38 , a horizontal angle detecting section 39 , a vertical rotation driving section 41 , a vertical angle detecting section 42 , the storage unit 44 and the communication unit 45 are connected.
  • the communication unit 45 enables communication between the control mechanism 11 (see FIG. 2) and the measuring device control unit 43 via the communication unit 17, and each data stored in the storage unit 44 under the control of the measuring device control unit 43. Submit information accordingly.
  • the communication unit 45 enables exchange of data and the like with the control mechanism 11 (communication unit 17).
  • the communication unit 45 may perform wired communication with the communication unit 17 via a laid LAN cable, or may perform wireless communication with the communication unit 17 .
  • Output values for measurement from the distance measurement optical unit 37, the horizontal angle detection unit 39, and the vertical angle detection unit 42 are input to the measuring device control unit 43. Based on these output values, the measuring instrument control unit 43 determines the arrival time difference or phase difference between the reference light propagating through the reference optical path provided in the main unit 32 and the reflected light acquired via the distance measuring optical unit 37. to the measurement point (reflection point). In addition, the measuring device control unit 43 measures (calculates) the calculated elevation angle and horizontal angle when measuring the distance. Then, the measuring device control section 43 stores these measurement results in the storage section 44 and appropriately transmits them to the control mechanism 11 (communication section 17) via the communication section 45.
  • the measuring device control unit 43 controls the driving of the horizontal rotation driving unit 38 and the vertical rotation driving unit 41 to appropriately rotate the main body unit 32 and the vertical rotation unit 33 (see FIG. 1), so that the vertical rotation unit 33 can be directed in a predetermined direction and can scan a predetermined range.
  • the measuring device control unit 43 of the first embodiment scans the water fountain 52 which is the data acquisition area 14, and each position of the water fountain 52 including the dairy cow 51 there is taken as a measurement point.
  • the positional relationship between the two laser measuring devices 13 and the drinking fountain 52 is known in advance and is constant (does not change).
  • the measuring device control unit 43 of the first embodiment sets measurement points at intervals of 12.5 mm on the scanning plane.
  • the scanning plane can be appropriately set within the drinking fountain 52, and in Example 1, it is set at a position where the vicinity 51a of the buttocks of the dairy cow 51 is assumed to be located.
  • the measuring device control unit 43 controls the distance measuring optical unit 37 to perform distance measurement (distance measurement) for each set measurement point while scanning the drinking fountain 52 . At this time, the measuring device control unit 43 measures (calculates) the elevation angle and horizontal angle in the sighting direction, thereby measuring the three-dimensional coordinate position of each measurement point of the drinking fountain 52 . Then, the measuring device control unit 43 generates point cloud data D1 (one example is shown in FIG. 7 and the other example is shown in FIG. 8) is generated and transmitted to the control mechanism 11 via the communication unit 45 and the communication unit 17 as appropriate.
  • the point cloud data D1 indicates the surface shape of the drinking fountain 52, and if the dairy cow 51 is present at the drinking fountain 52, the surface shape of the dairy cow 51 is also included.
  • the control mechanism 11 specifies an animal detection unit 61, a data acquisition unit 62, a data synthesizing unit 63, an animal extraction unit 64, and a surface data generation unit 65 for evaluation of the growth status of the dairy cow 51. It includes a part cut-out portion 66 , a part volume calculation portion 67 , a cut surface extraction portion 68 , an approximate curve calculation portion 69 and a growth condition evaluation portion 71 .
  • the animal detection unit 61 detects from the image I acquired by the camera 12 that there is an animal (in the first embodiment, the dairy cow 51 (see FIG. 4)) at the drinking fountain 52 defined as the data acquisition area 14 .
  • the animal detection unit 61 recognizes various shapes in the image I based on the contrast and the like, and distinguishes between the facilities such as the water tub 53 in the drinking fountain 52 and the dairy cow 51 based on the recognized shapes and the like.
  • the animal detection unit 61 since the camera 12 is arranged at a predetermined position, the animal detection unit 61 knows in advance the image of the facilities such as the water trough 53 at the drinking fountain 52 , so that the image of the dairy cow 51 can be detected by using the image.
  • the animal detection section 61 outputs a signal to the data acquisition section 62 indicating that the dairy cow 51 has been detected at the drinking fountain 52 . Therefore, the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 in cooperation with the camera 12 .
  • the animal detection unit 61 of Example 1 individually identifies the detected animal (dairy cow 51) and generates individual identification data D2 indicating the identified information.
  • the animal detection unit 61 identifies which dairy cow 51 among the dairy cows kept in the barn 50 is shown, based on the image I in which the dairy cow 51 is shown.
  • the animal detection unit 61 recognizes the shape and position of the black and white pattern on the cow 51 based on the contrast and the like from the area where the cow 51 is projected in the image I, and the recognized black and white pattern is registered in advance.
  • the dairy cow 51 is specified by comparing with the black and white pattern of the dairy cow.
  • the animal detection unit 61 generates individual identification data D2 by identifying the dairy cow 51 for each individual according to the identification, and outputs the individual identification data D2 to the data acquisition unit 62 .
  • the animal detection unit 61 identifies which of the cows kept in the barn 50 the cow 51 displayed in the image I, that is, the cow 51 at the watering hole 52 at that time, is Other methods, such as using tags, may be used as long as the individual identification data D2 identifying the dairy cow 51 for each individual is generated based on the method, and the configuration is not limited to that of the first embodiment.
  • the data acquisition unit 62 When the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, it drives the two laser measurement devices 13 to scan the drinking fountain 52 (data acquisition area 14). Each laser measuring device 13 acquires the point cloud data D1 (see FIGS. 7 and 8) of the drinking fountain 52, associates the point cloud data D1 with the individual identification data D2, and outputs the data synthesizing unit 63 with the individual identification data D2.
  • the point cloud data D1 since the cow 51 is present at the drinking fountain 52 when the data acquisition unit 62 is driven as described above, the point cloud data D1 includes the cow 51 indicated by the individual identification data D2. Become.
  • the data synthesizing unit 63 synthesizes the point cloud data D1 acquired by the two laser measuring instruments 13 to generate synthesized point cloud data D3 (see FIG. 9).
  • the two laser measuring devices 13 are provided at positions that form a pair with the drinking fountain 52 interposed therebetween as described above, they measure the same drinking fountain 52 from mutually different directions.
  • the respective point cloud data D1 can be synthesized (so-called point cloud synthesis), different surfaces of the drinking fountain 52 (for example, one is the right side and the other is the left side of the cow 51 there) can be a collection of three-dimensional position data including the surface shape of
  • the data synthesizing unit 63 connects overlapping portions of point groups (so-called point group matching), uses targets that serve as landmarks (so-called tie point method), and uses other known techniques.
  • the point cloud data D1 are synthesized to generate synthesized point cloud data D3.
  • the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance.
  • the data synthesizing unit 63 can appropriately generate the synthesized point cloud data D3.
  • the data synthesizing unit 63 associates the generated synthetic point cloud data D3 with the individual identification data D2 and outputs the data to the animal extracting unit 64 .
  • the animal extracting unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the combined point cloud data D3 generated by the data synthesizing unit 63, thereby extracting the animal representing the surface shape of the dairy cow 51.
  • Point cloud data D4 (see FIG. 10) is generated.
  • the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance.
  • the point cloud data of the drinking fountain 52 to be acquired is also known in advance. For this reason, the animal extraction unit 64 generates animal point cloud data D4 by taking the difference between the synthetic point cloud data D3 and the point cloud data representing the drinking fountain 52 without the dairy cow 51 .
  • this animal point cloud data D4 always includes at least half of the body near the buttocks 51a regardless of the posture of the cow 51 in the drinking fountain 52. .
  • the animal extraction unit 64 associates the generated animal point cloud data D4 with the individual identification data D2 and outputs the generated animal point cloud data D4 to the surface data generation unit 65 .
  • the animal extraction unit 64 may have another configuration as long as it generates the animal point cloud data D4, and is not limited to the configuration of the first embodiment.
  • the animal extraction unit 64 may pre-register three-dimensional coordinate positions (coordinate data) indicating various angles, sizes, and types of dairy cows 51, and extract the animal point cloud data D4 by comparing them. good.
  • the animal extraction unit 64 extracts clean planes such as the floor and walls of the drinking fountain 52 from the synthesized point cloud data D3, and uses the plane as a reference to perform threshold processing on the place where the dairy cow 51 is likely to be, thereby A place where 51 is likely to be is extracted.
  • the animal extracting unit 64 divides the clusters of the point cloud data close to the extracted location into a plurality of groups by clustering processing, and extracts the animal point cloud data D4 assuming that the largest cluster among them is the dairy cow 51. good.
  • the surface data generation unit 65 generates three-dimensional surface data D5 (see FIG. 11) showing the dairy cow 51 using a known technique based on the animal point cloud data D4.
  • the surface data generation unit 65 of the first embodiment generates three-dimensional surface data D5 (mesh data) of the dairy cow 51 by pasting a plurality of meshes using an algorithm for generating triangular meshes and using Poisson reconstruction.
  • the surface data generating unit 65 may have another configuration as long as it generates the three-dimensional surface data D5, and is not limited to the configuration of the first embodiment.
  • the surface data generation unit 65 associates the generated three-dimensional surface data D5 with the individual identification data D2 and outputs the data to the specific part extraction unit 66 .
  • the specific part extracting unit 66 From the three-dimensional surface data D5 representing the dairy cow 51, the specific part extracting unit 66 generates buttock vicinity surface data D6 (see FIGS. 12 and 13) of the buttock vicinity 51a.
  • the specific part cutout section 66 since the vicinity 51a of the buttocks of the cow 51 is considered to be important in evaluating the breeding status of the dairy cow 51, the specific part cutout section 66 generates the vicinity of the buttocks surface data D6.
  • the specific part cutout section 66 of Example 1 extracts the characteristic portion of the bone in the vicinity of the buttocks 51a from the three-dimensional surface data D5, and sets the cutout surface based on the characteristic portion.
  • the specific part cut-out portion 66 is cut from the middle position between the pin bone (see symbol Bp in FIG. 13) and the tail head (see symbol Bt in FIG. 13) in the front-back direction of the cow 51 to the hook bone (see symbol Bt in FIG. Bh) length as a reference (referred to as reference length Lr). Cut with Scv.
  • the specific part cut-out portion 66 cuts out along a cut-out plane Sch perpendicular to the vertical direction so as to exclude the lower abdomen (from the belly to the milk) in the height direction of the dairy cow 51 . This is because the lower abdomen is a part that tends to change from day to day, and is not preferable for judging the breeding status.
  • the buttock vicinity surface data D6 is limited to the configuration of the first embodiment, as long as it is a three-dimensional coordinate position (coordinate data) indicating the buttock vicinity 51a, and the size (position of each cut surface) may be appropriately set. not.
  • the specific part extracting section 66 associates the individual identification data D2 with the generated buttock neighboring plane data D6 and outputs the data to the part volume calculating section 67 .
  • the part volume calculator 67 uses a known technique to calculate the volume (cutout part volume Vc) of the buttock neighboring plane data D6 (cutout part) based on the buttock neighboring plane data D6 generated by the specific part extracting section 66. Calculate The part volume calculation unit 67 calculates, for all triangular meshes, the volume of trigonometric supposition, for example, with the triangular mesh in the three-dimensional surface data D5 as the base and the reference point in the buttock vicinity surface data D6 as the vertex. Thus, the excised portion volume Vc is calculated.
  • part volume calculator 67 is not limited to the configuration of the first embodiment as long as it calculates the cut-out part volume Vc, which is the volume of the buttock vicinity plane data D6.
  • Part volume calculation section 67 associates individual identification data D2 with each cut-out part volume Vc and outputs it to growth status evaluation section 71 .
  • the cut surface extraction unit 68 generates, from the buttock vicinity surface data D6, the buttock vicinity cross-sectional data D7 (see FIG. 14), which is a cross section obtained by cutting the buttock vicinity 51a of the dairy cow 51 along a predetermined plane.
  • the cut plane extracting unit 68 of the first embodiment cuts the buttock vicinity plane data D6 along a slice position Sp set at an arbitrary position in the fore-and-aft direction on a predetermined plane perpendicular to the front-and-rear direction of the cow 51. By doing so, cross-sectional data D7 near the buttocks is generated.
  • This slice position Sp may be set as appropriate, but in Example 1, it is set at five positions shown in FIG.
  • the slice position Sp1 is set at the same position as the cut-out plane Scv when the buttock vicinity plane data D6 is cut out.
  • the slice position Sp3 is the position of the hook bone.
  • the slice position Sp2 is an intermediate position between the slice positions Sp1 and Sp3.
  • the slice position Sp5 is an intermediate position between tail head (see symbol Bt) and pin bone (see symbol Bp).
  • the slice position Sp4 is an intermediate position between the slice positions Sp3 and Sp5.
  • the cut surface extraction unit 68 may generate a cross section cut along a plane perpendicular to the width direction of the dairy cow 51 as the buttock neighborhood cross-section data D7.
  • the slice position Sp6 is shown. This slice position Sp6 is parallel to the spine at an intermediate position between the spine and the hook bone (see symbol Bh).
  • FIG. 14 An example of the cross-sectional data D7 near the buttocks is shown in FIG. In FIG. 14, they are arranged in order from the left side as viewed from the front in correspondence with the numbers at the end of each slice position Sp, and the numbers 1 to 6 are added to the end. That is, the buttock vicinity cross-sectional data D71 is at the slice position Sp1, the buttock vicinity cross-sectional data D72 is at the slice position Sp2, the buttock vicinity cross-sectional data D73 is at the slice position Sp3, the buttock vicinity cross-sectional data D74 is at the slice position Sp4, and the buttock vicinity cross-sectional data D74 is at the slice position Sp4.
  • the cut surface extraction unit 68 associates the generated buttock neighborhood cross-section data D7 with the individual identification data D2 and outputs the data to the approximated curve calculation unit 69 .
  • the approximate curve calculation unit 69 calculates an approximate curve Ac (see FIGS. 15 and 16) that fits the contour OL (see FIG. 14) on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68.
  • the approximated curve calculator 69 can calculate the approximated curve Ac that fits the contour OL by using Kochi's technique. In the first embodiment, the Bezier curve is used to smooth the approximated curve Ac. . This concept will be described with reference to FIGS. 15 and 16. FIG.
  • FIG. 15 shows a part of the contour OL and two approximation curves Ac calculated by fitting them (hereinafter, when shown separately, the left side is referred to as the first approximation curve Ac1 and the right side is referred to as the second approximation curve Ac2). ) and .
  • the outline OL is indicated by a thick line
  • the first approximate curve Ac1 is indicated by a thin dashed line
  • the second approximate curve Ac2 is indicated by a thin dashed line.
  • the approximated curve calculation unit 69 has calculated two approximated curves Ac (assumed to be Ac1 and Ac2) for a portion of the contour OL shown in FIG.
  • the approximation curve calculation unit 69 sets passing points P at the start point and the end point on the contour OL in order to calculate the first approximation curve Ac1 that fits from the vicinity of the left end to the vicinity of the center of the contour OL.
  • the starting point side is defined as a passing point Ps
  • the end point side is defined as a passing point Pe.
  • the approximate curve calculator 69 sets a plurality of control points (two C1 and C2 in the example of FIG.
  • the approximated curve calculator 69 adjusts the position of each control point (C1, C2) so that the curve from the passing point Ps to the passing point Pe overlaps the contour OL.
  • the approximated curve calculator 69 can obtain the first approximated curve Ac1 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
  • the approximated curve calculation unit 69 places the passing point Ps near the center on the contour OL. ', a passing point Pe is set near the right end, and a plurality of control points (C1', C2') are set.
  • the passing point Ps' has the same coordinates as the passing point Pe of the first approximated curve Ac1 in order to show the contour OL by the approximated curve without a break.
  • the approximated curve calculator 69 adjusts the positions of the respective control points (C1', C2') so that the curve from the passing point Ps' to the passing point Pe' overlaps the contour OL.
  • the approximated curve calculator 69 can obtain the second approximated curve Ac2 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
  • the approximate curve calculation unit 69 sets each passing point P and each control point C in consideration of the above, and thereby calculates both approximate curves (Ac1, Ac2) that smoothly continue and overlap the contour OL. can ask.
  • the approximate curve calculation unit 69 of the first embodiment standardizes the method of obtaining the approximate curve Ac. That is, the approximate curve calculation unit 69 divides the contour OL into a plurality of ranges based on characteristic points of the dairy cow 51 in the contour OL (for example, characteristic portions of various bones appearing in the contour OL), and divides the contour OL into a plurality of ranges. Assume that an approximated curve Ac is to be obtained.
  • the approximate curve calculation unit 69 sets the characteristic points of the dairy cows 51 to be common points regardless of the individual differences of the dairy cows 51, so that the same number of approximate curves Ac for all the dairy cows 51 regardless of individual differences. can represent the contour OL.
  • the approximated curve calculator 69 sets passing points P at the start and end points of each region, and predefines the number of control points C to be set in each region.
  • the approximate curve calculator 69 can represent the contour OL with an equal number of approximate curves Ac regardless of the individual differences of the dairy cows 51, and the coefficients and values of the approximate curves Ac corresponding to the individual differences of the dairy cows 51.
  • the position of each passing point P and each control point C can be changed.
  • the approximated curve calculation unit 69 can represent the contour OL with a plurality of approximated curves Ac by obtaining a plurality of approximated curves Ac that match the contour OL of the buttock vicinity cross-sectional data D7 as described above.
  • FIG. 16 shows a contour OL represented by a plurality of approximation curves Ac.
  • the example of FIG. 16 shows an approximation curve fitted to the contour OL of the cross-sectional data near the buttocks D73 of FIG. It is represented by curve Ac.
  • seven control points referred to as symmetrical control points Cs in FIG. 16 for distinction) are provided.
  • the rearing condition evaluation unit 71 evaluates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69. Generate evaluation data Ev.
  • the rearing condition evaluation unit 71 can obtain the conventionally used BCS by using the cut-out part volume Vc, for example, and can use this BCS as the evaluation data Ev of the dairy cow 51 .
  • the growth status evaluation unit 71 determines each coefficient (for example, the amount of change from the reference value, the distribution of each coefficient, etc.) in each approximate curve Ac, the positions (coordinates or reference positions) of each passing point P and each control point C ) can be obtained, and these can be used as the evaluation data Ev of the dairy cow 51 .
  • the breeding status evaluation unit 71 associates the evaluation data Ev of the dairy cow 51 with the individual identification data D2 and stores them in the storage unit 18 as appropriate.
  • the breeding status evaluation unit 71 may generate the evaluation data Ev of the dairy cow 51 by using other numerical values based on the cut-out part volume Vc and each approximate curve Ac.
  • the breeding status evaluation unit 71 may generate the evaluation data Ev based on the cut-out part volume Vc of the target dairy cow 51 and changes over time in each approximate curve Ac.
  • the control mechanism 11 appropriately displays the evaluation data Ev stored in the storage unit 18 on the display unit 16 or outputs it to an external device via the communication unit 17 as appropriate.
  • the evaluation data Ev is associated with the individual identification data D2, it is possible to easily grasp which dairy cow 51 is being raised. Thereby, the control mechanism 11 can report the evaluation data Ev of the dairy cow 51 .
  • FIG. 17 a description will be given of a rearing state evaluation process (raising state evaluation control method) as an example of evaluating the rearing state of the dairy cow 51 using the growing state evaluation system 10.
  • FIG. This training status evaluation process is executed by the control mechanism 11 based on a program stored in the storage unit 18 or the internal memory 11a.
  • Each step (each process) of the flow chart of FIG. 17 will be described below.
  • the flow chart of FIG. 17 is started when the growing condition evaluation system 10 is activated, a browser or an application is launched, the camera 12 is driven, and both laser measuring instruments 13 are placed in a standby state.
  • step S1 it is determined whether or not the dairy cow 51 exists in the drinking fountain 52 (data acquisition area 14). If YES, proceed to step S2, and if NO, step S1 is repeated.
  • the animal detection unit 61 analyzes the image I acquired by the camera 12 to determine whether or not the dairy cow 51 is present at the drinking fountain 52. Output to the acquisition unit 62 .
  • the individual identification data D2 is generated by identifying the individual dairy cow 51, and the individual identification data D2 is output to the data acquisition unit 62.
  • step S2 the point cloud data D1 of the drinking fountain 52 is acquired, and the process proceeds to step S3.
  • step S2 when the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, the two laser measuring devices 13 are driven to scan the drinking fountain 52 (data acquisition area 14). , to obtain the point cloud data D1 of the drinking fountain 52 .
  • step S2 when the data acquiring unit 62 receives the point cloud data D1 from both laser measuring devices 13, it outputs the individual identification data D2 to the respective point cloud data D1 to the data synthesizing unit 63 in association with each other.
  • step S3 synthetic point cloud data D3 is generated, and the process proceeds to step S4.
  • the data synthesizing unit 63 synthesizes the point cloud data D1 acquired by both laser measuring devices 13 to generate synthesized point cloud data D3, and adds the individual identification data D2 to the synthesized point cloud data D3 thus generated. It is output to the animal extractor 64 in association with it.
  • step S4 animal point cloud data D4 is generated, and the process proceeds to step S5.
  • the animal extraction unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the synthesized point cloud data D3 generated by the data synthesis unit 63 to generate animal point cloud data D4,
  • the individual identification data D2 are associated with the generated animal point group data D4 and output to the plane data generation unit 65 .
  • step S5 three-dimensional surface data D5 is generated, and the process proceeds to step S6.
  • the surface data generation unit 65 generates three-dimensional surface data D5 of the dairy cow 51 from the animal point cloud data D4 generated by the animal extraction unit 64, and adds the individual identification data D2 to the generated three-dimensional surface data D5. It is output to the specific part extraction unit 66 in association with it.
  • step S6 buttock vicinity surface data D6 is generated, and the process proceeds to step S7.
  • the specific part extracting unit 66 generates buttock vicinity surface data D6 of the buttock vicinity 51a of the cow 51 from the three-dimensional surface data D5 generated by the surface data generation unit 65, and generates the buttock vicinity surface data D6. is associated with individual identification data D2 and output to part volume calculation unit 67 .
  • step S7 the excision part volume Vc is calculated, and the process proceeds to step S8.
  • step S7 the part volume calculation unit 67 calculates the cutout part volume Vc, which is the volume, from the buttock vicinity surface data D6 (cutout part) generated by the specific part cutout part 66, and the calculated cutout part volume Vc is calculated.
  • the individual identification data D2 is associated with the volume Vc and output to the growing condition evaluation unit 71 .
  • step S8 cross-sectional data D7 near the buttocks is generated, and the process proceeds to step S9.
  • the cut surface extraction unit 68 extracts the buttock vicinity 51a of the dairy cow 51 from the buttock vicinity surface data D6 (cut out portion) generated by the specific portion cutout unit 66 along a predetermined plane.
  • Proximal cross-sectional data D7 is generated, and individual identification data D2 is associated with the generated buttock proximate cross-sectional data D7 and output to approximated curve calculation unit 69 .
  • step S9 an approximate curve Ac is calculated, and the process proceeds to step S10.
  • the approximated curve calculator 69 calculates an approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extractor 68, and calculates the approximated curve Ac at each passing point.
  • P and each control point C are associated with the individual identification data D2 and output to the breeding state evaluation unit 71 .
  • step S10 the evaluation data Ev is generated, and the process proceeds to step S11.
  • the rearing condition evaluation unit 71 calculates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69.
  • Evaluation data Ev indicating the evaluation of is generated, and individual identification data D2 is associated with the evaluation data Ev and stored in the storage unit 18 as appropriate.
  • step S11 it is determined whether or not the training status evaluation process has ended. If YES, this training status evaluation process ends, and if NO, the process returns to step S1. In step S11, when the growing condition evaluation system 10 is stopped or when the operating unit 15 is operated to end the growing condition evaluating process, it is determined that the growing condition evaluating process has ended.
  • the growth condition evaluation system 10 detects the presence of the dairy cow 51 at the drinking fountain 52 via the camera 12, it acquires the point cloud data D1 of the drinking fountain 52 using both laser measuring devices 13 (steps S1 and S2). Thereafter, based on the point cloud data D1, the growing condition evaluation system 10 calculates the volume of the vicinity of the buttocks 51a of the dairy cow 51 (cutout portion volume Vc) (steps S3 to S7), and the outer surface side of the vicinity of the buttocks 51a. An approximated curve Ac that fits the contour OL of is calculated (steps S3 to S6, S8 and S9).
  • the growing condition evaluation system 10 generates evaluation data Ev of the dairy cow 51 using the cut-out part volume Vc and the approximation curve Ac, and appropriately stores them in the storage unit 18 (step S10).
  • the growing condition evaluation system 10 can notify the evaluation data Ev by appropriately displaying the evaluation data Ev on the display unit 16 or outputting the evaluation data Ev to an external device via the communication unit 17 as appropriate.
  • the conventional growing condition evaluation system acquires the three-dimensional coordinate group of the animal using a range image sensor. Since the range image sensor has limitations in increasing the accuracy and resolution of the acquired three-dimensional coordinate group, it is difficult for the three-dimensional coordinate group to appropriately express the outline of the actual animal. For this reason, with the health condition estimation device of the prior art, it is difficult to obtain an appropriate evaluation by the BCS even if the three-dimensional coordinate group is used.
  • the conventional health condition estimation device uses a distance image sensor, the contour of the animal indicated by the three-dimensional coordinate group is different from that of an actual animal, such as jagged unevenness. It becomes difficult to represent with a formula (approximate curve). For this reason, with the conventional health condition estimation device, it is difficult to obtain a mathematical expression that appropriately expresses the contour shape, and it is difficult to appropriately evaluate the growth situation using the contour.
  • the growing condition evaluation system 10 uses the laser measuring device 13 to acquire the three-dimensional coordinate group of the animal (the dairy cow 51 in Example 1).
  • the laser measuring instrument 13 can acquire a three-dimensional coordinate group (point group data D1) with accuracy at the level used for precise surveying, and also has an extremely high resolution (12.5 mm interval in Example 1). ), the point cloud data D1 can represent the contour of the actual dairy cow 51 very faithfully.
  • the laser measuring device 13 of Example 1 can set the accuracy of the reaching distance of the pulsed laser beam to an error of 3.5 mm or less, and set the accuracy of the scanning plane to an error of 2.0 mm or less. It is possible to reduce the accuracy of angle measurement to an error of 6 seconds (angle) or less for both vertical and horizontal.
  • the laser measuring device 13 of Example 1 can be set to a low output mode, but even in that case, except that the accuracy of the reaching distance of the pulsed laser beam becomes an error of 4.0 mm or less. , and others can be the above-described accuracy.
  • the growing condition evaluation system 10 can acquire a three-dimensional coordinate group with extremely high accuracy and can also achieve extremely high resolution by using the laser measuring device 13 . Therefore, the growing condition evaluation system 10 can appropriately obtain an evaluation of the growing condition by using the point cloud data D1 from the laser measuring device 13 .
  • the contour OL of the dairy cow 51 indicated by the point cloud data D1 can represent the actual contour of the dairy cow 51 very faithfully. It is possible to obtain an approximation curve Ac that can be approximated by a smooth curve and appropriately represents the contour of the actual dairy cow 51 . Therefore, in the growing condition evaluation system 10, since the approximated curve Ac representing the contour of the actual dairy cow 51 can be obtained with extreme fidelity, the contour OL can be represented by a standardized formula, and the growing condition can be evaluated without variation. You can get an evaluation.
  • the growing condition evaluation system 10 approximates the contour OL based on the point cloud data D1 to find the approximated curve Ac, so that the body shape of the dairy cow 51 can be represented by a mathematical formula, and the growing condition can be evaluated. Since it can be calculated by calculation, the evaluation can be made objective and appropriate.
  • the growing condition evaluation system 10 standardizes the method of obtaining the approximated curve Ac, so that individual differences in the contour OL can be determined by the coefficients, the positions of the passing points P and the control points C, etc. in the approximated curve Ac. can be shown. Therefore, the growing condition evaluation system 10 can generate the evaluation data Ev based on changes in the positions of each coefficient, each passing point P and each control point C, and the like, thereby making it possible to judge the growing condition more uniformly. do.
  • the growing condition evaluation system 10 since the growing condition evaluation system 10 generates the evaluation data Ev using the approximate curve Ac obtained from the point group data D1 acquired by the laser measuring device 13, the evaluation data Ev can be notified as the evaluation of the growing condition. can be done. Therefore, the growth situation evaluation system 10 can prevent individual differences and fluctuations in judgment from being reflected in the evaluation of the growth situation, and can eliminate differences caused by different places, facilities, times, etc. can. As a result, the growing condition evaluation system 10 can generate evaluation data Ev based on a unified standard, and can evaluate the dairy cow 51 (animal) as a quantitative numerical value.
  • the growing condition evaluation system 10 Only when the cow 51 is detected at the drinking fountain 52 via the camera 12, the growing condition evaluation system 10 obtains the point cloud data D1 of the drinking fountain 52 using the two laser measuring devices 13, and based on this, evaluates data. Ev can be generated. Therefore, the growing condition evaluation system 10 can obtain the evaluation data Ev without touching the dairy cow 51 while making use of the fact that the dairy cow 51 comes to the drinking fountain 52 of its own will. Therefore, the growing condition evaluation system 10 can determine an appropriate growing condition without applying stress to the dairy cow 51 such as touching it or guiding it to an unintended place for generating the evaluation data Ev. do.
  • the growth condition evaluation system 10 automatically obtains the evaluation data Ev by using the natural behavior of the dairy cow 51, it is possible to eliminate the time limit and manage the system all day (24 hours). Furthermore, the growing condition evaluation system 10 does not acquire the point cloud data D1 by the two laser measuring devices 13 when the dairy cow 51 is not present at the drinking fountain 52, so acquisition and accumulation of unnecessary data can be prevented. and can be operated efficiently.
  • the growing condition evaluation system 10 of Example 1 of the growing condition evaluation system can obtain the following effects.
  • the growing condition evaluation system 10 is a laser measuring instrument that acquires point cloud data D1 representing the outline of the animal in three-dimensional coordinates by receiving the reflected light of the emitted laser beam (pulse laser beam) from the animal (dairy cow 51). 13.
  • the growing condition evaluation system 10 also includes a surface data generation unit 65 that generates three-dimensional surface data D5 of the animal based on the point cloud data D1, and an approximate curve calculator that calculates an approximate curve Ac that fits the three-dimensional surface data D5. 69, and a breeding condition evaluation unit 71 that generates evaluation data Ev indicating an evaluation of the animal based on the approximated curve Ac.
  • the growing condition evaluation system 10 can calculate the approximated curve Ac representing the contour OL with a standardized formula based on the point cloud data D1 that very faithfully represents the contour of the actual animal. Since the evaluation data Ev can be generated from the data, it is possible to obtain an appropriate evaluation of the training status without variation.
  • the approximate curve calculation unit 69 fits the contour OL of the cut surface (buttock vicinity cross-sectional data D7) obtained by cutting the three-dimensional surface data D5 along the cutting plane to calculate the approximate curve Ac. do. For this reason, the growth condition evaluation system 10 determines the position of the cut surface based on the characteristic portion using bones, etc., thereby suppressing the capacity of the generated evaluation data Ev and facilitating comparison between animals. Appropriately assess the situation.
  • the growing condition evaluation system 10 sets a data acquisition area 14 for acquiring point cloud data D1 of an animal (dairy cow 51), and an animal detection mechanism (camera 12, An animal detection unit 61) is provided.
  • an animal detection mechanism camera 12, An animal detection unit 61
  • the growing condition evaluation system 10 acquires the point cloud data D1 of the data acquisition region 14.
  • FIG. Therefore, since the growing condition evaluation system 10 drives the laser measuring device 13 only when the animal is in the data acquisition area 14, the large-capacity point cloud data D1 is not acquired in the absence of the animal. Evaluation data Ev can be generated well.
  • the growing condition evaluation system 10 can generate the evaluation data Ev without giving stress to the animal such as touching it or leading it to an unintended place, etc., and can prevent hindrance to the breeding of the animal.
  • the evaluation data Ev can be generated by utilizing the time when the animal stops itself while acting naturally. , can eliminate animal stress associated with its production.
  • the growing condition evaluation system 10 has a pair of laser measuring instruments 13 with a data acquisition area 14 interposed therebetween. Therefore, the growth condition evaluation system 10 can acquire the point cloud data D1 of the animal in the data acquisition area 14 regardless of the animal's position, posture, facing direction, etc., and can provide an appropriate image of the animal while reducing stress. Evaluation data Ev can be generated.
  • both laser measurement devices 13 acquire point cloud data D1 of at least half of the animal's body (either left or right) regardless of the posture of the animal in the drinking fountain 52 as the data acquisition area 14. A positional relationship with respect to the drinking fountain 52 is set so that it can be obtained. Therefore, the growing condition evaluation system 10 can greatly increase the possibility of obtaining the point cloud data D1 necessary for generating the evaluation data Ev while eliminating the animal's stress caused by the generation of the evaluation data Ev.
  • the growing condition evaluation system 10 further includes a data synthesizing unit 63 that synthesizes the point cloud data D1 acquired by each laser measuring device 13 to generate synthesized point cloud data D3, and an animal (dairy cow 51) from the synthesized point cloud data D3. and an animal extracting unit 64 that generates animal point cloud data D4 indicating . Then, in the growing condition evaluation system 10, the surface data generation unit 65 generates three-dimensional surface data D5 from the animal point group data D4. For this reason, the growing condition evaluation system 10 synthesizes the two point cloud data D1 to obtain the synthesized point cloud data D3, thereby obtaining the data necessary for evaluating the growing condition of the animal (the vicinity of the buttocks 51a in the first embodiment).
  • the growing condition evaluation system 10 generates data (animal point cloud data D4) indicating animals required for evaluation of the growing condition among the synthesized point cloud data D3, and from there, three-dimensional surface data D5. , the evaluation data Ev is generated, the volume of the three-dimensional surface data D5 and the evaluation data Ev and the amount of work for generating them can be suppressed, and the evaluation data Ev can be efficiently generated.
  • the growing condition evaluation system 10 the animal detection mechanism (camera 12, animal detection unit 61) generates individual identification data D2 in which the detected animal (dairy cow 51) is individually identified, and the growth condition evaluation unit 71 generates Individual identification data D2 is associated with evaluation data Ev. Therefore, the growing condition evaluation system 10 can appropriately evaluate the growing condition of each animal even in a situation where a plurality of animals visit the data acquisition area 14 at random times or in any order.
  • the growth situation evaluation system 10 as an example of the growth situation evaluation system according to the present disclosure, it is possible to appropriately obtain an evaluation of the rearing situation of the animal (dairy cow 51).
  • Example 1 the growth situation evaluation system of the present disclosure has been described based on Example 1, but the specific configuration is not limited to Example 1, and the gist of the invention according to each claim of the scope of claims. Design changes and additions are permitted as long as they do not deviate.
  • the drinking fountain 52 is set as the data acquisition area 14 .
  • the data acquisition area 14 may be set as appropriate, and is not limited to the configuration of the first embodiment.
  • the data acquisition area 14 by setting a place where the animal regularly visits at its own will like the watering place 52, such as a feeding place, the breeding situation can be monitored without stressing the animal. Appraisal can be obtained properly.
  • the stress of the animal can be further reduced and the breeding situation can be monitored. evaluation can be obtained appropriately.
  • the above required time depends on the scanning speed of each laser measuring device 13, and can be shortened by, for example, increasing the number of pulsed laser beams emitted at one time. Even if it is shortened, the point cloud data D1 can be obtained appropriately.
  • Example 1 the dairy cow 51 is targeted as an example of an animal, and the evaluation data Ev indicating the evaluation of the breeding status is generated.
  • an animal for which evaluation data Ev is generated is not limited to the configuration of the first embodiment as long as the animal is required to evaluate the raising state.
  • the approximated curve calculation unit 69 calculates the approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68 .
  • the locations used for evaluating the growth status are basically left-right symmetrical.
  • the curve Ac may be calculated (evaluated on the half of the body), and is not limited to the configuration of the first embodiment.
  • the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 based on the image from the camera 12 .
  • the animal detection mechanism is not limited to the configuration of the first embodiment as long as it detects the presence of an animal in the data acquisition area 14 .
  • a device using infrared rays such as an infrared scanner or an infrared thermography can be used. In this case, detection of animals can be made more reliable even in dark conditions.
  • a pressure sensor may be provided at the faucet of the drinking fountain 52
  • a weight detection device may be provided in the data acquisition area 14, or a sensor may be provided at the entrance of the data acquisition area 14. can.
  • Example 1 the laser measuring instruments 13 are provided in pairs with the data acquisition area 14 interposed therebetween. However, as long as the laser measuring device 13 acquires the point group data D1 of the animal existing in the data acquisition area 14, a single laser measuring device 13 may be provided, or three or more may be provided. not.
  • Example 1 the specific part extraction unit 66 generates the buttock vicinity surface data D6 of the buttock vicinity 51a of the dairy cow 51 .
  • the specified part may be appropriately set, and is not limited to the configuration of the first embodiment.
  • the specific portion extraction unit 66, the cut surface extraction unit 68, and the approximated curve calculation unit 69 are based on the characteristic portion of the bone.
  • each of these parts may be based on other parts as long as they are detectable when setting the parts of the animal and are common regardless of individual differences between animals, It is not limited to the configuration of the first embodiment.
  • the approximate curve calculation unit 69 calculates the approximate curve Ac using a Bezier curve. However, the approximated curve calculation unit 69 calculates an approximated curve that can be represented by an equation in a predetermined format while adapting the contour of the specified location (the contour OL of the cross-sectional data D7 near the buttocks in the first embodiment). If so, the method of calculating the approximate curve may be set as appropriate, and is not limited to the configuration of the first embodiment.

Abstract

Provided is a growing condition evaluation system which can suitably evaluate a development condition of an animal. The growing condition evaluation system (10) comprises: a laser measurement apparatus (13) which acquires point group data (D1) that indicates the appearance of an animal (cow 51) in three-dimensional coordinates by receiving laser light that has been emitted toward and reflected from the animal (cow 51); a surface data generation unit (65) which generates three-dimensional data (D5) of the animal (cow 51) on the basis of the point group data (D1); an approximate curve calculation unit (69) which calculates an approximate curve (Ac) that is suitable for the three-dimensional surface data (D5); and a development condition evaluation unit (71) which generates, on the basis of the approximate curve (Ac), evaluation data (Ev) that indicates evaluation of the animal (cow 51).

Description

生育状況評価システムGrowing condition evaluation system
 本開示は、生育状況評価システムに関する。 This disclosure relates to a growing condition evaluation system.
 従来から、牛や豚等の動物の生育状況の判断のために、BCS(ボディーコンディションスコア)等を用いることが知られている。 Conventionally, it is known to use BCS (Body Condition Score) etc. to judge the growth status of animals such as cattle and pigs.
 そのBCSは、動物に触れることなく自動で求めることのできる健康状態推定装置が考えられている(例えば、特許文献1参照)。その健康状態推定装置は、距離画像センサを用いることで動物の三次元形状を示す三次元座標群を取得し、その三次元座標群に基づいて動物の体躯の幅を示す特徴値や、動物の背骨の位置を示す特徴値を求めて、それらに基づいてBCSを算出する。このため、従来の健康状態推定装置は、動物への負担を軽減しつつ、バラツキの抑制されたBCSによる評価を得ることができる。 The BCS is considered a health condition estimation device that can be automatically determined without touching the animal (see Patent Document 1, for example). The health condition estimating device obtains a three-dimensional coordinate group indicating the three-dimensional shape of the animal by using a range image sensor, and based on the three-dimensional coordinate group, a feature value indicating the width of the body of the animal, Characteristic values indicating the position of the spine are obtained, and the BCS is calculated based on them. Therefore, the conventional health condition estimating apparatus can obtain an evaluation based on the BCS with reduced variation while reducing the burden on the animal.
特許6777948号公報Japanese Patent No. 6777948
 ところで、従来の健康状態推定装置では、距離画像センサを用いて動物の三次元座標群を取得している。このため、従来の健康状態推定装置では、取得した三次元座標群では動物の三次元形状の再現に限界があるので、その三次元座標群に基づいてBCS等のような育成状況の評価を適切に得ることは困難である。 By the way, in conventional health condition estimation devices, a range image sensor is used to acquire a group of three-dimensional coordinates of an animal. For this reason, in the conventional health condition estimating device, there is a limit to the reproduction of the three-dimensional shape of the animal with the acquired three-dimensional coordinate group. It is difficult to get to.
 本開示は、上記の事情に鑑みて為されたもので、動物の育成状況の評価を適切に得ることのできる生育状況評価システムを提供することを目的とする。 The present disclosure has been made in view of the above circumstances, and aims to provide a growth situation evaluation system that can appropriately obtain an evaluation of the growth situation of animals.
 上記した課題を解決するために、本開示の生育状況評価システムは、出射したレーザ光の動物からの反射光を受光することで前記動物の外形を三次元座標で示す点群データを取得するレーザ測定器と、前記点群データに基づいて前記動物の三次元面データを生成する面データ生成部と、前記三次元面データに適合する近似曲線を算出する近似曲線算出部と、前記近似曲線に基づいて、前記動物の評価を示す評価データを生成する育成状況評価部と、を備えることを特徴とする。 In order to solve the above-described problems, the growth situation evaluation system of the present disclosure includes a laser beam that acquires point cloud data representing the outline of the animal in three-dimensional coordinates by receiving the reflected light from the animal of the emitted laser beam. a measuring device, a surface data generation unit that generates three-dimensional surface data of the animal based on the point cloud data, an approximate curve calculation unit that calculates an approximate curve that fits the three-dimensional surface data, and the approximate curve and a breeding status evaluation unit that generates evaluation data indicating the evaluation of the animal based on the evaluation.
 本開示の生育状況評価システムによれば、動物の育成状況の評価を適切に得ることができる。 According to the growth situation evaluation system of the present disclosure, it is possible to appropriately obtain an evaluation of the animal's growth situation.
本開示に係る生育状況評価システムの一例としての実施例1の生育状況評価システムの全体構成を示す説明図である。1 is an explanatory diagram showing the overall configuration of a growing condition evaluation system of Example 1 as an example of a growing condition evaluation system according to the present disclosure; FIG. 生育状況評価システムにおける制御系の構成を示すブロック図である。It is a block diagram which shows the structure of the control system in a growing condition evaluation system. 生育状況評価システムにおける撮像装置が取り付けられている様子を示す説明図である。It is explanatory drawing which shows a mode that the imaging device in a growing condition evaluation system is attached. 撮像装置により乳牛が撮像された画像を示す説明図である。It is explanatory drawing which shows the image which the dairy cow was imaged by the imaging device. 生育状況評価システムにおけるレーザ測定器を示す説明図である。It is explanatory drawing which shows the laser measuring device in a growth condition evaluation system. レーザ測定器における制御系の構成を示すブロック図である。3 is a block diagram showing the configuration of a control system in the laser measuring instrument; FIG. 一方のレーザ測定器(第1)により取得された点群データを示す説明図である。It is explanatory drawing which shows the point cloud data acquired by one laser measuring device (1st). 他方のレーザ測定器(第2)により取得された点群データを示す説明図である。It is explanatory drawing which shows the point group data acquired by the other laser measuring device (2nd). 両点群データを合成した合成点群データを示す説明図である。FIG. 10 is an explanatory diagram showing synthesized point cloud data obtained by synthesizing both point cloud data; 乳牛を示す動物点群データを示す説明図である。It is an explanatory view showing animal point cloud data showing a dairy cow. 動物点群データをメッシュ化した三次元面データを示す説明図である。FIG. 4 is an explanatory diagram showing three-dimensional surface data obtained by meshing animal point cloud data; 三次元面データから臀部近傍を切り出した臀部近傍面データを示す説明図である。FIG. 10 is an explanatory diagram showing surface data near the buttocks obtained by cutting out the vicinity of the buttocks from the three-dimensional surface data; 臀部近傍面データから臀部近傍断面データを切り出す各スライス位置を示す説明図である。FIG. 4 is an explanatory diagram showing each slice position for cutting out cross-sectional data near the buttocks from plane data near the buttocks; 臀部近傍断面データを示す説明図であり、図13の各スライス位置に対応するものを左から順に並べて示している。FIG. 14 is an explanatory diagram showing cross-sectional data in the vicinity of the buttocks, in which data corresponding to each slice position in FIG. 13 are arranged in order from the left. 臀部近傍断面データの輪郭の近似曲線を求める様子を示す説明図である。FIG. 10 is an explanatory diagram showing how an approximate curve of the contour of cross-sectional data near the buttocks is obtained; 一例としての臀部近傍断面データ(その輪郭)と近似曲線とを示す説明図である。It is explanatory drawing which shows buttock vicinity cross-sectional data (its outline) and an approximated curve as an example. 生育状況評価システムの制御機構で実行される育成状況評価処理(育成状況評価処理方法)を示すフローチャートである。It is a flowchart which shows the growing condition evaluation process (cultivating condition evaluation processing method) performed by the control mechanism of a growing condition evaluation system.
 以下に、本開示に係る生育状況評価システムの一実施形態としての生育状況評価システム10の実施例1について図1から図17を参照しつつ説明する。なお、図1、図3では、牛舎50における水飲み場52(データ取得領域14)の周辺を模式的に示しており、必ずしも実際の牛舎50の様子と一致するものではない。 Example 1 of the growing condition evaluation system 10 as one embodiment of the growing condition evaluation system according to the present disclosure will be described below with reference to FIGS. 1 to 17 . 1 and 3 schematically show the vicinity of the drinking fountain 52 (data acquisition area 14) in the barn 50, and does not necessarily match the actual state of the barn 50. FIG.
 本開示に係る生育状況評価システム10は、動物の育成状況を自動で評価するものである。実施例1の生育状況評価システム10は、動物の一例としてホルスタイン種の牛(以下では乳牛51とする)の育成状況を評価する。この生育状況評価システム10は、図1から図3に示すように、牛舎50に設けられており、制御機構11とカメラ12と2つのレーザ測定器13とを備える。 The growth status evaluation system 10 according to the present disclosure automatically evaluates the growth status of animals. The growing condition evaluation system 10 of Example 1 evaluates the growing condition of a Holstein cow (hereinafter referred to as dairy cow 51) as an example of an animal. This growing condition evaluation system 10 is installed in a cow barn 50 as shown in FIGS.
 牛舎50は、複数の乳牛51が飼われており、各乳牛51の移動が可能とされている。牛舎50では、乳牛51のための水飲み場52が設けられている。水飲み場52は、複数の乳牛51が入ることのできる空間(スペース)に水桶53が置かれて構成されている。水桶53は、長尺とされており、水飲み場52の片隅に配置されている。この水飲み場52では、乳牛51が自らの意思で定期的に訪れて、水桶53の前で立ち止まることとなる。このため、生育状況評価システム10では、水飲み場52をデータ取得領域14として設定している。 A plurality of dairy cows 51 are kept in the cowshed 50, and each dairy cow 51 can be moved. A cowshed 50 is provided with a drinking fountain 52 for a dairy cow 51 . The drinking fountain 52 is constructed by placing a water tub 53 in a space in which a plurality of dairy cows 51 can enter. The water tub 53 is elongated and arranged at one corner of the drinking fountain 52 . A dairy cow 51 periodically visits the drinking fountain 52 of its own volition and stops in front of the water tub 53 . Therefore, in the growing condition evaluation system 10 , the drinking fountain 52 is set as the data acquisition area 14 .
 制御機構11は、図2に示すように、記憶部18または内蔵する内部メモリ11aに記憶したプログラムを例えばRAM(Random Access Memory)上に展開することにより、生育状況評価システム10の動作を統括的に制御する。実施例1では、内部メモリ11aは、RAM等で構成され、記憶部18は、ROM(Read Only Memory)やEEPROM(Electrically Erasable Programmable ROM)等で構成される。生育状況評価システム10では、上記した構成の他に、測定完了信号や測定者からの指示に応じて測定結果を印字するプリンタや、測定結果を外部メモリやサーバーに出力する出力部や、動作の状況等を報知する音声出力部が適宜設けられる。 As shown in FIG. 2, the control mechanism 11 comprehensively controls the operation of the growing condition evaluation system 10 by deploying a program stored in the storage unit 18 or the built-in internal memory 11a, for example, on a RAM (Random Access Memory). to control. In the first embodiment, the internal memory 11a is composed of a RAM or the like, and the storage unit 18 is composed of a ROM (Read Only Memory), an EEPROM (Electrically Erasable Programmable ROM), or the like. In addition to the configuration described above, the growing condition evaluation system 10 includes a printer that prints the measurement results in response to a measurement completion signal or an instruction from the measurer, an output unit that outputs the measurement results to an external memory or a server, and an operation controller. An audio output unit for notifying the situation and the like is provided as appropriate.
 制御機構11は、カメラ12と2つのレーザ測定器13(図2では、一方を第1、他方を第2としている)とが接続されて適宜それらを制御するとともに、それらからの信号(データ)を受け取ることが可能とされている。この接続は、カメラ12および各レーザ測定器13との信号の遣り取りを可能とするものであれば、有線でもよく無線でもよい。この制御機構11は、牛舎50とは異なる位置に設けてもよく、牛舎50内に設けてもよい。制御機構11には、操作部15と表示部16と通信部17と記憶部18とが接続されている。 The control mechanism 11 is connected to a camera 12 and two laser measuring devices 13 (one is designated as the first and the other is designated as the second in FIG. 2) and appropriately controls them, and signals (data) from them are It is possible to receive This connection may be wired or wireless as long as it is possible to exchange signals with the camera 12 and each laser measuring device 13 . The control mechanism 11 may be provided at a position different from the cowshed 50 or may be provided inside the cowshed 50 . An operation unit 15 , a display unit 16 , a communication unit 17 and a storage unit 18 are connected to the control mechanism 11 .
 その操作部15は、生育状況の評価のための各種の設定や、カメラ12や各レーザ測定器13の動作や設定等を操作するものである。操作部15は、例えばキーボード、マウス等の入力装置で構成されていてもよく、表示部16の表示画面をタッチパネル式としてそこに表示されたソフトウェアキー等で構成されてもよい。 The operation unit 15 is for operating various settings for evaluating the growth situation, operations and settings of the camera 12 and each laser measuring device 13, and the like. The operation unit 15 may be composed of an input device such as a keyboard and a mouse, or may be composed of software keys displayed on the display screen of the display unit 16 of a touch panel type.
 表示部16は、カメラ12で取得した画像I(静止画でもよく動画でもよい(図4参照))、各レーザ測定器13で取得した点群データD1(図7、図8参照)、それらに基づく各種のデータ(図9から図14等参照)、動物(乳牛51)の評価を示す評価データEv等を表示する。表示部16は、一例として液晶表示装置(LCDモニタ)で構成しており、操作部15とともに制御機構11に設けられている。なお、操作部15と表示部16とは、スマートフォンやタブレット等の携帯端末で構成してもよく、実施例1の構成に限定されない。 The display unit 16 displays an image I acquired by the camera 12 (either a still image or a moving image (see FIG. 4)), point cloud data D1 acquired by each laser measuring device 13 (see FIGS. 7 and 8), and Various data (see FIGS. 9 to 14, etc.), evaluation data Ev indicating the evaluation of the animal (dairy cow 51), etc. are displayed. The display unit 16 is configured by a liquid crystal display device (LCD monitor) as an example, and is provided in the control mechanism 11 together with the operation unit 15 . Note that the operation unit 15 and the display unit 16 may be composed of a mobile terminal such as a smart phone or a tablet, and are not limited to the configuration of the first embodiment.
 通信部17は、カメラ12や各レーザ測定器13(その通信部45)や外部機器との通信を行うもので、カメラ12や各レーザ測定器13の駆動や、カメラ12からの画像Iや各レーザ測定器13からの点群データD1の受信を可能とする。なお、制御機構11は、タブレット端末で構成することができ、その場合には操作部15や表示部16を制御機構11と一体的な構成とすることができる。 The communication unit 17 communicates with the camera 12, each laser measuring device 13 (the communication unit 45 thereof), and an external device, and drives the camera 12 and each laser measuring device 13, and receives an image I from the camera 12 and each device. Reception of the point cloud data D1 from the laser measuring instrument 13 is enabled. Note that the control mechanism 11 can be configured by a tablet terminal, and in that case, the operation unit 15 and the display unit 16 can be configured integrally with the control mechanism 11 .
 カメラ12は、データ取得領域14の全域を撮影可能とされており、実施例1では4Kカメラ(3840×2160画素の解像度を有する)を用いている。カメラ12は、図1、図3に示すように、データ取得領域14とされた水飲み場52の上方の設置板54に取り付けられており、乳牛51を邪魔することなく水飲み場52の撮影を可能としている。その設置板54は、カメラ12の設置のために牛舎50の支柱55に設けられている。カメラ12は、生育状況評価システム10が動作状態とされている間、水飲み場52を常時撮影しており、その撮影した画像I(そのデータ(図4参照))を制御機構11に出力する。 The camera 12 is capable of photographing the entire data acquisition area 14, and in Example 1, a 4K camera (having a resolution of 3840×2160 pixels) is used. As shown in FIGS. 1 and 3, the camera 12 is attached to the installation plate 54 above the drinking fountain 52 which is the data acquisition area 14, and can photograph the drinking fountain 52 without disturbing the dairy cows 51. and The installation plate 54 is provided on a post 55 of the barn 50 for installation of the camera 12 . The camera 12 always captures the water fountain 52 while the growing condition evaluation system 10 is in operation, and outputs the captured image I (the data (see FIG. 4)) to the control mechanism 11 .
 2つのレーザ測定器13は、既知点に設置されて測定点へ向けてパルスレーザ光線を投射し、その測定点からのパルスレーザ光線の反射光(パルス反射光)を受光して、パルス毎に測距を行い、測距結果を平均化して高精度の距離測定を行う。そして、各レーザ測定器13は、設定された測定範囲内を走査(スキャン)して測定範囲全体に亘って満遍なく測定点を設定することにより、測定範囲の全域に存在する物体の表面形状を三次元座標で示す三次元位置データの集まり(以下では、点群データD1とする)を取得できる。これにより、各レーザ測定器13は、設定された測定範囲内に存在するものの表面形状を示す点群データD1を取得できる。 The two laser measuring devices 13 are installed at known points, project pulse laser beams toward the measurement points, receive reflected light (pulse reflected light) of the pulse laser beams from the measurement points, and Distance measurement is performed, and the distance measurement results are averaged to perform highly accurate distance measurement. Each laser measuring device 13 scans the set measurement range and evenly sets measurement points over the entire measurement range, so that the surface shape of the object existing over the entire measurement range is three-dimensionally measured. A collection of three-dimensional position data (hereinafter referred to as point cloud data D1) indicated by original coordinates can be obtained. As a result, each laser measuring device 13 can acquire the point cloud data D1 representing the surface shape of the object existing within the set measurement range.
 この2つのレーザ測定器13は、図1に示すように、データ取得領域14とされた水飲み場52を挟んで対を為す位置であって、乳牛51の移動の邪魔となることのない高さ位置に配置されている。両レーザ測定器13は、水飲み場52(データ取得領域14)にいる乳牛51の姿勢に拘らず、その乳牛51の臀部近傍51aの少なくとも半身(左右のいずれか一方)の点群データD1を取得可能とするように、水飲み場52に対する位置関係が設定されている。この2つのレーザ測定器13は、設けられている位置が異なることを除くと、互いに等しい構成とされている。なお、レーザ測定器13は、所定の周波数で変調された光ビームを用いる位相差測定方式を採用してもよく、他の方式を採用してもよく、実施例1に限定されない。このレーザ測定器13は、図5、図6に示すように、台座31と本体部32と鉛直回転部33とを備える。 As shown in FIG. 1, the two laser measuring devices 13 are positioned so as to form a pair across the drinking fountain 52, which is the data acquisition area 14, and at a height that does not interfere with the movement of the dairy cow 51. placed in position. Both laser measurement devices 13 acquire point cloud data D1 of at least half of the body (either left or right) near the buttocks 51a of the dairy cow 51, regardless of the posture of the cow 51 in the drinking fountain 52 (data acquisition area 14). A positional relationship with respect to the drinking fountain 52 is set so that it is possible. These two laser measuring instruments 13 have the same configuration, except that they are provided at different positions. Note that the laser measuring device 13 may adopt a phase difference measuring method using a light beam modulated at a predetermined frequency, or may adopt another method, and is not limited to the first embodiment. The laser measuring instrument 13 includes a base 31, a body portion 32, and a vertical rotating portion 33, as shown in FIGS.
 台座31は、設置台34に取り付けられる箇所である。その設置台34は、レーザ測定器13の設置のために牛舎50の支柱55に取り付けられる。本体部32は、台座31に対して鉛直軸心を中心に回転可能に当該台座31に設けられる。本体部32は、全体にU字形状とされており、その間の部分に鉛直回転部33が設けられている。この本体部32には、測定器表示部35と測定器操作部36とが設けられる。測定器表示部35は、後述する測定器制御部43の制御下で、測定のための各種の操作アイコンや設定等を表示する箇所である。この測定器操作部36は、レーザ測定器13における各種機能の利用や設定のための操作が為される箇所であり、入力操作された情報を測定器制御部43へと出力する。実施例1の本体部32では、測定器表示部35に表示された各種の操作アイコンが測定器操作部36として機能するものとされている。なお、測定器表示部35と測定器操作部36とは、その上記した機能を生育状況評価システム10における操作部15と表示部16とに持たせることで、支柱55の設置台34上に設けられた状態のままでの遠隔の操作が可能となる。なお、レーザ測定器13は、後述するようにデータ取得領域14(そこにいる乳牛51等の動物)の走査(スキャン)を可能とするものであれば、設ける場所や設置方法は適宜設定すればよく、実施例1の構成に限定されない。 The pedestal 31 is a portion that is attached to the installation base 34 . The installation base 34 is attached to a post 55 of the barn 50 for installation of the laser measuring device 13 . The body portion 32 is provided on the base 31 so as to be rotatable about the vertical axis with respect to the base 31 . The body portion 32 has a U shape as a whole, and a vertical rotating portion 33 is provided in the portion therebetween. A measuring instrument display section 35 and a measuring instrument operating section 36 are provided in the main body section 32 . The measuring instrument display section 35 is a section that displays various operation icons, settings, etc. for measurement under the control of the measuring instrument control section 43, which will be described later. The measuring device operation unit 36 is a place where operations for using and setting various functions of the laser measuring device 13 are performed, and information that has been input and operated is output to the measuring device control unit 43 . In the main body section 32 of the first embodiment, various operation icons displayed on the measuring instrument display section 35 function as the measuring instrument operating section 36 . The measuring device display unit 35 and the measuring device operation unit 36 are provided on the installation table 34 of the support 55 by providing the above functions to the operation unit 15 and the display unit 16 in the growing condition evaluation system 10. It is possible to operate remotely while keeping the As will be described later, if the laser measuring device 13 is capable of scanning the data acquisition area 14 (an animal such as the dairy cow 51 there), the location and installation method can be appropriately set. Well, it is not limited to the configuration of the first embodiment.
 鉛直回転部33は、水平方向に伸びる回転軸を中心に回転可能に本体部32に設けられる。この鉛直回転部33には、測距光学部37が内蔵される。この測距光学部37は、測距光としてのパルスレーザ光線を投射するとともに測定点からの反射光(パルス反射光)を受光して測定点までの光波距離測定を行う。 The vertical rotation part 33 is provided on the main body part 32 so as to be rotatable around a rotation axis extending in the horizontal direction. A range finding optical unit 37 is incorporated in the vertical rotation unit 33 . The distance measuring optical unit 37 projects a pulsed laser beam as distance measuring light and receives reflected light (pulse reflected light) from the measuring point to measure the light wave distance to the measuring point.
 その鉛直回転部33を水平軸心回りに回転可能とする本体部32には、水平回転駆動部38と水平角検出部39とが設けられる。その水平回転駆動部38は、台座31に対して本体部32を鉛直軸心回りにすなわち水平方向に回転させる。水平角検出部39は、その本体部32の台座31に対する水平回転角を検出することで、視準方向の水平角を検出(測角)する。これらは、例えば、水平回転駆動部38をモータで構成することができ、水平角検出部39をエンコーダで構成することができる。 A horizontal rotation drive section 38 and a horizontal angle detection section 39 are provided in the body section 32 that allows the vertical rotation section 33 to rotate around the horizontal axis. The horizontal rotation driving section 38 rotates the body section 32 with respect to the pedestal 31 around the vertical axis, that is, in the horizontal direction. The horizontal angle detection unit 39 detects (measures) the horizontal angle in the collimation direction by detecting the horizontal rotation angle of the main body 32 with respect to the pedestal 31 . For example, the horizontal rotation drive section 38 can be configured with a motor, and the horizontal angle detection section 39 can be configured with an encoder.
 また、本体部32には、鉛直回転駆動部41と鉛直角検出部42とが設けられる。その鉛直回転駆動部41は、本体部32に対して鉛直回転部33を水平軸心回りにすなわち鉛直方向に回転させる。鉛直角検出部42は、その鉛直回転部33の本体部32に対する鉛直角を検出することで、視準方向の鉛直角を検出(測角)する。これらは、例えば、鉛直回転駆動部41をモータで構成することができ、鉛直角検出部42をエンコーダで構成することができる。 In addition, the body portion 32 is provided with a vertical rotation drive portion 41 and a vertical angle detection portion 42 . The vertical rotation drive section 41 rotates the vertical rotation section 33 with respect to the body section 32 around the horizontal axis, that is, in the vertical direction. The vertical angle detection unit 42 detects (measures) the vertical angle of the collimation direction by detecting the vertical angle of the vertical rotation unit 33 with respect to the main unit 32 . For example, the vertical rotation drive unit 41 can be configured with a motor, and the vertical angle detection unit 42 can be configured with an encoder.
 さらに、本体部32には、測定器制御部43が内蔵される。その測定器制御部43は、接続された記憶部44に格納されたプログラムにより、レーザ測定器13の動作を統括的に制御する。その記憶部44は、半導体メモリや各種の記憶媒体により構成され、測定に必要な計算プログラムや、送信する情報を生成して送信するデータ送信プログラム等のプログラムが格納されるとともに、設定データや点群データD1が適宜格納される。その情報やデータは、後述する通信部45と上記した通信部17(図2参照)とを介して制御機構11に適宜送信される。その測定器制御部43には、測定器表示部35、測定器操作部36、測距光学部37、水平回転駆動部38、水平角検出部39、鉛直回転駆動部41、鉛直角検出部42、記憶部44および通信部45が接続される。 Furthermore, the main body part 32 incorporates a measuring device control part 43 . The measuring device control unit 43 controls the operation of the laser measuring device 13 by a program stored in the connected storage unit 44 . The storage unit 44 is configured by a semiconductor memory or various storage media, and stores programs such as a calculation program necessary for measurement and a data transmission program for generating and transmitting information to be transmitted. Group data D1 is stored appropriately. The information and data are appropriately transmitted to the control mechanism 11 via the communication section 45 described later and the communication section 17 (see FIG. 2) described above. The measuring instrument control section 43 includes a measuring instrument display section 35 , a measuring instrument operation section 36 , a distance measuring optical section 37 , a horizontal rotation driving section 38 , a horizontal angle detecting section 39 , a vertical rotation driving section 41 , a vertical angle detecting section 42 , the storage unit 44 and the communication unit 45 are connected.
 通信部45は、通信部17を介してその制御機構11(図2参照)と測定器制御部43との通信を可能とし、測定器制御部43の制御下において記憶部44に格納された各情報を適宜送信する。この通信部45は、制御機構11(通信部17)とデータ等の遣り取りを可能とする。通信部45は、敷設したLANケーブルを介して通信部17と有線通信するものとしてもよく、通信部17と無線通信するものとしてもよい。 The communication unit 45 enables communication between the control mechanism 11 (see FIG. 2) and the measuring device control unit 43 via the communication unit 17, and each data stored in the storage unit 44 under the control of the measuring device control unit 43. Submit information accordingly. The communication unit 45 enables exchange of data and the like with the control mechanism 11 (communication unit 17). The communication unit 45 may perform wired communication with the communication unit 17 via a laid LAN cable, or may perform wireless communication with the communication unit 17 .
 その測定器制御部43には、測距光学部37、水平角検出部39および鉛直角検出部42からの測定のための出力値が入力される。測定器制御部43は、それらの出力値に基づき、本体部32内に設けられた基準光路を伝搬してきた基準光と測距光学部37を介して取得した反射光との到達時間差または位相差から測定点(反射点)までの距離を算出する。また、測定器制御部43は、その算出した距離測定時の高低角、水平角の測定(算出)を行う。そして、測定器制御部43は、それらの測定結果を記憶部44に格納するとともに、通信部45を介して、制御機構11(通信部17)に適宜送信する。 Output values for measurement from the distance measurement optical unit 37, the horizontal angle detection unit 39, and the vertical angle detection unit 42 are input to the measuring device control unit 43. Based on these output values, the measuring instrument control unit 43 determines the arrival time difference or phase difference between the reference light propagating through the reference optical path provided in the main unit 32 and the reflected light acquired via the distance measuring optical unit 37. to the measurement point (reflection point). In addition, the measuring device control unit 43 measures (calculates) the calculated elevation angle and horizontal angle when measuring the distance. Then, the measuring device control section 43 stores these measurement results in the storage section 44 and appropriately transmits them to the control mechanism 11 (communication section 17) via the communication section 45. FIG.
 その測定器制御部43は、水平回転駆動部38および鉛直回転駆動部41の駆動を制御して本体部32および鉛直回転部33(図1参照)を適宜回転させることにより、当該鉛直回転部33を所定の方向に向けることができ、所定の範囲を走査することができる。実施例1の測定器制御部43は、データ取得領域14である水飲み場52を走査するものとしており、そこにいる乳牛51を含む水飲み場52の各位置を測定点とする。両レーザ測定器13は、水飲み場52との位置関係が予め解っているとともに一定な(変化することはない)ので、走査する範囲を水飲み場52の全域に適切に合わせることができる。実施例1の測定器制御部43は、水飲み場52を走査する際、その走査面上において12.5mmの間隔で測定点を設定する。その走査面は、水飲み場52内で適宜設定することができ、実施例1では乳牛51の臀部近傍51aが位置することが想定される位置に設定している。 The measuring device control unit 43 controls the driving of the horizontal rotation driving unit 38 and the vertical rotation driving unit 41 to appropriately rotate the main body unit 32 and the vertical rotation unit 33 (see FIG. 1), so that the vertical rotation unit 33 can be directed in a predetermined direction and can scan a predetermined range. The measuring device control unit 43 of the first embodiment scans the water fountain 52 which is the data acquisition area 14, and each position of the water fountain 52 including the dairy cow 51 there is taken as a measurement point. The positional relationship between the two laser measuring devices 13 and the drinking fountain 52 is known in advance and is constant (does not change). When scanning the drinking fountain 52, the measuring device control unit 43 of the first embodiment sets measurement points at intervals of 12.5 mm on the scanning plane. The scanning plane can be appropriately set within the drinking fountain 52, and in Example 1, it is set at a position where the vicinity 51a of the buttocks of the dairy cow 51 is assumed to be located.
 測定器制御部43は、測距光学部37を制御して水飲み場52を走査しつつ設定した各測定点の測距(距離測定)を行う。このとき、測定器制御部43は、視準方向の高低角および水平角を測定(算出)することで、水飲み場52の各測定点の三次元座標位置を測定する。そして、測定器制御部43は、測定した水飲み場52の各測定点の三次元座標位置(座標データ)の集まりである点群データD1(一方の例を図7に示し、他方の例を図8に示す)を生成し、適宜通信部45、通信部17を介して制御機構11へと送信する。その点群データD1は、水飲み場52の表面形状を示すこととなり、水飲み場52に乳牛51が存在する場合にはその乳牛51の表面形状も含まれることとなる。 The measuring device control unit 43 controls the distance measuring optical unit 37 to perform distance measurement (distance measurement) for each set measurement point while scanning the drinking fountain 52 . At this time, the measuring device control unit 43 measures (calculates) the elevation angle and horizontal angle in the sighting direction, thereby measuring the three-dimensional coordinate position of each measurement point of the drinking fountain 52 . Then, the measuring device control unit 43 generates point cloud data D1 (one example is shown in FIG. 7 and the other example is shown in FIG. 8) is generated and transmitted to the control mechanism 11 via the communication unit 45 and the communication unit 17 as appropriate. The point cloud data D1 indicates the surface shape of the drinking fountain 52, and if the dairy cow 51 is present at the drinking fountain 52, the surface shape of the dairy cow 51 is also included.
 制御機構11は、乳牛51の生育状況の評価のために、図2に示すように、動物検出部61とデータ取得部62とデータ合成部63と動物抽出部64と面データ生成部65と特定部位切出部66と部位体積算出部67と切断面抽出部68と近似曲線算出部69と育成状況評価部71とを備える。 2, the control mechanism 11 specifies an animal detection unit 61, a data acquisition unit 62, a data synthesizing unit 63, an animal extraction unit 64, and a surface data generation unit 65 for evaluation of the growth status of the dairy cow 51. It includes a part cut-out portion 66 , a part volume calculation portion 67 , a cut surface extraction portion 68 , an approximate curve calculation portion 69 and a growth condition evaluation portion 71 .
 動物検出部61は、カメラ12が取得した画像Iから、データ取得領域14とされた水飲み場52に動物(実施例1では乳牛51(図4参照))が存在することを検出する。動物検出部61は、画像Iにおいて、コントラスト等に基づいて各種の形状を認識し、その認識した形状等に基づいて水飲み場52における水桶53等の設備と乳牛51との判別を行う。ここで、動物検出部61は、カメラ12が所定の位置に配置されていることから、水飲み場52における水桶53等の設備の画像が予め解っているので、それを利用することで乳牛51の判別が容易となるとともに、乳牛51が映し出された領域を示すデータの取得も容易にできる。動物検出部61は、画像Iに乳牛51が映し出されている場合には、水飲み場52に乳牛51を検出した旨の信号をデータ取得部62に出力する。このため、動物検出部61は、カメラ12と協働して、データ取得領域14に動物がいることを検出する動物検出機構として機能する。 The animal detection unit 61 detects from the image I acquired by the camera 12 that there is an animal (in the first embodiment, the dairy cow 51 (see FIG. 4)) at the drinking fountain 52 defined as the data acquisition area 14 . The animal detection unit 61 recognizes various shapes in the image I based on the contrast and the like, and distinguishes between the facilities such as the water tub 53 in the drinking fountain 52 and the dairy cow 51 based on the recognized shapes and the like. Here, since the camera 12 is arranged at a predetermined position, the animal detection unit 61 knows in advance the image of the facilities such as the water trough 53 at the drinking fountain 52 , so that the image of the dairy cow 51 can be detected by using the image. In addition to facilitating the determination, it is also possible to easily obtain data indicating the area where the cow 51 is projected. When the dairy cow 51 is displayed in the image I, the animal detection section 61 outputs a signal to the data acquisition section 62 indicating that the dairy cow 51 has been detected at the drinking fountain 52 . Therefore, the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 in cooperation with the camera 12 .
 加えて、実施例1の動物検出部61は、検出した動物(乳牛51)を個体別に識別し、その識別した情報を示す個体識別データD2を生成する。先ず、動物検出部61は、乳牛51が映し出された画像Iに基づいて、牛舎50で飼われている乳牛のうちのどの乳牛51が映し出されたものであるのかを特定する。例えば、動物検出部61は、画像Iの乳牛51が映し出された領域から、コントラスト等に基づいて乳牛51における白黒模様の形状や位置を認識し、その認識した白黒模様を予め登録してある各乳牛の白黒模様と比較することで、乳牛51を特定する。そして、動物検出部61は、その特定に従って、乳牛51を個体別に識別した個体識別データD2を生成し、その個体識別データD2をデータ取得部62に出力する。なお、動物検出部61は、画像Iに映し出された乳牛51、すなわちその時点で水飲み場52にいる乳牛51が、牛舎50で飼われている乳牛のうちのいずれであるのかを特定し、それに基づき乳牛51を個体別に識別した個体識別データD2を生成するものであれば、例えばタグを利用する等のように他の方法を用いてもよく、実施例1の構成に限定されない。 In addition, the animal detection unit 61 of Example 1 individually identifies the detected animal (dairy cow 51) and generates individual identification data D2 indicating the identified information. First, the animal detection unit 61 identifies which dairy cow 51 among the dairy cows kept in the barn 50 is shown, based on the image I in which the dairy cow 51 is shown. For example, the animal detection unit 61 recognizes the shape and position of the black and white pattern on the cow 51 based on the contrast and the like from the area where the cow 51 is projected in the image I, and the recognized black and white pattern is registered in advance. The dairy cow 51 is specified by comparing with the black and white pattern of the dairy cow. Then, the animal detection unit 61 generates individual identification data D2 by identifying the dairy cow 51 for each individual according to the identification, and outputs the individual identification data D2 to the data acquisition unit 62 . The animal detection unit 61 identifies which of the cows kept in the barn 50 the cow 51 displayed in the image I, that is, the cow 51 at the watering hole 52 at that time, is Other methods, such as using tags, may be used as long as the individual identification data D2 identifying the dairy cow 51 for each individual is generated based on the method, and the configuration is not limited to that of the first embodiment.
 データ取得部62は、動物検出部61から乳牛51を検出した信号を受け取ると、2つのレーザ測定器13をそれぞれ駆動させて水飲み場52(データ取得領域14)の走査を行わせる。各レーザ測定器13は、それぞれが水飲み場52の点群データD1(図7、図8参照)を取得し、その点群データD1に個体識別データD2を関連付けてデータ合成部63に出力する。ここで、上記のようにデータ取得部62が駆動される際には水飲み場52に乳牛51がいることから、点群データD1には個体識別データD2が示す乳牛51が含まれていることとなる。 When the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, it drives the two laser measurement devices 13 to scan the drinking fountain 52 (data acquisition area 14). Each laser measuring device 13 acquires the point cloud data D1 (see FIGS. 7 and 8) of the drinking fountain 52, associates the point cloud data D1 with the individual identification data D2, and outputs the data synthesizing unit 63 with the individual identification data D2. Here, since the cow 51 is present at the drinking fountain 52 when the data acquisition unit 62 is driven as described above, the point cloud data D1 includes the cow 51 indicated by the individual identification data D2. Become.
 データ合成部63は、2つのレーザ測定器13が取得した点群データD1を合成して、合成点群データD3(図9参照)を生成する。ここで、2つのレーザ測定器13は、上記のように水飲み場52を挟んで対を為す位置に設けられているので、同じ水飲み場52を互いに異なる方向から測定していることとなる。このため、それぞれの点群データD1を合成(所謂点群合成)することで、水飲み場52の異なる面(例えば、そこにいる乳牛51に対して、一方が右側面、他方が左側面等)の表面形状を含む三次元位置データの集まりとすることができる。データ合成部63は、点群が重なり合う(オーバーラップする)部分を繋げること(所謂点群マッチング)や、目印となるターゲットを使用すること(所謂タイポイント法)やその他の公知の技術を用いて、点群データD1を合成して合成点群データD3を生成する。ここで、生育状況評価システム10では、データ取得領域14として水飲み場52を設定しているとともに、その水飲み場52に対するレーザ測定器13の位置関係が予め解っている。このため、それぞれのレーザ測定器13が取得した2つの点群データD1における点群が重なり合う部分の判別が容易であるので、データ合成部63は、合成点群データD3を適切に生成できる。データ合成部63は、生成した合成点群データD3に個体識別データD2を関連付けて動物抽出部64に出力する。 The data synthesizing unit 63 synthesizes the point cloud data D1 acquired by the two laser measuring instruments 13 to generate synthesized point cloud data D3 (see FIG. 9). Here, since the two laser measuring devices 13 are provided at positions that form a pair with the drinking fountain 52 interposed therebetween as described above, they measure the same drinking fountain 52 from mutually different directions. Therefore, by synthesizing the respective point cloud data D1 (so-called point cloud synthesis), different surfaces of the drinking fountain 52 (for example, one is the right side and the other is the left side of the cow 51 there) can be a collection of three-dimensional position data including the surface shape of The data synthesizing unit 63 connects overlapping portions of point groups (so-called point group matching), uses targets that serve as landmarks (so-called tie point method), and uses other known techniques. , the point cloud data D1 are synthesized to generate synthesized point cloud data D3. Here, in the growing condition evaluation system 10, the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance. Therefore, since it is easy to determine the overlapping portion of the point clouds in the two point cloud data D1 acquired by each laser measurement device 13, the data synthesizing unit 63 can appropriately generate the synthesized point cloud data D3. The data synthesizing unit 63 associates the generated synthetic point cloud data D3 with the individual identification data D2 and outputs the data to the animal extracting unit 64 .
 動物抽出部64は、データ合成部63が生成した合成点群データD3の中から、乳牛51に相当する三次元座標位置(座標データ)のみを抽出することで、乳牛51の表面形状を示す動物点群データD4(図10参照)を生成する。ここで、生育状況評価システム10では、データ取得領域14として水飲み場52を設定しているとともに、その水飲み場52に対するレーザ測定器13の位置関係が予め解っているので、各レーザ測定器13が取得する水飲み場52の点群データも予め解っている。このため、動物抽出部64は、合成点群データD3と、乳牛51がいない状態の水飲み場52を示す点群データと、の差分を取ることで、動物点群データD4を生成する。この動物点群データD4は、両レーザ測定器13が上記のように配置されているので、水飲み場52にいる乳牛51の姿勢に拘らず、その臀部近傍51aの少なくとも半身が必ず含まれている。動物抽出部64は、生成した動物点群データD4に個体識別データD2を関連付けて面データ生成部65に出力する。 The animal extracting unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the combined point cloud data D3 generated by the data synthesizing unit 63, thereby extracting the animal representing the surface shape of the dairy cow 51. Point cloud data D4 (see FIG. 10) is generated. Here, in the growing condition evaluation system 10, the drinking fountain 52 is set as the data acquisition area 14, and the positional relationship of the laser measuring device 13 with respect to the drinking fountain 52 is known in advance. The point cloud data of the drinking fountain 52 to be acquired is also known in advance. For this reason, the animal extraction unit 64 generates animal point cloud data D4 by taking the difference between the synthetic point cloud data D3 and the point cloud data representing the drinking fountain 52 without the dairy cow 51 . Since the two laser measuring devices 13 are arranged as described above, this animal point cloud data D4 always includes at least half of the body near the buttocks 51a regardless of the posture of the cow 51 in the drinking fountain 52. . The animal extraction unit 64 associates the generated animal point cloud data D4 with the individual identification data D2 and outputs the generated animal point cloud data D4 to the surface data generation unit 65 .
 なお、動物抽出部64は、動物点群データD4を生成するものであれば、他の構成でもよく、実施例1の構成に限定されない。例えば、動物抽出部64は、様々な角度や大きさや種類の乳牛51を示す三次元座標位置(座標データ)を予め登録しておき、それらとの比較により動物点群データD4を抽出してもよい。また、動物抽出部64は、水飲み場52における床面や壁等の綺麗な平面を合成点群データD3から抽出し、その平面を基準として乳牛51がいそうな場所を閾値処理することにより、乳牛51がいる可能性の高い場所を抽出する。そして、動物抽出部64は、抽出した場所をクラスタリング処理により近い点群データの塊を複数のグループに分け、その中の最も大きな塊を乳牛51であるとして動物点群データD4を抽出してもよい。 It should be noted that the animal extraction unit 64 may have another configuration as long as it generates the animal point cloud data D4, and is not limited to the configuration of the first embodiment. For example, the animal extraction unit 64 may pre-register three-dimensional coordinate positions (coordinate data) indicating various angles, sizes, and types of dairy cows 51, and extract the animal point cloud data D4 by comparing them. good. In addition, the animal extraction unit 64 extracts clean planes such as the floor and walls of the drinking fountain 52 from the synthesized point cloud data D3, and uses the plane as a reference to perform threshold processing on the place where the dairy cow 51 is likely to be, thereby A place where 51 is likely to be is extracted. Then, the animal extracting unit 64 divides the clusters of the point cloud data close to the extracted location into a plurality of groups by clustering processing, and extracts the animal point cloud data D4 assuming that the largest cluster among them is the dairy cow 51. good.
 面データ生成部65は、動物点群データD4に基づき、公知の技術を用いて乳牛51を面で示す三次元面データD5(図11参照)を生成する。実施例1の面データ生成部65は、三角メッシュを生成するアルゴリズムを用いて複数のメッシュを張り付け、ポアソン再構成を利用することにより、乳牛51の三次元面データD5(メッシュデータ)を生成する。なお、面データ生成部65は、三次元面データD5を生成するものであれば、他の構成でもよく、実施例1の構成に限定されない。面データ生成部65は、生成した三次元面データD5に個体識別データD2を関連付けて特定部位切出部66に出力する。 The surface data generation unit 65 generates three-dimensional surface data D5 (see FIG. 11) showing the dairy cow 51 using a known technique based on the animal point cloud data D4. The surface data generation unit 65 of the first embodiment generates three-dimensional surface data D5 (mesh data) of the dairy cow 51 by pasting a plurality of meshes using an algorithm for generating triangular meshes and using Poisson reconstruction. . Note that the surface data generating unit 65 may have another configuration as long as it generates the three-dimensional surface data D5, and is not limited to the configuration of the first embodiment. The surface data generation unit 65 associates the generated three-dimensional surface data D5 with the individual identification data D2 and outputs the data to the specific part extraction unit 66 .
 特定部位切出部66は、乳牛51を示す三次元面データD5から、その臀部近傍51aの臀部近傍面データD6(図12、図13参照)を生成する。本開示では、乳牛51における臀部近傍51aが、その乳牛51の育成状況を評価する上で重要であると考えているので、特定部位切出部66で臀部近傍面データD6を生成する。実施例1の特定部位切出部66は、三次元面データD5から臀部近傍51aにおける骨の特徴部分を抽出し、その特徴部分に基づいて切出面を設定する。特定部位切出部66は、一例として、乳牛51の前後方向において、pin bone(図13の符号Bp参照)とtail head(図13の符号Bt参照)との中間位置からhook bone(図13の符号Bh参照)の長さを基準(基準長さLrとする)として、pin boneから頭に向かってその基準長さLrの2倍の長さに相当する位置を、前後方向に直交する切出面Scvで切り取る。また、特定部位切出部66は、一例として、乳牛51の高さ方向において、下腹部(下っ腹から乳の辺り)を除くように、上下方向に直交する切出面Schで切り取る。これは、下腹部は、日毎に変化し易い部分であって、育成状況を判別する上で好ましくはないと考えたことによる。なお、臀部近傍面データD6は、臀部近傍51aを示す三次元座標位置(座標データ)であれば、その大きさ(各切出面の位置)は適宜設定すればよく、実施例1の構成に限定されない。特定部位切出部66は、生成した臀部近傍面データD6に個体識別データD2を関連付けて部位体積算出部67に出力する。 From the three-dimensional surface data D5 representing the dairy cow 51, the specific part extracting unit 66 generates buttock vicinity surface data D6 (see FIGS. 12 and 13) of the buttock vicinity 51a. In the present disclosure, since the vicinity 51a of the buttocks of the cow 51 is considered to be important in evaluating the breeding status of the dairy cow 51, the specific part cutout section 66 generates the vicinity of the buttocks surface data D6. The specific part cutout section 66 of Example 1 extracts the characteristic portion of the bone in the vicinity of the buttocks 51a from the three-dimensional surface data D5, and sets the cutout surface based on the characteristic portion. As an example, the specific part cut-out portion 66 is cut from the middle position between the pin bone (see symbol Bp in FIG. 13) and the tail head (see symbol Bt in FIG. 13) in the front-back direction of the cow 51 to the hook bone (see symbol Bt in FIG. Bh) length as a reference (referred to as reference length Lr). Cut with Scv. As an example, the specific part cut-out portion 66 cuts out along a cut-out plane Sch perpendicular to the vertical direction so as to exclude the lower abdomen (from the belly to the milk) in the height direction of the dairy cow 51 . This is because the lower abdomen is a part that tends to change from day to day, and is not preferable for judging the breeding status. The buttock vicinity surface data D6 is limited to the configuration of the first embodiment, as long as it is a three-dimensional coordinate position (coordinate data) indicating the buttock vicinity 51a, and the size (position of each cut surface) may be appropriately set. not. The specific part extracting section 66 associates the individual identification data D2 with the generated buttock neighboring plane data D6 and outputs the data to the part volume calculating section 67 .
 部位体積算出部67は、特定部位切出部66が生成した臀部近傍面データD6に基づいて、公知の技術を用いて臀部近傍面データD6(切り出した部位)の体積(切出部位体積Vc)を算出する。部位体積算出部67は、例えば、三次元面データD5における三角メッシュを底面としつつ臀部近傍面データD6内の基準点を頂点とする三角推の体積の算出を、全ての三角メッシュに対して行うことで、切出部位体積Vcを算出する。なお、部位体積算出部67は、臀部近傍面データD6の体積である切出部位体積Vcを算出するものであればよく、実施例1の構成に限定されない。部位体積算出部67は、各切出部位体積Vcに個体識別データD2を関連付けて育成状況評価部71に出力する。 The part volume calculator 67 uses a known technique to calculate the volume (cutout part volume Vc) of the buttock neighboring plane data D6 (cutout part) based on the buttock neighboring plane data D6 generated by the specific part extracting section 66. Calculate The part volume calculation unit 67 calculates, for all triangular meshes, the volume of trigonometric supposition, for example, with the triangular mesh in the three-dimensional surface data D5 as the base and the reference point in the buttock vicinity surface data D6 as the vertex. Thus, the excised portion volume Vc is calculated. Note that the part volume calculator 67 is not limited to the configuration of the first embodiment as long as it calculates the cut-out part volume Vc, which is the volume of the buttock vicinity plane data D6. Part volume calculation section 67 associates individual identification data D2 with each cut-out part volume Vc and outputs it to growth status evaluation section 71 .
 切断面抽出部68は、臀部近傍面データD6から、乳牛51の臀部近傍51aを所定の平面に沿って切断した断面となる臀部近傍断面データD7(図14参照)を生成する。実施例1の切断面抽出部68は、所定の平面を乳牛51の前後方向に直交するものとし、その前後方向における任意の位置に設定されたスライス位置Spに沿って臀部近傍面データD6を切断することにより、臀部近傍断面データD7を生成する。このスライス位置Spは、適宜設定すればよいが、実施例1では図13に示す5箇所に設定しており、個別に示す際には頭側から順に末尾に1から5の数字を付している。スライス位置Sp1は、臀部近傍面データD6を切り出した際の切出面Scvと同じ位置としている。スライス位置Sp3は、hook boneの位置としている。スライス位置Sp2は、スライス位置Sp1とスライス位置Sp3との中間位置としている。スライス位置Sp5は、tail head(符号Bt参照)とpin bone(符号Bp参照)の中間位置としている。スライス位置Sp4は、スライス位置Sp3とスライス位置Sp5との中間位置としている。 The cut surface extraction unit 68 generates, from the buttock vicinity surface data D6, the buttock vicinity cross-sectional data D7 (see FIG. 14), which is a cross section obtained by cutting the buttock vicinity 51a of the dairy cow 51 along a predetermined plane. The cut plane extracting unit 68 of the first embodiment cuts the buttock vicinity plane data D6 along a slice position Sp set at an arbitrary position in the fore-and-aft direction on a predetermined plane perpendicular to the front-and-rear direction of the cow 51. By doing so, cross-sectional data D7 near the buttocks is generated. This slice position Sp may be set as appropriate, but in Example 1, it is set at five positions shown in FIG. there is The slice position Sp1 is set at the same position as the cut-out plane Scv when the buttock vicinity plane data D6 is cut out. The slice position Sp3 is the position of the hook bone. The slice position Sp2 is an intermediate position between the slice positions Sp1 and Sp3. The slice position Sp5 is an intermediate position between tail head (see symbol Bt) and pin bone (see symbol Bp). The slice position Sp4 is an intermediate position between the slice positions Sp3 and Sp5.
 なお、切断面抽出部68は、臀部近傍断面データD7として、乳牛51の幅方向に直交する平面に沿って切断した断面を生成してもよい。この一例として、スライス位置Sp6を示す。このスライス位置Sp6を背骨とhook bone(符号Bh参照)の中間位置で背骨と平行としている。 Note that the cut surface extraction unit 68 may generate a cross section cut along a plane perpendicular to the width direction of the dairy cow 51 as the buttock neighborhood cross-section data D7. As an example of this, the slice position Sp6 is shown. This slice position Sp6 is parallel to the spine at an intermediate position between the spine and the hook bone (see symbol Bh).
 その臀部近傍断面データD7の一例を図14に示す。その図14では、各スライス位置Spの末尾の数字に対応させて正面視して左側から順に並べており、末尾に1から6の数字を付している。すなわち、臀部近傍断面データD71がスライス位置Sp1に、臀部近傍断面データD72がスライス位置Sp2に、臀部近傍断面データD73がスライス位置Sp3に、臀部近傍断面データD74がスライス位置Sp4に、臀部近傍断面データD75がスライス位置Sp5に、臀部近傍断面データD76がスライス位置Sp6に、それぞれ対応している。切断面抽出部68は、生成した臀部近傍断面データD7に個体識別データD2を関連付けて近似曲線算出部69に出力する。 An example of the cross-sectional data D7 near the buttocks is shown in FIG. In FIG. 14, they are arranged in order from the left side as viewed from the front in correspondence with the numbers at the end of each slice position Sp, and the numbers 1 to 6 are added to the end. That is, the buttock vicinity cross-sectional data D71 is at the slice position Sp1, the buttock vicinity cross-sectional data D72 is at the slice position Sp2, the buttock vicinity cross-sectional data D73 is at the slice position Sp3, the buttock vicinity cross-sectional data D74 is at the slice position Sp4, and the buttock vicinity cross-sectional data D74 is at the slice position Sp4. D75 corresponds to the slice position Sp5, and the buttock vicinity cross-sectional data D76 corresponds to the slice position Sp6. The cut surface extraction unit 68 associates the generated buttock neighborhood cross-section data D7 with the individual identification data D2 and outputs the data to the approximated curve calculation unit 69 .
 近似曲線算出部69は、切断面抽出部68が生成した臀部近傍断面データD7における外表面側の輪郭OL(図14参照)に適合する近似曲線Ac(図15、図16参照)を算出する。近似曲線算出部69は、公地の技術を用いることで輪郭OLに適合する近似曲線Acを算出することができるが、実施例1ではベジェ曲線を用いることで近似曲線Acを滑らかなものとする。この概念を図15、図16を用いて説明する。 The approximate curve calculation unit 69 calculates an approximate curve Ac (see FIGS. 15 and 16) that fits the contour OL (see FIG. 14) on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68. The approximated curve calculator 69 can calculate the approximated curve Ac that fits the contour OL by using Kochi's technique. In the first embodiment, the Bezier curve is used to smooth the approximated curve Ac. . This concept will be described with reference to FIGS. 15 and 16. FIG.
 図15は、輪郭OLの一部と、その適合させて算出した2つの近似曲線Ac(以下では、個別に示す際には左側を第1近似曲線Ac1とし、右側を第2近似曲線Ac2とする)と、を示している。その図15では、互いの線が重なって見難くなるので、輪郭OLを太い実践で示すとともに、第1近似曲線Ac1を細い破線で示し、第2近似曲線Ac2を細い一点鎖線で示している。 FIG. 15 shows a part of the contour OL and two approximation curves Ac calculated by fitting them (hereinafter, when shown separately, the left side is referred to as the first approximation curve Ac1 and the right side is referred to as the second approximation curve Ac2). ) and . In FIG. 15, since the lines overlap each other and become difficult to see, the outline OL is indicated by a thick line, the first approximate curve Ac1 is indicated by a thin dashed line, and the second approximate curve Ac2 is indicated by a thin dashed line.
 近似曲線算出部69は、図15に示す輪郭OLの一部に対して、2つの近似曲線Ac(Ac1、Ac2とする)を算出したものとする。先ず、近似曲線算出部69は、輪郭OLのうちの左端近傍から中央近傍までに適合する第1近似曲線Ac1を算出するために、その輪郭OL上の始点と終点とに通過点Pを設定する。以下では、個別に示す際には、始点側を通過点Psとするとともに、終点側を通過点Peとする。また、近似曲線算出部69は、通過点Psから通過点Peに至る曲線の曲がり方を調整するために、複数の制御点(図15の例ではC1、C2の2つ)を設定する。そして、近似曲線算出部69は、通過点Psから通過点Peに至る曲線が輪郭OLと重なるように、各制御点(C1、C2)の位置を調整する。これにより、近似曲線算出部69は、輪郭OLと重なりつつ通過点Psから通過点Peに至る第1近似曲線Ac1を求めることができる。 It is assumed that the approximated curve calculation unit 69 has calculated two approximated curves Ac (assumed to be Ac1 and Ac2) for a portion of the contour OL shown in FIG. First, the approximation curve calculation unit 69 sets passing points P at the start point and the end point on the contour OL in order to calculate the first approximation curve Ac1 that fits from the vicinity of the left end to the vicinity of the center of the contour OL. . Hereinafter, when individually shown, the starting point side is defined as a passing point Ps, and the end point side is defined as a passing point Pe. In addition, the approximate curve calculator 69 sets a plurality of control points (two C1 and C2 in the example of FIG. 15) in order to adjust the curve of the curve from the passing point Ps to the passing point Pe. Then, the approximated curve calculator 69 adjusts the position of each control point (C1, C2) so that the curve from the passing point Ps to the passing point Pe overlaps the contour OL. As a result, the approximated curve calculator 69 can obtain the first approximated curve Ac1 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
 次に、近似曲線算出部69は、輪郭OLのうちの中央近傍から右端近傍までに適合する第2近似曲線Ac2を算出するために、上記と同様に、輪郭OL上における中央近傍に通過点Ps´を、右端近傍に通過点Peをそれぞれ設定し、複数の制御点(C1´、C2´)を設定する。その通過点Ps´は、輪郭OLを切れ目なく近似曲線で示すために、第1近似曲線Ac1の通過点Peと同一座標とする。そして、近似曲線算出部69は、通過点Ps´から通過点Pe´に至る曲線が輪郭OLと重なるように、各制御点(C1´、C2´)の位置を調整する。これにより、近似曲線算出部69は、輪郭OLと重なりつつ通過点Psから通過点Peに至る第2近似曲線Ac2を求めることができる。 Next, in order to calculate the second approximate curve Ac2 that fits from the vicinity of the center to the vicinity of the right end of the contour OL, the approximated curve calculation unit 69, similarly to the above, places the passing point Ps near the center on the contour OL. ', a passing point Pe is set near the right end, and a plurality of control points (C1', C2') are set. The passing point Ps' has the same coordinates as the passing point Pe of the first approximated curve Ac1 in order to show the contour OL by the approximated curve without a break. Then, the approximated curve calculator 69 adjusts the positions of the respective control points (C1', C2') so that the curve from the passing point Ps' to the passing point Pe' overlaps the contour OL. As a result, the approximated curve calculator 69 can obtain the second approximated curve Ac2 that extends from the passing point Ps to the passing point Pe while overlapping the contour OL.
 ここで、近似曲線算出部69は、第1近似曲線Ac1と第2近似曲線Ac2とを滑らかに連続させるために、第1近似曲線Ac1における制御点C2から通過点Peまでと、第2近似曲線Ac2における通過点Ps´から制御点C1´までと、を同一直線上に位置させる。実施例1では、制御点C1´は、共通する通過点(Pe、Ps)に関して制御点C2と点対称となる位置とすることで、同一直線上に位置させつつ、実質的に制御点C1´の設定を省略している。このように、近似曲線算出部69は、上記のことを鑑みて各通過点Pおよび各制御点Cを設定することで、滑らかに連続しつつ輪郭OLと重なる両近似曲線(Ac1、Ac2)を求めることができる。 Here, in order to smoothly connect the first approximate curve Ac1 and the second approximate curve Ac2, the approximate curve calculation unit 69 The passage point Ps' on Ac2 to the control point C1' are positioned on the same straight line. In the first embodiment, the control point C1′ is positioned symmetrically with the control point C2 with respect to the common passing points (Pe, Ps), so that the control point C1′ is substantially positioned on the same straight line. setting is omitted. In this manner, the approximate curve calculation unit 69 sets each passing point P and each control point C in consideration of the above, and thereby calculates both approximate curves (Ac1, Ac2) that smoothly continue and overlap the contour OL. can ask.
 実施例1の近似曲線算出部69は、近似曲線Acの求め方を規格化している。すなわち、近似曲線算出部69は、輪郭OLにおける乳牛51の特徴点(例えば、輪郭OLに現れる各種の骨の特徴部分等)を基準として、輪郭OLを複数の範囲に分割し、それぞれの範囲で近似曲線Acを求めるものとする。ここで、近似曲線算出部69は、乳牛51の特徴点を乳牛51の個体差に拘らず共通する箇所とすることで、全ての乳牛51に対して個体差に拘らず等しい数の近似曲線Acで輪郭OLを表すことができる。そして、近似曲線算出部69は、各領域の始点と終点とのそれぞれに通過点Pを設定するとともに、各領域において設定する制御点Cの数を予め規定する。これにより、近似曲線算出部69は、乳牛51の個体差に拘らず等しい数の近似曲線Acで輪郭OLを表すことができるとともに、乳牛51の個体差に応じてそれぞれの近似曲線Acにおける係数や各通過点Pおよび各制御点Cの位置を変化させることができる。 The approximate curve calculation unit 69 of the first embodiment standardizes the method of obtaining the approximate curve Ac. That is, the approximate curve calculation unit 69 divides the contour OL into a plurality of ranges based on characteristic points of the dairy cow 51 in the contour OL (for example, characteristic portions of various bones appearing in the contour OL), and divides the contour OL into a plurality of ranges. Assume that an approximated curve Ac is to be obtained. Here, the approximate curve calculation unit 69 sets the characteristic points of the dairy cows 51 to be common points regardless of the individual differences of the dairy cows 51, so that the same number of approximate curves Ac for all the dairy cows 51 regardless of individual differences. can represent the contour OL. Then, the approximated curve calculator 69 sets passing points P at the start and end points of each region, and predefines the number of control points C to be set in each region. As a result, the approximate curve calculator 69 can represent the contour OL with an equal number of approximate curves Ac regardless of the individual differences of the dairy cows 51, and the coefficients and values of the approximate curves Ac corresponding to the individual differences of the dairy cows 51. The position of each passing point P and each control point C can be changed.
 そして、近似曲線算出部69は、上記のように臀部近傍断面データD7の輪郭OLに適合する複数の近似曲線Acを求めることで、輪郭OLを複数の近似曲線Acで表すことができる。その一例として、輪郭OLを複数の近似曲線Acで表した様子を図16に示す。この図16の例では、図14の臀部近傍断面データD73の輪郭OLに適合させた近似曲線を示しており、9つの通過点Pと9つの制御点Cとを設定しており、8つの近似曲線Acで表すものとしている。なお、この図16の例では、9つの制御点Cの他に、7つの制御点(図16では区別するために対称制御点Csとする)を設けている。この7つの対称制御点Csは、上記のように隣り合う2つの近似曲線Acを滑らかに連続させるために、その間の通過点Pに関して制御点Cと点対称となる位置に設定している。また、9つの制御点Cのうちの、両端の制御点Cでは、隣り合う近似曲線Acが存在しないので、点対称となる位置の対称制御点Csを記載していない。このため、図16の例では、実質的に9つの通過点Pと9つの制御点Cとを設定して、8つの近似曲線Acで表していることとなる。近似曲線算出部69は、各通過点Pと各制御点Cとを含む算出した複数の近似曲線Ac(そのデータ)を、そこに個体識別データD2を関連付けて育成状況評価部71に出力する。 Then, the approximated curve calculation unit 69 can represent the contour OL with a plurality of approximated curves Ac by obtaining a plurality of approximated curves Ac that match the contour OL of the buttock vicinity cross-sectional data D7 as described above. As an example, FIG. 16 shows a contour OL represented by a plurality of approximation curves Ac. The example of FIG. 16 shows an approximation curve fitted to the contour OL of the cross-sectional data near the buttocks D73 of FIG. It is represented by curve Ac. In the example of FIG. 16, in addition to the nine control points C, seven control points (referred to as symmetrical control points Cs in FIG. 16 for distinction) are provided. These seven symmetrical control points Cs are set at points symmetrical to the control point C with respect to the passing point P between them in order to smoothly connect two adjacent approximated curves Ac as described above. In addition, since there is no adjacent approximated curve Ac at the control points C at both ends of the nine control points C, the symmetrical control points Cs at the point-symmetrical positions are not described. Therefore, in the example of FIG. 16, substantially nine passing points P and nine control points C are set and represented by eight approximate curves Ac. Approximate curve calculation unit 69 outputs a plurality of calculated approximate curves Ac (data thereof) including each passing point P and each control point C to breeding condition evaluation unit 71 in association with individual identification data D2.
 育成状況評価部71は、部位体積算出部67からの切出部位体積Vcや、近似曲線算出部69からの各近似曲線Acに基づいて、水飲み場52にいた乳牛51の育成状況の評価を示す評価データEvを生成する。育成状況評価部71は、例えば、切出部位体積Vcを用いることで、従来から用いられているBCSを求めることができ、このBCSを乳牛51の評価データEvとすることができる。また、育成状況評価部71は、各近似曲線Acにおける各係数(例えば、基準値からの変化量や係数毎の分布等)や、各通過点Pおよび各制御点Cの位置(座標または基準位置からの変位等)を求めることができ、それらを乳牛51の評価データEvとすることができる。育成状況評価部71は、乳牛51の評価データEvに個体識別データD2を関連付けて、適宜記憶部18に格納する。 The rearing condition evaluation unit 71 evaluates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69. Generate evaluation data Ev. The rearing condition evaluation unit 71 can obtain the conventionally used BCS by using the cut-out part volume Vc, for example, and can use this BCS as the evaluation data Ev of the dairy cow 51 . In addition, the growth status evaluation unit 71 determines each coefficient (for example, the amount of change from the reference value, the distribution of each coefficient, etc.) in each approximate curve Ac, the positions (coordinates or reference positions) of each passing point P and each control point C ) can be obtained, and these can be used as the evaluation data Ev of the dairy cow 51 . The breeding status evaluation unit 71 associates the evaluation data Ev of the dairy cow 51 with the individual identification data D2 and stores them in the storage unit 18 as appropriate.
 なお、育成状況評価部71は、切出部位体積Vcや各近似曲線Acに基づいて、他の数値等を用いることにより乳牛51の評価データEvを生成してもよい。例えば、育成状況評価部71は、対象とする乳牛51の切出部位体積Vcや各近似曲線Acの経時的な変化に基づいて評価データEvを生成してもよい。 It should be noted that the breeding status evaluation unit 71 may generate the evaluation data Ev of the dairy cow 51 by using other numerical values based on the cut-out part volume Vc and each approximate curve Ac. For example, the breeding status evaluation unit 71 may generate the evaluation data Ev based on the cut-out part volume Vc of the target dairy cow 51 and changes over time in each approximate curve Ac.
 制御機構11は、記憶部18に格納された評価データEvを、表示部16に適宜表示させたり、通信部17を介して外部の機器に適宜出力させたりする。ここで、評価データEvには、個体識別データD2が関連付けられているので、いずれの乳牛51の育成状況を示すものであるのかを容易に把握できる。これにより、制御機構11は、乳牛51の評価データEvを報知することができる。 The control mechanism 11 appropriately displays the evaluation data Ev stored in the storage unit 18 on the display unit 16 or outputs it to an external device via the communication unit 17 as appropriate. Here, since the evaluation data Ev is associated with the individual identification data D2, it is possible to easily grasp which dairy cow 51 is being raised. Thereby, the control mechanism 11 can report the evaluation data Ev of the dairy cow 51 .
 次に、生育状況評価システム10を用いて、乳牛51の育成状況の評価を行う一例としての育成状況評価処理(育成状況評価制御方法)について、図17を用いて説明する。この育成状況評価処理は、記憶部18または内部メモリ11aに記憶されたプログラムに基づいて、制御機構11が実行する。以下では、この図17のフローチャートの各ステップ(各工程)について説明する。この図17のフローチャートは、生育状況評価システム10が起動されてブラウザまたはアプリが立ち上がってカメラ12が駆動されるとともに、両レーザ測定器13が待機状態とされることにより開始される。 Next, using FIG. 17, a description will be given of a rearing state evaluation process (raising state evaluation control method) as an example of evaluating the rearing state of the dairy cow 51 using the growing state evaluation system 10. FIG. This training status evaluation process is executed by the control mechanism 11 based on a program stored in the storage unit 18 or the internal memory 11a. Each step (each process) of the flow chart of FIG. 17 will be described below. The flow chart of FIG. 17 is started when the growing condition evaluation system 10 is activated, a browser or an application is launched, the camera 12 is driven, and both laser measuring instruments 13 are placed in a standby state.
 ステップS1では、水飲み場52(データ取得領域14)に乳牛51が存在するか否かを判断し、YESの場合はステップS2へ進み、NOの場合はステップS1を繰り返す。ステップS1では、動物検出部61が、カメラ12が取得した画像Iを解析することで、水飲み場52に乳牛51がいるか否かを判断し、乳牛51がいる場合にはその旨の信号をデータ取得部62に出力する。加えて、実施例1のステップS1は、乳牛51を検出した場合、その乳牛51を個体別に識別した個体識別データD2を生成し、その個体識別データD2をデータ取得部62に出力する。 In step S1, it is determined whether or not the dairy cow 51 exists in the drinking fountain 52 (data acquisition area 14). If YES, proceed to step S2, and if NO, step S1 is repeated. In step S1, the animal detection unit 61 analyzes the image I acquired by the camera 12 to determine whether or not the dairy cow 51 is present at the drinking fountain 52. Output to the acquisition unit 62 . In addition, in step S1 of the first embodiment, when the dairy cow 51 is detected, the individual identification data D2 is generated by identifying the individual dairy cow 51, and the individual identification data D2 is output to the data acquisition unit 62.
 ステップS2では、水飲み場52の点群データD1を取得して、ステップS3へ進む。ステップS2では、データ取得部62が、動物検出部61から乳牛51を検出した信号を受け取ると、2つのレーザ測定器13を駆動させて水飲み場52(データ取得領域14)の走査を行わせて、水飲み場52の点群データD1を取得させる。そして、ステップS2では、データ取得部62が、両レーザ測定器13からの点群データD1を受け取ると、それぞれの点群データD1に個体識別データD2を関連付けてデータ合成部63に出力する。 In step S2, the point cloud data D1 of the drinking fountain 52 is acquired, and the process proceeds to step S3. In step S2, when the data acquisition unit 62 receives a signal that the dairy cow 51 is detected from the animal detection unit 61, the two laser measuring devices 13 are driven to scan the drinking fountain 52 (data acquisition area 14). , to obtain the point cloud data D1 of the drinking fountain 52 . Then, in step S2, when the data acquiring unit 62 receives the point cloud data D1 from both laser measuring devices 13, it outputs the individual identification data D2 to the respective point cloud data D1 to the data synthesizing unit 63 in association with each other.
 ステップS3では、合成点群データD3を生成して、ステップS4へ進む。ステップS3では、データ合成部63が、両レーザ測定器13が取得した点群データD1を合成して、合成点群データD3を生成し、その生成した合成点群データD3に個体識別データD2を関連付けて動物抽出部64に出力する。 In step S3, synthetic point cloud data D3 is generated, and the process proceeds to step S4. In step S3, the data synthesizing unit 63 synthesizes the point cloud data D1 acquired by both laser measuring devices 13 to generate synthesized point cloud data D3, and adds the individual identification data D2 to the synthesized point cloud data D3 thus generated. It is output to the animal extractor 64 in association with it.
 ステップS4では、動物点群データD4を生成して、ステップS5へ進む。ステップS4では、動物抽出部64が、データ合成部63が生成した合成点群データD3から乳牛51に相当する三次元座標位置(座標データ)のみを抽出して動物点群データD4を生成し、その生成した動物点群データD4に個体識別データD2を関連付けて面データ生成部65に出力する。 In step S4, animal point cloud data D4 is generated, and the process proceeds to step S5. In step S4, the animal extraction unit 64 extracts only the three-dimensional coordinate position (coordinate data) corresponding to the dairy cow 51 from the synthesized point cloud data D3 generated by the data synthesis unit 63 to generate animal point cloud data D4, The individual identification data D2 are associated with the generated animal point group data D4 and output to the plane data generation unit 65 .
 ステップS5では、三次元面データD5を生成して、ステップS6へ進む。ステップS5では、面データ生成部65が、動物抽出部64が生成した動物点群データD4から乳牛51の三次元面データD5を生成し、その生成した三次元面データD5に個体識別データD2を関連付けて特定部位切出部66に出力する。 In step S5, three-dimensional surface data D5 is generated, and the process proceeds to step S6. In step S5, the surface data generation unit 65 generates three-dimensional surface data D5 of the dairy cow 51 from the animal point cloud data D4 generated by the animal extraction unit 64, and adds the individual identification data D2 to the generated three-dimensional surface data D5. It is output to the specific part extraction unit 66 in association with it.
 ステップS6では、臀部近傍面データD6を生成して、ステップS7へ進む。ステップS6では、特定部位切出部66が、面データ生成部65が生成した三次元面データD5から乳牛51の臀部近傍51aの臀部近傍面データD6を生成し、その生成した臀部近傍面データD6に個体識別データD2を関連付けて部位体積算出部67に出力する。 In step S6, buttock vicinity surface data D6 is generated, and the process proceeds to step S7. In step S6, the specific part extracting unit 66 generates buttock vicinity surface data D6 of the buttock vicinity 51a of the cow 51 from the three-dimensional surface data D5 generated by the surface data generation unit 65, and generates the buttock vicinity surface data D6. is associated with individual identification data D2 and output to part volume calculation unit 67 .
 ステップS7では、切出部位体積Vcを算出して、ステップS8へ進む。ステップS7では、部位体積算出部67が、特定部位切出部66が生成した臀部近傍面データD6(切り出した部位)からその体積である切出部位体積Vcを算出し、その算出した切出部位体積Vcに個体識別データD2を関連付けて育成状況評価部71に出力する。 In step S7, the excision part volume Vc is calculated, and the process proceeds to step S8. In step S7, the part volume calculation unit 67 calculates the cutout part volume Vc, which is the volume, from the buttock vicinity surface data D6 (cutout part) generated by the specific part cutout part 66, and the calculated cutout part volume Vc is calculated. The individual identification data D2 is associated with the volume Vc and output to the growing condition evaluation unit 71 .
 ステップS8では、臀部近傍断面データD7を生成して、ステップS9へ進む。ステップS8では、切断面抽出部68が、特定部位切出部66が生成した臀部近傍面データD6(切り出した部位)から乳牛51の臀部近傍51aを所定の平面に沿って切断した断面となる臀部近傍断面データD7を生成し、その生成した臀部近傍断面データD7に個体識別データD2を関連付けて近似曲線算出部69に出力する。 In step S8, cross-sectional data D7 near the buttocks is generated, and the process proceeds to step S9. In step S8, the cut surface extraction unit 68 extracts the buttock vicinity 51a of the dairy cow 51 from the buttock vicinity surface data D6 (cut out portion) generated by the specific portion cutout unit 66 along a predetermined plane. Proximal cross-sectional data D7 is generated, and individual identification data D2 is associated with the generated buttock proximate cross-sectional data D7 and output to approximated curve calculation unit 69 .
 ステップS9では、近似曲線Acを算出して、ステップS10へ進む。ステップS9では、近似曲線算出部69が、切断面抽出部68が生成した臀部近傍断面データD7における外表面側の輪郭OLに適合する近似曲線Acを算出し、その近似曲線Acをその各通過点Pと各制御点Cとともに個体識別データD2を関連付けて育成状況評価部71に出力する。 In step S9, an approximate curve Ac is calculated, and the process proceeds to step S10. In step S9, the approximated curve calculator 69 calculates an approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extractor 68, and calculates the approximated curve Ac at each passing point. P and each control point C are associated with the individual identification data D2 and output to the breeding state evaluation unit 71 .
 ステップS10では、評価データEvを生成して、ステップS11へ進む。ステップS10では、育成状況評価部71が、部位体積算出部67からの切出部位体積Vcや、近似曲線算出部69からの各近似曲線Acに基づいて、水飲み場52にいた乳牛51の育成状況の評価を示す評価データEvを生成し、その評価データEvに個体識別データD2を関連付けて適宜記憶部18に格納する。 In step S10, the evaluation data Ev is generated, and the process proceeds to step S11. In step S10, the rearing condition evaluation unit 71 calculates the rearing condition of the dairy cow 51 at the drinking fountain 52 based on the cut-out part volume Vc from the part volume calculation unit 67 and the approximate curves Ac from the approximate curve calculation unit 69. Evaluation data Ev indicating the evaluation of is generated, and individual identification data D2 is associated with the evaluation data Ev and stored in the storage unit 18 as appropriate.
 ステップS11では、育成状況評価処理が終了されたか否かを判断し、YESの場合はこの育成状況評価処理を終了し、NOの場合はステップS1に戻る。ステップS11は、生育状況評価システム10が停止される、もしくは操作部15に育成状況評価処理を終了する旨の操作が為されると、育成状況評価処理が終了されたと判断する。 In step S11, it is determined whether or not the training status evaluation process has ended. If YES, this training status evaluation process ends, and if NO, the process returns to step S1. In step S11, when the growing condition evaluation system 10 is stopped or when the operating unit 15 is operated to end the growing condition evaluating process, it is determined that the growing condition evaluating process has ended.
 次に、生育状況評価システム10を用いて、育成状況評価を行う様子について説明する。
  生育状況評価システム10は、カメラ12を介して水飲み場52に乳牛51がいることを検出すると、両レーザ測定器13により水飲み場52の点群データD1を取得する(ステップS1、S2)。その後、生育状況評価システム10は、点群データD1に基づいて、乳牛51の臀部近傍51aの体積(切出部位体積Vc)を算出する(ステップS3からS7)とともに、臀部近傍51aの外表面側の輪郭OLに適合する近似曲線Acを算出する(ステップS3からS6、S8、S9)。そして、生育状況評価システム10は、切出部位体積Vcや近似曲線Acを用いて、乳牛51の評価データEvを生成し、適宜記憶部18に格納する(ステップS10)。生育状況評価システム10は、評価データEvを表示部16に適宜表示させたり、評価データEvを通信部17を介して外部の機器に適宜出力させたりすることで、評価データEvを報知できる。
Next, how to evaluate the growing conditions using the growing condition evaluation system 10 will be described.
When the growth condition evaluation system 10 detects the presence of the dairy cow 51 at the drinking fountain 52 via the camera 12, it acquires the point cloud data D1 of the drinking fountain 52 using both laser measuring devices 13 (steps S1 and S2). Thereafter, based on the point cloud data D1, the growing condition evaluation system 10 calculates the volume of the vicinity of the buttocks 51a of the dairy cow 51 (cutout portion volume Vc) (steps S3 to S7), and the outer surface side of the vicinity of the buttocks 51a. An approximated curve Ac that fits the contour OL of is calculated (steps S3 to S6, S8 and S9). Then, the growing condition evaluation system 10 generates evaluation data Ev of the dairy cow 51 using the cut-out part volume Vc and the approximation curve Ac, and appropriately stores them in the storage unit 18 (step S10). The growing condition evaluation system 10 can notify the evaluation data Ev by appropriately displaying the evaluation data Ev on the display unit 16 or outputting the evaluation data Ev to an external device via the communication unit 17 as appropriate.
 ここで、従来技術の生育状況評価システムは、距離画像センサを用いて動物の三次元座標群を取得している。その距離画像センサは、取得する三次元座標群の精度や解像度を高めることに限界があるので、その三次元座標群が実際の動物の輪郭を適切に表すことが困難である。このため、従来技術の健康状態推定装置では、三次元座標群を用いても、BCSにより適切な評価を得ることが困難となる。 Here, the conventional growing condition evaluation system acquires the three-dimensional coordinate group of the animal using a range image sensor. Since the range image sensor has limitations in increasing the accuracy and resolution of the acquired three-dimensional coordinate group, it is difficult for the three-dimensional coordinate group to appropriately express the outline of the actual animal. For this reason, with the health condition estimation device of the prior art, it is difficult to obtain an appropriate evaluation by the BCS even if the three-dimensional coordinate group is used.
 また、従来の健康状態推定装置では、距離画像センサを用いているので、三次元座標群が示す動物の輪郭がギザギザな凹凸となる等のように実際の動物とは異なり、適切な輪郭形状を数式(近似曲線)で表すことが困難となる。このため、従来の健康状態推定装置では、輪郭形状を適切に表す数式を得ることが困難となり、輪郭を用いて育成状況を適切に評価することが困難である。 In addition, since the conventional health condition estimation device uses a distance image sensor, the contour of the animal indicated by the three-dimensional coordinate group is different from that of an actual animal, such as jagged unevenness. It becomes difficult to represent with a formula (approximate curve). For this reason, with the conventional health condition estimation device, it is difficult to obtain a mathematical expression that appropriately expresses the contour shape, and it is difficult to appropriately evaluate the growth situation using the contour.
 そして、従来、獣医等は、手触り等で乳牛の形状を把握することで、BCSを求めることが一般的である。すると、BCSでは、それを求める者(獣医等)が、自身の見え方に応じて、骨や肉付き等に基づく形状がどこに当て嵌まるのかを判断するので、求める者によってバラツキが生じてしまい、適切な評価を得ることが困難である。すなわち、従来のBCSは、求める者による個人差や、同じ者が求めた場合であっても判断の揺らぎが生じることにより、評価の精度を高めることが困難である。加えて、BCSでは、乳牛の形状を把握のために触れられたりすることが、乳牛にとってストレスとなってしまう。 Conventionally, veterinarians and others generally determine the BCS by grasping the shape of the dairy cow by feeling it. Then, in the BCS, the person (veterinarian, etc.) who seeks it judges where the shape based on the bones, meat, etc. fits according to the way he/she sees it, so variations occur depending on the person who seeks it, and it is not appropriate. It is difficult to obtain a good evaluation. In other words, with the conventional BCS, it is difficult to improve the accuracy of evaluation due to individual differences depending on the requester and fluctuations in judgment even when the same person requests. In addition, in the BCS, being touched to grasp the shape of the cow causes stress for the cow.
 これらに対して、生育状況評価システム10は、レーザ測定器13を用いて動物(実施例1では乳牛51)の三次元座標群を取得している。そのレーザ測定器13は、精密な測量に用いられる水準での精度の三次元座標群(点群データD1)を取得可能であるとともに、解像度も極めて高くする(実施例1では12.5mmの間隔)ことができるので、その点群データD1が実際の乳牛51の輪郭を極めて忠実に表すことができる。ここで、実施例1のレーザ測定器13は、一例として、パルスレーザ光線の到達距離の精度を3.5mm以下の誤差とすることができ、走査面の精度を2.0mm以下の誤差とすることができ、測角の精度を鉛直、水平ともに6秒(角度)以下の誤差とすることができる。また、実施例1のレーザ測定器13は、低出力のモードとすることもできるが、その場合であってもパルスレーザ光線の到達距離の精度を4.0mm以下の誤差となることを除くと、その他に関しては上記した精度とすることができる。このように、生育状況評価システム10は、レーザ測定器13を用いることにより、極めて高い精度で三次元座標群を取得できるとともに解像度も極めて高くできる。このため、生育状況評価システム10は、レーザ測定器13からの点群データD1を用いることで、育成状況の評価を適切に得ることができる。 On the other hand, the growing condition evaluation system 10 uses the laser measuring device 13 to acquire the three-dimensional coordinate group of the animal (the dairy cow 51 in Example 1). The laser measuring instrument 13 can acquire a three-dimensional coordinate group (point group data D1) with accuracy at the level used for precise surveying, and also has an extremely high resolution (12.5 mm interval in Example 1). ), the point cloud data D1 can represent the contour of the actual dairy cow 51 very faithfully. Here, as an example, the laser measuring device 13 of Example 1 can set the accuracy of the reaching distance of the pulsed laser beam to an error of 3.5 mm or less, and set the accuracy of the scanning plane to an error of 2.0 mm or less. It is possible to reduce the accuracy of angle measurement to an error of 6 seconds (angle) or less for both vertical and horizontal. In addition, the laser measuring device 13 of Example 1 can be set to a low output mode, but even in that case, except that the accuracy of the reaching distance of the pulsed laser beam becomes an error of 4.0 mm or less. , and others can be the above-described accuracy. In this way, the growing condition evaluation system 10 can acquire a three-dimensional coordinate group with extremely high accuracy and can also achieve extremely high resolution by using the laser measuring device 13 . Therefore, the growing condition evaluation system 10 can appropriately obtain an evaluation of the growing condition by using the point cloud data D1 from the laser measuring device 13 .
 また、生育状況評価システム10は、レーザ測定器13を用いているため、点群データD1が示す乳牛51の輪郭OLが実際の乳牛51の輪郭を極めて忠実に表すことができるので、輪郭OLを滑らかな曲線で近似することができ、実際の乳牛51の輪郭に適切に表す近似曲線Acを求めることができる。このため、生育状況評価システム10では、実際の乳牛51の輪郭を極めて忠実に表した近似曲線Acを求めることができるので、輪郭OLを標準化した数式で示すことができ、バラツキのない育成状況の評価を得ることができる。すなわち、生育状況評価システム10は、点群データD1に基づく輪郭OLを近似して近似曲線Acを求めることにより、乳牛51の体形を数式で表すことができるようになって、育成状況の評価を計算により算出できるようになるので、その評価を客観的かつ適切なものにできる。 In addition, since the growing condition evaluation system 10 uses the laser measuring device 13, the contour OL of the dairy cow 51 indicated by the point cloud data D1 can represent the actual contour of the dairy cow 51 very faithfully. It is possible to obtain an approximation curve Ac that can be approximated by a smooth curve and appropriately represents the contour of the actual dairy cow 51 . Therefore, in the growing condition evaluation system 10, since the approximated curve Ac representing the contour of the actual dairy cow 51 can be obtained with extreme fidelity, the contour OL can be represented by a standardized formula, and the growing condition can be evaluated without variation. You can get an evaluation. That is, the growing condition evaluation system 10 approximates the contour OL based on the point cloud data D1 to find the approximated curve Ac, so that the body shape of the dairy cow 51 can be represented by a mathematical formula, and the growing condition can be evaluated. Since it can be calculated by calculation, the evaluation can be made objective and appropriate.
 さらに、生育状況評価システム10は、その近似曲線Acの求め方を規格化することで、輪郭OLの個体差を、近似曲線Acにおける各係数や各通過点Pおよび各制御点Cの位置等で示すことができる。このため、生育状況評価システム10は、各係数や各通過点Pおよび各制御点Cの位置等の変化等で評価データEvを生成することができ、よりバラツキのない育成状況の判断を可能とする。 Furthermore, the growing condition evaluation system 10 standardizes the method of obtaining the approximated curve Ac, so that individual differences in the contour OL can be determined by the coefficients, the positions of the passing points P and the control points C, etc. in the approximated curve Ac. can be shown. Therefore, the growing condition evaluation system 10 can generate the evaluation data Ev based on changes in the positions of each coefficient, each passing point P and each control point C, and the like, thereby making it possible to judge the growing condition more uniformly. do.
 加えて、生育状況評価システム10は、レーザ測定器13で取得した点群データD1から求めた近似曲線Acを用いて評価データEvを生成するので、その評価データEvを育成状況の評価として知らせることができる。このため、生育状況評価システム10は、求める者による個人差や判断の揺らぎが育成状況の評価に反映することを防止できるとともに、場所や施設や時間等が異なることに起因する差異をなくすことができる。これにより、生育状況評価システム10は、統一した基準で評価データEvを生成することができ、定量的な数値として乳牛51(動物)を評価できる。 In addition, since the growing condition evaluation system 10 generates the evaluation data Ev using the approximate curve Ac obtained from the point group data D1 acquired by the laser measuring device 13, the evaluation data Ev can be notified as the evaluation of the growing condition. can be done. Therefore, the growth situation evaluation system 10 can prevent individual differences and fluctuations in judgment from being reflected in the evaluation of the growth situation, and can eliminate differences caused by different places, facilities, times, etc. can. As a result, the growing condition evaluation system 10 can generate evaluation data Ev based on a unified standard, and can evaluate the dairy cow 51 (animal) as a quantitative numerical value.
 生育状況評価システム10は、カメラ12を介して水飲み場52に乳牛51がいることを検出したときだけ、両レーザ測定器13により水飲み場52の点群データD1を取得し、それに基づいて評価データEvを生成できる。このため、生育状況評価システム10は、乳牛51が自らの意思で水飲み場52にくることを利用しつつ、その乳牛51に触れることなく評価データEvを得られる。よって、生育状況評価システム10は、評価データEvの生成のために、触れたり意思に反した場所等へ導いたりする等のストレスを乳牛51に与えることなく、適切な育成状況の判断を可能とする。また、生育状況評価システム10は、乳牛51の自然な行動を利用して自動で評価データEvを得るものなので、時刻の制限をなくすことができ、一日中(24時間)管理できる。さらに、生育状況評価システム10は、水飲み場52に乳牛51がいない場合には、両レーザ測定器13により点群データD1を取得することはないので、不要なデータの取得や蓄積を防止することができ、効率良く運用できる。 Only when the cow 51 is detected at the drinking fountain 52 via the camera 12, the growing condition evaluation system 10 obtains the point cloud data D1 of the drinking fountain 52 using the two laser measuring devices 13, and based on this, evaluates data. Ev can be generated. Therefore, the growing condition evaluation system 10 can obtain the evaluation data Ev without touching the dairy cow 51 while making use of the fact that the dairy cow 51 comes to the drinking fountain 52 of its own will. Therefore, the growing condition evaluation system 10 can determine an appropriate growing condition without applying stress to the dairy cow 51 such as touching it or guiding it to an unintended place for generating the evaluation data Ev. do. In addition, since the growth condition evaluation system 10 automatically obtains the evaluation data Ev by using the natural behavior of the dairy cow 51, it is possible to eliminate the time limit and manage the system all day (24 hours). Furthermore, the growing condition evaluation system 10 does not acquire the point cloud data D1 by the two laser measuring devices 13 when the dairy cow 51 is not present at the drinking fountain 52, so acquisition and accumulation of unnecessary data can be prevented. and can be operated efficiently.
 本開示に係る生育状況評価システムの実施例1の生育状況評価システム10は、以下の各作用効果を得ることができる。
  生育状況評価システム10は、出射したレーザ光(パルスレーザ光線)の動物(乳牛51)からの反射光を受光することで動物の外形を三次元座標で示す点群データD1を取得するレーザ測定器13を備える。また、生育状況評価システム10は、点群データD1に基づいて動物の三次元面データD5を生成する面データ生成部65と、三次元面データD5に適合する近似曲線Acを算出する近似曲線算出部69と、近似曲線Acに基づいて、動物の評価を示す評価データEvを生成する育成状況評価部71と、を備える。このため、生育状況評価システム10は、実際の動物の輪郭を極めて忠実に表す点群データD1に基づいて、輪郭OLを標準化した数式で示す近似曲線Acを算出することができ、その近似曲線Acから評価データEvを生成できるので、バラツキのない適切な育成状況の評価を得ることができる。
The growing condition evaluation system 10 of Example 1 of the growing condition evaluation system according to the present disclosure can obtain the following effects.
The growing condition evaluation system 10 is a laser measuring instrument that acquires point cloud data D1 representing the outline of the animal in three-dimensional coordinates by receiving the reflected light of the emitted laser beam (pulse laser beam) from the animal (dairy cow 51). 13. The growing condition evaluation system 10 also includes a surface data generation unit 65 that generates three-dimensional surface data D5 of the animal based on the point cloud data D1, and an approximate curve calculator that calculates an approximate curve Ac that fits the three-dimensional surface data D5. 69, and a breeding condition evaluation unit 71 that generates evaluation data Ev indicating an evaluation of the animal based on the approximated curve Ac. Therefore, the growing condition evaluation system 10 can calculate the approximated curve Ac representing the contour OL with a standardized formula based on the point cloud data D1 that very faithfully represents the contour of the actual animal. Since the evaluation data Ev can be generated from the data, it is possible to obtain an appropriate evaluation of the training status without variation.
 また、生育状況評価システム10は、近似曲線算出部69が、三次元面データD5を切断平面に沿って切断した切断面(臀部近傍断面データD7)の輪郭OLに適合させて近似曲線Acを算出する。このため、生育状況評価システム10は、切断面の位置を、骨等を利用した特徴部分に基づいて定めることで、生成する評価データEvの容量を抑制しつつ、動物同士の比較を容易として育成状況を適切に評価できる。 In addition, in the growth condition evaluation system 10, the approximate curve calculation unit 69 fits the contour OL of the cut surface (buttock vicinity cross-sectional data D7) obtained by cutting the three-dimensional surface data D5 along the cutting plane to calculate the approximate curve Ac. do. For this reason, the growth condition evaluation system 10 determines the position of the cut surface based on the characteristic portion using bones, etc., thereby suppressing the capacity of the generated evaluation data Ev and facilitating comparison between animals. Appropriately assess the situation.
 さらに、生育状況評価システム10は、動物(乳牛51)の点群データD1を取得するデータ取得領域14を設定し、そのデータ取得領域14に動物がいることを検出する動物検出機構(カメラ12、動物検出部61)を備える。そして、生育状況評価システム10は、レーザ測定器13が、動物検出機構(カメラ12、動物検出部61)が動物を検出すると、データ取得領域14の点群データD1を取得する。このため、生育状況評価システム10は、動物がデータ取得領域14にいる時だけレーザ測定器13を駆動させるので、容量の大きな点群データD1を動物が存在しない状態で取得することがなくなり、効率良く評価データEvを生成できる。加えて、生育状況評価システム10は、触れたり意思に反した場所等へ導いたりする等のストレスを動物に与えることなく評価データEvを生成でき、動物の育成を阻害することを防止できる。特に、実施例1の生育状況評価システム10は、データ取得領域14を水飲み場52としているので、動物が自然に行動している中で自ら停止するときを利用して評価データEvを生成できるので、その生成に伴う動物のストレスをなくすことができる。 Furthermore, the growing condition evaluation system 10 sets a data acquisition area 14 for acquiring point cloud data D1 of an animal (dairy cow 51), and an animal detection mechanism (camera 12, An animal detection unit 61) is provided. When the laser measuring device 13 and the animal detection mechanism (camera 12, animal detection unit 61) detect an animal, the growing condition evaluation system 10 acquires the point cloud data D1 of the data acquisition region 14. FIG. Therefore, since the growing condition evaluation system 10 drives the laser measuring device 13 only when the animal is in the data acquisition area 14, the large-capacity point cloud data D1 is not acquired in the absence of the animal. Evaluation data Ev can be generated well. In addition, the growing condition evaluation system 10 can generate the evaluation data Ev without giving stress to the animal such as touching it or leading it to an unintended place, etc., and can prevent hindrance to the breeding of the animal. In particular, since the growth condition evaluation system 10 of the first embodiment uses the water fountain 52 as the data acquisition area 14, the evaluation data Ev can be generated by utilizing the time when the animal stops itself while acting naturally. , can eliminate animal stress associated with its production.
 生育状況評価システム10は、レーザ測定器13を、データ取得領域14を挟んで対を為して設けている。このため、生育状況評価システム10は、データ取得領域14において、動物の位置や姿勢や向いている方向等に拘らず、動物の点群データD1を取得でき、ストレスを軽減しつつ動物の適切な評価データEvを生成できる。特に、実施例1では、両レーザ測定器13は、データ取得領域14としての水飲み場52にいる動物の姿勢に拘らず、その動物の少なくとも半身(左右のいずれか一方)の点群データD1を取得可能とするように、水飲み場52に対する位置関係が設定されている。このため、生育状況評価システム10は、評価データEvの生成のために生じる動物のストレスをなくしつつ、評価データEvの生成に必要な点群データD1の取得の可能性を極めて高めることができる。 The growing condition evaluation system 10 has a pair of laser measuring instruments 13 with a data acquisition area 14 interposed therebetween. Therefore, the growth condition evaluation system 10 can acquire the point cloud data D1 of the animal in the data acquisition area 14 regardless of the animal's position, posture, facing direction, etc., and can provide an appropriate image of the animal while reducing stress. Evaluation data Ev can be generated. In particular, in Example 1, both laser measurement devices 13 acquire point cloud data D1 of at least half of the animal's body (either left or right) regardless of the posture of the animal in the drinking fountain 52 as the data acquisition area 14. A positional relationship with respect to the drinking fountain 52 is set so that it can be obtained. Therefore, the growing condition evaluation system 10 can greatly increase the possibility of obtaining the point cloud data D1 necessary for generating the evaluation data Ev while eliminating the animal's stress caused by the generation of the evaluation data Ev.
 生育状況評価システム10は、さらに、各レーザ測定器13が取得した点群データD1を合成して合成点群データD3を生成するデータ合成部63と、合成点群データD3から動物(乳牛51)を示す動物点群データD4を生成する動物抽出部64と、を備える。そして、生育状況評価システム10は、面データ生成部65が、動物点群データD4から三次元面データD5を生成する。このため、生育状況評価システム10は、2つの点群データD1を合成して合成点群データD3とすることで、動物における育成状況の評価のために必要なデータ(実施例1では臀部近傍51aの少なくとも半身)の取得の可能性を極めて高めることができる。また、生育状況評価システム10は、合成点群データD3のうちで育成状況の評価のために必要となる動物を示すもの(動物点群データD4)を生成し、そこからから三次元面データD5、評価データEvを生成するので、三次元面データD5および評価データEvの容量およびその生成のための作業量を抑制でき、効率良く評価データEvを生成できる。 The growing condition evaluation system 10 further includes a data synthesizing unit 63 that synthesizes the point cloud data D1 acquired by each laser measuring device 13 to generate synthesized point cloud data D3, and an animal (dairy cow 51) from the synthesized point cloud data D3. and an animal extracting unit 64 that generates animal point cloud data D4 indicating . Then, in the growing condition evaluation system 10, the surface data generation unit 65 generates three-dimensional surface data D5 from the animal point group data D4. For this reason, the growing condition evaluation system 10 synthesizes the two point cloud data D1 to obtain the synthesized point cloud data D3, thereby obtaining the data necessary for evaluating the growing condition of the animal (the vicinity of the buttocks 51a in the first embodiment). (at least half of the body) can be greatly increased. In addition, the growing condition evaluation system 10 generates data (animal point cloud data D4) indicating animals required for evaluation of the growing condition among the synthesized point cloud data D3, and from there, three-dimensional surface data D5. , the evaluation data Ev is generated, the volume of the three-dimensional surface data D5 and the evaluation data Ev and the amount of work for generating them can be suppressed, and the evaluation data Ev can be efficiently generated.
 生育状況評価システム10は、動物検出機構(カメラ12、動物検出部61)が、検出した動物(乳牛51)を個体別に識別した個体識別データD2を生成し、育成状況評価部71が、生成した評価データEvに個体識別データD2を関連付ける。このため、生育状況評価システム10は、複数の動物がデータ取得領域14に時間や順序の決まりなく訪れるような状況であっても、それぞれの動物の育成状況の評価を適切に得ることができる。 In the growing condition evaluation system 10, the animal detection mechanism (camera 12, animal detection unit 61) generates individual identification data D2 in which the detected animal (dairy cow 51) is individually identified, and the growth condition evaluation unit 71 generates Individual identification data D2 is associated with evaluation data Ev. Therefore, the growing condition evaluation system 10 can appropriately evaluate the growing condition of each animal even in a situation where a plurality of animals visit the data acquisition area 14 at random times or in any order.
 したがって、本開示に係る生育状況評価システムの一実施例としての生育状況評価システム10では、動物(乳牛51)の育成状況の評価を適切に得ることができる。 Therefore, in the growth situation evaluation system 10 as an example of the growth situation evaluation system according to the present disclosure, it is possible to appropriately obtain an evaluation of the rearing situation of the animal (dairy cow 51).
 以上、本開示の生育状況評価システムを実施例1に基づき説明してきたが、具体的な構成については実施例1に限られるものではなく、特許請求の範囲の各請求項に係る発明の要旨を逸脱しない限り、設計の変更や追加等は許容される。 As described above, the growth situation evaluation system of the present disclosure has been described based on Example 1, but the specific configuration is not limited to Example 1, and the gist of the invention according to each claim of the scope of claims. Design changes and additions are permitted as long as they do not deviate.
 例えば、実施例1では、水飲み場52をデータ取得領域14として設定している。しかしながら、データ取得領域14は、適宜設定すればよく、実施例1の構成に限定されない。ここで、データ取得領域14は、餌場等のように、水飲み場52と同様に動物が自らの意思で定期的に訪れる場所を設定することで、動物にとってのストレスとなることなく育成状況の評価を適切に得ることができる。加えて、データ取得領域14は、各レーザ測定器13による走査を完了するために必要な時間だけ、動物が自らの意思で立ち止まる場所に設定することで、動物のストレスをより低減しつつ育成状況の評価を適切に得ることができる。ここで、上記の必要な時間は、各レーザ測定器13における走査の速度に左右されるもので、例えば、一度に放射するパルスレーザ光線の本数を増やすことで短くすることができ、立ち止まる時間を短くしても点群データD1を適切に取得できる。 For example, in Example 1, the drinking fountain 52 is set as the data acquisition area 14 . However, the data acquisition area 14 may be set as appropriate, and is not limited to the configuration of the first embodiment. Here, in the data acquisition area 14, by setting a place where the animal regularly visits at its own will like the watering place 52, such as a feeding place, the breeding situation can be monitored without stressing the animal. Appraisal can be obtained properly. In addition, by setting the data acquisition area 14 at a place where the animal stops at its own will for the time required to complete the scanning by each laser measuring device 13, the stress of the animal can be further reduced and the breeding situation can be monitored. evaluation can be obtained appropriately. Here, the above required time depends on the scanning speed of each laser measuring device 13, and can be shortened by, for example, increasing the number of pulsed laser beams emitted at one time. Even if it is shortened, the point cloud data D1 can be obtained appropriately.
 また、実施例1では、動物の一例として乳牛51を対象として、その育成状況の評価を示す評価データEvを生成している。しかしながら、評価データEvを生成(育成状況を評価する対象と)するものは、育成状況の評価が求められる動物であればよく、実施例1の構成に限定されない。 In addition, in Example 1, the dairy cow 51 is targeted as an example of an animal, and the evaluation data Ev indicating the evaluation of the breeding status is generated. However, an animal for which evaluation data Ev is generated (an object for which the raising state is to be evaluated) is not limited to the configuration of the first embodiment as long as the animal is required to evaluate the raising state.
 さらに、実施例1では、近似曲線算出部69が、切断面抽出部68が生成した臀部近傍断面データD7における外表面側の輪郭OLに適合する近似曲線Acを算出している。しかしながら、臀部近傍51aすなわち臀部近傍断面データD7において、育成状況の評価に用いる箇所は基本的に左右対称であるので、臀部近傍断面データD7(臀部近傍51a)の左右いずれかの半身に対して近似曲線Acを算出(半身で評価する)してもよく、実施例1の構成に限定されない。 Furthermore, in Example 1, the approximated curve calculation unit 69 calculates the approximated curve Ac that fits the contour OL on the outer surface side in the cross-sectional data D7 near the buttocks generated by the cut surface extraction unit 68 . However, in the buttock neighborhood 51a, that is, the buttock neighborhood cross-sectional data D7, the locations used for evaluating the growth status are basically left-right symmetrical. The curve Ac may be calculated (evaluated on the half of the body), and is not limited to the configuration of the first embodiment.
 実施例1では、動物検出部61が、カメラ12からの画像に基づいて、データ取得領域14に動物がいることを検出する動物検出機構として機能している。しかしながら、動物検出機構は、データ取得領域14に動物がいることを検出するものであればよく、実施例1の構成に限定されない。この一例として、例えば、カメラ12に替えて赤外線スキャナや赤外線サーモグラフィのように赤外線を利用したものを用いることができる。この場合には、暗い状況であっても動物の検出をより確実なものにできる。また、他の例として、例えば、水飲み場52の蛇口に圧力センサを設けたり、データ取得領域14に重さを検知する装置を設けたり、データ取得領域14の出入り口にセンサを設けたりすることができる。 In Example 1, the animal detection unit 61 functions as an animal detection mechanism that detects the presence of an animal in the data acquisition area 14 based on the image from the camera 12 . However, the animal detection mechanism is not limited to the configuration of the first embodiment as long as it detects the presence of an animal in the data acquisition area 14 . As an example of this, for example, instead of the camera 12, a device using infrared rays such as an infrared scanner or an infrared thermography can be used. In this case, detection of animals can be made more reliable even in dark conditions. Further, as another example, for example, a pressure sensor may be provided at the faucet of the drinking fountain 52, a weight detection device may be provided in the data acquisition area 14, or a sensor may be provided at the entrance of the data acquisition area 14. can.
 実施例1では、レーザ測定器13を、データ取得領域14を挟んで対を為して設けている。しかしながら、レーザ測定器13は、データ取得領域14に存在する動物の点群データD1を取得するものであれば、単一としてもよく、3つ以上設けてもよく、実施例1の構成に限定されない。 In Example 1, the laser measuring instruments 13 are provided in pairs with the data acquisition area 14 interposed therebetween. However, as long as the laser measuring device 13 acquires the point group data D1 of the animal existing in the data acquisition area 14, a single laser measuring device 13 may be provided, or three or more may be provided. not.
 実施例1では、特定部位切出部66が、乳牛51の臀部近傍51aの臀部近傍面データD6を生成している。しかしながら、特定部位切出部66は、動物の育成状況の評価に適した箇所を特定部位として切り出すものであれば、特定する部位は適宜設定すればよく、実施例1の構成に限定されない。 In Example 1, the specific part extraction unit 66 generates the buttock vicinity surface data D6 of the buttock vicinity 51a of the dairy cow 51 . However, as long as the specified part extracting unit 66 extracts a part suitable for evaluating the animal's rearing status as a specified part, the specified part may be appropriately set, and is not limited to the configuration of the first embodiment.
 実施例1では、特定部位切出部66や切断面抽出部68や近似曲線算出部69が、骨の特徴部分を基準としている。しかしながら、それらの各部(66、68、69)は、動物の部位を設定する際に検出可能であって動物の個体差に拘らず共通するものであれば、他の部分を基準としてもよく、実施例1の構成に限定されない。 In the first embodiment, the specific portion extraction unit 66, the cut surface extraction unit 68, and the approximated curve calculation unit 69 are based on the characteristic portion of the bone. However, each of these parts (66, 68, 69) may be based on other parts as long as they are detectable when setting the parts of the animal and are common regardless of individual differences between animals, It is not limited to the configuration of the first embodiment.
 実施例1では、近似曲線算出部69が、ベジェ曲線を用いて近似曲線Acを算出している。しかしながら、近似曲線算出部69は、特定された箇所の輪郭(実施例1では臀部近傍断面データD7の輪郭OL)を適合しつつ所定の形式とされた式で表すことのできる近似曲線を算出するものであれば、近似曲線の算出方法は適宜設定すればよく、実施例1の構成に限定されない。 In Example 1, the approximate curve calculation unit 69 calculates the approximate curve Ac using a Bezier curve. However, the approximated curve calculation unit 69 calculates an approximated curve that can be represented by an equation in a predetermined format while adapting the contour of the specified location (the contour OL of the cross-sectional data D7 near the buttocks in the first embodiment). If so, the method of calculating the approximate curve may be set as appropriate, and is not limited to the configuration of the first embodiment.
関連出願への相互参照Cross-references to related applications
 本出願は、2021年3月25日に日本国特許庁に出願された特願2021-052422に基づいて優先権を主張し、その全ての開示は完全に本明細書で参照により組み込まれる。
 
This application claims priority from Japanese Patent Application No. 2021-052422 filed with the Japan Patent Office on March 25, 2021, the entire disclosure of which is fully incorporated herein by reference.

Claims (6)

  1.  出射したレーザ光の動物からの反射光を受光することで前記動物の外形を三次元座標で示す点群データを取得するレーザ測定器と、
     前記点群データに基づいて前記動物の三次元面データを生成する面データ生成部と、
     前記三次元面データに適合する近似曲線を算出する近似曲線算出部と、
     前記近似曲線に基づいて、前記動物の評価を示す評価データを生成する育成状況評価部と、を備えることを特徴とする生育状況評価システム。
    a laser measuring instrument that acquires point cloud data indicating the outline of the animal in three-dimensional coordinates by receiving reflected light from the animal of emitted laser light;
    a surface data generation unit that generates three-dimensional surface data of the animal based on the point cloud data;
    an approximate curve calculation unit that calculates an approximate curve that fits the three-dimensional surface data;
    and a growing condition evaluation unit that generates evaluation data indicating an evaluation of the animal based on the approximated curve.
  2.  前記近似曲線算出部は、前記三次元面データを切断平面に沿って切断した切断面の輪郭に適合させて前記近似曲線を算出することを特徴とする請求項1に記載の生育状況評価システム。 The growth situation evaluation system according to claim 1, wherein the approximate curve calculation unit calculates the approximate curve by fitting the three-dimensional surface data to the contour of a cut plane cut along the cutting plane.
  3.  請求項1または請求項2に記載の生育状況評価システムであって、
     さらに、前記動物の前記点群データを取得するデータ取得領域が設定され、
     前記データ取得領域に前記動物がいることを検出する動物検出機構を備え、
     前記レーザ測定器は、前記動物検出機構が前記動物を検出すると、前記データ取得領域の前記点群データを取得することを特徴とする生育状況評価システム。
    The growth situation evaluation system according to claim 1 or claim 2,
    Furthermore, a data acquisition area for acquiring the point cloud data of the animal is set,
    An animal detection mechanism for detecting that the animal is in the data acquisition area;
    The growth situation evaluation system, wherein the laser measuring device acquires the point cloud data of the data acquisition region when the animal detection mechanism detects the animal.
  4.  前記レーザ測定器は、前記データ取得領域を挟んで対を為して設けられていることを特徴とする請求項3に記載の生育状況評価システム。 The growing condition evaluation system according to claim 3, characterized in that the laser measuring instruments are provided in pairs with the data acquisition area interposed therebetween.
  5.  請求項4に記載の生育状況評価システムであって、
     さらに、各前記レーザ測定器が取得した前記点群データを合成して合成点群データを生成するデータ合成部と、
     前記合成点群データから前記動物を示す動物点群データを生成する動物抽出部と、を備え、
     前記面データ生成部は、前記動物点群データから前記三次元面データを生成することを特徴とする生育状況評価システム。
    The growing condition evaluation system according to claim 4,
    Furthermore, a data synthesizing unit that synthesizes the point cloud data acquired by each of the laser measuring instruments to generate synthetic point cloud data;
    an animal extraction unit that generates animal point cloud data representing the animal from the synthesized point cloud data;
    The growing condition evaluation system, wherein the surface data generation unit generates the three-dimensional surface data from the animal point cloud data.
  6.  前記動物検出機構は、検出した前記動物を個体別に識別した個体識別データを生成し、 前記育成状況評価部は、生成した前記評価データに前記個体識別データを関連付けることを特徴とする請求項3から請求項5のいずれか1項に記載の生育状況評価システム。
     
    4. The animal detection mechanism generates individual identification data identifying the detected animal for each individual, and the breeding status evaluation unit associates the generated evaluation data with the individual identification data. The growing condition evaluation system according to claim 5.
PCT/JP2022/013100 2021-03-25 2022-03-22 Growing condition evaluation system WO2022202793A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-052422 2021-03-25
JP2021052422A JP2022150026A (en) 2021-03-25 2021-03-25 Growth condition evaluation system

Publications (1)

Publication Number Publication Date
WO2022202793A1 true WO2022202793A1 (en) 2022-09-29

Family

ID=83395847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/013100 WO2022202793A1 (en) 2021-03-25 2022-03-22 Growing condition evaluation system

Country Status (2)

Country Link
JP (1) JP2022150026A (en)
WO (1) WO2022202793A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10500207A (en) * 1994-04-14 1998-01-06 フェノ・イメージング、インク Animal three-dimensional phenotype measurement device
US20100289879A1 (en) * 2009-05-01 2010-11-18 Texas Tech University System Remote Contactless Stereoscopic Mass Estimation System
JP2014044078A (en) * 2012-08-24 2014-03-13 Univ Of Miyazaki Body weight estimation device and body weight estimation method, for animal body
JP2019045478A (en) * 2017-09-06 2019-03-22 国立大学法人 宮崎大学 Body weight estimation device for livestock and body weight estimation method for livestock
JP2019187277A (en) * 2018-04-24 2019-10-31 国立大学法人 宮崎大学 Evaluation device, evaluation method and evaluation program of body condition score of cow
JP2019211364A (en) * 2018-06-06 2019-12-12 全国農業協同組合連合会 Device and method for estimating weight of body of animal
WO2021166894A1 (en) * 2020-02-18 2021-08-26 国立大学法人宮崎大学 Weight estimation device and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10500207A (en) * 1994-04-14 1998-01-06 フェノ・イメージング、インク Animal three-dimensional phenotype measurement device
US20100289879A1 (en) * 2009-05-01 2010-11-18 Texas Tech University System Remote Contactless Stereoscopic Mass Estimation System
JP2014044078A (en) * 2012-08-24 2014-03-13 Univ Of Miyazaki Body weight estimation device and body weight estimation method, for animal body
JP2019045478A (en) * 2017-09-06 2019-03-22 国立大学法人 宮崎大学 Body weight estimation device for livestock and body weight estimation method for livestock
JP2019187277A (en) * 2018-04-24 2019-10-31 国立大学法人 宮崎大学 Evaluation device, evaluation method and evaluation program of body condition score of cow
JP2019211364A (en) * 2018-06-06 2019-12-12 全国農業協同組合連合会 Device and method for estimating weight of body of animal
WO2021166894A1 (en) * 2020-02-18 2021-08-26 国立大学法人宮崎大学 Weight estimation device and program

Also Published As

Publication number Publication date
JP2022150026A (en) 2022-10-07

Similar Documents

Publication Publication Date Title
US20200187906A1 (en) System and methods for at-home ultrasound imaging
US10219782B2 (en) Position correlated ultrasonic imaging
JP5400343B2 (en) Method and apparatus for diagnosis of parturition by ultrasound
US20180271484A1 (en) Method and systems for a hand-held automated breast ultrasound device
CN100556360C (en) Ultrasonic probe track display device and method and diagnostic ultrasound equipment and method
CN110087555B (en) Ultrasonic equipment and display transformation method and system of three-dimensional ultrasonic image thereof
US20110255762A1 (en) Method and system for determining a region of interest in ultrasound data
ATE519430T1 (en) DEVICE FOR A USER INTERFACE FOR MEDICAL ULTRASONIC NAVIGATION
US20210321976A1 (en) Method for postural independent location of targets in diagnostic images acquired by multimodal acquisitions and system for carrying out the method
WO2015078148A1 (en) Ultrasound-assisted scanning method and system
JP2015171476A (en) Ultrasonic diagnostic device and ultrasonic image processing method
US9730675B2 (en) Ultrasound imaging system and an ultrasound imaging method
JP7442548B2 (en) guided ultrasound imaging
CN110087550A (en) A kind of ultrasound pattern display method, equipment and storage medium
US11523799B2 (en) Fetal imaging system and method
JP7308196B2 (en) Ultrasound lung assessment
CN113795198A (en) System and method for controlling volumetric rate
WO2022202793A1 (en) Growing condition evaluation system
KR20210010321A (en) Methods and systems for processing and displaying fetal images from ultrasound imaging data
US11890142B2 (en) System and methods for automatic lesion characterization
WO2023190352A1 (en) Growing condition evaluation system
KR20200099910A (en) Apparatus and method for displaying ultrasound image and computer program product
JP2023146764A (en) Growth state evaluation system
JP2023146765A (en) Growth state evaluation system
KR102598211B1 (en) Ultrasound scanner for measuring urine volume in a bladder

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22775576

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22775576

Country of ref document: EP

Kind code of ref document: A1