WO2020022309A1 - Measurement device, measurement system, measurement method, and program storage medium - Google Patents

Measurement device, measurement system, measurement method, and program storage medium Download PDF

Info

Publication number
WO2020022309A1
WO2020022309A1 PCT/JP2019/028768 JP2019028768W WO2020022309A1 WO 2020022309 A1 WO2020022309 A1 WO 2020022309A1 JP 2019028768 W JP2019028768 W JP 2019028768W WO 2020022309 A1 WO2020022309 A1 WO 2020022309A1
Authority
WO
WIPO (PCT)
Prior art keywords
thickness
measurement
unit
fish
reference plane
Prior art date
Application number
PCT/JP2019/028768
Other languages
French (fr)
Japanese (ja)
Inventor
準 小林
真美子 麓
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2020022309A1 publication Critical patent/WO2020022309A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness

Definitions

  • the present invention relates to a technology for measuring an object.
  • Patent Document 1 a passing object mounted on a moving lane is photographed by a plurality of photographing devices provided above the moving lane, and the height of the passing object is heightened by using a photographed image of the passing object.
  • a configuration for calculating the degree is disclosed.
  • Patent Literature 2 discloses that a fish body placed on a conveyor is irradiated with near-infrared light and white light, and a fish body is formed using a monochromatic image of the fish body by near-infrared light and a color image of the fish body by white light. A configuration for discriminating the type of is disclosed.
  • Patent Document 1 it is assumed that the object whose height is to be measured has a rectangular parallelepiped shape, and that the upper surface is flat. Thereby, the edge of the upper surface of the object is detected, and the height of the edge is detected as the height of the passing object. Therefore, for example, if the passing object in Patent Literature 1 is replaced with a fish and the height (thickness) of the fish is measured using the technique in Patent Literature 1, the height of the outline of the fish is detected as the thickness of the fish. You.
  • the thickness of the fish is the thickness of the center portion that is most inflated, and not the height of the contour portion from the conveyor, the thickness of the fish measured by the method of Patent Document 1 is shifted from the actual thickness of the fish. Value. In other words, it is difficult to accurately detect the thickness, which is the size of the fish related to the selling price and the like, with the method of Patent Document 1.
  • a main object of the present invention is to provide a technique capable of increasing the certainty of at least the measured value of the thickness of an object even when the surface of the object is not flat.
  • one embodiment of the measuring device is: A receiving unit that receives the information output from the detection device that is arranged at a position facing the reference surface and outputs information related to spatial coordinates in the detection area including the reference surface, Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. A calculating unit that calculates the thickness of the detected portion as the thickness of the object.
  • One aspect of the measurement system is: A detection device that outputs information related to spatial coordinates in the detection area, A receiving unit that receives the information output from a detection device that is arranged at a position facing the reference plane and outputs information related to spatial coordinates in a detection area including the reference plane, and the reception unit is disposed on the reference plane. Utilizing information related to the spatial coordinates on the surface of the object in the detection area, the object detects the thickest portion in the direction from the reference plane toward the detection device, and calculates the thickness of the detected portion. And a calculating unit that calculates the thickness of the object.
  • One aspect of the measurement method Receiving the information output from the detection device that is disposed at a position facing the reference plane and outputs information related to spatial coordinates in the detection area including the reference plane, Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. , The thickness of the detected portion is calculated as the thickness of the object.
  • One embodiment of the program storage medium includes: A process of receiving the information output from the detection device that is arranged at a position facing the reference surface and outputs information related to spatial coordinates in the detection area including the reference surface, Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, detecting a portion of the object having the largest thickness in the direction from the reference plane to the detection device. Processing, A computer program for causing a computer to execute a process of calculating the thickness of the detected portion as the thickness of the object is stored.
  • FIG. 1 is a model diagram schematically illustrating a configuration of a measurement system according to a first embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a simplified configuration of a measurement system according to a first embodiment of the present invention. It is a figure explaining the fork length of a fish.
  • 5 is a flowchart illustrating an example of a calculation operation in the measurement device according to the first embodiment. It is a block diagram explaining a modification in the measuring device of a 1st embodiment. It is a figure explaining composition of a measuring system of a 2nd embodiment concerning the present invention. It is a block diagram which represents the structure of the measuring device of other embodiment which concerns on this invention in a simplified form.
  • FIG. 9 is a block diagram illustrating a simplified configuration of a measurement system according to another embodiment of the present invention.
  • FIG. 1 shows a simplified configuration of a measurement system including a measurement device according to a first embodiment of the present invention.
  • the measurement system 1 according to the first embodiment is a system that measures the size of a fish body 5 being conveyed by a belt conveyor 3 that is a conveyance device, and includes a camera 7 and a measurement device 8.
  • FIG. 2 is a simplified block diagram showing the configurations of the camera 7 and the measuring device 8.
  • the camera 7 is a camera with a depth sensor that functions as a detection device, and is disposed above the belt conveyor 3. As shown in FIG. 2, the camera 7 includes a photographing unit 27 and a sensor unit 28.
  • the photographing unit 27 has a function of photographing an object in a detection area (imaging range) as indicated by a dotted line K in FIG. 1 and outputting a photographed image of visible light (hereinafter also referred to as a color image).
  • the size of the detection region K is set to a size corresponding to the size of the fish 5 to be measured.
  • the sensor unit 28 has a function of outputting information relating to the distance (depth) in the depth direction from the camera 7 with respect to the object in the detection area K.
  • the sensor unit 28 has a function of calculating information relating to the distance from the camera 7 to the object for each pixel by a TOF (Time @ of @ Flight) method using infrared rays.
  • the sensor unit 28 has a function of outputting, for example, a two-dimensional image (hereinafter, also referred to as a sensor image) including information on a distance (depth) to an object portion corresponding to the pixel for each pixel. I have.
  • the sensor image output from the sensor unit 28 includes, for each pixel, information relating to the spatial coordinates of the object portion corresponding to the pixel.
  • the x component, the y component, and the z component that constitute the spatial coordinates are represented based on an orthogonal coordinate system defined so that the Z axis runs along the direction from the belt conveyor 3 to the camera 7.
  • the camera 7 having such a configuration is connected to the measuring device 8 by wire communication or wireless communication. Further, the height position of the camera 7 with respect to the belt conveyor 3 (in other words, the reference plane) is determined in consideration of the assumed size of the fish body 5 to be measured, and the entire fish body 5 to be measured is detected by the camera 7. It is set appropriately so as to enter K.
  • the color image output from the camera 7 by the image capturing unit 27 and the sensor image output by the sensor unit 28 include information indicating the time point at which the image was captured. 28 allows synchronization with the sensor image.
  • the color image output from the camera 7 and the sensor image are output so that the depth information of the object portion corresponding to each pixel constituting the color image by the imaging unit 27 can be obtained from the sensor image by the sensor unit 28. Is provided to the measuring device 8.
  • the measuring device 8 is, for example, a personal computer, and includes a control device 10 and a storage device 20, as shown in FIG.
  • the measuring device 8 is connected to, for example, an input device (for example, a keyboard or a mouse) 24 for inputting information to the measuring device 8 by a measurer's operation, and a display device 25 for displaying information.
  • the measuring device 8 may be connected to an external storage device 21 separate from the measuring device 8.
  • the storage device 20 has a function of storing various data and computer programs (hereinafter, also referred to as programs), and is realized by a storage medium such as a hard disk device or a semiconductor memory.
  • the number of storage devices provided in the measurement device 8 is not limited to one, and a plurality of types of storage devices may be provided in the measurement device 8. In this case, the plurality of storage devices are collectively referred to as the storage device 20.
  • the storage device 21 also has a function of storing various data and computer programs, like the storage device 20, and is realized by a storage medium such as a hard disk device or a semiconductor memory.
  • the storage device 21 stores appropriate information. In this case, the measuring device 8 executes a process of writing and reading information to and from the storage device 21 as appropriate, but a description of the storage device 21 is omitted in the following description.
  • the control device 10 includes, for example, a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit).
  • the control device 10 can have the following functions, for example, when the CPU executes a computer program stored in the storage device 20. That is, the control device 10 includes a receiving unit 11, a calculating unit 12, and a display control unit 13 as functional units.
  • the receiving unit 11 receives information output from the camera 7 (that is, information of a color image obtained by the image capturing unit 27 and information of a sensor image obtained by the sensor unit 28), and outputs the received information (received information) to the display control unit 13. It has the function to do. Further, the receiving unit 11 has a function of storing the received information in the storage device 20.
  • the display control unit 13 has a function of controlling the display operation of the display device 25.
  • the display control unit 13 causes the display device 25 to display an image based on the reception information received from the camera 7 via the reception unit 11, for example, in real time.
  • a display mode for displaying an image based on the received information on the display device 25 may be a display mode that is appropriately set. For example, it is assumed that a display mode for displaying a color image by the imaging unit 27, a display mode for displaying a sensor image by the sensor unit 28, and a display mode for displaying a color image and a sensor image side by side are set in advance.
  • the display control unit 13 detects a display mode selected by the measurer from among the plurality of display modes based on operation information of the input device 24 by the measurer, and displays the display device 25 based on the detected display mode.
  • the display operation of the display device 25 is controlled so that the.
  • the display control unit 13 reads out one or both of the past color image and the sensor image by the camera 7 from the storage device 20 based on the operation information of the input device 24 by the measurer, and displays the read image on the display device. Control for displaying the information on the display 25 may be performed.
  • the calculation unit 12 includes a thickness measurement unit 15 and a length measurement unit 16.
  • the thickness measuring unit 15 has a function of measuring the thickness (body width) of the fish body 5 being conveyed by the belt conveyor 3 using the information of the image output from the camera 7 as follows.
  • the thickness measuring unit 15 determines whether or not the entire fish 5 to be measured is in the detection area K using the color image obtained by the photographing unit 27.
  • reference data for fish detection generated by machine learning is stored in the storage device 20, and the color image of the color image by the photographing unit 27 is stored using the reference data for fish detection. It is determined whether or not the entire measurement target fish 5 is reflected in the detection area K.
  • the fish of the type to be measured is learned using a large number of images of the fish of the fish of the type to be measured as teacher data.
  • the teacher data also includes data of fish bodies that are variously inclined with respect to the traveling direction of the belt conveyor 3.
  • the thickness measuring unit 15 can detect the fish even if the fish is inclined with respect to the traveling direction of the belt conveyor 3, It can also be detected that the entire inclined fish body is in the detection area K. That is, when the fish body is inclined with respect to the traveling direction of the belt conveyor 3, it is possible to suppress a situation in which the thickness measurement unit 15 cannot detect the fish body even though the entire fish body is in the detection area K. it can.
  • the thickness measurement unit 15 detects that the entire fish body 5 to be measured is within the detection area K, the thickness measurement unit 15 refers to the sensor image captured by the sensor unit 28 at the time of the detection. Then, the thickness measuring unit 15 compares the z component of the spatial coordinates of each pixel in the referenced sensor image with each other, and determines the pixel having the highest z component at the height position from the belt conveyor 3 (hereinafter, a vertex pixel). Is also detected). It is assumed that the information of the z component of the spatial coordinates representing the position of the belt conveyor 3 in the sensor image has been acquired in advance.
  • the thickness measurement unit 15 calculates the spatial coordinates of pixels within a set range including the detected vertex pixels (for example, pixels within a rectangular range of 5 pixels (pixels) ⁇ 5 pixels (pixels) centered on the vertex pixels). The average value of the z component is calculated. Then, the thickness measuring unit 15 calculates the difference between the calculated average value of the z component and the z component of the spatial coordinates representing the position of the belt conveyor 3 as the thickness of the fish 5 to be measured. Note that the thickness measuring unit 15 does not calculate the average value of the z component of the spatial coordinates at the pixels within the set range including the apex pixel, but corresponds to the z component of the spatial coordinates at the apex pixel and the position of the belt conveyor 3. The difference from the z component may be calculated as the thickness of the fish 5 to be measured.
  • the length measurement unit 16 uses a color image obtained by the imaging unit 27 and a sensor image obtained by the sensor unit 28. And a function of calculating the fork length of the fish 5 to be measured.
  • the fork length here indicates the length L from the mouth M of the fish body 5 to the forked part T of the tail as shown in FIG.
  • This fork length is a length measurement portion in the fish body 5, and the tip M of the fish body 5 and the forked portion T of the tail are detected as both ends of the length measurement for measuring the fork length. .
  • the length measuring unit 16 detects, for example, the mouth M of the fish 5 used for length measurement and the forked part T of the tail from the color image obtained by the photographing unit 27.
  • this detection processing for example, reference data for detecting a measurement use site stored in the storage device 20 in advance is referred to, and a color image detected as including the entire fish 5 to be measured in the detection area K is detected.
  • the reference data for detecting the measurement use site is obtained by, for example, machine learning using image data of the entire fish body to which the information of the mouth M and the forked part T of the tail as shown in FIG. Generated.
  • the length measuring unit 16 detects the mouth M of the fish body 5 and the forked part T of the tail, and then detects the spatial coordinates information corresponding to the detected mouth M and the forked part T of the fish as a sensor. It is extracted from the sensor image by the unit 28. Then, the length measuring unit 16 calculates the length between the lip M and the forked portion T of the tail as the forked length L using the spatial coordinates of the lip M and the forked portion T of the tail. I do.
  • the calculated values of the thickness (body width) and the fork length of the measurement target fish body 5 calculated by the calculation unit 12 including the thickness measurement unit 15 and the length measurement unit 16 are stored in, for example, the storage device 20. Is stored. At this time, in the stored values of the thickness (body width) and the fork length, for example, predetermined information such as the frame numbers of the color image and the sensor image on which the calculation is based and the calculation date and time are included. It is stored in the storage device 20 in association with it.
  • the calculated information of the calculated thickness (body width) and fork length of the fish 5 to be measured is output to the display control unit 13. Then, by the control operation of the display control unit 13, the calculated values of the thickness (body width) and the fork length are displayed on the display device 25 together with, for example, a color image of the fish 5 on which the calculation is based. For example, the size information such as the thickness and the fork length of the fish 5 displayed on the display device 25 is used for sorting the fish 5 to be shipped by size.
  • the receiving unit 11 in the control device 10 of the measuring device 8 receives the color image from the camera 7 and the sensor image from the sensor unit 28 from time to time (step S101).
  • the thickness measurement unit 15 of the calculation unit 12 determines whether or not the whole of the fish 5 to be measured is in the detection area K using the received color image (Step S102). When it is determined that the image does not enter, the thickness measurement unit 15 repeats the same determination operation using the color images of the frames after the determined frame. If it is determined that the whole of the fish 5 to be measured is included in the detection area K, the thickness measuring unit 15 refers to the sensor image of the sensor unit 28 taken at the time of detection, and Pixels are detected (step S103). Further, the thickness measurement unit 15 calculates the thickness of the fish 5 to be measured in the detection area K using the spatial coordinates of the vertex pixels (step S104).
  • the length measurement unit 16 detects the mouth M of the fish body 5 which is a measurement use site used for length measurement and the forked part T of the tail from the color image obtained by the imaging unit 27 (step S105). . Then, the length measuring unit 16 extracts information on the spatial coordinates of the detected forehead M and the forked part T of the tail from the sensor image, and utilizes the extracted spatial coordinates to extract the fork of the fish 5 to be measured. The length is calculated (step S106).
  • control device 10 determines whether or not a predetermined measurement end condition (for example, a condition that a measurement end instruction has been received by the measurer operating the input device 24) has been satisfied (step S107). Is not satisfied, the operation after step S101 is repeated. When the measurement termination condition is satisfied, the control device 10 terminates the measurement operation.
  • a predetermined measurement end condition for example, a condition that a measurement end instruction has been received by the measurer operating the input device 24
  • the measurement system 1 and the measurement device 8 include a configuration that detects a portion of the fish body 5 that is being conveyed by the belt conveyor 3, where the height position from the belt conveyor 3 is highest. I have. Accordingly, the measurement system 1 and the measurement device 8 calculate the thickness (body width) of the fish body 5 using the detected part, so that even if the fish body 5 has an inflated surface, The certainty of the calculated value of the thickness (body width) can be increased.
  • the measurement system 1 and the measurement device 8 of the first embodiment detect that the whole of the fish 5 is included in the detection area K using the color image by the photographing unit 27, the detected fish The thickness (body width) and the fork length of No. 5 are calculated. Therefore, for example, when there is no fish body 5 entirely in the detection area K, and when the tail part and the mouth part of the different fish body 5 are in the detection area K, the fish bodies 5 This prevents the problem that the fork length is calculated using the tail portion T and the lip M different from each other. This can also contribute to the effect that the certainty of the calculated value of the fork length of the fish body 5 by the measurement system 1 and the measurement device 8 can be increased.
  • the measurement system 1 and the measurement device 8 of the first embodiment calculate the thickness (body width) and the fork length of the fish body 5 using the color image obtained by the imaging unit 27 and the sensor image obtained by the sensor unit 28. are doing.
  • the sensor image obtained by the sensor unit 28 includes spatial coordinate information, but is an image in which the clarity of the contour of the object is poor.
  • the color image obtained by the photographing unit 27 does not include information on spatial coordinates, but the image of the contour of the object is clear.
  • the measurement system 1 and the measurement device 8 can reduce the labor and time required for a person to measure the size of the fish 5.
  • the thickness measurement unit 15 detects that the entire fish body 5 has entered the detection region K.
  • the length measurement unit 16 It may be detected that the whole fish 5 has entered.
  • the fork length of the fish 5 is calculated by the length measuring unit 16 after the thickness of the fish 5 is calculated by the thickness measuring unit 15.
  • the thickness of the fish 5 may be calculated by the thickness measuring unit 15 after the fork length of the fish 5 is calculated by the length measuring unit 16. The calculation of the thickness of the fish 5 by the thickness measurement unit 15 and the calculation of the fork length of the fish 5 by the length measurement unit 16 may be performed in parallel.
  • the calculating unit 12 may include a weight calculating unit 17 as shown in FIG. 5 in addition to the thickness measuring unit 15 and the length measuring unit 16.
  • data for calculating the weight is stored in the storage device 20 in advance.
  • the data for calculating the body weight is, for example, data for calculating the weight of the fish body 5 using the thickness (body width) and the fork length of the fish body 5, and using the actually measured values of the thickness, the fork length and the body weight of the fish body 5. Generated.
  • the length measurement unit 16 of the calculation unit 12 may measure the body height of the fish body 5 by the same process as the process of measuring the fork length in addition to the fork length of the fish body 5.
  • the body height is the length between the most bulging portion on the dorsal side (for example, the root portion on the head side of the dorsal fin) and the most bulging portion on the ventral side in the fish body 5.
  • the most bulging part on the back side and the most bulging part on the ventral side of the fish body 5 are detected by the length measuring unit 16 as both ends (measurement use parts) of the length measurement. Is done.
  • the reference data for measuring and utilizing part stored in the storage device 20 is, for example, data for detecting each measuring and utilizing part used for measuring the fork length and body height of the fish body 5.
  • the reference data for detecting the measurement use part includes not only the lip M used for measurement of the fork length and the forked part T of the tail, but also the most inflated part of the back side used for measurement of body height and the abdomen. Is generated by machine learning using, as teacher data, image data of the entire fish body to which information of the most bulging portion is also attached.
  • the length measurement unit 16 can use two measurement use sites used for measurement of the fork length of the same fish 5 in the color image obtained by the imaging unit 27. , And two measurement parts used for measuring the body height can be detected.
  • the weight calculating unit 17 determines the thickness (body width) of the fish 5 and the fork length. In addition, the weight may be calculated in consideration of the body height. Furthermore, the weight calculating unit 17 may calculate the weight of the fish 5 using the fork length and body height without using the thickness of the fish 5.
  • the example has been described in which the sensor unit 28 acquires information related to the distance from the camera 7 to the object by the TOF method using infrared rays.
  • the sensor unit 28 may employ a three-dimensional measurement technique such as a pattern irradiation method or LIDAR (Laser Imaging Detection and Ranging) instead of the TOF method.
  • a three-dimensional measurement technique such as a pattern irradiation method or LIDAR (Laser Imaging Detection and Ranging) instead of the TOF method.
  • the measurement system 1 includes a display device 31 and a communication unit 32 in addition to the configuration of the first embodiment, and the measurement device 8 further includes a communication unit 18 It has.
  • the display device 31 and the communication unit 32 are provided in a personal computer (personal computer) that is the information processing device 30.
  • the information processing device 30 is disposed, for example, near the belt conveyor 3 (in other words, near the measurer).
  • the communication unit 32 has a function of relaying the color image and the sensor image output from the camera 7 to the measuring device 8 through the information communication network 34.
  • the communication unit 32 has a function of receiving information transmitted from the measuring device 8 via the information communication network 34.
  • the communication unit 18 of the measuring device 8 has a function of receiving the color image and the sensor image from the camera 7 via the information communication network 34 and providing the color image and the sensor image to the receiving unit 11.
  • the measurement device 8 calculates the thickness and the fork length of the fish body 5 using the color image and the sensor image of the camera 7 received via the communication unit 18 as in the first embodiment.
  • the communication unit 18 further has a function of outputting the calculated values of the calculated thickness and fork length of the fish body 5 to the information processing device 30 through the information communication network 34.
  • the display device 31 has a function of displaying the calculated values of the thickness and the fork length of the fish body 5 received by the communication unit 32.
  • the fish body Since the measurement system 1 and the measurement device 8 of the second embodiment have a configuration for calculating the thickness and the fork length of the fish body 5 as in the first embodiment, the fish body has the same effect as that of the first embodiment. The effect of increasing the certainty of the calculated value related to the size of 5 can be obtained.
  • the measuring device 8 is realized by, for example, a server, and the measuring device 8 may be connected to a plurality of information processing devices 30 via an information communication network 34 and the camera 7 through the information processing devices 30. .
  • the present invention is not limited to the first and second embodiments, but can adopt various embodiments.
  • it is determined whether or not the entire fish 5 has entered the detection area K using the color image obtained by the imaging unit 27.
  • the sensor may be used.
  • the sensor image by the unit 28 it may be determined whether or not the entire fish body 5 is included in the detection area K.
  • the z component of the spatial coordinates at the preset height measurement position is acquired moment by moment, and the change tendency of the z component is calculated. Utilizing the tendency, it is determined whether or not the entire fish body 5 is included in the detection area K.
  • the object to be measured is a fish.
  • the present invention can be applied to at least the calculation of the thickness of an object other than a fish.
  • the fish 5 to be measured is placed on the belt conveyor 3, and the surface of the belt conveyor 3 is a reference surface.
  • the object to be measured may be placed on a reference surface other than the belt conveyor 3.
  • the reference plane may be a fixed plane.
  • the measurement system 1 may be provided with a mechanism for adjusting the height position of the camera 7 with respect to the belt conveyor 3.
  • FIG. 7 is a simplified block diagram showing a configuration of a measuring apparatus according to another embodiment of the present invention.
  • the measurement device 40 illustrated in FIG. 7 forms a measurement system 45 together with the detection device 46 illustrated in FIG.
  • the detection device 46 has a function of outputting information relating to spatial coordinates in the detection area.
  • the measuring device 40 includes a receiving unit 41 and a calculating unit 42, as shown in FIG.
  • the receiving unit 41 has a function of receiving information output from the detecting device 46 that is disposed at a position facing the reference plane and outputs information relating to spatial coordinates in a detection area including the reference plane.
  • the calculating unit 42 uses the information relating to the spatial coordinates of the surface of the object within the detection area placed on the reference plane to detect a portion of the object having the largest thickness in the direction from the reference plane to the detection device 46. And a function of calculating the thickness of the detected portion as the thickness of the object.
  • the measuring device 40 and the measuring system 45 having the above-described configurations can also obtain the same effects as those of the first and second embodiments.
  • the present invention has been described above using the above-described embodiment as a typical example. However, the invention is not limited to the embodiments described above. That is, the present invention can apply various aspects that can be understood by those skilled in the art within the scope of the present invention.
  • This application claims priority based on Japanese Patent Application No. 2018-138548 filed on July 24, 2018, the disclosure of which is incorporated herein in its entirety.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

With a view to providing technology with which it is possible to increase the likelihood of accuracy of measurement of at least the thickness of an object even when the object has an uneven surface, this measurement device 40 is provided with a receiving unit 41 and a calculation unit 42. The receiving unit 41 receives information outputted from a detection device which is disposed at a position facing a reference surface and which is for outputting information relating to space coordinates within a detection region that encompasses the reference surface. The calculation unit 42, by utilizing the information relating to the space coordinates on the surface of an object placed on the reference surface within the detection region, detects a portion of the object that has the greatest thickness in a direction approaching the detection device from the reference surface, and then performs calculation to determine the thickness of the detected portion as the thickness of the object.

Description

計測装置、計測システム、計測方法およびプログラム記憶媒体Measuring device, measuring system, measuring method and program storage medium
 本発明は、物体を計測する技術に関する。 The present invention relates to a technology for measuring an object.
 水揚げされた魚を出荷する前に、販売価格等に関わる魚のサイズを測定する作業が行われる。この魚のサイズを測定する作業は、例えば、ベルトコンベアで魚を搬送している途中において、人により物差しを利用して行われる。 作業 Before shipping the landed fish, work to measure the size of the fish in relation to the selling price etc. is performed. The work of measuring the size of the fish is performed by a person using a ruler, for example, while the fish is being conveyed on a belt conveyor.
 特許文献1には、移動レーンに搭載されている通過物体を、移動レーンの上方側に設けられた複数の撮影装置によって撮影し、撮影された通過物体の撮影画像を利用して通過物体の高さを算出する構成が開示されている。 In Patent Document 1, a passing object mounted on a moving lane is photographed by a plurality of photographing devices provided above the moving lane, and the height of the passing object is heightened by using a photographed image of the passing object. A configuration for calculating the degree is disclosed.
 特許文献2には、コンベアに載せられている魚体に近赤外光と白色光を照射し、近赤外光による魚体の単色画像と、白色光による魚体のカラー画像とを利用して、魚体の種類を判別する構成が開示されている。 Patent Literature 2 discloses that a fish body placed on a conveyor is irradiated with near-infrared light and white light, and a fish body is formed using a monochromatic image of the fish body by near-infrared light and a color image of the fish body by white light. A configuration for discriminating the type of is disclosed.
特開2004-061382号公報JP 2004-061382 A 国際公開第2016/092646号WO 2016/092646
 人が魚のサイズを手作業により測定する場合には、人の手間が掛かる上に、測定した人によって魚の測定値がばらつく虞がある。 (4) When a person manually measures the size of a fish, it takes time and effort for the person, and the measured value of the fish may vary depending on the person who measured the size.
 また、特許文献1では、高さを測定する対象の物体は直方体状であり、上面が平坦であることを想定している。これにより、物体の上面のエッジが検知され、当該エッジの高さが通過物体の高さとして検知されている。このため、例えば、仮に、特許文献1における通過物体を魚に置き換え特許文献1における手法を利用して魚の高さ(厚み)を計測しようとすると、魚の輪郭部分の高さが魚の厚みとして検知される。 特許 In Patent Document 1, it is assumed that the object whose height is to be measured has a rectangular parallelepiped shape, and that the upper surface is flat. Thereby, the edge of the upper surface of the object is detected, and the height of the edge is detected as the height of the passing object. Therefore, for example, if the passing object in Patent Literature 1 is replaced with a fish and the height (thickness) of the fish is measured using the technique in Patent Literature 1, the height of the outline of the fish is detected as the thickness of the fish. You.
 しかしながら、魚の厚みとは、最も膨らんでいる中央部分の厚みであり、コンベアからの輪郭部分の高さではないので、特許文献1の手法により計測される魚の厚みは、実際の魚の厚みからずれた値となる。つまり、特許文献1の手法では、販売価格等に関わる魚のサイズである厚みを精度良く検知することは難しい。 However, since the thickness of the fish is the thickness of the center portion that is most inflated, and not the height of the contour portion from the conveyor, the thickness of the fish measured by the method of Patent Document 1 is shifted from the actual thickness of the fish. Value. In other words, it is difficult to accurately detect the thickness, which is the size of the fish related to the selling price and the like, with the method of Patent Document 1.
 本発明は上記課題を解決するために考え出された。すなわち、本発明の主な目的は、物体の表面が平坦でなくとも、物体の少なくとも厚みの計測値の確からしさを高めることができる技術を提供することにある。 The present invention has been devised to solve the above problems. That is, a main object of the present invention is to provide a technique capable of increasing the certainty of at least the measured value of the thickness of an object even when the surface of the object is not flat.
 上記目的を達成するために、本発明に係る計測装置の一態様は、
 基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信する受信部と、
 前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知し、当該検知した部分の厚みを前記物体の厚みとして算出する算出部とを備える。
In order to achieve the above object, one embodiment of the measuring device according to the present invention is:
A receiving unit that receives the information output from the detection device that is arranged at a position facing the reference surface and outputs information related to spatial coordinates in the detection area including the reference surface,
Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. A calculating unit that calculates the thickness of the detected portion as the thickness of the object.
 本発明に係る計測システムの一態様は、
 検知領域内の空間座標に関わる情報を出力する検知装置と、
 基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信する受信部と、前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知し、当該検知した部分の厚みを前記物体の厚みとして算出する算出部とを備える計測装置とを含む。
One aspect of the measurement system according to the present invention is:
A detection device that outputs information related to spatial coordinates in the detection area,
A receiving unit that receives the information output from a detection device that is arranged at a position facing the reference plane and outputs information related to spatial coordinates in a detection area including the reference plane, and the reception unit is disposed on the reference plane. Utilizing information related to the spatial coordinates on the surface of the object in the detection area, the object detects the thickest portion in the direction from the reference plane toward the detection device, and calculates the thickness of the detected portion. And a calculating unit that calculates the thickness of the object.
 本発明に係る計測方法の一態様は、
 基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信し、
 前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知し、
 検知した部分の厚みを前記物体の厚みとして算出する。
One aspect of the measurement method according to the present invention,
Receiving the information output from the detection device that is disposed at a position facing the reference plane and outputs information related to spatial coordinates in the detection area including the reference plane,
Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. ,
The thickness of the detected portion is calculated as the thickness of the object.
 本発明に係るプログラム記憶媒体の一態様は、
 基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信する処理と、
 前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知する処理と、
 検知した部分の厚みを前記物体の厚みとして算出する処理とをコンピュータに実行させるコンピュータプログラムを記憶する。
One embodiment of the program storage medium according to the present invention includes:
A process of receiving the information output from the detection device that is arranged at a position facing the reference surface and outputs information related to spatial coordinates in the detection area including the reference surface,
Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, detecting a portion of the object having the largest thickness in the direction from the reference plane to the detection device. Processing,
A computer program for causing a computer to execute a process of calculating the thickness of the detected portion as the thickness of the object is stored.
 本発明によれば、物体の表面が平坦でなくとも、物体の少なくとも厚みの計測値の確からしさを高めることができる技術を提供できる。 According to the present invention, it is possible to provide a technique capable of increasing the certainty of at least the measured value of the thickness of an object even when the surface of the object is not flat.
本発明に係る第1実施形態の計測システムの構成を模式的に表すモデル図である。FIG. 1 is a model diagram schematically illustrating a configuration of a measurement system according to a first embodiment of the present invention. 本発明に係る第1実施形態の計測システムの構成を簡略化して表すブロック図である。FIG. 1 is a block diagram illustrating a simplified configuration of a measurement system according to a first embodiment of the present invention. 魚体の尾叉長を説明する図である。It is a figure explaining the fork length of a fish. 第1実施形態の計測装置における算出動作の一例を説明するフローチャートである。5 is a flowchart illustrating an example of a calculation operation in the measurement device according to the first embodiment. 第1実施形態の計測装置における変形例を説明するブロック図である。It is a block diagram explaining a modification in the measuring device of a 1st embodiment. 本発明に係る第2実施形態の計測システムの構成を説明する図である。It is a figure explaining composition of a measuring system of a 2nd embodiment concerning the present invention. 本発明に係るその他の実施形態の計測装置の構成を簡略化して表すブロック図である。It is a block diagram which represents the structure of the measuring device of other embodiment which concerns on this invention in a simplified form. 本発明に係るその他の実施形態の計測システムの構成を簡略化して表すブロック図である。FIG. 9 is a block diagram illustrating a simplified configuration of a measurement system according to another embodiment of the present invention.
 以下に、本発明に係る実施形態を図面を参照しつつ説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 <第1実施形態>
 図1には、本発明に係る第1実施形態の計測装置を含む計測システムの構成が簡略化して表されている。第1実施形態の計測システム1は、搬送装置であるベルトコンベア3により搬送されている魚体5のサイズを計測するシステムであり、カメラ7と、計測装置8とを備えている。図2は、カメラ7と計測装置8の構成を簡略化して表すブロック図である。
<First embodiment>
FIG. 1 shows a simplified configuration of a measurement system including a measurement device according to a first embodiment of the present invention. The measurement system 1 according to the first embodiment is a system that measures the size of a fish body 5 being conveyed by a belt conveyor 3 that is a conveyance device, and includes a camera 7 and a measurement device 8. FIG. 2 is a simplified block diagram showing the configurations of the camera 7 and the measuring device 8.
 カメラ7は、検知装置として機能する深度センサ付きカメラであり、ベルトコンベア3の上方側に配設される。このカメラ7は、図2に表されるように、撮影部27と、センサ部28とを備えている。撮影部27は、図1の点線Kにより表されるような検知領域内(撮影範囲)の物体を撮影して可視光の撮影画像(以下、カラー画像とも記す)を出力する機能を備えている。ここでは、検知領域Kの大きさは、計測対象の魚体5の大きさに応じた大きさとする。 The camera 7 is a camera with a depth sensor that functions as a detection device, and is disposed above the belt conveyor 3. As shown in FIG. 2, the camera 7 includes a photographing unit 27 and a sensor unit 28. The photographing unit 27 has a function of photographing an object in a detection area (imaging range) as indicated by a dotted line K in FIG. 1 and outputting a photographed image of visible light (hereinafter also referred to as a color image). . Here, the size of the detection region K is set to a size corresponding to the size of the fish 5 to be measured.
 センサ部28は、検知領域Kにおける物体に関し、カメラ7からの奥行き方向の距離(深度)に関わる情報を出力する機能を備えている。例えば、センサ部28は、赤外線を利用したTOF(Time of Flight)方式によってカメラ7から物体までの距離に関わる情報を画素毎に算出する機能を備えている。さらに、センサ部28は、例えば画素毎に当該画素に対応する物体部分までの距離(深度)に関わる情報が含まれている二次元画像(以下、センサ画像とも記す)を出力する機能を備えている。換言すれば、センサ部28から出力されるセンサ画像は、画素毎に当該画素に対応する物体部分の空間座標に関わる情報を含んでいる。ここでは、空間座標を構成するx成分、y成分、z成分は、Z軸がベルトコンベア3からカメラ7に向かう方向に沿うように定められた直交座標系に基づいて表されることとする。 The sensor unit 28 has a function of outputting information relating to the distance (depth) in the depth direction from the camera 7 with respect to the object in the detection area K. For example, the sensor unit 28 has a function of calculating information relating to the distance from the camera 7 to the object for each pixel by a TOF (Time @ of @ Flight) method using infrared rays. Further, the sensor unit 28 has a function of outputting, for example, a two-dimensional image (hereinafter, also referred to as a sensor image) including information on a distance (depth) to an object portion corresponding to the pixel for each pixel. I have. In other words, the sensor image output from the sensor unit 28 includes, for each pixel, information relating to the spatial coordinates of the object portion corresponding to the pixel. Here, it is assumed that the x component, the y component, and the z component that constitute the spatial coordinates are represented based on an orthogonal coordinate system defined so that the Z axis runs along the direction from the belt conveyor 3 to the camera 7.
 このような構成を備えるカメラ7は計測装置8に有線通信あるいは無線通信により接続されている。また、ベルトコンベア3(換言すれば、基準面)に対するカメラ7の高さ位置は、計測対象の魚体5の想定される大きさを考慮し、計測対象の魚体5の全体がカメラ7の検知領域Kに入るように、適宜設定される。 The camera 7 having such a configuration is connected to the measuring device 8 by wire communication or wireless communication. Further, the height position of the camera 7 with respect to the belt conveyor 3 (in other words, the reference plane) is determined in consideration of the assumed size of the fish body 5 to be measured, and the entire fish body 5 to be measured is detected by the camera 7. It is set appropriately so as to enter K.
 なお、カメラ7から出力される撮影部27によるカラー画像とセンサ部28によるセンサ画像には、撮影時点を表す情報が含まれており、計測装置8において、撮影部27によるカラー画像と、センサ部28によるセンサ画像との同期合わせが可能となっている。また、ここでは、撮影部27によるカラー画像を構成する各画素に対応する物体部分の深度の情報をセンサ部28によるセンサ画像から取得できるように、カメラ7から出力されるカラー画像とセンサ画像との位置関係の情報が計測装置8に与えられている。 The color image output from the camera 7 by the image capturing unit 27 and the sensor image output by the sensor unit 28 include information indicating the time point at which the image was captured. 28 allows synchronization with the sensor image. Here, the color image output from the camera 7 and the sensor image are output so that the depth information of the object portion corresponding to each pixel constituting the color image by the imaging unit 27 can be obtained from the sensor image by the sensor unit 28. Is provided to the measuring device 8.
 計測装置8は、例えばパーソナルコンピュータであり、図2に表されるように、制御装置10と記憶装置20を備えている。また、計測装置8は、例えば計測者の操作により情報を計測装置8に入力する入力装置(例えば、キーボードやマウス)24と、情報を表示する表示装置25とに接続されている。さらに、計測装置8は、当該計測装置8とは別体の外付けの記憶装置21に接続されていてもよい。 The measuring device 8 is, for example, a personal computer, and includes a control device 10 and a storage device 20, as shown in FIG. The measuring device 8 is connected to, for example, an input device (for example, a keyboard or a mouse) 24 for inputting information to the measuring device 8 by a measurer's operation, and a display device 25 for displaying information. Furthermore, the measuring device 8 may be connected to an external storage device 21 separate from the measuring device 8.
 記憶装置20は、各種データやコンピュータプログラム(以下、プログラムとも記す)を記憶する機能を有し、例えば、ハードディスク装置や半導体メモリ等の記憶媒体により実現される。計測装置8に備えられる記憶装置は一つに限定されず、複数種の記憶装置が計測装置8に備えられていてもよく、この場合には、複数の記憶装置を総称して記憶装置20と記す。また、記憶装置21も、記憶装置20と同様に、各種データやコンピュータプログラムを記憶する機能を有し、例えば、ハードディスク装置や半導体メモリ等の記憶媒体により実現される。なお、計測装置8が記憶装置21に接続されている場合には、記憶装置21には適宜な情報が格納される。また、この場合には、計測装置8は、適宜、記憶装置21に情報を書き込む処理および読み出す処理を実行するが、以下の説明では、記憶装置21に関する説明を省略する。 The storage device 20 has a function of storing various data and computer programs (hereinafter, also referred to as programs), and is realized by a storage medium such as a hard disk device or a semiconductor memory. The number of storage devices provided in the measurement device 8 is not limited to one, and a plurality of types of storage devices may be provided in the measurement device 8. In this case, the plurality of storage devices are collectively referred to as the storage device 20. Write. The storage device 21 also has a function of storing various data and computer programs, like the storage device 20, and is realized by a storage medium such as a hard disk device or a semiconductor memory. When the measuring device 8 is connected to the storage device 21, the storage device 21 stores appropriate information. In this case, the measuring device 8 executes a process of writing and reading information to and from the storage device 21 as appropriate, but a description of the storage device 21 is omitted in the following description.
 制御装置10は、例えば、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)などのプロセッサにより構成される。制御装置10は、例えばCPUが記憶装置20に格納されているコンピュータプログラムを実行することにより、次のような機能を有することができる。すなわち、制御装置10は、機能部として、受信部11と、算出部12と、表示制御部13とを備えている。 The control device 10 includes, for example, a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The control device 10 can have the following functions, for example, when the CPU executes a computer program stored in the storage device 20. That is, the control device 10 includes a receiving unit 11, a calculating unit 12, and a display control unit 13 as functional units.
 受信部11は、カメラ7から出力された情報(つまり、撮影部27によるカラー画像や、センサ部28によるセンサ画像の情報)を受信し、受信した情報(受信情報)を表示制御部13に出力する機能を備えている。また、受信部11は、受信情報を記憶装置20に格納する機能を備えている。 The receiving unit 11 receives information output from the camera 7 (that is, information of a color image obtained by the image capturing unit 27 and information of a sensor image obtained by the sensor unit 28), and outputs the received information (received information) to the display control unit 13. It has the function to do. Further, the receiving unit 11 has a function of storing the received information in the storage device 20.
 表示制御部13は、表示装置25の表示動作を制御する機能を備えている。例えば、表示制御部13は、カメラ7から受信部11を介して受け取った受信情報に基づいた画像を例えばリアルタイムで表示装置25に表示させる。なお、受信情報に基づいた画像を表示装置25に表示させる表示形態は、適宜に設定された表示形態を採り得る。例えば、撮影部27によるカラー画像を表示する表示モードと、センサ部28によるセンサ画像を表示する表示モードと、カラー画像とセンサ画像を並べて表示する表示モードとが予め設定されているとする。表示制御部13は、計測者による入力装置24の操作情報に基づいて、そのような複数の表示モードの中から計測者により選択された表示モードを検知し、検知した表示モードでもって表示装置25が動作するように表示装置25の表示動作を制御する。また、表示制御部13は、計測者による入力装置24の操作情報に基づいて、記憶装置20からカメラ7による過去のカラー画像とセンサ画像の一方又は両方の情報を読み出し、読み出した画像を表示装置25に表示させる制御を行ってもよい。 The display control unit 13 has a function of controlling the display operation of the display device 25. For example, the display control unit 13 causes the display device 25 to display an image based on the reception information received from the camera 7 via the reception unit 11, for example, in real time. Note that a display mode for displaying an image based on the received information on the display device 25 may be a display mode that is appropriately set. For example, it is assumed that a display mode for displaying a color image by the imaging unit 27, a display mode for displaying a sensor image by the sensor unit 28, and a display mode for displaying a color image and a sensor image side by side are set in advance. The display control unit 13 detects a display mode selected by the measurer from among the plurality of display modes based on operation information of the input device 24 by the measurer, and displays the display device 25 based on the detected display mode. The display operation of the display device 25 is controlled so that the. The display control unit 13 reads out one or both of the past color image and the sensor image by the camera 7 from the storage device 20 based on the operation information of the input device 24 by the measurer, and displays the read image on the display device. Control for displaying the information on the display 25 may be performed.
 算出部12は、厚み計測部15と、長さ計測部16とを備えている。厚み計測部15は、カメラ7から出力された画像の情報を利用してベルトコンベア3にて搬送されている魚体5の厚み(体幅)を次のように計測する機能を備えている。 The calculation unit 12 includes a thickness measurement unit 15 and a length measurement unit 16. The thickness measuring unit 15 has a function of measuring the thickness (body width) of the fish body 5 being conveyed by the belt conveyor 3 using the information of the image output from the camera 7 as follows.
 例えば、厚み計測部15は、まず、撮影部27によるカラー画像を利用して検知領域Kに計測対象の魚体5の全体が入っているか否かを判断する。この判断処理の一例を挙げると、機械学習により生成された魚体検知用の参考データを記憶装置20に格納しておき、当該魚体検知用の参考データを利用して、撮影部27によるカラー画像の検知領域Kに計測対象の魚体5の全体が映っているか否かが判断される。その機械学習では、計測対象の種類の魚における魚体の多数の画像を教師データとして計測対象の種類の魚体が学習される。教師データには、ベルトコンベア3の進行方向に対して様々に傾いている魚体のデータも含まれる。このような教師データにより学習された魚体検知用の参考データを利用することにより、厚み計測部15は、ベルトコンベア3の進行方向に対して傾いている魚体であっても魚体を検知できるため、傾いている魚体全体が検知領域Kに入っていることをも検知できる。つまり、ベルトコンベア3の進行方向に対して魚体が傾いている場合に、当該魚体の全体が検知領域Kに入っているのにも拘わらず、厚み計測部15がその魚体を検知できない事態を抑制できる。 {For example, first, the thickness measuring unit 15 determines whether or not the entire fish 5 to be measured is in the detection area K using the color image obtained by the photographing unit 27. As an example of this determination processing, reference data for fish detection generated by machine learning is stored in the storage device 20, and the color image of the color image by the photographing unit 27 is stored using the reference data for fish detection. It is determined whether or not the entire measurement target fish 5 is reflected in the detection area K. In the machine learning, the fish of the type to be measured is learned using a large number of images of the fish of the fish of the type to be measured as teacher data. The teacher data also includes data of fish bodies that are variously inclined with respect to the traveling direction of the belt conveyor 3. By using the reference data for detecting the fish learned by such teacher data, the thickness measuring unit 15 can detect the fish even if the fish is inclined with respect to the traveling direction of the belt conveyor 3, It can also be detected that the entire inclined fish body is in the detection area K. That is, when the fish body is inclined with respect to the traveling direction of the belt conveyor 3, it is possible to suppress a situation in which the thickness measurement unit 15 cannot detect the fish body even though the entire fish body is in the detection area K. it can.
 厚み計測部15は、検知領域Kに計測対象の魚体5の全体が入っていることを検知した場合には、当該検知した時点に撮影されたセンサ部28によるセンサ画像を参照する。そして、厚み計測部15は、参照しているセンサ画像において、各画素における空間座標のz成分を互いに比較し、ベルトコンベア3からの高さ位置が最も高いz成分を持つ画素(以下、頂点画素とも記す)を検知する。なお、センサ画像におけるベルトコンベア3の位置を表す空間座標のz成分の情報は予め取得されているとする。 When the thickness measurement unit 15 detects that the entire fish body 5 to be measured is within the detection area K, the thickness measurement unit 15 refers to the sensor image captured by the sensor unit 28 at the time of the detection. Then, the thickness measuring unit 15 compares the z component of the spatial coordinates of each pixel in the referenced sensor image with each other, and determines the pixel having the highest z component at the height position from the belt conveyor 3 (hereinafter, a vertex pixel). Is also detected). It is assumed that the information of the z component of the spatial coordinates representing the position of the belt conveyor 3 in the sensor image has been acquired in advance.
 さらに、厚み計測部15は、検知した頂点画素を含む設定範囲内の画素(例えば、頂点画素を中心とする5画素(ピクセル)×5画素(ピクセル)の矩形範囲内の画素)における空間座標のz成分の平均値を算出する。そして、厚み計測部15は、算出したz成分の平均値と、ベルトコンベア3の位置を表す空間座標のz成分との差分を計測対象の魚体5の厚みとして算出する。なお、厚み計測部15は、頂点画素を含む設定範囲内の画素における空間座標のz成分の平均値を算出せずに、頂点画素における空間座標のz成分と、ベルトコンベア3の位置に対応するz成分との差分を計測対象の魚体5の厚みとして算出してもよい。 Further, the thickness measurement unit 15 calculates the spatial coordinates of pixels within a set range including the detected vertex pixels (for example, pixels within a rectangular range of 5 pixels (pixels) × 5 pixels (pixels) centered on the vertex pixels). The average value of the z component is calculated. Then, the thickness measuring unit 15 calculates the difference between the calculated average value of the z component and the z component of the spatial coordinates representing the position of the belt conveyor 3 as the thickness of the fish 5 to be measured. Note that the thickness measuring unit 15 does not calculate the average value of the z component of the spatial coordinates at the pixels within the set range including the apex pixel, but corresponds to the z component of the spatial coordinates at the apex pixel and the position of the belt conveyor 3. The difference from the z component may be calculated as the thickness of the fish 5 to be measured.
 長さ計測部16は、検知領域Kに計測対象の魚体5の全体が入っていることを厚み計測部15が検知した場合に、撮影部27によるカラー画像とセンサ部28によるセンサ画像を利用して、計測対象の魚体5の尾叉長を算出する機能を備えている。ここでの尾叉長とは、図3に表されるような魚体5の口先Mから尾の二叉する部分Tまでの長さLを表している。この尾叉長が魚体5における長さ計測部分であり、当該尾叉長の計測には、魚体5の口先Mと、尾の二叉する部分Tとが長さ計測の両端部として検知される。 When the thickness measurement unit 15 detects that the entire fish body 5 to be measured is included in the detection area K, the length measurement unit 16 uses a color image obtained by the imaging unit 27 and a sensor image obtained by the sensor unit 28. And a function of calculating the fork length of the fish 5 to be measured. The fork length here indicates the length L from the mouth M of the fish body 5 to the forked part T of the tail as shown in FIG. This fork length is a length measurement portion in the fish body 5, and the tip M of the fish body 5 and the forked portion T of the tail are detected as both ends of the length measurement for measuring the fork length. .
 すなわち、長さ計測部16は、例えば、撮影部27によるカラー画像から、長さ計測に利用する魚体5の口先Mと、尾の二叉する部分Tとを検知する。この検知処理では、例えば、記憶装置20に予め格納しておいた計測利用部位検知用の参考データが参照され、検知領域Kに計測対象の魚体5の全体が入っていると検知されたカラー画像から、魚体5の口先Mと、尾の二叉する部分Tとが検知される。その計測利用部位検知用の参考データは、例えば、図3に表されるような口先Mおよび尾の二叉する部分Tの情報が付された魚体全体の画像データを教師データとした機械学習により生成される。 {That is, the length measuring unit 16 detects, for example, the mouth M of the fish 5 used for length measurement and the forked part T of the tail from the color image obtained by the photographing unit 27. In this detection processing, for example, reference data for detecting a measurement use site stored in the storage device 20 in advance is referred to, and a color image detected as including the entire fish 5 to be measured in the detection area K is detected. Thus, the mouth M of the fish 5 and the bifurcated portion T of the tail are detected. The reference data for detecting the measurement use site is obtained by, for example, machine learning using image data of the entire fish body to which the information of the mouth M and the forked part T of the tail as shown in FIG. Generated.
 さらに、長さ計測部16は、魚体5の口先Mと、尾の二叉する部分Tとを検知した後に、検知した口先Mおよび尾の二叉する部分Tに対応する空間座標の情報をセンサ部28によるセンサ画像から抽出する。そして、長さ計測部16は、口先Mと尾の二叉する部分Tの空間座標を利用して、口先Mと尾の二叉する部分Tとの間の長さを尾叉長Lとして算出する。 Further, the length measuring unit 16 detects the mouth M of the fish body 5 and the forked part T of the tail, and then detects the spatial coordinates information corresponding to the detected mouth M and the forked part T of the fish as a sensor. It is extracted from the sensor image by the unit 28. Then, the length measuring unit 16 calculates the length between the lip M and the forked portion T of the tail as the forked length L using the spatial coordinates of the lip M and the forked portion T of the tail. I do.
 上記のように、厚み計測部15と長さ計測部16を含む算出部12によって算出された計測対象の魚体5における厚み(体幅)と尾叉長の算出値は、例えば、記憶装置20に格納される。この際、格納される厚み(体幅)と尾叉長の算出値には、例えば、算出の基となったカラー画像とセンサ画像の各フレーム番号や、算出日時などの予め定められた情報が関連付けられて記憶装置20に格納される。 As described above, the calculated values of the thickness (body width) and the fork length of the measurement target fish body 5 calculated by the calculation unit 12 including the thickness measurement unit 15 and the length measurement unit 16 are stored in, for example, the storage device 20. Is stored. At this time, in the stored values of the thickness (body width) and the fork length, for example, predetermined information such as the frame numbers of the color image and the sensor image on which the calculation is based and the calculation date and time are included. It is stored in the storage device 20 in association with it.
 また、算出された計測対象の魚体5における厚み(体幅)と尾叉長の算出値の情報は、表示制御部13に出力される。そして、表示制御部13の制御動作によって、厚み(体幅)と尾叉長の算出値が、例えば算出の基となった魚体5のカラー画像と共に表示装置25に表示される。例えば、表示装置25に表示された魚体5の厚みや尾叉長などのサイズ情報は、出荷される魚体5をサイズで仕分ける作業に使用される。 The calculated information of the calculated thickness (body width) and fork length of the fish 5 to be measured is output to the display control unit 13. Then, by the control operation of the display control unit 13, the calculated values of the thickness (body width) and the fork length are displayed on the display device 25 together with, for example, a color image of the fish 5 on which the calculation is based. For example, the size information such as the thickness and the fork length of the fish 5 displayed on the display device 25 is used for sorting the fish 5 to be shipped by size.
 以下に、計測装置8における魚体5の厚みおよび長さの算出に係る動作の一例を図4のフローチャートを利用して説明する。 Hereinafter, an example of the operation related to the calculation of the thickness and the length of the fish body 5 in the measuring device 8 will be described with reference to the flowchart of FIG.
 例えば、計測装置8の制御装置10における受信部11は、カメラ7から撮影部27によるカラー画像と、センサ部28によるセンサ画像とを時々刻々と受信する(ステップS101)。一方、算出部12の厚み計測部15は、受信したカラー画像を利用して検知領域Kに計測対象の魚体5の全体が入っているか否かを判断する(ステップS102)。入っていないと判断した場合には、厚み計測部15は、判断済みのフレーム以降のフレームのカラー画像を利用して同様の判断動作を繰り返す。検知領域Kに計測対象の魚体5の全体が入っていると判断した場合には、厚み計測部15は、検知した時点に撮影されたセンサ部28によるセンサ画像を参照し、当該センサ画像において頂点画素を検知する(ステップS103)。さらに、厚み計測部15は、頂点画素における空間座標を利用して、検知領域Kにおける計測対象の魚体5の厚みを算出する(ステップS104)。 For example, the receiving unit 11 in the control device 10 of the measuring device 8 receives the color image from the camera 7 and the sensor image from the sensor unit 28 from time to time (step S101). On the other hand, the thickness measurement unit 15 of the calculation unit 12 determines whether or not the whole of the fish 5 to be measured is in the detection area K using the received color image (Step S102). When it is determined that the image does not enter, the thickness measurement unit 15 repeats the same determination operation using the color images of the frames after the determined frame. If it is determined that the whole of the fish 5 to be measured is included in the detection area K, the thickness measuring unit 15 refers to the sensor image of the sensor unit 28 taken at the time of detection, and Pixels are detected (step S103). Further, the thickness measurement unit 15 calculates the thickness of the fish 5 to be measured in the detection area K using the spatial coordinates of the vertex pixels (step S104).
 その後、長さ計測部16は、撮影部27によるカラー画像から、長さ計測に利用する計測利用部位である魚体5の口先Mと、尾の二叉する部分Tとを検知する(ステップS105)。そして、長さ計測部16は、検知した口先Mと尾の二叉する部分Tにおける空間座標の情報をセンサ画像から抽出し、抽出した空間座標を利用して、計測対象の魚体5の尾叉長を算出する(ステップS106)。 Thereafter, the length measurement unit 16 detects the mouth M of the fish body 5 which is a measurement use site used for length measurement and the forked part T of the tail from the color image obtained by the imaging unit 27 (step S105). . Then, the length measuring unit 16 extracts information on the spatial coordinates of the detected forehead M and the forked part T of the tail from the sensor image, and utilizes the extracted spatial coordinates to extract the fork of the fish 5 to be measured. The length is calculated (step S106).
 然る後に、制御装置10は、予め定められた計測終了条件(例えば、計測者による入力装置24の操作によって計測終了の指示を受け付けたという条件)を満たしたか否かを判断し(ステップS107)、満たしていない場合には、ステップS101以降の動作を繰り返す。また、計測終了条件を満たした場合には、制御装置10は、計測動作を終了する。 Thereafter, the control device 10 determines whether or not a predetermined measurement end condition (for example, a condition that a measurement end instruction has been received by the measurer operating the input device 24) has been satisfied (step S107). Is not satisfied, the operation after step S101 is repeated. When the measurement termination condition is satisfied, the control device 10 terminates the measurement operation.
 第1実施形態の計測システム1および計測装置8は、上記のように、ベルトコンベア3により搬送されている魚体5において、ベルトコンベア3からの高さ位置が最も高い部位を検知する構成を備えている。これにより、計測システム1および計測装置8は、その検知された部位を利用して魚体5の厚み(体幅)を算出することにより、表面が膨らんでいる魚体5であっても、魚体5の厚み(体幅)の算出値の確からしさを高めることができる。 As described above, the measurement system 1 and the measurement device 8 according to the first embodiment include a configuration that detects a portion of the fish body 5 that is being conveyed by the belt conveyor 3, where the height position from the belt conveyor 3 is highest. I have. Accordingly, the measurement system 1 and the measurement device 8 calculate the thickness (body width) of the fish body 5 using the detected part, so that even if the fish body 5 has an inflated surface, The certainty of the calculated value of the thickness (body width) can be increased.
 さらに、第1実施形態の計測システム1および計測装置8は、撮影部27によるカラー画像を利用して、検知領域Kに魚体5の全体が入っていることを検知した場合に、その検知した魚体5の厚み(体幅)と尾叉長を算出している。このため、例えば、検知領域Kに全体が入っている魚体5が無い場合であって、互いに異なる魚体5の尾の部分と、口先の部分とが検知領域Kに入っている場合に、それら魚体5が異なる尾の部分Tと口先Mを利用して尾叉長が算出されるという問題が防止される。このことも、計測システム1および計測装置8による魚体5の尾叉長の算出値の確からしさを高めることができるという効果に寄与することができる。 Furthermore, when the measurement system 1 and the measurement device 8 of the first embodiment detect that the whole of the fish 5 is included in the detection area K using the color image by the photographing unit 27, the detected fish The thickness (body width) and the fork length of No. 5 are calculated. Therefore, for example, when there is no fish body 5 entirely in the detection area K, and when the tail part and the mouth part of the different fish body 5 are in the detection area K, the fish bodies 5 This prevents the problem that the fork length is calculated using the tail portion T and the lip M different from each other. This can also contribute to the effect that the certainty of the calculated value of the fork length of the fish body 5 by the measurement system 1 and the measurement device 8 can be increased.
 さらに、第1実施形態の計測システム1および計測装置8は、撮影部27によるカラー画像と、センサ部28によるセンサ画像とを利用して、魚体5の厚み(体幅)と尾叉長を算出している。センサ部28によるセンサ画像は、空間座標の情報を含むが、物体の輪郭などの明瞭さが劣る画像である。これに対し、撮影部27によるカラー画像は、空間座標の情報が含まれていないが、物体の輪郭などは明瞭な画像である。センサ部28によるセンサ画像と、撮影部27によるカラー画像との両方を利用することによって、一方のみ利用する場合に比べて、計測システム1および計測装置8は、魚体5の厚み(体幅)と尾叉長の算出値の算出精度を高めることができる。 Furthermore, the measurement system 1 and the measurement device 8 of the first embodiment calculate the thickness (body width) and the fork length of the fish body 5 using the color image obtained by the imaging unit 27 and the sensor image obtained by the sensor unit 28. are doing. The sensor image obtained by the sensor unit 28 includes spatial coordinate information, but is an image in which the clarity of the contour of the object is poor. On the other hand, the color image obtained by the photographing unit 27 does not include information on spatial coordinates, but the image of the contour of the object is clear. By using both the sensor image by the sensor unit 28 and the color image by the photographing unit 27, the measurement system 1 and the measurement device 8 can reduce the thickness (body width) of the fish body 5 as compared with the case of using only one of them. The calculation accuracy of the calculated value of the fork length can be improved.
 さらに、計測システム1および計測装置8は、魚体5のサイズ計測に掛かる人の手間と時間を軽減できる。 Furthermore, the measurement system 1 and the measurement device 8 can reduce the labor and time required for a person to measure the size of the fish 5.
 なお、上述した説明では、検知領域Kに魚体5の全体が入ったことを厚み計測部15が検知しているが、厚み計測部15に代えて、長さ計測部16が、検知領域Kに魚体5の全体が入ったことを検知してもよい。また、上述した説明では、厚み計測部15によって魚体5の厚みが算出された後に、長さ計測部16によって魚体5の尾叉長が算出されている。これに代えて、長さ計測部16によって魚体5の尾叉長が算出された後に、厚み計測部15によって魚体5の厚みが算出されてもよい。また、厚み計測部15による魚体5の厚みの算出と、長さ計測部16による魚体5の尾叉長の算出とが並行処理されてもよい。 In the above description, the thickness measurement unit 15 detects that the entire fish body 5 has entered the detection region K. However, instead of the thickness measurement unit 15, the length measurement unit 16 It may be detected that the whole fish 5 has entered. In the above description, the fork length of the fish 5 is calculated by the length measuring unit 16 after the thickness of the fish 5 is calculated by the thickness measuring unit 15. Alternatively, the thickness of the fish 5 may be calculated by the thickness measuring unit 15 after the fork length of the fish 5 is calculated by the length measuring unit 16. The calculation of the thickness of the fish 5 by the thickness measurement unit 15 and the calculation of the fork length of the fish 5 by the length measurement unit 16 may be performed in parallel.
 さらに、算出部12は、厚み計測部15と長さ計測部16に加えて、図5に表されているような体重算出部17が備えられていてもよい。この場合には、記憶装置20には、体重算出用のデータが予め格納される。体重算出用のデータは、例えば魚体5の厚み(体幅)と尾叉長を利用して魚体5の体重を算出するデータであり、魚体5の厚みと尾叉長と体重の実測値を利用して生成される。 算出 Furthermore, the calculating unit 12 may include a weight calculating unit 17 as shown in FIG. 5 in addition to the thickness measuring unit 15 and the length measuring unit 16. In this case, data for calculating the weight is stored in the storage device 20 in advance. The data for calculating the body weight is, for example, data for calculating the weight of the fish body 5 using the thickness (body width) and the fork length of the fish body 5, and using the actually measured values of the thickness, the fork length and the body weight of the fish body 5. Generated.
 さらに、算出部12の長さ計測部16は、魚体5の尾叉長に加えて、尾叉長を計測する処理と同様の処理により、魚体5の体高を計測してもよい。ここでの体高とは、魚体5において、背側の最も膨らんでいる部分(例えば背びれの頭側の付け根部分)と、腹側の最も膨らんでいる部分との間の長さである。体高を計測する場合には、魚体5における背側の最も膨らんでいる部分と、腹側の最も膨らんでいる部分とが長さ計測部16によって長さ計測の両端部(計測利用部位)として検知される。また、記憶装置20に格納される計測利用部位検知用の参考データは、例えば、魚体5の尾叉長と体高の計測に利用する各計測利用部位を検知するためのデータである。この計測利用部位検知用の参考データは、尾叉長の計測に利用される口先Mと尾の二叉する部分Tだけでなく、体高の計測に利用する背側の最も膨らんでいる部分と腹側の最も膨らんでいる部分の情報をも付された魚体全体の画像データを教師データとした機械学習により生成される。このような計測利用部位検知用の参考データを利用することにより、長さ計測部16は、撮影部27によるカラー画像において、同じ魚体5における尾叉長の計測に利用する2つの計測利用部位と、体高の計測に利用する2つの計測利用部位とを検知できる。 Further, the length measurement unit 16 of the calculation unit 12 may measure the body height of the fish body 5 by the same process as the process of measuring the fork length in addition to the fork length of the fish body 5. Here, the body height is the length between the most bulging portion on the dorsal side (for example, the root portion on the head side of the dorsal fin) and the most bulging portion on the ventral side in the fish body 5. When the body height is measured, the most bulging part on the back side and the most bulging part on the ventral side of the fish body 5 are detected by the length measuring unit 16 as both ends (measurement use parts) of the length measurement. Is done. Further, the reference data for measuring and utilizing part stored in the storage device 20 is, for example, data for detecting each measuring and utilizing part used for measuring the fork length and body height of the fish body 5. The reference data for detecting the measurement use part includes not only the lip M used for measurement of the fork length and the forked part T of the tail, but also the most inflated part of the back side used for measurement of body height and the abdomen. Is generated by machine learning using, as teacher data, image data of the entire fish body to which information of the most bulging portion is also attached. By using such reference data for detecting a measurement use site, the length measurement unit 16 can use two measurement use sites used for measurement of the fork length of the same fish 5 in the color image obtained by the imaging unit 27. , And two measurement parts used for measuring the body height can be detected.
 さらに、長さ計測部16が体高をも計測し、かつ、算出部12に体重算出部17が備えられている場合に、体重算出部17は、魚体5の厚み(体幅)と尾叉長だけでなく、体高をも考慮して、体重を算出してもよい。さらにまた、体重算出部17は、魚体5の厚みは利用せずに、尾叉長と体高を利用して魚体5の体重を算出してもよい。 Further, when the length measuring unit 16 also measures the body height and the calculating unit 12 is provided with a weight calculating unit 17, the weight calculating unit 17 determines the thickness (body width) of the fish 5 and the fork length. In addition, the weight may be calculated in consideration of the body height. Furthermore, the weight calculating unit 17 may calculate the weight of the fish 5 using the fork length and body height without using the thickness of the fish 5.
 さらに、センサ部28は、赤外線を利用したTOF方式によりカメラ7から物体までの距離に関わる情報を取得する例を示した。これに関し、センサ部28は、TOF方式に代えて、パターン照射方式やLIDAR(Laser Imaging Detection and Ranging)等の3次元計測技術を採用してもよい。 {Furthermore, the example has been described in which the sensor unit 28 acquires information related to the distance from the camera 7 to the object by the TOF method using infrared rays. In this regard, the sensor unit 28 may employ a three-dimensional measurement technique such as a pattern irradiation method or LIDAR (Laser Imaging Detection and Ranging) instead of the TOF method.
 <第2実施形態>
 以下に、本発明に係る第2実施形態を説明する。なお、第2実施形態の説明において、第1実施形態の計測システムおよび計測装置を構成する構成部分と同一名称部分には同一符号を付し、その共通部分の重複説明は省略する。
<Second embodiment>
Hereinafter, a second embodiment according to the present invention will be described. In the description of the second embodiment, the same components as those of the measuring system and the measuring apparatus according to the first embodiment are denoted by the same reference numerals, and redundant description of the common components will be omitted.
 第2実施形態では、図6に表されるように、計測システム1は、第1実施形態の構成に加えて、表示装置31と通信部32を備え、計測装置8は、さらに、通信部18を備えている。 In the second embodiment, as shown in FIG. 6, the measurement system 1 includes a display device 31 and a communication unit 32 in addition to the configuration of the first embodiment, and the measurement device 8 further includes a communication unit 18 It has.
 表示装置31と通信部32は、情報処理装置30であるパーソナルコンピュータ(パソコン)に備えられている。情報処理装置30は、例えば、ベルトコンベア3の近傍(換言すれば、計測者の近傍)に配設される。 The display device 31 and the communication unit 32 are provided in a personal computer (personal computer) that is the information processing device 30. The information processing device 30 is disposed, for example, near the belt conveyor 3 (in other words, near the measurer).
 通信部32は、カメラ7から出力されたカラー画像とセンサ画像を、情報通信網34を通して計測装置8に向けて中継する機能を備えている。また、通信部32は、計測装置8から情報通信網34を介して送信されてきた情報を受信する機能を備えている。 The communication unit 32 has a function of relaying the color image and the sensor image output from the camera 7 to the measuring device 8 through the information communication network 34. The communication unit 32 has a function of receiving information transmitted from the measuring device 8 via the information communication network 34.
 計測装置8の通信部18は、情報通信網34を介してカメラ7によるカラー画像とセンサ画像を受信し受信部11に提供する機能を備えている。計測装置8は、通信部18を介して受信したカメラ7によるカラー画像とセンサ画像を利用して、第1実施形態と同様に、魚体5の厚みと尾叉長を算出する。通信部18は、算出した魚体5の厚みと尾叉長の算出値を情報通信網34を通して情報処理装置30に向けて出力する機能をさらに備えている。 The communication unit 18 of the measuring device 8 has a function of receiving the color image and the sensor image from the camera 7 via the information communication network 34 and providing the color image and the sensor image to the receiving unit 11. The measurement device 8 calculates the thickness and the fork length of the fish body 5 using the color image and the sensor image of the camera 7 received via the communication unit 18 as in the first embodiment. The communication unit 18 further has a function of outputting the calculated values of the calculated thickness and fork length of the fish body 5 to the information processing device 30 through the information communication network 34.
 表示装置31は、通信部32により受信された魚体5の厚みと尾叉長の算出値を表示する機能を備えている。 The display device 31 has a function of displaying the calculated values of the thickness and the fork length of the fish body 5 received by the communication unit 32.
 第2実施形態の計測システム1および計測装置8は、第1実施形態と同様に魚体5の厚みと尾叉長を算出する構成を備えているため、第1実施形態と同様の効果である魚体5のサイズに関わる算出値の確からしさを高めることができるという効果を得ることができる。 Since the measurement system 1 and the measurement device 8 of the second embodiment have a configuration for calculating the thickness and the fork length of the fish body 5 as in the first embodiment, the fish body has the same effect as that of the first embodiment. The effect of increasing the certainty of the calculated value related to the size of 5 can be obtained.
 なお、計測装置8は例えばサーバにより実現されるものであり、計測装置8には、情報通信網34を介して複数の情報処理装置30および当該情報処理装置30を通してカメラ7が接続されてもよい。 The measuring device 8 is realized by, for example, a server, and the measuring device 8 may be connected to a plurality of information processing devices 30 via an information communication network 34 and the camera 7 through the information processing devices 30. .
 <その他の実施形態>
 本発明は、第1と第2の実施形態に限定されず、様々な実施の態様を採り得る。例えば、第1と第2の実施形態では、撮影部27によるカラー画像を利用して、検知領域Kに魚体5の全体が入ったか否かが判断されている。これに代えて、例えば、魚体5の口先から尾に向かう方向(背骨に沿う方向)がベルトコンベア3の進行方向にほぼ合うように魚体5がベルトコンベア3に裁置される場合には、センサ部28によるセンサ画像を利用して、検知領域Kに魚体5の全体が入っているか否かが判断されてもよい。この場合には、例えば、センサ画像から、予め設定された高さ計測位置における空間座標のz成分を時々刻々と取得しz成分の変化傾向を算出し、センサ画像だけでなく、z成分の変化傾向をも利用して、検知領域Kに魚体5の全体が入っているか否かが判断される。
<Other embodiments>
The present invention is not limited to the first and second embodiments, but can adopt various embodiments. For example, in the first and second embodiments, it is determined whether or not the entire fish 5 has entered the detection area K using the color image obtained by the imaging unit 27. Alternatively, for example, when the fish 5 is placed on the belt conveyor 3 such that the direction from the mouth of the fish 5 toward the tail (the direction along the spine) substantially matches the traveling direction of the belt conveyor 3, the sensor may be used. Using the sensor image by the unit 28, it may be determined whether or not the entire fish body 5 is included in the detection area K. In this case, for example, from the sensor image, the z component of the spatial coordinates at the preset height measurement position is acquired moment by moment, and the change tendency of the z component is calculated. Utilizing the tendency, it is determined whether or not the entire fish body 5 is included in the detection area K.
 また、第1と第2の実施形態では、計測対象の物体は魚体であるが、本発明は、魚体以外の物体の少なくとも厚みの算出に適用することができる。また、第1と第2の実施形態では、計測対象の魚体5はベルトコンベア3に裁置され、当該ベルトコンベア3の面が基準面となっている。これに対し、計測対象の物体は、ベルトコンベア3以外の基準面に裁置されてもよい。また、基準面は、固定した面であってもよい。 In the first and second embodiments, the object to be measured is a fish. However, the present invention can be applied to at least the calculation of the thickness of an object other than a fish. In the first and second embodiments, the fish 5 to be measured is placed on the belt conveyor 3, and the surface of the belt conveyor 3 is a reference surface. On the other hand, the object to be measured may be placed on a reference surface other than the belt conveyor 3. Further, the reference plane may be a fixed plane.
 さらに、第1と第2の実施形態の構成に加えて、計測システム1には、ベルトコンベア3に対するカメラ7の高さ位置を調整する機構が備えられていてもよい。 Further, in addition to the configurations of the first and second embodiments, the measurement system 1 may be provided with a mechanism for adjusting the height position of the camera 7 with respect to the belt conveyor 3.
 図7は、本発明に係るその他の実施形態の計測装置の構成を簡略化して表すブロック図である。図7に表される計測装置40は、図8に表される検知装置46と共に、計測システム45を構築する。検知装置46は、検知領域内の空間座標に関わる情報を出力する機能を備える。 FIG. 7 is a simplified block diagram showing a configuration of a measuring apparatus according to another embodiment of the present invention. The measurement device 40 illustrated in FIG. 7 forms a measurement system 45 together with the detection device 46 illustrated in FIG. The detection device 46 has a function of outputting information relating to spatial coordinates in the detection area.
 計測装置40は、図7に表されるように、受信部41と、算出部42とを備えている。受信部41は、基準面に向き合う位置に配設され基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置46から出力された情報を受信する機能を備える。算出部42は、基準面に裁置された検知領域内の物体の表面における空間座標に関わる情報を利用して、その物体において基準面から検知装置46に向かう方向の厚みが最も厚い部分を検知し、当該検知した部分の厚みを物体の厚みとして算出する機能を備える。 The measuring device 40 includes a receiving unit 41 and a calculating unit 42, as shown in FIG. The receiving unit 41 has a function of receiving information output from the detecting device 46 that is disposed at a position facing the reference plane and outputs information relating to spatial coordinates in a detection area including the reference plane. The calculating unit 42 uses the information relating to the spatial coordinates of the surface of the object within the detection area placed on the reference plane to detect a portion of the object having the largest thickness in the direction from the reference plane to the detection device 46. And a function of calculating the thickness of the detected portion as the thickness of the object.
 上述したような構成を持つ計測装置40および計測システム45も、第1や第2の実施形態と同様の効果を得ることができる。
 以上、上述した実施形態を模範的な例として本発明を説明した。しかしながら、本発明は、上述した実施形態には限定されない。即ち、本発明は、本発明のスコープ内において、当業者が理解し得る様々な態様を適用することができる。
 この出願は、2018年7月24日に出願された日本出願特願2018-138548を基礎とする優先権を主張し、その開示の全てをここに取り込む。
The measuring device 40 and the measuring system 45 having the above-described configurations can also obtain the same effects as those of the first and second embodiments.
The present invention has been described above using the above-described embodiment as a typical example. However, the invention is not limited to the embodiments described above. That is, the present invention can apply various aspects that can be understood by those skilled in the art within the scope of the present invention.
This application claims priority based on Japanese Patent Application No. 2018-138548 filed on July 24, 2018, the disclosure of which is incorporated herein in its entirety.
 1,45 計測システム
 5 魚体
 7 カメラ
 8,40 計測装置
 11,41 受信部
 12,42 算出部
 46 検知装置
1,45 measuring system 5 fish 7 camera 8,40 measuring device 11,41 receiving unit 12,42 calculating unit 46 detecting device

Claims (7)

  1.  基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信する受信手段と、
     前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知し、当該検知した部分の厚みを前記物体の厚みとして算出する算出手段とを備える計測装置。
    A receiving unit that receives the information output from the detection device that is arranged at a position facing the reference plane and outputs information related to spatial coordinates in the detection area including the reference plane,
    Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. And a calculating means for calculating the thickness of the detected portion as the thickness of the object.
  2.  前記受信手段は、前記検知装置に備えられている撮影機能により撮影された前記検知領域内の撮影画像の情報をも受信する機能を備え、
     前記算出手段は、さらに、受信した前記撮影画像における前記物体の予め設定された長さ計測部分の両端部を検知し、前記検知装置からの前記情報を利用して、検知した前記両端部に対応する空間座標を取得し、当該取得した前記空間座標を利用して前記物体における計測対象の長さを算出する請求項1に記載の計測装置。
    The receiving unit has a function of receiving information of a captured image in the detection area captured by the capturing function provided in the detection device,
    The calculating means further detects both ends of a preset length measurement portion of the object in the received captured image, and uses the information from the detection device to correspond to the detected both ends. The measurement apparatus according to claim 1, wherein space coordinates to be obtained are obtained, and a length of the object to be measured in the object is calculated using the obtained space coordinates.
  3.  前記算出手段は、前記検知装置から受信した情報を利用して検知領域内に前記物体の全体が入っていることを検知した場合に、少なくとも前記厚みを算出する算出動作を実行する請求項1又は請求項2に記載の計測装置。 The calculation unit executes a calculation operation of calculating at least the thickness when detecting that the entirety of the object is included in a detection area using information received from the detection device. The measuring device according to claim 2.
  4.  前記基準面は、物体を搬送する搬送装置における前記物体を裁置する移動面である請求項1乃至請求項3の何れか一つに記載の計測装置。 4. The measuring device according to claim 1, wherein the reference surface is a moving surface on which the object is placed in a transport device that transports the object. 5.
  5.  検知領域内の空間座標に関わる情報を出力する検知装置と、
     請求項1乃至請求項4の何れか一項に記載の計測装置と
    を含む計測システム。
    A detection device that outputs information related to spatial coordinates in the detection area,
    A measurement system comprising: the measurement device according to claim 1.
  6.  基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信し、
     前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知し、
     検知した部分の厚みを前記物体の厚みとして算出する計測方法。
    Receiving the information output from the detection device that is disposed at a position facing the reference plane and outputs information related to spatial coordinates in the detection area including the reference plane,
    Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, the object detects the thickest portion in the direction from the reference plane to the detection device in the object. ,
    A measurement method for calculating the thickness of the detected portion as the thickness of the object.
  7.  基準面に向き合う位置に配設され前記基準面を含む検知領域内の空間座標に関わる情報を出力する検知装置から出力された前記情報を受信する処理と、
     前記基準面に裁置された前記検知領域内の物体の表面における前記空間座標に関わる情報を利用して、前記物体において前記基準面から前記検知装置に向かう方向の厚みが最も厚い部分を検知する処理と、
     検知した部分の厚みを前記物体の厚みとして算出する処理とをコンピュータに実行させるコンピュータプログラムを記憶するプログラム記憶媒体。
    A process of receiving the information output from the detection device that is arranged at a position facing the reference surface and outputs information related to spatial coordinates in the detection area including the reference surface,
    Utilizing information related to the spatial coordinates on the surface of the object in the detection area disposed on the reference plane, detecting a portion of the object having the largest thickness in the direction from the reference plane to the detection device. Processing,
    A program storage medium storing a computer program for causing a computer to execute a process of calculating a thickness of a detected portion as a thickness of the object.
PCT/JP2019/028768 2018-07-24 2019-07-23 Measurement device, measurement system, measurement method, and program storage medium WO2020022309A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018138548A JP6702370B2 (en) 2018-07-24 2018-07-24 Measuring device, measuring system, measuring method and computer program
JP2018-138548 2018-07-24

Publications (1)

Publication Number Publication Date
WO2020022309A1 true WO2020022309A1 (en) 2020-01-30

Family

ID=69181541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/028768 WO2020022309A1 (en) 2018-07-24 2019-07-23 Measurement device, measurement system, measurement method, and program storage medium

Country Status (2)

Country Link
JP (1) JP6702370B2 (en)
WO (1) WO2020022309A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7237321B1 (en) 2022-07-29 2023-03-13 株式会社MizLinx Fish length measuring device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7155375B1 (en) * 2021-10-13 2022-10-18 マルハニチロ株式会社 Measurement system, information processing device, information processing method and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60247109A (en) * 1984-05-22 1985-12-06 Nippon Suisan Kaisha Ltd Method and apparatus for measuring and processing thickness of fish body
JPH0510735A (en) * 1990-01-18 1993-01-19 Nordischer Mas Rud Baader Gmbh & Co Kg Method and device for measuring body during movement in three-dimension
US5184733A (en) * 1991-02-19 1993-02-09 Marel H.F. Apparatus and method for determining the volume, form and weight of objects
WO2018061925A1 (en) * 2016-09-30 2018-04-05 日本電気株式会社 Information processing device, length measurement system, length measurement method, and program storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3483039B2 (en) * 1993-09-30 2004-01-06 博文 松尾 Identification device based on image signals
JP4462952B2 (en) * 2004-02-13 2010-05-12 勝三 川西 Fat percentage measuring device
KR101656635B1 (en) * 2014-12-10 2016-09-09 가부시기가이샤니레꼬 Fish species determination device and fish species determination method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60247109A (en) * 1984-05-22 1985-12-06 Nippon Suisan Kaisha Ltd Method and apparatus for measuring and processing thickness of fish body
JPH0510735A (en) * 1990-01-18 1993-01-19 Nordischer Mas Rud Baader Gmbh & Co Kg Method and device for measuring body during movement in three-dimension
US5184733A (en) * 1991-02-19 1993-02-09 Marel H.F. Apparatus and method for determining the volume, form and weight of objects
WO2018061925A1 (en) * 2016-09-30 2018-04-05 日本電気株式会社 Information processing device, length measurement system, length measurement method, and program storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7237321B1 (en) 2022-07-29 2023-03-13 株式会社MizLinx Fish length measuring device
JP7398764B1 (en) 2022-07-29 2023-12-15 株式会社MizLinx Fish length measurement method
JP2024018218A (en) * 2022-07-29 2024-02-08 株式会社MizLinx Fish body length measurement device

Also Published As

Publication number Publication date
JP2020016501A (en) 2020-01-30
JP6702370B2 (en) 2020-06-03

Similar Documents

Publication Publication Date Title
JP7124865B2 (en) Information processing device, object measuring system, object measuring method, computer program and information providing system
US20200230821A1 (en) Information processing apparatus, information processing method, and storage medium
JP6635690B2 (en) Information processing apparatus, information processing method and program
JP5567908B2 (en) Three-dimensional measuring apparatus, measuring method and program
JP6377863B2 (en) Enhancement of depth map representation by reflection map representation
JP5587137B2 (en) Measuring apparatus and measuring method
JP3624353B2 (en) Three-dimensional shape measuring method and apparatus
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
US20030210407A1 (en) Image processing method, image processing system and image processing apparatus
KR101616176B1 (en) Full body high speed three dimesional scanning apparatus
US11328439B2 (en) Information processing device, object measurement system, object measurement method, and program storage medium
WO2006054425A1 (en) Three-dimensional measuring instrument, three-dimensional measuring method, and three-dimensional measuring program
WO2020022309A1 (en) Measurement device, measurement system, measurement method, and program storage medium
JP6981531B2 (en) Object identification device, object identification system, object identification method and computer program
JP6590153B2 (en) Projection instruction apparatus, package sorting system, and projection instruction method
JP2014202567A (en) Position attitude measurement device, control method thereof, and program
CN114375177A (en) Body measurement device and control method thereof
CN111742352B (en) Method for modeling three-dimensional object and electronic equipment
CN115135973A (en) Weight estimation device and program
JP6412372B2 (en) Information processing apparatus, information processing system, information processing apparatus control method, and program
JP2015232771A (en) Face detection method, face detection system and face detection program
KR102325970B1 (en) Method and apparatus for calculating fat mass using images of head and neck
US20210133427A1 (en) Image processing device, image printing device, imaging device, and non-transitory medium
JP7028814B2 (en) External shape recognition device, external shape recognition system and external shape recognition method
DK2972071T3 (en) Device for measuring a carcass body object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19839845

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19839845

Country of ref document: EP

Kind code of ref document: A1