US11328439B2 - Information processing device, object measurement system, object measurement method, and program storage medium - Google Patents
Information processing device, object measurement system, object measurement method, and program storage medium Download PDFInfo
- Publication number
- US11328439B2 US11328439B2 US16/971,078 US201916971078A US11328439B2 US 11328439 B2 US11328439 B2 US 11328439B2 US 201916971078 A US201916971078 A US 201916971078A US 11328439 B2 US11328439 B2 US 11328439B2
- Authority
- US
- United States
- Prior art keywords
- measurement
- baseline
- length
- fish
- measured
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K61/00—Culture of aquatic animals
- A01K61/90—Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
- A01K61/95—Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/04—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/247—
Definitions
- the present invention relates to a technology for measuring, from a captured image in which an object to be measured is captured, a length and the like of the object.
- PTL 1 discloses a technology relating to observation of fish.
- shapes and sizes of regions, such as a head, a body, and a caudal fin, of a fish are estimated with respect to each region, based on captured images of a back side (or an abdomen side) of a fish captured from an upper side (or a bottom side) and a side of an aquarium and a captured image of a head side captured from the front.
- the estimation of the shape and size of each region of the fish is performed by use of a plurality of template images provided for each region. Specifically, captured images of each region are individually compared with template images of each region, and, based on known information of sizes and the like of regions of a fish in template images that match the captured images, the size and the like of each region of the fish are estimated.
- PTL 2 discloses a technology of capturing images of fish in water by use of a video camera and a still-image camera and detecting a fish shadow, based on captured video and still images. PTL 2 also describes a configuration for estimating a size of a fish, based on a size of an image (the number of pixels).
- a size of a region of a fish is estimated based on known size information on a region of a fish in the template image.
- the technology in PTL 1 merely detects a size of a region of a fish in a template image as the size of the region of a fish to be measured and does not measure a size of the region of the fish to be measured, a problem may occur that it is difficult to increase detection accuracy of the size.
- a principal object of the present invention is to provide a technology that is capable of increasing accuracy of measurement values acquired by measuring a length and the like of an object to be measured, based on a captured image.
- an information processing device as one example embodiment according to the present invention includes:
- a detection unit that, when an object image is divided by a baseline, detects a predetermined portion of an object to be measured as a measurement-use point in each of divided areas on both sides of the baseline in the object image, the object image being an image of the object in a captured image in which the object is captured, the baseline being set to the object image, the predetermined portion being a portion to be used for length measurement;
- a calculation unit that calculates, in each of the divided areas, a length of a line segment between an intersection point of a perpendicular that passes through the measurement-use point and is perpendicular to the baseline and the baseline and the measurement-use point, and calculates a length to be measured on the object by adding the lengths of the line segments each of which is calculated in one of the divided areas.
- an information processing device that calculates, by use of a captured image captured by the image capturing device, a length to be measured on the object to be measured.
- the information processing device includes:
- a detection unit that, when an object image is divided by a baseline, detects a predetermined portion of an object to be measured as a measurement-use point in each of divided areas on both sides of the baseline in the object image, the object image being an image of the object in a captured image in which the object is captured, the baseline being set to the object image, the predetermined portion being a portion to be used for length measurement;
- a calculation unit that calculates, in each of the divided areas, a length of a line segment between an intersection point of a perpendicular that passes through the measurement-use point and is perpendicular to the baseline and the baseline and the measurement-use point, and calculates a length to be measured on the object by adding the lengths of the line segments each of which is calculated in one of the divided areas.
- an object image is divided by a baseline, detecting a predetermined portion of an object to be measured as a measurement-use point in each of divided areas on both sides of the baseline in the object image, the object image being an image of the object in a captured image in which the object is captured, the baseline being set to the object image, the predetermined portion being a portion to be used for length measurement;
- an object image is divided by a baseline, detecting a predetermined portion of an object to be measured as a measurement-use point in each of divided areas on both sides of the baseline in the object image, the object image being an image of the object in a captured image in which the object is captured, the baseline being set to the object image, the predetermined portion being a portion to be used for length measurement;
- the present invention enables accuracy of measurement values acquired by measuring a length and the like of an object to be measured, based on a captured image, to be increased.
- FIG. 1 is a block diagram illustrating a configuration of an information processing device of a first example embodiment according to the present invention in a simplified manner
- FIG. 2A is a diagram describing a configuration of an image capturing device that provides the information processing device of the first example embodiment with captured images;
- FIG. 2B is a perspective view illustrating the image capturing device that provides the information processing device of the first example embodiment with captured images
- FIG. 3 is a diagram describing a mode in which the image capturing device captures images of fishes that are objects to be measured in the first example embodiment
- FIG. 4 is a diagram describing an example of a form in which captured images in which fishes, which are objects to be measured, are captured are displayed on a display device;
- FIG. 7 is a diagram illustrating an example of fish bodies that are not employed as training data used when fish bodies to be measured are learned through machine learning;
- FIG. 8 is a diagram describing positions on a fish body that are detected as measurement-use points in the first example embodiment
- FIG. 10 is a diagram illustrating an example of reference data for detecting measurement-use points, the reference data being generated by use of machine learning;
- FIG. 11 is a diagram describing the fork length of a fish body that is calculated in the first example embodiment
- FIG. 12 is a diagram describing a method by which a body depth of a fish body is calculated in the first example embodiment
- FIG. 13 is a flowchart describing an example of operation relating to measurement of a fish body performed by the information processing device in the first example embodiment
- FIG. 15 is a diagram describing a function of a correction unit that the information processing device of the second example embodiment includes;
- FIG. 16 is a block diagram extracting and illustrating characteristic constituent components of an information processing device of a third example embodiment according to the present invention.
- FIG. 17 is a block diagram illustrating a configuration of an information processing device of another example embodiment according to the present invention in a simplified manner.
- FIG. 18 is a block diagram illustrating a configuration of an object measurement system of the another example embodiment according to the present invention in a simplified manner.
- FIG. 1 is a block diagram illustrating a configuration of an information processing device of a first example embodiment according to the present invention in a simplified manner.
- An information processing device 10 of the first example embodiment has a function of calculating lengths (fork length and body depth) of a fish that is an object to be measured from captured images of the fish that were captured by a plurality of (two) cameras 40 A and 40 B as illustrated in FIG. 2A and further estimating the weight of the fish.
- the information processing device 10 constitutes, in conjunction with the cameras 40 A and 40 B, a fish measurement system that is an object measurement system.
- the cameras 40 A and 40 B are image capturing devices having a function of capturing a video
- an image capturing device that, instead of having a video capturing function, for example, intermittently captures still images at each preset time interval may be employed as the cameras 40 A and 40 B.
- the cameras 40 A and 40 B capture images of fishes while being placed side by side with an interval interposed therebetween, as illustrated in FIG. 2B , by being supported by and fixed to a support member 42 as illustrated in FIG. 2A .
- the support member 42 is constituted including an extensible rod 43 , an attachment rod 44 , and attachment fixtures 45 A and 45 B.
- the extensible rod 43 is a freely extensible and retractable rod member and further includes a structure that enables the length thereof to be fixed at a length appropriate for use within a length range in which the extensible rod 43 is extensible and retractable.
- the attachment rod 44 is made of a metallic material, such as aluminum, and is joined to the extensible rod 43 in such a way as to be orthogonal to the extensible rod 43 .
- the attachment fixtures 45 A and 45 B are fixed at sites that are symmetrically located with respect to the joint portion with the extensible rod 43 .
- the attachment fixtures 45 A and 45 B include mounting surfaces 46 A and 46 B and have a structure that enables the cameras 40 A and 40 B mounted on the mounting surfaces 46 A and 46 B to be fixed to the mounting surfaces 46 A and 46 B by means of, for example, screws without backlash, respectively.
- the cameras 40 A and 40 B are capable of maintaining a state of being placed side by side with a preset interval interposed therebetween by being fixed to the support member 42 having a structure as described above.
- the cameras 40 A and 40 B are fixed to the support member 42 in such a way that lenses disposed to the cameras 40 A and 40 B face the same direction and the optical axes of the lenses are set to be parallel with each other.
- the support member supporting and fixing the cameras 40 A and 40 B is not limited to the support member 42 illustrated in FIG. 2A and the like.
- the support member supporting and fixing the cameras 40 A and 40 B may have, in place of the extensible rod 43 in the support member 42 , a structure in which one or a plurality of ropes are used and the attachment rod 44 and the attachment fixtures 45 A and 45 B are suspended by the ropes.
- the cameras 40 A and 40 B while being fixed to the support member 42 , are, for example, made to enter a fish preserve 48 in which fishes are cultivated, as illustrated in FIG. 3 and arranged at a depth in the water and with a direction of the lenses that are determined to be appropriate for observation of fishes (in other words, image-capturing of fishes that are objects to be measured).
- a method of arranging and fixing the support member 42 (the cameras 40 A and 40 B), which is made to enter the fish preserve 48 , at an appropriate depth in the water and with an appropriate direction of the lenses various methods are conceivable, and, herein, any method can be employed and a description of the method will be omitted.
- Calibration of the cameras 40 A and 40 B is performed using an appropriate calibration method selected in consideration of the environment of the fish preserve 48 and the types of fishes to be measured. A description of the calibration method will be omitted herein.
- an appropriate method selected in consideration of the performance of the cameras 40 A and 40 B, the environment of the fish preserve 48 , and the like is employed.
- an observer (measurer) of fishes manually starts image-capturing before making the cameras 40 A and 40 B enter the fish preserve 48 and manually stops the image-capturing after having made the cameras 40 A and 40 B leave the fish preserve 48 .
- an operation device that is capable of transmitting information for controlling image-capturing start and image-capturing stop is connected to the cameras 40 A and 40 B.
- the image-capturing start and the image-capturing stop may be controlled by the measurer operating the operation device.
- a monitor device that is capable of receiving images that either or both of the camera 40 A and the camera 40 B is/are capturing from the cameras 40 A and 40 B by means of wired communication or wireless communication may be used.
- the measurer becomes able to see, through the monitor device, images being captured.
- This configuration for example, enables the measurer to change the image-capturing direction or the depth in the water of the cameras 40 A and 40 B while seeing images being captured.
- a mobile terminal provided with a monitoring function may be used as the monitor device.
- the information processing device 10 uses, in the processing of calculating lengths (herein, fork length and body depth) of a fish, a captured image from the camera 40 A and a captured image from the camera 40 B that were captured at the same time.
- the above-described images captured by the cameras 40 A and 40 B may be taken into the information processing device 10 by means of wired communication or wireless communication or may, after having been stored in a portable storage medium (for example, a secure digital (SD) card), be taken into the information processing device 10 .
- a portable storage medium for example, a secure digital (SD) card
- the information processing device 10 when outlined, includes a control device 22 and a storage device 23 , as illustrated in FIG. 1 .
- the information processing device 10 is connected to an input device (for example, a keyboard or a mouse) 25 for inputting information to the information processing device 10 through, for example, operation by the measurer and a display device 26 for displaying information.
- the information processing device 10 may be connected to an external storage device 24 , which is a separate entity from the information processing device 10 .
- the storage device 23 has a function of storing various types of data and computer programs (hereinafter, also referred to as programs) and is achieved by a storage medium, such as a hard disk device and a semiconductor memory.
- the number of storage devices with which the information processing device 10 is provided is not limited to one and the information processing device 10 may be provided with a plurality of types of storage devices, and, in this case, the plurality of storage devices are collectively referred to as storage devices 23 .
- the storage device 24 also has, as with the storage device 23 , a function of storing various types of data and computer programs and is achieved by a storage medium, such as a hard disk device and a semiconductor memory.
- the information processing device 10 When the information processing device 10 is connected to the storage device 24 , appropriate information is stored in the storage device 24 .
- the information processing device 10 appropriately performs processing of writing and reading information to and from the storage device 24 , a description about the storage device 24 will be omitted in the following description.
- images captured by the cameras 40 A and 40 B are stored in the storage device 23 in association with identification information for identifying a camera that captured each image and information relating to an image-capturing situation, such as information of a capture time.
- the control device 22 is constituted by a processor, such as a central processing unit (CPU) and a graphics processing unit (GPU).
- the control device 22 is capable of having functions as follows by, for example, the CPU executing computer programs stored in the storage device 23 . That is, the control device 22 includes, as functional units, a detection unit 30 , a specification unit 31 , a calculation unit 32 , an analysis unit 33 , and a display control unit 34 .
- the display control unit 34 has a function of controlling display operation of the display device 26 .
- the display control unit 34 receives, from the input device 25 , a request to reproduce captured images captured by the cameras 40 A and 40 B
- the display control unit 34 reads, from the storage device 23 , the captured images captured by the cameras 40 A and 40 B in accordance with the request and displays the captured images on the display device 26 .
- FIG. 4 is a diagram illustrating a display example of captured images captured by the cameras 40 A and 40 B on the display device 26 . In the example in FIG. 4 , a captured image 41 A captured by the camera 40 A and a captured image 41 B captured by the camera 40 B are displayed side by side by means of dual screen display.
- the display control unit 34 has a function capable of synchronizing the captured images 41 A and 41 B with each other in such a way that the image-capturing time points of the captured images 41 A and 41 B, which are displayed on the display device 26 at the same time, coincide with each other.
- the display control unit 34 has a function enabling the measurer to adjust each pair of reproduced frames of the captured images 41 A and 41 B by use of marks for time alignment as described afore that were simultaneously captured by the cameras 40 A and 40 B.
- the detection unit 30 has a function of detecting a fish to be measured and a function of detecting measurement points on the detected fish to be measured in the captured images 41 A and 41 B, which are displayed (reproduced) on the display device 26 .
- the detection unit 30 detects a fish to be measured in the following way. For example, the detection unit 30 detects, in a pair of frames specified by the measurer or for every preset number of pairs of frames in the captured images 41 A and 41 B that are displayed (reproduced) on the display device 26 , a fish body to be measured by use of reference data for fish body detection, which are stored in the storage device 23 .
- the reference data for fish body detection is generated through, for example, machine learning. In the machine learning, fish bodies of a type to be measured are learned by use of, as training data, a large number of images of fish bodies as illustrated in FIG. 6 captured with respect to the type of fish to be measured.
- an image of a fish that largely bends, an image of a fish the inclination of which is large, an image of a fish a portion of the body of which is not captured as illustrated in FIG. 7 are excluded from detection targets and are not learned as fish bodies to be measured. Since such images of fish bodies that were not learned as fish bodies through machine learning are not reflected by the reference data for fish body detection, the detection unit 30 does not detect fish bodies as illustrated in FIG. 7 as a fish to be measured. There exist various methods of machine learning, and an appropriate method of machine learning is employed herein.
- the detection unit 30 may, instead of using the above-described method using machine learning, have a function of detecting a fish to be measured based on input information input by the measurer.
- the detection unit 30 displays, by use of the display control unit 34 , a message such as “Specify (select) a fish to be measured” on the display device 26 on which the captured images 41 A and 41 B are displayed as illustrated in FIG. 4 .
- the measurer for example, operating the input device 25 and thereby enclosing a fish to be measured with borders 50 and 51 as illustrated in FIG. 5 in the captured images 41 A and 41 B, respectively, causes the fish to be measured to be brought into a state of being specified.
- the detection unit 30 detects a fish to be measured based on display position information of the borders 50 and 51 .
- the borders 50 and 51 is formed into, for example, rectangular shapes (including squares), and the sizes and aspect ratios of the borders 50 and 51 are configured to be varied by the measurer.
- the measurer is performing operation of specifying a fish to be measured with the borders 50 and 51 , the captured images 41 A and 41 B are in a state of being temporarily suspended and stationary.
- the detection unit 30 may detect a fish to be measured in the following way.
- a screen area in which one of the captured images 41 A and 41 B is displayed (for example, the left side screen area in FIGS. 4 and 5 ) is set as an operation screen, and a screen area in which the other is displayed (for example, the right side screen area in FIGS. 4 and 5 ) is set as a reference screen.
- the detection unit 30 prompts, through message display or the like, the measurer to specify a fish to be measured with the border 50 in the operation screen and acquires information of the position at which the border 50 is displayed through operation by the prompted measurer.
- the detection unit 30 calculates, based on interval information representing the interval between the cameras 40 A and 40 B, a display position of the border 51 in the captured image 41 A in the reference screen, the border 51 representing the same region as a region specified by the border 50 in the captured image 41 B. Based on the information of the display positions of the borders 50 and 51 acquired in this manner, the detection unit 30 detects a fish to be measured.
- the detection unit 30 has a function of, while the position and size of the border 50 are adjusted by the measurer in the captured image 41 B, varying the position and size of the border 51 in the captured image 41 A, following the position and size of the border 50 .
- the detection unit 30 may have a function of, after the position and size of the border 50 have been fixed in the captured image 41 B, causing the border 51 to be displayed in the captured image 41 A.
- the detection unit 30 may have both the function of varying the position and size of the border 51 , following the adjustment of the position and size of the border 50 and the function of, after the position and size of the border 50 have been fixed, displaying the border 51 and perform, for example, the function alternatively selected by the measurer.
- the detection unit 30 further has a function of detecting measurement-use points on a fish detected as a measurement target in the captured images 41 A and 41 B, the measurement-use points having predetermined features.
- a bifurcating portion Pt of the tail and the mouth Pm of a fish as illustrated in FIG. 8 that are used for measurement of the fork length of the fish are detected as measurement-use points.
- a top portion Pb on the back side and a joint portion Ps of the pelvic fin as a most bulging portion on the abdomen side as illustrated in FIG. 8 that are used for measurement of the body depth of the fish are also detected as measurement-use points.
- the detection unit 30 detects the measurement-use points by use of an appropriate method selected in consideration of needs of the measurer and the performance of the control device, examples of the detection method will be described below.
- the detection unit 30 prompts, by means of message display or the like, the measurer to specify (point) measurement-use points in an image of a fish to be measured by use of the input device 25 and detects, based on operation by the measurer who has received the message or the like, the measurement-use points in the image of the fish to be measured.
- the measurement-use points specified by the measurer are clearly indicated on the display device 26 by the display control unit 34 , and this function causes a configuration by which the measurer can confirm the positions of the measurement-use points that the measurer has specified to be provided to the information processing device 10 .
- the information processing device 10 is also provided with a configuration of, after the measurer has specified the measurement-use points, receiving correction of the specification of the measurement-use points.
- the detection unit 30 may, for example, detect the measurement-use points Pt, Pm, Pb, and Ps, based on reference data for detection of measurement-use points, which are generated through machine learning.
- the reference data for detection of measurement-use points are generated through machine learning using, as training data, image data of whole fish bodies provided with measurement-use points Pt, Pm, Pb, and Ps as illustrated in FIG. 9 and are stored in the storage device 23 .
- the same fish has different degrees of bulging of the abdomen among a full stomach state (for example, immediately after feeding), an empty stomach state (for example, before feeding), and a standard state (for example, at an intermediate time point of a time interval between preset feeding timings).
- a degree of bulging of the abdomen a way of using the pelvic fin and the dorsal fin at the time of swimming, and the like, appearances of the pelvic fin of a fish in captured images differ from one another. Therefore, the reference data for detection of measurement-use points are generated through machine learning using training data that are selected in consideration of such various states.
- the reference data for detection of measurement-use points that the detection unit 30 uses may be, instead of reference data of whole fish bodies, reference data of each fish body part as illustrated in FIG. 10 .
- the reference data of each fish body part are generated through machine learning using, as training data, image data of each fish body part provided with one of the measurement-use points Pt, Pm, Pb, and Ps.
- images that are extracted in such a way that the center of the image data of each fish body part coincides with one of the measurement-use points Pt, Pm, Pb, and Ps are used as training data.
- the reference data for detection of measurement-use points with respect to each fish body part learned through machine learning using such training data have a meaning that the center position of each piece of the training data represents one of the measurement-use points Pt, Pm, Pb, and Ps.
- the detection unit 30 further has a function of, by use of the display control unit 34 , clearly indicating the positions of detected measurement-use points Pt, Pm, Pb, and Ps on the display device 26 , using, for example, marks, points, or the like.
- the measurement-use points Pt, Pm, Pb, and Ps may be clearly indicated in both the captured images 41 A and 41 B on the display device 26 or in one of the captured images 41 A and 41 B (for example, the captured image 41 B, which is the operation screen).
- the specification unit 31 has a function of specifying coordinates representing the positions in a coordinate space of measurement-use points Pt, Pm, Pb, and Ps on a fish to be measured, which are detected by the detection unit 30 .
- the specification unit 31 receives, from the detection unit 30 , display position information representing display positions at which the measurement-use points Pt, Pm, Pb, and Ps on the fish to be measured, which were detected by the detection unit 30 , are displayed in the captured images 41 A and 41 B.
- the specification unit 31 reads the interval information representing the interval between the cameras 40 A and 40 B (that is, image-capturing positions) from the storage device 23 .
- the specification unit 31 specifies (calculates) the coordinates in the coordinate space of the measurement-use points Pt, Pm, Pb, and Ps on the fish to be measured, by use of a triangulation method.
- the calculation unit 32 has a function of calculating, by use of the spatial coordinates of the measurement-use points Pm and Pt, which were specified by the specification unit 31 , of the mouth and the tail on a fish to be measured, an interval L between the measurement-use points Pm and Pt as illustrated in FIG. 11 as the fork length of the fish to be measured (length of the measurement target).
- the calculation unit 32 also has a function of calculating a body depth of the fish to be measured (length of the measurement object) in the following manner. That is, the calculation unit 32 sets a straight line S connecting the measurement-use points Pm and Pt of the mouth and the tail as illustrated in FIG. 12 as a baseline. The calculation unit 32 calculates, on a perpendicular I 1 that is perpendicular to the baseline S and passes the measurement-use point Ps, a length h 1 of a line segment PsPss between an intersection point Pss of the perpendicular I 1 and the baseline S and the measurement-use point Ps.
- the fork length L and the body depth H of a fish to be measured calculated by the calculation unit 32 as described above are stored in the storage device 23 in association with predetermined information, such as observation date and time.
- the analysis unit 33 has a function of performing predetermined analysis by use of the fork lengths L and body depths H of a plurality of fishes and information associated with the information, which are stored in the storage device 23 . For example, the analysis unit 33 calculates an average of the fork lengths L of a plurality of fishes in the fish preserve 48 at the observation date. Alternatively, the analysis unit 33 calculates an average of the fork lengths L of a specific fish that is set as an analysis target. In this case, the average of a plurality of fork lengths L of the fish to be analyzed that are calculated from images of the fish to be analyzed in a plurality of frames of a video captured for a short period of time, such as one second, is calculated.
- the analysis unit 33 may calculate a relationship between the fork lengths L of fishes in the fish preserve 48 and the number of the fishes (fish body number distribution with respect to the fork lengths L of fishes). Further, the analysis unit 33 may calculate temporal change in the fork length L of a fish, which represents a growth of the fish.
- the analysis unit 33 also has a function of calculating a weight of a fish to be measured by use of data for weight calculation that are stored in the storage device 23 in advance and the calculated fork length L and body depth H.
- the data for weight calculation are data for calculating a weight of a fish, based on the fork length L and body depth H of the fish and are, for example, provided in a form of mathematical formula.
- the data for weight calculation are data generated based on a relationship between the fork length and body depth and the weight that is acquired based on actually measured fork lengths, body depths, and weights of fishes.
- the analysis unit 33 calculates a weight of the fish to be measured, based on data for weight calculation according to the age in month or age in year of the fish to be measured and the calculated fork length L and body depth H of the fish to be measured.
- the weight of the fish to be measured which is calculated by the analysis unit 33 , and the fork length L and body depth H of the fish to be measured, which are calculated by the calculation unit 32 , are stored in the storage device 23 in association with each other and also in association with predetermined information (for example, image-capturing date and time).
- the display control unit 34 may have a function of, when, for example, the measurer inputs, by use of the input device 25 , an instruction to make the display device 26 display the calculated values, receiving the instruction, reading information to be displayed from the storage device 23 , and displaying the information on the display device 26 .
- FIG. 13 is a flowchart illustrating processing steps relating to calculation (measurement) of a fork length L, a body depth H, and a weight of a fish performed by the information processing device 10 .
- the detection unit 30 of the information processing device 10 detects a fish to be measured in captured images 41 A and 41 B (step S 101 ).
- the detection operation of the fish to be measured is detection operation based on reference data for fish body detection learned through machine learning.
- the detection operation of the fish to be measured is detection operation based on input information that is input by a measurer by use of the input device 25 .
- the detection unit 30 may detect a fish to be measured through one detection operation of the detection operation based on the reference data for fish body detection learned through machine learning and the detection operation based on the input information manually input by the measurer, the one detection operation being selected based on selection information input by the measurer by use of the input device 25 .
- the detection unit 30 After having detected the fish to be measured, the detection unit 30 detects measurement-use points Pt, Pm, Pb, and Ps on the fish to be measured (step S 102 ).
- the detection operation of the measurement-use points Pt, Pm, Pb, and Ps is detection operation based on reference data for detection of measurement-use points learned through machine learning. Alternatively, the detection operation of the measurement-use points Pt, Pm, Pb, and Ps is detection operation based on input information that is manually input by the measurer.
- the detection unit 30 may detect the measurement-use points Pt, Pm, Pb, and Ps through one detection operation of the detection operation based on the reference data for detection of measurement-use points learned through machine learning and the detection operation based on the input information manually input by the measurer, the one detection operation being selected based on selection information input by the measurer by use of the input device 25 .
- the specification unit 31 specifies the coordinates in the coordinate space of the detected measurement-use points Pt, Pm, Pb, and Ps by use of, for example, interval information between the cameras 40 A and 40 B (image-capturing positions) or the like and a triangulation method (step S 103 ).
- the calculation unit 32 calculates, based on the specified coordinates, an interval L between the measurement-use points Pm and Pt (the mouth and the tail) as the fork length of the fish to be measured.
- the calculation unit 32 sets a straight line S connecting the measurement-use points Pm and Pt as a baseline and calculates a length h 1 of a line segment PsPss perpendicular to the baseline S and a length h 2 of a line segment PbPbs perpendicular to the baseline S. Further, the calculation unit 32 calculates an added value of the calculated lengths h 1 and h 2 as the body depth H of the fish to be measured (step S 104 ).
- the analysis unit 33 calculates, by use of the calculated fork length L and body depth H of the fish to be measured and data for weight calculation that are stored in the storage device 23 , a weight of the fish to be measured (step S 105 ).
- the fork length L, body depth H, and weight of the fish to be measured which are calculated as described above, are stored in the storage device 23 in association with each other and also in association with predetermined information (for example, image-capturing date and time).
- the control device 22 of the information processing device 10 determines whether an instruction instructing the ending of the measurement operation of a fish has been input by, for example, the measurer through operation of the input device 25 (step S 106 ). When the instruction of the ending has not been input, the control device 22 repeats the operation in step S 101 and subsequent steps. When the instruction of the ending has been input, the control device 22 terminates the measurement operation of a fish.
- the information processing device 10 of the first example embodiment has a function of the detection unit 30 detecting the measurement-use points Pt, Pm, Pb, and Ps on a fish to be measured in captured images 41 A and 41 B captured by the cameras 40 A and 40 B.
- the information processing device 10 has a function of the specification unit 31 specifying coordinates in the coordinate space representing the positions of the detected measurement-use points Pt, Pm, Pb, and Ps.
- the information processing device 10 has a function of the calculation unit 32 calculating, by use of the measurement-use points Pt and Pm, a fork length L of the fish to be measured.
- the information processing device 10 has a function of the analysis unit 33 calculating a body depth H, based on a baseline S determined based on the measurement-use points Pt and Pm and the measurement-use points Pb and Ps.
- the body depth H of the fish to be measured is calculated by adding the length h 1 of a line segment PsPss perpendicular to the baseline S and the length h 2 of a line segment PbPbs perpendicular to the baseline S. Calculating the body depth H in this manner enables the information processing device 10 to increase the accuracy of the body depth H to be calculated.
- the shape of a fish to be measured is not a line-symmetric shape with respect to the baseline S determined based on the measurement-use points Pt and Pm. Therefore, a most bulging portion (the measurement-use point Pb) on the back side of the baseline S and a most bulging portion (a neighboring area of the measurement-use point Ps) on the abdomen side of the baseline S are not necessarily arranged on the same perpendicular that is perpendicular to the baseline S. Because of this situation, there is a probability that calculating a length of a line segment simply connecting the measurement-use points Pb and Ps as a body depth causes a value deviating by a large amount from the actual body depth of the fish to be calculated as the body depth.
- the analysis unit 33 separately calculates a length h 1 between the measurement-use point Ps and the baseline S and a length h 2 between the measurement-use point Pb and the baseline S and calculates an added value of the calculated lengths h 1 and h 2 as the body depth H.
- This configuration enables the accuracy of the calculated value of the body depth calculated by the analysis unit 33 to be increased.
- the analysis unit 33 calculates a weight of the fish to be measured by use of the body depth H and the fork length L the accuracy of which can be increased in this way, the accuracy of the calculated value of the weight can also be increased.
- the information processing device 10 of the first example embodiment is able to increase reliability for calculated values of the body depth H and weight of a fish to be measured.
- the information processing device 10 of the first example embodiment sometimes has a configuration in which the detection unit 30 detects a fish body to be measured and measurement-use points Pt, Pm, Pb, and Ps, based on reference data for fish body detection and reference data for detection of measurement-use points learned through machine learning. Since, in this case, the measurer does not have to perform an operation of specifying a fish to be measured and an operation of specifying measurement-use points Pt, Pm, Pb, and Ps, the information processing device 10 is able to reduce the effort of the measurer. The information processing device 10 is also able to achieve speed-up of processing of measuring the fork length L, the weight, and the like of a fish.
- a second example embodiment according to the present invention will be described below.
- the same signs are assigned to constituent components having the same names as those of constituent components constituting the information processing device and the object measurement system (fish measurement system) of the first example embodiment, and redundant descriptions of the common components will be omitted.
- An information processing device 10 and a fish measurement system of the second example embodiment include a correction unit 55 illustrated in FIG. 14 in addition to the constitution of the first example embodiment.
- the information processing device 10 of the second example embodiment includes, as with the first example embodiment, a specification unit 31 , a calculation unit 32 , an analysis unit 33 , and a display control unit 34 , illustration of the constituent components is omitted in FIG. 14 .
- illustration of a storage device 24 , an input device 25 , and a display device 26 is also omitted.
- a joint portion of the pelvic fin that is assumed to be a most bulging portion on the abdomen side of a fish is detected as a measurement-use point.
- the degree of bulging of the abdomen of a fish differs among a standard state (see a solid line A), a full stomach state (see a dotted line B), and an empty stomach state (see a dotted line C), and, in the full stomach state, a most bulging portion on the abdomen side of the fish is not a joint portion of the pelvic fin but a portion indicated by a point Ps 3 .
- a detection unit 30 detects a joint portion of the pelvic fin on a fish in the full stomach state as a measurement-use point Ps and the calculation unit 32 calculates a body depth H by use of the measurement-use point Ps
- the calculated body depth H becomes a value deviating from an actual body depth of the fish in the full stomach state. It is concerned that, when a detected fish body is in the full stomach state, a weight calculated by the analysis unit 33 becomes a value lighter than the actual weight.
- the joint portion of the pelvic fin in the standard state is a portion indicated by a point Ps 2
- the joint portion of the pelvic fin in the empty stomach state is a portion indicated by a point Ps 1 .
- the information processing device 10 of the second example embodiment includes a configuration configured in consideration of the above-described situation.
- the correction unit 55 has a function of, when the shape (bulging) of a fish to be measured deviates from that in the standard state, correcting the measurement-use point Ps on the abdomen side detected by the detection unit 30 in the following manner. That is, reference data of fish bodies in the standard state, which are generated through machine learning, are stored in a storage device 23 .
- the correction unit 55 determines, with reference to the reference data of fish bodies in the standard state, that the shape (bulging) of the abdomen of the fish body to be measured is in a state deviating from the standard state in excess of an allowable range, the correction unit 55 determines that it is required to correct the measurement-use point Ps.
- the correction unit 55 corrects the measurement-use point Ps on the abdomen side in accordance with data for correction provided in advance.
- the data for correction are data used to correct the measurement-use point Ps according to the shape (bulging) of the abdomen of a fish to be measured.
- the data for correction include data as follows.
- One type of data included in the data for correction are data for, in the case of a state in which the shape of the abdomen bulges more than in the standard state (the full stomach state), correcting the measurement-use point Ps to a most bulging portion of the abdomen.
- Another type of data included in the data for correction are data for, in the case of a state in which the shape of the abdomen is more depressed than in the standard state (the empty stomach state), correcting the measurement-use point Ps to a most depressed portion of the abdomen.
- Such data for correction are generated through, for example, machine learning.
- the correction unit 55 corrects the measurement-use point Ps to a most bulging portion of the abdomen when the detected fish body is in the full stomach state and corrects the measurement-use point Ps to a most depressed portion of the abdomen when the detected fish body is in the empty stomach state.
- the data for correction include data as follows.
- One type of data included in the data for correction are data for, in the case of a state in which the shape of the abdomen bulges more than in the standard state (the full stomach state), correcting the measurement-use point Ps to a position (for example, a middle portion) that is appropriately set between the joint portion of the pelvic fin and a most bulging portion of the abdomen.
- Another type of data included in the data for correction are data for, in the case of a state in which the shape of the abdomen is more depressed than in the standard state (the empty stomach state), correcting the measurement-use point Ps to a position (for example, a middle portion) that is appropriately set between the joint portion of the pelvic fin and a most depressed portion of the abdomen.
- Such data for correction are generated through, for example, machine learning.
- the correction unit 55 corrects the measurement-use point Ps to a position (for example, a middle portion) that is appropriately set between the joint portion of the pelvic fin and a most bulging portion of the abdomen when the detected fish body is in the full stomach state.
- the correction unit 55 corrects the measurement-use point Ps to a position (for example, a middle portion) that is appropriately set between the joint portion of the pelvic fin and a most depressed portion of the abdomen when the detected fish body is in the empty stomach state.
- the calculation unit 32 adds the length h 1 of a line segment PsPss based on the corrected measurement-use point Ps, the line segment PsPss being perpendicular to a baseline S, and the length h 2 of a line segment PbPbs based on a measurement-use point Pb, the line segment PbPbs being perpendicular to the baseline S, and calculates the added value as the body depth H.
- constituent components of the information processing device 10 of the second example embodiment other than the above-described constituent components are the same as those in the first example embodiment.
- the information processing device 10 of the second example embodiment is capable of achieving the same advantageous effects as those of the first example embodiment. Further, the information processing device 10 has a configuration to correct the measurement-use point Ps according to the shape of the abdomen of a fish. This configuration enables the information processing device 10 to calculate a body depth H and a weight of a fish in accordance with the actual state of the fish.
- a third example embodiment according to the present invention will be described below.
- the same signs are assigned to constituent components having the same names as those of constituent components constituting the information processing device and the object measurement system (fish measurement system) of the first example embodiment, and redundant descriptions of the common components will be omitted.
- An information processing device 10 and a fish measurement system of the third example embodiment include an estimation unit 56 illustrated in FIG. 16 in addition to the constitution of the first or second example embodiment.
- the information processing device 10 of the third example embodiment includes, as with the first or second example embodiment, a specification unit 31 , a calculation unit 32 , an analysis unit 33 , and a display control unit 34 , illustration of the constituent components is omitted in FIG. 16 .
- illustration of a storage device 24 , an input device 25 , and a display device 26 is also omitted.
- the information processing device 10 of the third example embodiment includes a configuration configured in consideration of such a situation in which the measurement-use point Ps is difficult to detect.
- a detection unit 30 sometimes determines that the measurement-use point Ps cannot be detected because the pelvic fin of a fish to be measured is indistinct.
- the estimation unit 56 has a function of detecting the measurement-use point Ps by estimating the position of the joint portion of the pelvic fin in accordance with a predetermined rule.
- a rule in accordance with which the position of the joint portion of the pelvic fin is estimated based on the position of the pectoral fin is acquired by use of the relationship and stored in the storage device 23 .
- the estimation unit 56 estimates, by use of such a rule and the position of the pectoral fin, the measurement-use point Ps.
- the calculation unit 32 calculates, by use of the estimated measurement-use point Ps, a length h 1 of a line segment PsPss perpendicular to a baseline S and calculates, by use of the length h 1 , a body depth H of the fish to be measured.
- the information processing device 10 of the third example embodiment is capable of achieving the same advantageous effects as those of the first or second example embodiment by including the same configuration as that of the first or second example embodiment. Further, since, by including the estimation unit 56 , the information processing device 10 of the third example embodiment is capable of reducing situations in which, because the measurement-use point Ps cannot be detected, the body depth H of a fish to be measured cannot be calculated and, further, the weight also cannot be calculated, achieving an increase in the number of measurements is facilitated.
- the rule used when the measurement-use point Ps is estimated is not limited to the afore-described rule based on a positional relationship between the pelvic fin and the pectoral fin and, when there exists a certain relationship between the position of the mouth and the position of the joint portion of the pelvic fin, the rule may be a rule acquired by use of the relationship.
- information of the position of the pectoral fin may be provided to the information processing device 10 by manual input by a measurer or may, for example, be detected by the detection unit 30 by use of reference data learned through machine learning.
- the present invention may, without being limited to the first to third example embodiments, employ various example embodiments.
- the information processing device 10 includes the analysis unit 33
- the analysis of information of a fork length L and a body depth H calculated by the calculation unit 32 may be performed by an information processing device separate from the information processing device 10 and, in this case, the analysis unit 33 may be omitted.
- the baseline S is a straight line between the measurement-use points Pm and Pt at the mouth and the tail
- the baseline S may be, for example, a curved line by which the measurement-use points Pm and Pt are connected to each other in consideration of the bulging of the fish body.
- the information processing device 10 may perform image processing to reduce turbidity of water in captured images and image processing to correct distortion of fish bodies in captured images due to trembling of water at an appropriate timing, such as a point of time before the start of detection processing performed by the detection unit 30 .
- the information processing device 10 may perform image processing to correct captured images in consideration of image-capturing conditions, such as depth in the water at which fishes are present and the brightness of water. As described above, the information processing device 10 performing image processing (image correction) on captured images in consideration of an image-capturing environment enables reliability for detection processing performed by the detection unit 30 to be increased.
- the information processing device 10 having the constitution described in the first to third example embodiments is applicable to measurement of other objects.
- the information processing device 10 having the constitution described in the first to third example embodiments is capable of, in the case where an object to be measured does not have a line-symmetric shape with respect to a baseline set to the object, appropriately calculating the length to be measured of the object in the direction orthogonal to the baseline.
- FIG. 17 a constitution of an information processing device of another example embodiment according to the present invention is illustrated in a simplified manner.
- An information processing device 60 in FIG. 17 includes, as functional units, a detection unit 61 and a calculation unit 62 .
- the detection unit 61 detects a predetermined portion of the object to be used for length measurement as a measurement-use point in each of the divided areas on both sides of the baseline in the image of the object.
- the calculation unit 62 calculates, in each of the divided areas, a length of a line segment between an intersection point of a perpendicular that passes the measurement-use point and is perpendicular to the baseline and the baseline and the measurement-use point. Further, the calculation unit 62 , by adding the lengths of the line segments each of which is calculated in one of the divided areas of the object to be measured, calculates a length to be measured on the object.
- the information processing device 60 is, by including the constitution as described above, capable of increasing the accuracy of measurement values acquired by measuring the length and the like of an object to be measured, based on a captured image.
- the information processing device 60 as described above constitutes an object measurement system 70 in conjunction with an image capturing device 71 that captures an image of an object to be measured, as illustrated in FIG. 18 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Zoology (AREA)
- Environmental Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biodiversity & Conservation Biology (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
Description
- [PTL 1] Japanese Unexamined Patent Application Publication No. 2003-250382
- [PTL 2] Japanese Unexamined Patent Application Publication No. 2013-201714
-
- 10, 60 Information processing device
- 30, 61 Detection unit
- 32, 62 Calculation unit
- 33 Analysis unit
- 55 Correction unit
- 56 Estimation unit
Claims (10)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018043236 | 2018-03-09 | ||
| JP2018-043236 | 2018-03-09 | ||
| JPJP2018-043236 | 2018-03-09 | ||
| PCT/JP2019/009045 WO2019172363A1 (en) | 2018-03-09 | 2019-03-07 | Information processing device, object measurement system, object measurement method, and program storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20200394811A1 US20200394811A1 (en) | 2020-12-17 |
| US11328439B2 true US11328439B2 (en) | 2022-05-10 |
Family
ID=67845743
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/971,078 Active US11328439B2 (en) | 2018-03-09 | 2019-03-07 | Information processing device, object measurement system, object measurement method, and program storage medium |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US11328439B2 (en) |
| JP (1) | JP7001145B2 (en) |
| WO (1) | WO2019172363A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230380399A1 (en) * | 2022-05-31 | 2023-11-30 | Navico Holding As | Livewell operation and control for a watercraft |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7310925B2 (en) * | 2019-12-25 | 2023-07-19 | 日本電気株式会社 | Movement speed estimation device, feeding determination device, movement speed estimation method, and movement speed estimation program |
| JP7480984B2 (en) * | 2020-03-24 | 2024-05-10 | 国立大学法人愛媛大学 | INDIVIDUAL DETECTION SYSTEM, IMAGING UNIT, INDIVIDUAL DETECTION METHOD, AND COMPUTER PROGRAM |
| CN111696114A (en) * | 2020-04-13 | 2020-09-22 | 浙江大学 | Method and device for identifying hunger degree of penaeus vannamei based on underwater imaging analysis |
| CN111754527B (en) * | 2020-06-24 | 2022-01-04 | 西藏自治区农牧科学院水产科学研究所 | Fish phenotype automatic extraction method based on three-dimensional scanning model |
| NO347281B1 (en) * | 2020-10-05 | 2023-08-21 | Fishency Innovation As | Generating three dimensional skeleton representations of aquatic animals using machine learning |
| JP7155375B1 (en) | 2021-10-13 | 2022-10-18 | マルハニチロ株式会社 | Measurement system, information processing device, information processing method and program |
| WO2024166354A1 (en) * | 2023-02-10 | 2024-08-15 | 日本電気株式会社 | Image analysis device, imaging system, image analysis method, and recording medium |
| NO348408B1 (en) * | 2023-05-16 | 2025-01-13 | Fiskher As | Procedure for analyzing fish |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000125319A (en) * | 1998-10-15 | 2000-04-28 | Matsushita Electric Ind Co Ltd | Digital signal processing circuit |
| JP2000215319A (en) | 1999-01-22 | 2000-08-04 | Canon Inc | Image extraction method and apparatus, and storage medium |
| JP2003250382A (en) | 2002-02-25 | 2003-09-09 | Matsushita Electric Works Ltd | Method and apparatus for monitoring the growth state of aquatic organisms |
| WO2009008733A1 (en) | 2007-07-09 | 2009-01-15 | Feed Control Norway As | Means and method for average weight determination and appetite feeding |
| JP2013201714A (en) | 2012-03-26 | 2013-10-03 | Central Research Institute Of Electric Power Industry | Moving-object image discrimination device and moving-object image discrimination method |
| WO2016092646A1 (en) | 2014-12-10 | 2016-06-16 | 株式会社ニレコ | Fish type determination device and fish type determination method |
| WO2017204660A1 (en) | 2016-05-24 | 2017-11-30 | Itecsolutions Systems & Services As | Arrangement and method for measuring the biological mass of fish, and use of the arrangement |
| US20190244346A1 (en) * | 2018-02-07 | 2019-08-08 | Analogic Corporation | Visual augmentation of regions within images |
-
2019
- 2019-03-07 US US16/971,078 patent/US11328439B2/en active Active
- 2019-03-07 JP JP2020505103A patent/JP7001145B2/en active Active
- 2019-03-07 WO PCT/JP2019/009045 patent/WO2019172363A1/en not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000125319A (en) * | 1998-10-15 | 2000-04-28 | Matsushita Electric Ind Co Ltd | Digital signal processing circuit |
| JP2000215319A (en) | 1999-01-22 | 2000-08-04 | Canon Inc | Image extraction method and apparatus, and storage medium |
| US6650778B1 (en) | 1999-01-22 | 2003-11-18 | Canon Kabushiki Kaisha | Image processing method and apparatus, and storage medium |
| JP2003250382A (en) | 2002-02-25 | 2003-09-09 | Matsushita Electric Works Ltd | Method and apparatus for monitoring the growth state of aquatic organisms |
| WO2009008733A1 (en) | 2007-07-09 | 2009-01-15 | Feed Control Norway As | Means and method for average weight determination and appetite feeding |
| JP2013201714A (en) | 2012-03-26 | 2013-10-03 | Central Research Institute Of Electric Power Industry | Moving-object image discrimination device and moving-object image discrimination method |
| WO2016092646A1 (en) | 2014-12-10 | 2016-06-16 | 株式会社ニレコ | Fish type determination device and fish type determination method |
| WO2017204660A1 (en) | 2016-05-24 | 2017-11-30 | Itecsolutions Systems & Services As | Arrangement and method for measuring the biological mass of fish, and use of the arrangement |
| US20190244346A1 (en) * | 2018-02-07 | 2019-08-08 | Analogic Corporation | Visual augmentation of regions within images |
Non-Patent Citations (3)
| Title |
|---|
| English translation of Written opinion for PCT Application No. PCT/JP2019/009045, dated May 21, 2019. |
| International Search Report for PCT Application No. PCT/JP2019/009045, dated May 21, 2019. |
| Japanese Office Communication for JP Application No. 2020-505103 dated Nov. 24, 2021 with English Translation. |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230380399A1 (en) * | 2022-05-31 | 2023-11-30 | Navico Holding As | Livewell operation and control for a watercraft |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7001145B2 (en) | 2022-01-19 |
| WO2019172363A1 (en) | 2019-09-12 |
| US20200394811A1 (en) | 2020-12-17 |
| JPWO2019172363A1 (en) | 2021-02-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11328439B2 (en) | Information processing device, object measurement system, object measurement method, and program storage medium | |
| JP7188527B2 (en) | Fish length measurement system, fish length measurement method and fish length measurement program | |
| JP6981531B2 (en) | Object identification device, object identification system, object identification method and computer program | |
| US20210072017A1 (en) | Information processing device, object measuring system, object measuring method, and program storing medium | |
| US20200027231A1 (en) | Information processing device, information processing method, and program storage medium | |
| JP6694039B1 (en) | Fish size calculator | |
| US10499808B2 (en) | Pupil detection system, gaze detection system, pupil detection method, and pupil detection program | |
| US20210227144A1 (en) | Target tracking method and device, movable platform, and storage medium | |
| EP3651457B1 (en) | Pupillary distance measurement method, wearable eye equipment and storage medium | |
| US10058237B2 (en) | Image processing device, image processing method, and program | |
| WO2019045091A1 (en) | Information processing device, counter system, counting method, and program storage medium | |
| JP2019211364A (en) | Device and method for estimating weight of body of animal | |
| JP6879375B2 (en) | Information processing equipment, length measurement system, length measurement method and computer program | |
| ES3044908T3 (en) | Weight estimation device and program | |
| JP6288770B2 (en) | Face detection method, face detection system, and face detection program | |
| WO2020022309A1 (en) | Measurement device, measurement system, measurement method, and program storage medium | |
| CN115244360A (en) | Calculation method | |
| JP2019045989A (en) | Information processing apparatus, information processing method and computer program | |
| JPWO2018061928A1 (en) | INFORMATION PROCESSING APPARATUS, COUNTING SYSTEM, COUNTING METHOD, AND COMPUTER PROGRAM | |
| JP2017072945A (en) | Image processing apparatus and image processing method | |
| US9501840B2 (en) | Information processing apparatus and clothes proposing method | |
| JP7343237B2 (en) | Tracking method | |
| EP4130690A1 (en) | Imaging device, imaging method, and program | |
| CN110836715A (en) | Moving body weight measurement system and moving body weight measurement method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: NEC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KITAGAWA, TAKEHARU;PIAO, JUN;REEL/FRAME:053545/0110 Effective date: 20200701 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |