WO2018061928A1 - Information processing device, counter system, counting method, and program storage medium - Google Patents
Information processing device, counter system, counting method, and program storage medium Download PDFInfo
- Publication number
- WO2018061928A1 WO2018061928A1 PCT/JP2017/033885 JP2017033885W WO2018061928A1 WO 2018061928 A1 WO2018061928 A1 WO 2018061928A1 JP 2017033885 W JP2017033885 W JP 2017033885W WO 2018061928 A1 WO2018061928 A1 WO 2018061928A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measured
- camera
- captured
- objects
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06M—COUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
- G06M11/00—Counting of objects distributed at random, e.g. on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
Definitions
- the present invention relates to a technique for measuring the number of objects using a photographing apparatus.
- Patent Document 1 discloses a technique related to fish observation.
- the technique in this Patent Document 1 based on a photographed image of the back side (or belly side) of a fish photographed from the upper side (or bottom side) and the lateral side of the aquarium, and a photographed image of the front side of the head, The shape and size of parts such as the head, trunk, and tail fin are estimated for each part.
- the estimation of the shape and size of each part of the fish is performed using a plurality of template images provided for each part. That is, the captured image for each part is collated with the template image for each part, and based on known information such as the size of the fish part in the template image that matches the captured image, the size for each part of the fish, etc. Is estimated.
- Patent Document 2 discloses a technique for capturing fish underwater using a video camera and a still image camera, and detecting a fish shadow based on the captured video and still image. Further, Patent Document 2 shows a configuration in which the number of appearances that appear in a photographed moving image is counted for each fish type, such that the number of appearances is one for each trajectory of fish shadow obtained from the moving image. Has been.
- Patent Document 2 shows a configuration for counting fish shadow trajectories.
- the fish shadow trajectory is temporarily interrupted due to the reason that the fish is in the shadow of another fish. Therefore, the number of appearances is two.
- the number of fish shadows counted by the configuration shown in Patent Document 2 is its trajectory and may be different from the number of fish. With the configuration of Patent Document 2, it is difficult to improve the accuracy of counting the number of fish. .
- a main object of the present invention is to provide a technique capable of increasing the accuracy of counting objects when the number of objects is measured using images captured by a photographing device.
- an information processing apparatus of the present invention provides: A detection unit for detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured; A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
- the counting system of the present invention comprises: A photographing device for photographing a photographing space in which an object to be measured exists; An information processing device that measures the number of objects to be measured in the imaging space based on images captured by the imaging device;
- the information processing apparatus includes: A detection unit that detects a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured; A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
- the counting method of the present invention is: Detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured; The number of detected characteristic parts in the captured image is measured as the number of objects to be measured.
- the program storage medium of the present invention includes: A process of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured; A computer program for causing a computer to execute a process of measuring the number of detected characteristic parts in the captured image as the number of objects to be measured is stored.
- the main object of the present invention is also achieved by the counting method of the present invention corresponding to the information processing apparatus of the present invention.
- the main object of the present invention is also achieved by an information processing apparatus of the present invention, a computer program corresponding to the counting method of the present invention, and a program storage medium for storing the computer program.
- the accuracy of object counting when the number of objects is measured using the images captured by the imaging device, the accuracy of object counting can be increased.
- FIG. 1 is a block diagram illustrating a simplified configuration of an information processing apparatus according to a first embodiment of the present invention. It is a block diagram which simplifies and represents the structure of the counting system provided with the information processing apparatus of 1st Embodiment. It is a block diagram which simplifies and represents the structure of the counting system of 2nd Embodiment which concerns on this invention. It is a figure explaining the arrangement
- FIG. 9 is a diagram illustrating an example of reference data used in the detection process by the detection unit according to the second embodiment, following FIG. 8. It is a figure explaining the detection process of the detection part in 2nd Embodiment. It is a flowchart showing the operation example which concerns on the count of the control apparatus in 2nd Embodiment. It is a figure explaining the arrangement
- FIG. 15 is a diagram illustrating an example of reference data used in the detection process by the detection unit according to the third embodiment, following FIG. 14. It is a figure showing an example of another reference data.
- FIG. 17 is a diagram illustrating an example of another reference data following FIG. 16. It is a figure explaining the arrangement
- FIG. 27 is a diagram illustrating still another example of reference data following FIG. 26.
- FIG. 1 is a block diagram showing a simplified configuration of the information processing apparatus according to the first embodiment of the present invention.
- the information processing apparatus 1 according to the first embodiment constitutes a counting system 5 together with the photographing apparatus 6.
- the photographing device 6 is arranged in a state of photographing a photographing space where an object to be measured exists.
- the information processing apparatus 1 includes a detection unit 2 and a counting unit 3 as functional units.
- the detection unit 2 has a function of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured.
- the counting unit 3 has a function of measuring the number of feature parts detected by the detection unit 2 in the captured image as the number of objects to be measured.
- the information processing apparatus 1 Since the information processing apparatus 1 according to the first embodiment detects a characteristic part of a measurement target object in a captured image and measures the number of detected characteristic parts as the number of measurement target objects, with a simple configuration, The accuracy of counting objects can be increased.
- FIG. 3 is a block diagram showing a simplified configuration of the counting system according to the second embodiment of the present invention.
- the counting system 10 in the second embodiment is a system that measures the number of fish in a ginger that is an object to be measured.
- the counting system 10 includes a camera 11 that is a photographing device and an information processing device 12.
- the camera 11 has a waterproof function and is arranged in a ginger 25 where fish 26 are cultivated as shown in FIG.
- the camera 11 is disposed at a position close to the water bottom. Further, the position of the camera 11 in the cross section parallel to the water surface of the ginger 25 is substantially in the center. The direction of the lens of the camera 11 arranged at such a position is the direction facing the water surface (upward).
- the camera 11 has a shooting range (field of view) as shown in FIG. That is, the shooting range of the camera 11 is a range in which the side of the ginger 25 can be shot in consideration of the size of the ginger 25.
- each camera 11 is supported and fixed on the metal plate 20 that is a support member as shown in FIG. 4 so that the direction of the lens is upward (the direction toward the upper side of the substrate surface of the metal plate 20). Is done.
- the metal plate 20 on which the camera 11 is supported and fixed is connected to a buoy 22 that is a floating body by a plurality of (four) ropes 21 that are linear members. Further, the end of the rope 21 opposite to the buoy 22 is connected to the weight 23.
- the buoy 22 When the arrangement structure of the camera 11 having such a configuration is thrown into the water (ginger 25), the buoy 22 floats on the water surface and the weight 23 sinks to the bottom of the water. 21 is suspended in water. Moreover, the weight 23 prevents the arrangement position of the metal plate 20 (in other words, the camera 11) from changing greatly.
- the method of disposing the camera 11 in the water by the metal plate 20, the rope 21, the buoy 22 and the weight 23 has a simple structure, and can be easily reduced in size and weight. This makes it easy to move the camera 11 to another ginger 25.
- the camera 11 disposed in the water in this way photographs the fish 26 in the ginger 25 from the ventral side.
- the fish 26 in the image captured by the camera 11 becomes an image as shown in the image diagram of FIG.
- the camera 11 is a photographing device having a function of photographing a moving image. However, the camera 11 adopts a photographing device that does not have a moving image photographing function, for example, intermittently photographs still images at set time intervals. May be.
- the calibration of the camera 11 is performed by an appropriate calibration method in consideration of the environment of the ginger 25, the type of fish to be measured, and the like. Here, the description of the calibration method is omitted.
- the method for starting the shooting by the camera 11 and the method for stopping the shooting an appropriate method considering the performance of the camera 11 and the environment of the sacrifice 25 is employed.
- the fish observer manually starts shooting before the camera 11 enters the ginger 25, and manually stops shooting after the camera 11 leaves the ginger 25.
- the camera 11 has a wireless communication function or a wired communication function
- the camera 11 is connected to an operation device that can transmit information for controlling the start and stop of shooting. Then, the start and stop of photographing of the underwater camera 11 may be controlled by the operation of the operation device by the observer.
- a photographed image photographed by the camera 11 as described above may be taken into the information processing apparatus 12 by wired communication or wireless communication, or after being stored in the portable storage medium, the information processing apparatus from the portable storage medium. 12 may be incorporated.
- the information processing apparatus 12 includes a control device 13 and a storage device 14.
- the information processing apparatus 12 is connected to, for example, an input device (for example, a keyboard or a mouse) 16 that inputs information to the information processing apparatus 12 by an operation of an observer (measurer) and a display device 17 that displays information. Yes. Further, the information processing apparatus 12 may be connected to an external storage device 15 that is separate from the information processing apparatus 12.
- the storage device 14 has a function of storing various data and computer programs (hereinafter also referred to as programs), and is realized by a storage medium such as a hard disk device or a semiconductor memory.
- the storage device 14 provided in the information processing device 12 is not limited to one, and a plurality of types of storage devices may be provided in the information processing device 12. In this case, the plurality of storage devices are collectively referred to. This will be referred to as storage device 14.
- the storage device 15 has a function of storing various data and computer programs, and is realized by a storage medium such as a hard disk device or a semiconductor memory.
- the information processing apparatus 12 When the information processing device 12 is connected to the storage device 15, appropriate information is stored in the storage device 15. In this case, the information processing apparatus 12 appropriately executes a process of writing information to and a process of reading information from the storage device 15, but the description regarding the storage device 15 is omitted in the following description.
- the image captured by the camera 11 is stored in the storage device 14 in a state associated with information related to the shooting state, such as information representing the camera that was shot and information about the shooting time.
- the control device 13 is composed of, for example, a CPU (Central Processing Unit).
- the control device 13 can have the following functions when the CPU executes a computer program stored in the storage device 14, for example. That is, the control device 13 includes a detection unit 30, a counting unit 31, and a display control unit 32 as functional units.
- the display control unit 32 has a function of controlling the display operation of the display device 17. For example, when the display control unit 32 receives a request for reproducing the captured image of the camera 11 from the input device 16, the display control unit 32 reads the captured image of the camera 11 according to the request from the storage device 14 and displays the captured image on the display device 17. To display.
- the detection unit 30 when the detection unit 30 detects that a request for measuring the number of fishes is input by operating the input device 16 of the observer during reproduction of a captured image of the camera 11, the detection unit 30 executes the following process. It has. That is, the detection unit 30 has a function of detecting a characteristic part having a predetermined characteristic in a fish that is an object to be measured in a photographed image by the camera 11. In the second embodiment, the head of the fish is set as the characteristic part. There are various methods for detecting the head of a fish, which is a characteristic part, from a photographed image of the camera 11. Here, an appropriate method is used in consideration of the processing capability of the information processing apparatus 12. For example, there are the following methods.
- a plurality of reference data having different fish orientations, distances from the camera 11 and the like as shown in FIGS. 8 and 9 are stored in the storage device 14 for the head of the type of fish to be measured.
- the reference data in FIG. 8 and FIG. 9 is an image of the head of the fish viewed from the stomach side of the fish.
- These reference data are obtained by extracting, as teacher data (teacher image), an image of a region in which a characteristic part of the head is photographed from a large number of photographed images in which the type of fish to be measured is photographed. Created by machine learning.
- the detection unit 30 uses the reference data of the characteristic part (fish head) read from the storage device 14 to detect the characteristic part in the captured image as follows. For example, the detection unit 30 moves the survey image range having a predetermined shape and size from the upper left end of the photographed image 46 as shown in FIG. 10 toward the upper right end at every set interval. The image of the portion where the survey image range is located is compared with the reference data. Then, the detection unit 30 determines the degree of matching (similarity) between the image within the survey image range and the reference data by, for example, a method used in the template matching method. The detection unit 30 detects that the image represents a characteristic part when the matching degree is equal to or higher than a threshold value (for example, 90%).
- a threshold value for example, 90%
- the degree of match is set for each set interval as described above while moving the survey image range to the right from the position lower than the upper left end in the captured image 46. Will be judged.
- the detection unit 30 determines the degree of matching between the reference data and the image within the survey image range while scanning the scan image range in the captured image, thereby measuring the object to be measured in the captured image by the camera 11. Detect the characteristic part (head) of (fish).
- the characteristic part detected in the captured image 46 is represented by a frame 48.
- the detection result by such a detection unit 30 is displayed on the display device 17 by the display control unit 32.
- the counting unit 31 has a function of measuring the number of feature parts detected by the detection unit 30 in the image taken by the camera 11 as the number of objects (fishes) to be measured. Further, the counting unit 31 has a function of storing the information on the number of objects to be measured as described above in the storage device 14 in a state in which the information is associated with the information of the measured captured image (for example, identification information of the captured image). ing.
- the counting unit 31 may have a function of correcting the count value after counting the number of characteristic parts. For example, it is assumed that there is a fish that is not counted by the counting unit 31 due to reasons such as being hidden in the shadow of another fish and not appearing in the captured image. The relationship between the number of fish that are not counted and the number of fish counted by the counting unit 31 is obtained by observation or simulation, for example, and the relationship data is stored in the storage device 14 as correction data. After counting the number of feature parts, the counting unit 31 may correct the count value with the correction data and calculate the corrected value as the number of objects to be measured.
- the counting system 10 of the second embodiment is configured as described above. Next, an example of the counting process of the information processing apparatus 12 in the second embodiment will be described based on the flowchart of FIG.
- the detection unit 30 of the information processing device 12 detects that a request for counting objects to be measured is input by an operation of the input device 16 by an observer (step S101), the image captured by the camera 11 is stored. Obtained from the device 14. And the detection part 30 detects the characteristic site
- the detection unit 30 and the counting unit 31 of the information processing device 12 detect the characteristic parts of the object to be measured in the captured image, and the number of characteristic parts is set as the number of objects to be measured. It has a function to measure. Thereby, the information processing apparatus 12 can improve the accuracy of counting objects with a simple configuration.
- the counting system 10 in the third embodiment is different from the second embodiment in the arrangement of the camera 11 that is a photographing apparatus, and has the same configuration as the counting system 10 in the second embodiment except for the above.
- FIG. 12 is a perspective view schematically showing an arrangement form of the camera 11 in the third embodiment. That is, the camera 11 is fixed to the metal plate 20 in a lateral direction such that the direction of the lens is parallel to the substrate surface of the metal plate 20 that is a support member. In other words, the camera 11 is disposed so as to be in a horizontal direction so as to be parallel to the water surface in a state where the camera 11 is thrown into the water (ginger 25). In addition, the depth position (height position) of the camera 11 in water is a position that is an intermediate portion between the water bottom and the water surface. Furthermore, the camera 11 is disposed on the peripheral edge of the ginger 25.
- FIG. 13 is a diagram illustrating the relationship between the shooting range of the camera 11 and the ginger 25 when the ginger 25 is viewed from above.
- the reference data used in the detection process by the detection unit 30 in the information processing apparatus 12 is a sample image of a fish head viewed from the side as shown in FIGS. 14 and 15.
- the feature parts of the object to be measured are detected from the captured image, and the number of the feature parts is calculated as the number of objects to be measured. The same effect as in the second embodiment can be obtained.
- the head of a fish is taken as an example as a characteristic part of an object to be measured.
- the characteristic part of the object to be measured may be a fish tail.
- the reference data used by the detection unit 30 in the detection process is, for example, a fish tail sample image viewed from the side as shown in FIGS. 16 and 17.
- the counting system 10 according to the fourth embodiment is different from the second and third embodiments in the arrangement of a camera as a photographing device, and is otherwise substantially the same as the counting system 10 according to the second or third embodiment. It has a simple configuration.
- FIG. 18 is a diagram schematically showing the arrangement of cameras in the fourth embodiment.
- the counting system 10 of the fourth embodiment includes cameras 11 and 40 that are a plurality of (two in the example of FIG. 18) imaging devices.
- the cameras 11 and 40 are disposed in the depth direction (height direction) with an interval in a state where they are put into the ginger 25 where the fish 26 is cultured.
- the camera 11 is disposed at a position close to the water bottom, and the camera 40 is disposed at a substantially intermediate portion between the water bottom and the water surface.
- the positions of the cameras 11 and 40 in the cross section parallel to the water surface of the ginger 25 are substantially in the center.
- the direction of the lenses of the cameras 11 and 40 arranged at such positions is the direction (upward) facing the water surface.
- the shooting range (field of view) of the cameras 11 and 40 is a range in which the side of the ginger 25 can be shot in consideration of the size of the ginger 25.
- the captured images of the cameras 11 and 40 arranged in this way are taken from the ventral side of the fish 26 in the ginger 25 as in the second embodiment.
- the same method as the method described in the second embodiment can be adopted as a method for arranging the cameras 11 and 40 in water. That is, the cameras 11 and 40 are disposed in water in an arrangement form using the metal plate 20, the rope 21, the buoy 22, and the weight 23. Note that the method of disposing the cameras 11 and 40 in water is not limited to the method of using the rope 21, and may be a method of using a rod-shaped member instead of the rope 21, for example.
- a plurality of cameras 11 and 40 are provided, so that images taken by the cameras 11 and 40 can be obtained.
- the detection unit 30 and the counting unit 31 in the information processing apparatus 12 execute processing for each of the obtained plurality of captured images.
- the counting unit 31 sums up the number of objects measured from the images taken by the cameras 11 and 40, and calculates the total value as the number of objects to be measured.
- the information processing apparatus 12 uses a photographed image of the camera 11 and a photographed image of the camera 40 that are photographed simultaneously. Considering this, in order to make it easy to obtain a photographed image by the camera 11 and a photographed image by the camera 40 at the same time, the camera 11, 40 also changes a mark used for time adjustment (synchronization) during photographing. It is preferable to let them shoot in common. For example, as a mark used for synchronization, light that emits light for a short time may be used by automatic control or manually by an observer, and the cameras 11 and 40 may capture the light. Or you may make it the cameras 11 and 40 capture the sound as a mark used for a synchronization with an image.
- a mark used for synchronization light that emits light for a short time may be used by automatic control or manually by an observer, and the cameras 11 and 40 may capture the light. Or you may make it the cameras 11 and 40 capture the sound as a mark used for a synchronization with an image.
- the time of the clocks built in the cameras 11 and 40 may be adjusted, and the cameras 11 and 40 may associate the time information of the built-in clocks with the captured images.
- information used for synchronization is included in or associated with the captured images of the cameras 11 and 40, so that the captured images of the cameras 11 and 40 can be easily synchronized.
- the cameras 11 and 40 are arrange
- the correction unit 34 is counted by the counting unit 31 based on the number of objects to be measured that are assumed to be counted repeatedly by the counting unit 31 in the overlapping shooting range regions of the cameras 11 and 40. A function for correcting the number of objects to be measured is provided.
- the obtained ratio of data is stored in the storage device 14 as a thinning rate (for example, 0.5 (50%)).
- a thinning rate for example, 0.5 (50%)
- the observer determines the thinning rate based on the experience of the observer based on the experiment.
- the thinning rate may be calculated based on a sample image (for example, a moving image of fish movement) obtained by machine learning using pre-captured images taken by the cameras 11 and 40 as teacher data.
- the thinning rate according to the shooting environment may be calculated.
- there are various methods for obtaining the thinning rate and the thinning rate is obtained by an appropriate method.
- the correction unit 34 calculates the correction number by multiplying the number of objects to be measured counted by the counting unit 31 in the photographed image by the camera 11 by the thinning rate stored in the storage device 14. Then, the correction unit 34 corrects the number of objects to be measured counted by the counting unit 31 by subtracting the calculated correction number from the number counted as the number of objects to be measured by the counting unit 31.
- the correction unit 34 has a function of storing the corrected information on the number of objects to be measured in the storage device 14 after correction.
- the correction unit 34 further has a function of causing the display control unit 32 to display the corrected number of objects to be measured on the display device 17.
- the counting system 10 of the fourth embodiment a plurality of cameras 11 and 40 are arranged in the depth direction. For this reason, the following effects can be acquired. For example, it is assumed that there is an object that is unclear in the image captured by the camera 11 because it is away from the camera 11. In this case, since the object is photographed by the camera 40 at a position closer to the camera 11, the image of the object can be clarified in the photographed image by the camera 40 than the photographed image by the camera 11. . Since the image on the object to be measured can be clarified in this way, the information processing apparatus 12 can increase the measurement accuracy of the object to be measured by the counting function using the captured image.
- the information processing apparatus 12 is corrected by the correction unit 34.
- the number of such overlapping objects can be corrected.
- the information processing apparatus 12 can prevent the occurrence of problems caused by using the captured images of the plurality of cameras 11 and 40.
- the counting system 10 according to the fifth embodiment is different from the fourth embodiment in the arrangement of a camera that is a photographing device, and has substantially the same configuration as that of the counting system 10 according to the fourth embodiment.
- FIG. 21 is a perspective view schematically showing an arrangement form of the camera 50 in the fifth embodiment.
- FIG. 22 is a model diagram showing the arrangement of the camera 50 as viewed from above in FIG.
- the position where the camera 50 is disposed in water (ginger 25) is a depth position Dt close to the bottom of the water, a position Db close to the water surface, and an interval between the positions Db and Dt is substantially equal.
- four cameras 50 are disposed at each depth position (height position) where the camera 50 is disposed.
- the four cameras 50 for each depth position are arranged in four different directions along the water surface.
- a plurality of cameras 50 as shown in FIG. 21 are arranged at the center of the ginger 25 as in the fourth embodiment.
- the detection unit 30 detects the characteristic part (head) of the measurement target object (fish) in the captured image of each camera 50.
- the reference data that the detection unit 30 uses in the feature part detection process is shown in FIGS. 14 and 15, for example. It is the sample image of the head of the fish seen from the side as it is.
- the counting unit 31 measures the number of characteristic parts detected in the captured image of each camera 50, sums the measured number in each captured image, and calculates the total value as the number of objects to be measured.
- an overlapping shooting range in the cameras 50 adjacent in the horizontal direction is, for example, a region W in FIG.
- the correction unit 34 corrects the number of objects to be measured calculated by the counting unit 31 in consideration of the number of objects to be measured that are redundantly measured in such overlapping imaging range regions.
- the counting system 10 of the fifth embodiment can further increase the counting accuracy of the object to be measured by increasing the number of cameras 50 arranged as compared to the fourth embodiment.
- a plurality of cameras 50 are arranged at the center of the ginger 25.
- the plurality of cameras 50 may be arranged at the peripheral edge portion (corner portion in the example of FIG. 24) of the ginger 25.
- one camera 50 is disposed at each different depth position.
- FIG. 25 is a model diagram of the relationship between the camera 50 and the ginger 25 in FIG. 24 viewed from the water surface side. As shown in FIG.
- the direction of the lens of the camera 50 at each depth position is the same in the horizontal direction toward the center of the ginger 25.
- the camera 50 may be arrange
- the image taken by the camera 50 is taken from the side of the fish 26 in the ginger 25 as in the fifth embodiment.
- the cameras 11 and 40 are arranged in two stages as shown in FIG. 18, and in the fifth embodiment, the camera 50 is arranged in four stages as shown in FIG. It is installed.
- the number of stages when the cameras are arranged in a plurality of stages may be three or five or more depending on the depth range for counting the number of fish.
- image processing for correcting the distortion of the fish body due to the fluctuation of water may be performed.
- image processing may be performed in which a captured image is corrected in consideration of imaging conditions such as the water depth and brightness of the object.
- the information processing apparatus 12 performs image processing (image correction) on the captured image in consideration of the imaging environment, so that the counting accuracy of the object to be measured can be further increased.
- the information processing apparatus 12 can obtain an effect that the number of reference data can be reduced by using the captured image that has been subjected to such image correction.
- the detection unit 30 in the information processing apparatus 12 detects a fish head as a characteristic part of an object to be measured.
- the detection unit 30 may detect a side surface portion of a fish.
- the reference data used by the detection unit 30 in the detection process is, for example, a sample image of the side surface portion of the fish as shown in FIGS.
- the reference data shown in FIGS. 14 to 17 used by the detection unit 30 is an example, and the reference data used by the detection unit 30 may include more reference data.
- the reference data for the head or tail of the fish includes data that is not subject to detection, such as data for closing out in FIG. 27 (that is, image data in which a part of the detected part is out of the shooting range). It may be.
- data representing the fish tail bend in consideration of the fish bend as shown in FIG. 27 may be included as reference data.
- the counting system 10 having the configuration described in the second to fifth embodiments is not limited to other objects. It is also applicable to the counting.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Farming Of Fish And Shellfish (AREA)
- Image Processing (AREA)
Abstract
In order to provide a feature with which it is possible to increase the accuracy of an object count when measuring the number of objects utilizing a captured image by a photograph device, an information processing device 1 is provided with a detection unit 2 and a counter unit 3. The detection unit 2 detects a feature part of an object to be measured that has a predetermined feature from a captured image in which the object to be measured is captured. The counter unit 3 measures the number of feature parts detected in the captured image by the detection unit 2 as the number of objects to be measured.
Description
本発明は、撮影装置を利用して物体の数を計測する技術に関する。
The present invention relates to a technique for measuring the number of objects using a photographing apparatus.
魚の養殖技術の向上のために、養殖している魚の成長を観測することが行われている。特許文献1には、魚の観測に関わる技術が開示されている。この特許文献1における技術では、水槽の上方側(あるいは底側)と横側から撮影された魚の背側(あるいは腹側)の撮影画像と、頭側の正面の撮影画像とに基づいて、魚の頭、胴体、尾ひれ等の部位の形状や大きさが部位毎に推定される。その魚の部位毎の形状や大きさの推定は、各部位毎に与えられている複数のテンプレート画像を利用して行われる。すなわち、各部位毎の撮影画像がそれぞれ各部位毎のテンプレート画像に照合され、撮影画像に合うテンプレート画像中の魚の部位における大きさ等の既知の情報に基づいて、魚の各部位毎の大きさ等が推定される。
In order to improve fish culture techniques, the growth of fish being cultured is being observed. Patent Document 1 discloses a technique related to fish observation. In the technique in this Patent Document 1, based on a photographed image of the back side (or belly side) of a fish photographed from the upper side (or bottom side) and the lateral side of the aquarium, and a photographed image of the front side of the head, The shape and size of parts such as the head, trunk, and tail fin are estimated for each part. The estimation of the shape and size of each part of the fish is performed using a plurality of template images provided for each part. That is, the captured image for each part is collated with the template image for each part, and based on known information such as the size of the fish part in the template image that matches the captured image, the size for each part of the fish, etc. Is estimated.
特許文献2には、水中の魚を動画カメラと静止画カメラによって撮影し、撮影された動画および静止画に基づいて、魚影を検知する技術が開示されている。また、特許文献2には、動画から得られた魚影の1つの軌跡に対して出現回数を1回とするというように、撮影動画に出現する出現回数を魚種毎に計数する構成が示されている。
Patent Document 2 discloses a technique for capturing fish underwater using a video camera and a still image camera, and detecting a fish shadow based on the captured video and still image. Further, Patent Document 2 shows a configuration in which the number of appearances that appear in a photographed moving image is counted for each fish type, such that the number of appearances is one for each trajectory of fish shadow obtained from the moving image. Has been.
特許文献2には、魚影の軌跡を計数する構成が示されている。この構成では、例えば、動画において、魚が別の魚の陰となる等の原因によって魚影の軌跡が一旦途切れ、再び、登場した場合には同じ魚であるにも拘わらず別の魚影の軌跡となるので、出現回数は2回となる。つまり、特許文献2に表されている構成により計数される魚影の数はその軌跡であり、魚の数とは異なる虞があり、特許文献2の構成では、魚の数の計数精度の向上は難しい。
Patent Document 2 shows a configuration for counting fish shadow trajectories. In this configuration, for example, in the movie, the fish shadow trajectory is temporarily interrupted due to the reason that the fish is in the shadow of another fish. Therefore, the number of appearances is two. In other words, the number of fish shadows counted by the configuration shown in Patent Document 2 is its trajectory and may be different from the number of fish. With the configuration of Patent Document 2, it is difficult to improve the accuracy of counting the number of fish. .
本発明は上記課題を解決するために考え出された。すなわち、本発明の主な目的は、撮影装置による撮影画像を利用して物体の数を計測する場合に、物体の計数の精度を高めることができる技術を提供することにある。
The present invention has been devised to solve the above problems. That is, a main object of the present invention is to provide a technique capable of increasing the accuracy of counting objects when the number of objects is measured using images captured by a photographing device.
上記目的を達成するために、本発明の情報処理装置は、
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する検知部と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数部とを備える。 In order to achieve the above object, an information processing apparatus of the present invention provides:
A detection unit for detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する検知部と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数部とを備える。 In order to achieve the above object, an information processing apparatus of the present invention provides:
A detection unit for detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
本発明の計数システムは、
計測対象の物体が存在している撮影空間を撮影する撮影装置と、
前記撮影装置による撮影画像に基づいて、前記撮影空間内における計測対象の物体の数を計測する情報処理装置と
を備え、
前記情報処理装置は、
前記計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定めた特徴を持つ特徴部位を検知する検知部と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数部とを備える。 The counting system of the present invention comprises:
A photographing device for photographing a photographing space in which an object to be measured exists;
An information processing device that measures the number of objects to be measured in the imaging space based on images captured by the imaging device;
The information processing apparatus includes:
A detection unit that detects a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
計測対象の物体が存在している撮影空間を撮影する撮影装置と、
前記撮影装置による撮影画像に基づいて、前記撮影空間内における計測対象の物体の数を計測する情報処理装置と
を備え、
前記情報処理装置は、
前記計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定めた特徴を持つ特徴部位を検知する検知部と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数部とを備える。 The counting system of the present invention comprises:
A photographing device for photographing a photographing space in which an object to be measured exists;
An information processing device that measures the number of objects to be measured in the imaging space based on images captured by the imaging device;
The information processing apparatus includes:
A detection unit that detects a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting unit that measures the number of detected characteristic parts in the captured image as the number of objects to be measured.
本発明の計数方法は、
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知し、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する。 The counting method of the present invention is:
Detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
The number of detected characteristic parts in the captured image is measured as the number of objects to be measured.
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知し、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する。 The counting method of the present invention is:
Detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
The number of detected characteristic parts in the captured image is measured as the number of objects to be measured.
本発明のプログラム記憶媒体は、
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する処理と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する処理と
をコンピュータに実行させるコンピュータプログラムを記憶する。 The program storage medium of the present invention includes:
A process of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A computer program for causing a computer to execute a process of measuring the number of detected characteristic parts in the captured image as the number of objects to be measured is stored.
計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する処理と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する処理と
をコンピュータに実行させるコンピュータプログラムを記憶する。 The program storage medium of the present invention includes:
A process of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A computer program for causing a computer to execute a process of measuring the number of detected characteristic parts in the captured image as the number of objects to be measured is stored.
なお、本発明の上記主な目的は、本発明の情報処理装置に対応する本発明の計数方法によっても達成される。また、本発明の上記主な目的は、本発明の情報処理装置、本発明の計数方法に対応するコンピュータプログラムおよびそれを記憶するプログラム記憶媒体によっても達成される。
The main object of the present invention is also achieved by the counting method of the present invention corresponding to the information processing apparatus of the present invention. The main object of the present invention is also achieved by an information processing apparatus of the present invention, a computer program corresponding to the counting method of the present invention, and a program storage medium for storing the computer program.
本発明によれば、撮影装置による撮影画像を利用して物体の数を計測する場合に、物体の計数の精度を高めることができる。
According to the present invention, when the number of objects is measured using the images captured by the imaging device, the accuracy of object counting can be increased.
以下に、本発明に係る実施形態を図面を参照しながら説明する。
Embodiments according to the present invention will be described below with reference to the drawings.
<第1実施形態>
図1は、本発明に係る第1実施形態の情報処理装置の構成を簡略化して表すブロック図である。第1実施形態の情報処理装置1は、図2に表されるように、撮影装置6と共に、計数システム5を構成する。撮影装置6は、計測対象の物体が存在している撮影空間を撮影する状態に配設されている。 <First Embodiment>
FIG. 1 is a block diagram showing a simplified configuration of the information processing apparatus according to the first embodiment of the present invention. As shown in FIG. 2, theinformation processing apparatus 1 according to the first embodiment constitutes a counting system 5 together with the photographing apparatus 6. The photographing device 6 is arranged in a state of photographing a photographing space where an object to be measured exists.
図1は、本発明に係る第1実施形態の情報処理装置の構成を簡略化して表すブロック図である。第1実施形態の情報処理装置1は、図2に表されるように、撮影装置6と共に、計数システム5を構成する。撮影装置6は、計測対象の物体が存在している撮影空間を撮影する状態に配設されている。 <First Embodiment>
FIG. 1 is a block diagram showing a simplified configuration of the information processing apparatus according to the first embodiment of the present invention. As shown in FIG. 2, the
情報処理装置1は、機能部として、検知部2と、計数部3とを備える。検知部2は、計測対象の物体が撮影されている撮影画像から計測対象の物体における予め定められた特徴を持つ特徴部位を検知する機能を備える。計数部3は、撮影画像において検知部2が検知した特徴部位の数を計測対象の物体の数として計測する機能を備える。
The information processing apparatus 1 includes a detection unit 2 and a counting unit 3 as functional units. The detection unit 2 has a function of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured. The counting unit 3 has a function of measuring the number of feature parts detected by the detection unit 2 in the captured image as the number of objects to be measured.
第1実施形態の情報処理装置1は、撮影画像において、計測対象の物体における特徴部位を検知し、当該検知した特徴部位の数を計測対象の物体の数として計測するので、簡単な構成で、物体の計数の精度を高めることができる。
Since the information processing apparatus 1 according to the first embodiment detects a characteristic part of a measurement target object in a captured image and measures the number of detected characteristic parts as the number of measurement target objects, with a simple configuration, The accuracy of counting objects can be increased.
<第2実施形態>
以下に、本発明に係る第2実施形態を説明する。 Second Embodiment
The second embodiment according to the present invention will be described below.
以下に、本発明に係る第2実施形態を説明する。 Second Embodiment
The second embodiment according to the present invention will be described below.
図3は、本発明に係る第2実施形態の計数システムの構成を簡略化して表すブロック図である。第2実施形態における計数システム10は、計測対象の物体である生簀内の魚の数を計測するシステムである。この計数システム10は、撮影装置であるカメラ11と、情報処理装置12とを備えている。
FIG. 3 is a block diagram showing a simplified configuration of the counting system according to the second embodiment of the present invention. The counting system 10 in the second embodiment is a system that measures the number of fish in a ginger that is an object to be measured. The counting system 10 includes a camera 11 that is a photographing device and an information processing device 12.
カメラ11は、防水機能を持ち、図5に表されるように、魚26が養殖されている生簀25内に配設される。第2実施形態では、カメラ11は、水底に近い位置に配設されている。また、生簀25の水面に平行な断面におけるカメラ11の位置については、ほぼ中央となっている。このような位置に配置されるカメラ11のレンズの向きは、水面を向く向き(上向き)となっている。また、カメラ11は、図6に表されるような撮影範囲(視野)を持っている。つまり、カメラ11の撮影範囲は、生簀25の大きさを考慮し、生簀25の側面側も撮影できる範囲となっている。
The camera 11 has a waterproof function and is arranged in a ginger 25 where fish 26 are cultivated as shown in FIG. In the second embodiment, the camera 11 is disposed at a position close to the water bottom. Further, the position of the camera 11 in the cross section parallel to the water surface of the ginger 25 is substantially in the center. The direction of the lens of the camera 11 arranged at such a position is the direction facing the water surface (upward). The camera 11 has a shooting range (field of view) as shown in FIG. That is, the shooting range of the camera 11 is a range in which the side of the ginger 25 can be shot in consideration of the size of the ginger 25.
カメラ11を水中に配設する手法は特に限定されないが、その一例を次に述べる。すなわち、カメラ11は、それぞれ、図4に表されるような支持部材である金属板20に、レンズの向きが上向き(金属板20の基板面の上方側を向く向き)となるように支持固定される。カメラ11が支持固定される金属板20は、線条材である複数(4本)のロープ21により浮体であるブイ22に接続されている。また、各ロープ21におけるブイ22の反対側の端部側は錘23に接続されている。
The method for arranging the camera 11 in water is not particularly limited, but an example thereof will be described below. That is, each camera 11 is supported and fixed on the metal plate 20 that is a support member as shown in FIG. 4 so that the direction of the lens is upward (the direction toward the upper side of the substrate surface of the metal plate 20). Is done. The metal plate 20 on which the camera 11 is supported and fixed is connected to a buoy 22 that is a floating body by a plurality of (four) ropes 21 that are linear members. Further, the end of the rope 21 opposite to the buoy 22 is connected to the weight 23.
このような構成を持つカメラ11の配設構造体が水中(生簀25)に投入されることにより、ブイ22は水面に浮き、錘23は水底側に沈み、これにより、金属板20は、ロープ21によって水中に吊下げられている形態となる。また、錘23が金属板20(換言すればカメラ11)の配設位置が大きく変動してしまうことを防止する。このように、金属板20とロープ21とブイ22と錘23によってカメラ11を水中に配設する手法は、構造が簡易であり、また、コンパクト化および軽量化が容易である。このことから、カメラ11を別の生簀25に移動させることが容易となる。
When the arrangement structure of the camera 11 having such a configuration is thrown into the water (ginger 25), the buoy 22 floats on the water surface and the weight 23 sinks to the bottom of the water. 21 is suspended in water. Moreover, the weight 23 prevents the arrangement position of the metal plate 20 (in other words, the camera 11) from changing greatly. Thus, the method of disposing the camera 11 in the water by the metal plate 20, the rope 21, the buoy 22 and the weight 23 has a simple structure, and can be easily reduced in size and weight. This makes it easy to move the camera 11 to another ginger 25.
このように水中に配設されるカメラ11は、生簀25内の魚26を腹側から撮影することになる。例えば、カメラ11による撮影画像中の魚26は、図7のイメージ図に表されるような像となる。
The camera 11 disposed in the water in this way photographs the fish 26 in the ginger 25 from the ventral side. For example, the fish 26 in the image captured by the camera 11 becomes an image as shown in the image diagram of FIG.
なお、カメラ11は、動画を撮影する機能を備えている撮影装置であるが、動画撮影機能を持たずに例えば静止画を設定の時間間隔毎に断続的に撮影する撮影装置をカメラ11として採用してもよい。また、カメラ11のキャリブレーションは、生簀25の環境や計測対象の魚の種類等を考慮した適宜なキャリブレーション手法によって行われる。ここでは、そのキャリブレーション手法の説明は省略する。
The camera 11 is a photographing device having a function of photographing a moving image. However, the camera 11 adopts a photographing device that does not have a moving image photographing function, for example, intermittently photographs still images at set time intervals. May be. The calibration of the camera 11 is performed by an appropriate calibration method in consideration of the environment of the ginger 25, the type of fish to be measured, and the like. Here, the description of the calibration method is omitted.
さらに、カメラ11による撮影を開始する手法および撮影を停止する手法は、カメラ11の性能や生簀25の環境などを考慮した適宜な手法が採用される。例えば、魚の観測者が、カメラ11を生簀25に進入させる前に手動により撮影を開始させ、また、カメラ11を生簀25から退出させた後に手動により撮影を停止させる。また、カメラ11が無線通信あるいは有線通信の機能を備えている場合には、撮影開始と撮影停止を制御する情報を送信できる操作装置と、カメラ11とを接続する。そして、観測者による操作装置の操作により、水中のカメラ11の撮影開始と撮影停止が制御されてもよい。
Furthermore, as the method for starting the shooting by the camera 11 and the method for stopping the shooting, an appropriate method considering the performance of the camera 11 and the environment of the sacrifice 25 is employed. For example, the fish observer manually starts shooting before the camera 11 enters the ginger 25, and manually stops shooting after the camera 11 leaves the ginger 25. When the camera 11 has a wireless communication function or a wired communication function, the camera 11 is connected to an operation device that can transmit information for controlling the start and stop of shooting. Then, the start and stop of photographing of the underwater camera 11 may be controlled by the operation of the operation device by the observer.
上述したようなカメラ11により撮影された撮影画像は、有線通信あるいは無線通信によって情報処理装置12に取り込まれてもよいし、可搬型記憶媒体に格納された後に当該可搬型記憶媒体から情報処理装置12に取り込まれてもよい。
A photographed image photographed by the camera 11 as described above may be taken into the information processing apparatus 12 by wired communication or wireless communication, or after being stored in the portable storage medium, the information processing apparatus from the portable storage medium. 12 may be incorporated.
情報処理装置12は、図3に表されるように、概略すると、制御装置13と、記憶装置14とを備えている。また、情報処理装置12は、例えば観測者(計測者)の操作により情報を情報処理装置12に入力する入力装置(例えば、キーボードやマウス)16と、情報を表示する表示装置17に接続されている。さらに、情報処理装置12は、当該情報処理装置12とは別体の外付けの記憶装置15に接続されていてもよい。
As illustrated in FIG. 3, the information processing apparatus 12 includes a control device 13 and a storage device 14. The information processing apparatus 12 is connected to, for example, an input device (for example, a keyboard or a mouse) 16 that inputs information to the information processing apparatus 12 by an operation of an observer (measurer) and a display device 17 that displays information. Yes. Further, the information processing apparatus 12 may be connected to an external storage device 15 that is separate from the information processing apparatus 12.
記憶装置14は、各種データやコンピュータプログラム(以下、プログラムとも記す)を記憶する機能を有し、例えば、ハードディスク装置や半導体メモリ等の記憶媒体により実現される。情報処理装置12に備えられる記憶装置14は一つには限定されず、複数種の記憶装置が情報処理装置12に備えられていてもよく、この場合には、複数の記憶装置を総称して記憶装置14と記す。また、記憶装置15も、記憶装置14と同様に、各種データやコンピュータプログラムを記憶する機能を有し、例えば、ハードディスク装置や半導体メモリ等の記憶媒体により実現される。なお、情報処理装置12が記憶装置15に接続されている場合には、記憶装置15には適宜な情報が格納される。また、この場合には、情報処理装置12は、適宜、記憶装置15に情報を書き込む処理および読み出す処理を実行するが、以下の説明では、記憶装置15に関する説明を省略する。
The storage device 14 has a function of storing various data and computer programs (hereinafter also referred to as programs), and is realized by a storage medium such as a hard disk device or a semiconductor memory. The storage device 14 provided in the information processing device 12 is not limited to one, and a plurality of types of storage devices may be provided in the information processing device 12. In this case, the plurality of storage devices are collectively referred to. This will be referred to as storage device 14. Similarly to the storage device 14, the storage device 15 has a function of storing various data and computer programs, and is realized by a storage medium such as a hard disk device or a semiconductor memory. When the information processing device 12 is connected to the storage device 15, appropriate information is stored in the storage device 15. In this case, the information processing apparatus 12 appropriately executes a process of writing information to and a process of reading information from the storage device 15, but the description regarding the storage device 15 is omitted in the following description.
第2実施形態では、記憶装置14には、カメラ11による撮影画像が、撮影したカメラを表す情報や、撮影時間の情報などの撮影状況に関わる情報と関連付けられた状態で格納される。
In the second embodiment, the image captured by the camera 11 is stored in the storage device 14 in a state associated with information related to the shooting state, such as information representing the camera that was shot and information about the shooting time.
制御装置13は、例えば、CPU(Central Processing Unit)により構成される。制御装置13は、例えばCPUが記憶装置14に格納されているコンピュータプログラムを実行することにより、次のような機能を有することができる。すなわち、制御装置13は、機能部として、検知部30と、計数部31と、表示制御部32とを備えている。
The control device 13 is composed of, for example, a CPU (Central Processing Unit). The control device 13 can have the following functions when the CPU executes a computer program stored in the storage device 14, for example. That is, the control device 13 includes a detection unit 30, a counting unit 31, and a display control unit 32 as functional units.
表示制御部32は、表示装置17の表示動作を制御する機能を備えている。例えば、表示制御部32は、入力装置16から、カメラ11の撮影画像を再生する要求を受け取った場合に、記憶装置14から要求に応じたカメラ11の撮影画像を読み出し当該撮影画像を表示装置17に表示する。
The display control unit 32 has a function of controlling the display operation of the display device 17. For example, when the display control unit 32 receives a request for reproducing the captured image of the camera 11 from the input device 16, the display control unit 32 reads the captured image of the camera 11 according to the request from the storage device 14 and displays the captured image on the display device 17. To display.
検知部30は、例えば、カメラ11の撮影画像の再生中に、観測者の入力装置16の操作によって魚の数を計測する要求が入力されたことを検知すると、次のような処理を実行する機能を備えている。すなわち、検知部30は、カメラ11による撮影画像において、計測対象の物体である魚における予め定められた特徴を持つ特徴部位を検知する機能を備えている。第2実施形態では、魚の頭が特徴部位として設定されている。カメラ11の撮影画像から特徴部位である魚の頭を検知する手法には様々な手法があり、ここでは、情報処理装置12の処理能力等を考慮した適宜な手法が採用されるが、その一例を挙げると、次のような手法がある。
For example, when the detection unit 30 detects that a request for measuring the number of fishes is input by operating the input device 16 of the observer during reproduction of a captured image of the camera 11, the detection unit 30 executes the following process. It has. That is, the detection unit 30 has a function of detecting a characteristic part having a predetermined characteristic in a fish that is an object to be measured in a photographed image by the camera 11. In the second embodiment, the head of the fish is set as the characteristic part. There are various methods for detecting the head of a fish, which is a characteristic part, from a photographed image of the camera 11. Here, an appropriate method is used in consideration of the processing capability of the information processing apparatus 12. For example, there are the following methods.
例えば、計測対象となる種類の魚の頭について、図8および図9に表されるような、魚の向きやカメラ11からの距離等が異なる複数の参考データ(参考部位画像)が記憶装置14に格納されている。なお、図8および図9における参考データは、魚の腹側から見た魚の頭の画像である。これら参考データは、計測対象となる種類の魚が撮影されている多数の撮影画像から、頭の特徴部位が撮影されている領域の画像が教師データ(教師画像)として抽出され、当該教師データを利用した機械学習により作成される。
For example, a plurality of reference data (reference part images) having different fish orientations, distances from the camera 11 and the like as shown in FIGS. 8 and 9 are stored in the storage device 14 for the head of the type of fish to be measured. Has been. The reference data in FIG. 8 and FIG. 9 is an image of the head of the fish viewed from the stomach side of the fish. These reference data are obtained by extracting, as teacher data (teacher image), an image of a region in which a characteristic part of the head is photographed from a large number of photographed images in which the type of fish to be measured is photographed. Created by machine learning.
検知部30は、記憶装置14から読み出した特徴部位(魚の頭)の参考データを利用して、次のように、撮影画像における特徴部位を検知する。例えば、検知部30は、図10に表されるような撮影画像46の左上端から予め定められた形状および大きさを持つ調査画像範囲を右上端に向けて移動させながら、設定の間隔毎に調査画像範囲が位置している部分の画像と、参考データとを比較する。そして、検知部30は、調査画像範囲内の画像と参考データとのマッチ度(類似度)を例えばテンプレートマッチング手法で利用される手法により判定する。検知部30は、マッチ度が閾値(例えば90%)以上である場合に、その画像は特徴部位を表していると検知する。
The detection unit 30 uses the reference data of the characteristic part (fish head) read from the storage device 14 to detect the characteristic part in the captured image as follows. For example, the detection unit 30 moves the survey image range having a predetermined shape and size from the upper left end of the photographed image 46 as shown in FIG. 10 toward the upper right end at every set interval. The image of the portion where the survey image range is located is compared with the reference data. Then, the detection unit 30 determines the degree of matching (similarity) between the image within the survey image range and the reference data by, for example, a method used in the template matching method. The detection unit 30 detects that the image represents a characteristic part when the matching degree is equal to or higher than a threshold value (for example, 90%).
検知部30は、調査画像範囲を右上端まで移動させると、撮影画像46における左上端よりも設定間隔下げた位置から調査画像範囲を右側に移動させながら、上記同様に、設定間隔毎にマッチ度を判定していく。
When the detection unit 30 moves the survey image range to the upper right end, the degree of match is set for each set interval as described above while moving the survey image range to the right from the position lower than the upper left end in the captured image 46. Will be judged.
検知部30は、このように撮影画像において調査画像範囲をスキャン(走査)しながら参考データと調査画像範囲内の画像とのマッチ度を判定することにより、カメラ11による撮影画像において計測対象の物体(魚)の特徴部位(頭)を検知する。図10の例では、撮影画像46において検知された特徴部位が枠48により表されている。例えば、このような検知部30による検知結果は、表示制御部32によって表示装置17に表示される。
In this way, the detection unit 30 determines the degree of matching between the reference data and the image within the survey image range while scanning the scan image range in the captured image, thereby measuring the object to be measured in the captured image by the camera 11. Detect the characteristic part (head) of (fish). In the example of FIG. 10, the characteristic part detected in the captured image 46 is represented by a frame 48. For example, the detection result by such a detection unit 30 is displayed on the display device 17 by the display control unit 32.
計数部31は、カメラ11による撮影画像において検知部30により検知された特徴部位の数を計測対象の物体(魚)の数として計測する機能を備えている。また、計数部31は、そのように計数した計測対象の物体の数の情報を、計測した撮影画像の情報(例えば撮影画像の識別情報)に関連付けた状態で記憶装置14に格納する機能を備えている。
The counting unit 31 has a function of measuring the number of feature parts detected by the detection unit 30 in the image taken by the camera 11 as the number of objects (fishes) to be measured. Further, the counting unit 31 has a function of storing the information on the number of objects to be measured as described above in the storage device 14 in a state in which the information is associated with the information of the measured captured image (for example, identification information of the captured image). ing.
なお、計数部31は、特徴部位の数を計数した後に、その計数値を補正する機能を備えていてもよい。例えば、別の魚の陰に隠れたために撮影画像に映らなかった等の原因により計数部31により計数されない魚が有ると想定される。このような計数されない魚の数と、計数部31により計数される魚の数との関係が例えば観測やシミュレーションにより求められ、当該関係データが補正用データとして記憶装置14に格納される。計数部31は、特徴部位の数を計数した後に、当該計数値を、その補正用データにより補正し、補正後の値を計測対象の物体の数として算出してもよい。
Note that the counting unit 31 may have a function of correcting the count value after counting the number of characteristic parts. For example, it is assumed that there is a fish that is not counted by the counting unit 31 due to reasons such as being hidden in the shadow of another fish and not appearing in the captured image. The relationship between the number of fish that are not counted and the number of fish counted by the counting unit 31 is obtained by observation or simulation, for example, and the relationship data is stored in the storage device 14 as correction data. After counting the number of feature parts, the counting unit 31 may correct the count value with the correction data and calculate the corrected value as the number of objects to be measured.
第2実施形態の計数システム10は上記のように構成されている。次に、第2実施形態における情報処理装置12の計数処理の一例を図11のフローチャートに基づいて説明する。
The counting system 10 of the second embodiment is configured as described above. Next, an example of the counting process of the information processing apparatus 12 in the second embodiment will be described based on the flowchart of FIG.
例えば、情報処理装置12の検知部30は、観測者による入力装置16の操作により、計測対象の物体を計数する要求が入力されたことを検知すると(ステップS101)、カメラ11による撮影画像を記憶装置14から取得する。そして、検知部30は、取得したカメラ11による撮影画像において、記憶装置14に格納されている参考データを利用して、計測対象の物体における特徴部位を検知する(ステップS102)。その後、計数部31が、検知部30により検知された撮影画像における特徴部位の数を計測する(ステップS103)。そして、計数部31は、例えば、計測した物体の数を表示制御部32によって表示装置17に表示させる。情報処理装置12は、このような処理により、撮影画像から計測対象の物体の数を計測し、出力することができる。
For example, when the detection unit 30 of the information processing device 12 detects that a request for counting objects to be measured is input by an operation of the input device 16 by an observer (step S101), the image captured by the camera 11 is stored. Obtained from the device 14. And the detection part 30 detects the characteristic site | part in the object of a measurement object using the reference data stored in the memory | storage device 14 in the acquired image by the camera 11 (step S102). Thereafter, the counting unit 31 measures the number of characteristic parts in the captured image detected by the detection unit 30 (step S103). For example, the counting unit 31 causes the display control unit 32 to display the measured number of objects on the display device 17. The information processing apparatus 12 can measure and output the number of objects to be measured from the captured image by such processing.
第2実施形態の計数システム10は、情報処理装置12の検知部30および計数部31によって、撮影画像において計測対象の物体の特徴部位を検知し、特徴部位の数を計測対象の物体の数として計測する機能を備えている。これにより、情報処理装置12は、簡単な構成で、物体の計数の精度を高めることができる。
In the counting system 10 of the second embodiment, the detection unit 30 and the counting unit 31 of the information processing device 12 detect the characteristic parts of the object to be measured in the captured image, and the number of characteristic parts is set as the number of objects to be measured. It has a function to measure. Thereby, the information processing apparatus 12 can improve the accuracy of counting objects with a simple configuration.
<第3実施形態>
以下に、本発明に係る第3実施形態を説明する。なお、第3実施形態の説明において、第2実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Third Embodiment>
The third embodiment according to the present invention will be described below. In the description of the third embodiment, the same components as those constituting the counting system in the second embodiment are denoted by the same reference numerals, and overlapping description of the common portions is omitted.
以下に、本発明に係る第3実施形態を説明する。なお、第3実施形態の説明において、第2実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Third Embodiment>
The third embodiment according to the present invention will be described below. In the description of the third embodiment, the same components as those constituting the counting system in the second embodiment are denoted by the same reference numerals, and overlapping description of the common portions is omitted.
第3実施形態における計数システム10は、撮影装置であるカメラ11の配設形態が第2実施形態と異なり、それ以外は、第2実施形態の計数システム10とほぼ同様な構成を備えている。
The counting system 10 in the third embodiment is different from the second embodiment in the arrangement of the camera 11 that is a photographing apparatus, and has the same configuration as the counting system 10 in the second embodiment except for the above.
すなわち、図12は、第3実施形態におけるカメラ11の配設形態を模式的に表す斜視図である。つまり、カメラ11は、レンズの向きが支持部材である金属板20の基板面に平行となるような横向きで金属板20に固定されている。換言すれば、カメラ11は、水中(生簀25)に投入された状態で、水面に平行となるような横向きとなるように配設される。また、水中におけるカメラ11の深さ位置(高さ位置)は、水底と水面との間の中間部となる位置である。さらに、カメラ11は、生簀25の周縁部に配置される。図13は、生簀25を上方側から見た場合におけるカメラ11の撮影範囲と生簀25との関係を表す図である。
That is, FIG. 12 is a perspective view schematically showing an arrangement form of the camera 11 in the third embodiment. That is, the camera 11 is fixed to the metal plate 20 in a lateral direction such that the direction of the lens is parallel to the substrate surface of the metal plate 20 that is a support member. In other words, the camera 11 is disposed so as to be in a horizontal direction so as to be parallel to the water surface in a state where the camera 11 is thrown into the water (ginger 25). In addition, the depth position (height position) of the camera 11 in water is a position that is an intermediate portion between the water bottom and the water surface. Furthermore, the camera 11 is disposed on the peripheral edge of the ginger 25. FIG. 13 is a diagram illustrating the relationship between the shooting range of the camera 11 and the ginger 25 when the ginger 25 is viewed from above.
カメラ11は、上記のように配設されることにより、生簀25内の魚を横側から撮影することになる。このため、情報処理装置12における検知部30が検知処理で利用する参考データは、図14および図15に表されるような横側から見た魚の頭のサンプル画像である。
When the camera 11 is arranged as described above, the fish in the ginger 25 is photographed from the side. Therefore, the reference data used in the detection process by the detection unit 30 in the information processing apparatus 12 is a sample image of a fish head viewed from the side as shown in FIGS. 14 and 15.
第3実施形態の計数システム10においても、第2実施形態と同様に、撮影画像から計測対象の物体の特徴部位を検知し、当該特徴部位の数を計測対象の物体の数として算出するので、第2実施形態と同様の効果を得ることができる。
Also in the counting system 10 of the third embodiment, as in the second embodiment, the feature parts of the object to be measured are detected from the captured image, and the number of the feature parts is calculated as the number of objects to be measured. The same effect as in the second embodiment can be obtained.
なお、第3実施形態では、計測対象の物体の特徴部位として魚の頭が例に挙げられている。これに代えて、例えば、計測対象の物体の特徴部位は、魚の尾であってもよい。この場合には、検知部30が検知処理で利用する参考データは、例えば、図16および図17に表されるような横側から見た魚の尾のサンプル画像となる。
In the third embodiment, the head of a fish is taken as an example as a characteristic part of an object to be measured. Instead, for example, the characteristic part of the object to be measured may be a fish tail. In this case, the reference data used by the detection unit 30 in the detection process is, for example, a fish tail sample image viewed from the side as shown in FIGS. 16 and 17.
<第4実施形態>
以下に、本発明に係る第4実施形態を説明する。なお、第4実施形態の説明において、第2や第3の実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Fourth embodiment>
The fourth embodiment according to the present invention will be described below. In the description of the fourth embodiment, the same components as those constituting the counting system in the second and third embodiments are denoted by the same reference numerals, and the duplicate description of the common portions is omitted.
以下に、本発明に係る第4実施形態を説明する。なお、第4実施形態の説明において、第2や第3の実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Fourth embodiment>
The fourth embodiment according to the present invention will be described below. In the description of the fourth embodiment, the same components as those constituting the counting system in the second and third embodiments are denoted by the same reference numerals, and the duplicate description of the common portions is omitted.
第4実施形態における計数システム10は、撮影装置であるカメラの配設形態が第2や第3の実施形態と異なり、それ以外は、第2あるいは第3の実施形態の計数システム10とほぼ同様な構成を備えている。
The counting system 10 according to the fourth embodiment is different from the second and third embodiments in the arrangement of a camera as a photographing device, and is otherwise substantially the same as the counting system 10 according to the second or third embodiment. It has a simple configuration.
すなわち、図18は、第4実施形態におけるカメラの配設形態を模式的に表す図である。第4実施形態の計数システム10は、複数(図18の例では2台)の撮影装置であるカメラ11,40を備えている。カメラ11,40は、魚26が養殖されている生簀25内に投入された状態において、深さ方向(高さ方向)に間隔を介して配設される。第4実施形態では、カメラ11,40の深さ方向の位置については、カメラ11は、水底に近い位置に配設され、カメラ40は、水底と水面とのほぼ中間部に配設されている。また、生簀25の水面に平行な断面におけるカメラ11,40の位置については、ほぼ中央となっている。このような位置に配置されるカメラ11,40のレンズの向きは、水面を向く向き(上向き)となっている。また、カメラ11,40の撮影範囲(視野)は生簀25の大きさを考慮し、生簀25の側面側も撮影できる範囲となっている。このように配設されるカメラ11,40の撮影画像は、第2実施形態と同様に、生簀25内の魚26を腹側から撮影することになる。
That is, FIG. 18 is a diagram schematically showing the arrangement of cameras in the fourth embodiment. The counting system 10 of the fourth embodiment includes cameras 11 and 40 that are a plurality of (two in the example of FIG. 18) imaging devices. The cameras 11 and 40 are disposed in the depth direction (height direction) with an interval in a state where they are put into the ginger 25 where the fish 26 is cultured. In the fourth embodiment, with respect to the position of the cameras 11 and 40 in the depth direction, the camera 11 is disposed at a position close to the water bottom, and the camera 40 is disposed at a substantially intermediate portion between the water bottom and the water surface. . Further, the positions of the cameras 11 and 40 in the cross section parallel to the water surface of the ginger 25 are substantially in the center. The direction of the lenses of the cameras 11 and 40 arranged at such positions is the direction (upward) facing the water surface. The shooting range (field of view) of the cameras 11 and 40 is a range in which the side of the ginger 25 can be shot in consideration of the size of the ginger 25. The captured images of the cameras 11 and 40 arranged in this way are taken from the ventral side of the fish 26 in the ginger 25 as in the second embodiment.
カメラ11,40を水中に配設する手法も、第2実施形態で述べた手法と同様の手法を採用することができる。つまり、カメラ11,40は、金属板20とロープ21とブイ22と錘23を利用する配設形態でもって、水中に配設される。なお、カメラ11,40を水中に配設する手法は、ロープ21を利用する手法に限定されず、例えば、ロープ21に代えて棒状部材が用いられる手法であってもよい。
The same method as the method described in the second embodiment can be adopted as a method for arranging the cameras 11 and 40 in water. That is, the cameras 11 and 40 are disposed in water in an arrangement form using the metal plate 20, the rope 21, the buoy 22, and the weight 23. Note that the method of disposing the cameras 11 and 40 in water is not limited to the method of using the rope 21, and may be a method of using a rod-shaped member instead of the rope 21, for example.
第4実施形態では、複数台のカメラ11,40が配設されることにより、各カメラ11,40による撮影画像が得られる。このことにより、情報処理装置12における検知部30と計数部31は、それぞれ、得られる複数の撮影画像のそれぞれについて、処理を実行する。そして、計数部31は、カメラ11,40による撮影画像から計測された物体の数を合計し、合計値を測定対象の物体の数として算出する。
In the fourth embodiment, a plurality of cameras 11 and 40 are provided, so that images taken by the cameras 11 and 40 can be obtained. As a result, the detection unit 30 and the counting unit 31 in the information processing apparatus 12 execute processing for each of the obtained plurality of captured images. Then, the counting unit 31 sums up the number of objects measured from the images taken by the cameras 11 and 40, and calculates the total value as the number of objects to be measured.
なお、情報処理装置12は、同時に撮影されたカメラ11の撮影画像とカメラ40の撮影画像とを用いる。このことを考慮し、同時に撮影されたカメラ11による撮影画像とカメラ40による撮影画像とを得やすくするために、撮影中に、時間合わせ(同期)に用いる目印となる変化をもカメラ11,40に共通に撮影させることが好ましい。例えば、同期に用いる目印として、自動制御あるいは観測者の手動によって短時間発光する光を利用することとし、カメラ11,40がその光を撮影するようにしてもよい。あるいは、同期に用いる目印としての音を画像と共にカメラ11,40が取り込むようにしてもよい。あるいは、カメラ11,40が内蔵している時計の時間合わせをしておき、カメラ11,40は内蔵の時計の時間情報を撮影画像に関連付けてもよい。このように、カメラ11,40の撮影画像に同期に用いる情報が含まれる、あるいは、関連付けられることにより、カメラ11,40の撮影画像の同期を行うことが容易となる。
Note that the information processing apparatus 12 uses a photographed image of the camera 11 and a photographed image of the camera 40 that are photographed simultaneously. Considering this, in order to make it easy to obtain a photographed image by the camera 11 and a photographed image by the camera 40 at the same time, the camera 11, 40 also changes a mark used for time adjustment (synchronization) during photographing. It is preferable to let them shoot in common. For example, as a mark used for synchronization, light that emits light for a short time may be used by automatic control or manually by an observer, and the cameras 11 and 40 may capture the light. Or you may make it the cameras 11 and 40 capture the sound as a mark used for a synchronization with an image. Alternatively, the time of the clocks built in the cameras 11 and 40 may be adjusted, and the cameras 11 and 40 may associate the time information of the built-in clocks with the captured images. As described above, information used for synchronization is included in or associated with the captured images of the cameras 11 and 40, so that the captured images of the cameras 11 and 40 can be easily synchronized.
ところで、第4実施形態では、カメラ11,40は、深さ方向に間隔を介して配置され、かつ、それらレンズの向きは同じ方向を向いている(上向きである)。このため、カメラ11の撮影範囲は、図19における斜線領域Wのような、カメラ40の撮影範囲とオーバーラップする領域がある。このため、カメラ11による撮影画像から計数部31が計数する計測対象の物体(魚)の数には、カメラ40による撮影画像において計数部31が計数する計測対象の物体(魚)の数が含まれてしまうことが想定される。このことを考慮し、第4実施形態では、計数部31により計数された計測対象の物体の数を補正する機能が備えられている。すなわち、図20に表されるように、制御装置13は、第2や第3の実施形態の構成に加えて、補正部34を備えている。なお、図20は、制御装置13において、検知部30と表示制御部32の図示が省略されている。
By the way, in 4th Embodiment, the cameras 11 and 40 are arrange | positioned through the space | interval in the depth direction, and the direction of these lenses has faced the same direction (it is upward). For this reason, the shooting range of the camera 11 has an area that overlaps the shooting range of the camera 40, such as the hatched area W in FIG. Therefore, the number of measurement target objects (fish) counted by the counting unit 31 from the captured image by the camera 11 includes the number of measurement target objects (fish) counted by the counting unit 31 in the captured image by the camera 40. It is assumed that Considering this, the fourth embodiment has a function of correcting the number of objects to be measured counted by the counting unit 31. That is, as illustrated in FIG. 20, the control device 13 includes a correction unit 34 in addition to the configurations of the second and third embodiments. In FIG. 20, the detection unit 30 and the display control unit 32 are not shown in the control device 13.
補正部34は、カメラ11,40におけるオーバーラップしている撮影範囲領域において計数部31により重複して計数されると想定される計測対象の物体の数に基づいて、計数部31により計数された計測対象の物体の数を補正する機能を備えている。
The correction unit 34 is counted by the counting unit 31 based on the number of objects to be measured that are assumed to be counted repeatedly by the counting unit 31 in the overlapping shooting range regions of the cameras 11 and 40. A function for correcting the number of objects to be measured is provided.
例えば、カメラ11による撮影画像から計数部31が計数する計測対象の物体の数のうち、カメラ40による撮影画像においても計数部31が計数すると考えられる計測対象の物体の数の割合がどの程度になるかを観測者等が予め求める。その求められた割合のデータが間引き率(例えば、0.5(50%))として記憶装置14に格納される。なお、間引き率を求める手法には様々な手法が考えられる。例えば、実験に基づいた観測者の経験に基づいて観測者が間引き率を定める。あるいは、予め撮影されたカメラ11,40による撮影画像を教師データとして利用した機械学習によって得られたサンプル画像(例えば魚の動きの動画)に基づいて、間引き率が算出されてもよい。あるいは、撮影環境に応じた前記間引き率が算出されてもよい。このように、間引き率を求める手法には様々な手法があり、適宜な手法により間引き率が求められる。
For example, out of the number of objects to be measured counted by the counting unit 31 from the image captured by the camera 11, what is the ratio of the number of objects to be measured that the counting unit 31 is supposed to count also in the image captured by the camera 40? An observer etc. asks whether it becomes. The obtained ratio of data is stored in the storage device 14 as a thinning rate (for example, 0.5 (50%)). There are various methods for obtaining the thinning rate. For example, the observer determines the thinning rate based on the experience of the observer based on the experiment. Alternatively, the thinning rate may be calculated based on a sample image (for example, a moving image of fish movement) obtained by machine learning using pre-captured images taken by the cameras 11 and 40 as teacher data. Alternatively, the thinning rate according to the shooting environment may be calculated. As described above, there are various methods for obtaining the thinning rate, and the thinning rate is obtained by an appropriate method.
補正部34は、カメラ11による撮影画像において計数部31により計数された計測対象の物体の数に、記憶装置14に格納されている間引き率を乗算することにより、補正数を算出する。そして、補正部34は、計数部31により計測対象の物体の数として計数された数から、算出した補正数を差し引くことにより、計数部31により計数された計測対象の物体の数を補正する。補正部34は、そのように補正した補正後の計測対象の物体の数の情報を記憶装置14に格納する機能を備えている。また、補正部34は、補正後の計測対象の物体の数を表示制御部32によって表示装置17に表示させる機能をさらに備えている。
The correction unit 34 calculates the correction number by multiplying the number of objects to be measured counted by the counting unit 31 in the photographed image by the camera 11 by the thinning rate stored in the storage device 14. Then, the correction unit 34 corrects the number of objects to be measured counted by the counting unit 31 by subtracting the calculated correction number from the number counted as the number of objects to be measured by the counting unit 31. The correction unit 34 has a function of storing the corrected information on the number of objects to be measured in the storage device 14 after correction. The correction unit 34 further has a function of causing the display control unit 32 to display the corrected number of objects to be measured on the display device 17.
第4実施形態の計数システム10では、複数のカメラ11,40が深さ方向に配設されている。このため、次のような効果を得ることができる。例えば、カメラ11から離れているために、カメラ11による撮影画像においては不明瞭である物体があるとする。この場合に、その物体がカメラ11よりも近い位置のカメラ40により撮影されることによって、カメラ40による撮影画像では、カメラ11による撮影画像よりも、その物体の画像の明瞭化を図ることができる。このように計測対象の物体における画像の明瞭化を図ることができることにより、情報処理装置12は、撮影画像を利用する計数の機能によって計測対象の物体の計測精度を高めることができる。
In the counting system 10 of the fourth embodiment, a plurality of cameras 11 and 40 are arranged in the depth direction. For this reason, the following effects can be acquired. For example, it is assumed that there is an object that is unclear in the image captured by the camera 11 because it is away from the camera 11. In this case, since the object is photographed by the camera 40 at a position closer to the camera 11, the image of the object can be clarified in the photographed image by the camera 40 than the photographed image by the camera 11. . Since the image on the object to be measured can be clarified in this way, the information processing apparatus 12 can increase the measurement accuracy of the object to be measured by the counting function using the captured image.
また、第4実施形態では、複数のカメラ11,40の撮影画像を利用することに起因して、重複して計測されてしまう物体があるが、情報処理装置12は、補正部34によって、そのような重複した物体の数を補正できる。このため、情報処理装置12は、複数のカメラ11,40の撮影画像を利用することに因る不具合の発生を防止できる。
Further, in the fourth embodiment, there is an object that is redundantly measured due to the use of the captured images of the plurality of cameras 11 and 40, but the information processing apparatus 12 is corrected by the correction unit 34. The number of such overlapping objects can be corrected. For this reason, the information processing apparatus 12 can prevent the occurrence of problems caused by using the captured images of the plurality of cameras 11 and 40.
<第5実施形態>
以下に、本発明に係る第5実施形態を説明する。なお、第5実施形態の説明において、第2~第4の実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Fifth Embodiment>
The fifth embodiment according to the present invention will be described below. In the description of the fifth embodiment, the same reference numerals are given to the same components as the components constituting the counting system in the second to fourth embodiments, and the duplicate description of the common portions will be omitted.
以下に、本発明に係る第5実施形態を説明する。なお、第5実施形態の説明において、第2~第4の実施形態における計数システムを構成する構成部分と同様な構成部分には同一符号を付し、その共通部分の重複説明は省略する。 <Fifth Embodiment>
The fifth embodiment according to the present invention will be described below. In the description of the fifth embodiment, the same reference numerals are given to the same components as the components constituting the counting system in the second to fourth embodiments, and the duplicate description of the common portions will be omitted.
第5実施形態における計数システム10は、撮影装置であるカメラの配設形態が第4実施形態と異なり、それ以外は、第4実施形態の計数システム10とほぼ同様な構成を備えている。
The counting system 10 according to the fifth embodiment is different from the fourth embodiment in the arrangement of a camera that is a photographing device, and has substantially the same configuration as that of the counting system 10 according to the fourth embodiment.
すなわち、図21は、第5実施形態におけるカメラ50の配設形態を模式的に表す斜視図である。図22は、図21の上方側から見たカメラ50の配設形態を表すモデル図である。第5実施形態では、水中(生簀25)においてカメラ50が配設される位置は、水底に近い深さ位置Dtと、水面に近い位置Dbと、それら位置Db,Dt間の間隔をほぼ等間隔に分ける位置Dm1,Dm2とである。また、カメラ50が配設される各深さ位置(高さ位置)において、カメラ50は4台ずつ配設される。深さ位置毎の4台のカメラ50は、図22に表されるように、水面に沿う互いに異なる四方向にそれぞれ向いて配設されている。図21に表されるような複数のカメラ50は、第4実施形態と同様に、生簀25の中央部に配置される。
That is, FIG. 21 is a perspective view schematically showing an arrangement form of the camera 50 in the fifth embodiment. FIG. 22 is a model diagram showing the arrangement of the camera 50 as viewed from above in FIG. In the fifth embodiment, the position where the camera 50 is disposed in water (ginger 25) is a depth position Dt close to the bottom of the water, a position Db close to the water surface, and an interval between the positions Db and Dt is substantially equal. And positions Dm1 and Dm2. Further, at each depth position (height position) where the camera 50 is disposed, four cameras 50 are disposed. As shown in FIG. 22, the four cameras 50 for each depth position are arranged in four different directions along the water surface. A plurality of cameras 50 as shown in FIG. 21 are arranged at the center of the ginger 25 as in the fourth embodiment.
第5実施形態における情報処理装置12では、検知部30は、各カメラ50の撮影画像において計測対象の物体(魚)の特徴部位(頭)を検知する。第5実施形態では、カメラ50は、計測対象の物体である魚を横側から撮影するので、検知部30が特徴部位の検知処理で利用する参考データは、例えば、図14および図15に表されるような横側から見た魚の頭のサンプル画像である。
In the information processing apparatus 12 according to the fifth embodiment, the detection unit 30 detects the characteristic part (head) of the measurement target object (fish) in the captured image of each camera 50. In the fifth embodiment, since the camera 50 captures the fish that is the object to be measured from the side, the reference data that the detection unit 30 uses in the feature part detection process is shown in FIGS. 14 and 15, for example. It is the sample image of the head of the fish seen from the side as it is.
計数部31は、各カメラ50の撮影画像において検知された特徴部位の数を計測し、各撮影画像における計測数を合計し、当該合計値を計測対象の物体の数として算出する。第5実施形態では、水平方向に隣り合っているカメラ50におけるオーバーラップする撮影範囲は、例えば、図22における領域Wとなる。また、高さ方向に隣り合っているカメラ50の撮影範囲についても、オーバーラップする撮影範囲領域がある。補正部34は、そのようなオーバーラップする撮影範囲領域において重複して計測される計測対象の物体の数を考慮して、計数部31により算出された計測対象の物体の数を補正する。
The counting unit 31 measures the number of characteristic parts detected in the captured image of each camera 50, sums the measured number in each captured image, and calculates the total value as the number of objects to be measured. In the fifth embodiment, an overlapping shooting range in the cameras 50 adjacent in the horizontal direction is, for example, a region W in FIG. In addition, there are overlapped shooting range areas for the shooting ranges of the cameras 50 adjacent in the height direction. The correction unit 34 corrects the number of objects to be measured calculated by the counting unit 31 in consideration of the number of objects to be measured that are redundantly measured in such overlapping imaging range regions.
第5実施形態の計数システム10は、第4実施形態に比べて、カメラ50の配設数をより増加したことにより、計測対象の物体の計数精度をより高めることができる。
The counting system 10 of the fifth embodiment can further increase the counting accuracy of the object to be measured by increasing the number of cameras 50 arranged as compared to the fourth embodiment.
<その他の実施形態>
なお、本発明は第1~第5の実施形態に限定されることなく、様々な実施の形態を採り得る。例えば、第4と第5の実施形態では、複数のカメラ50が生簀25の中央部に配置されている。これに代えて、例えば、図24に表されるように、複数のカメラ50は生簀25の周縁部(図24の例では角部)に配置されてもよい。この場合には、例えば、図23に表されるように、互いに異なる各深さ位置に1台ずつカメラ50が配設されている。図25は、図24におけるカメラ50と生簀25との関係を水面側から見たモデル図である。図25に表されているように、各深さ位置のカメラ50のレンズの向きは生簀25の中央部を向く横向きの同方向となっている。なお、図24,25に表されるように生簀25の周縁部にカメラ50が配置される場合において、カメラ50の視野によっては、各深さ位置に複数ずつカメラ50が配設されてもよい。 <Other embodiments>
The present invention is not limited to the first to fifth embodiments, and various embodiments can be adopted. For example, in the fourth and fifth embodiments, a plurality ofcameras 50 are arranged at the center of the ginger 25. Instead of this, for example, as shown in FIG. 24, the plurality of cameras 50 may be arranged at the peripheral edge portion (corner portion in the example of FIG. 24) of the ginger 25. In this case, for example, as shown in FIG. 23, one camera 50 is disposed at each different depth position. FIG. 25 is a model diagram of the relationship between the camera 50 and the ginger 25 in FIG. 24 viewed from the water surface side. As shown in FIG. 25, the direction of the lens of the camera 50 at each depth position is the same in the horizontal direction toward the center of the ginger 25. In addition, when the camera 50 is arrange | positioned at the peripheral part of the ginger 25 as represented to FIG.24,25, depending on the visual field of the camera 50, the camera 50 may be arrange | positioned by multiple each depth position. .
なお、本発明は第1~第5の実施形態に限定されることなく、様々な実施の形態を採り得る。例えば、第4と第5の実施形態では、複数のカメラ50が生簀25の中央部に配置されている。これに代えて、例えば、図24に表されるように、複数のカメラ50は生簀25の周縁部(図24の例では角部)に配置されてもよい。この場合には、例えば、図23に表されるように、互いに異なる各深さ位置に1台ずつカメラ50が配設されている。図25は、図24におけるカメラ50と生簀25との関係を水面側から見たモデル図である。図25に表されているように、各深さ位置のカメラ50のレンズの向きは生簀25の中央部を向く横向きの同方向となっている。なお、図24,25に表されるように生簀25の周縁部にカメラ50が配置される場合において、カメラ50の視野によっては、各深さ位置に複数ずつカメラ50が配設されてもよい。 <Other embodiments>
The present invention is not limited to the first to fifth embodiments, and various embodiments can be adopted. For example, in the fourth and fifth embodiments, a plurality of
このようにカメラ50が配設される場合には、カメラ50の撮影画像は、第5実施形態と同様に、生簀25内の魚26を横側から撮影することになる。
When the camera 50 is arranged in this way, the image taken by the camera 50 is taken from the side of the fish 26 in the ginger 25 as in the fifth embodiment.
また、第4実施形態では、カメラ11,40は図18に表されているように二段に配設され、第5実施形態ではカメラ50は図21に表されているように四段に配設されている。これに代えて、カメラを複数段に配設する場合の段数は、魚の数を計数する目的とする深さ範囲によっては三段や五段以上であってもよい。
In the fourth embodiment, the cameras 11 and 40 are arranged in two stages as shown in FIG. 18, and in the fifth embodiment, the camera 50 is arranged in four stages as shown in FIG. It is installed. Instead of this, the number of stages when the cameras are arranged in a plurality of stages may be three or five or more depending on the depth range for counting the number of fish.
さらに、第2~第5の実施形態の構成に加えて、情報処理装置12の検知部30が検知処理を開始する前などの適宜なタイミングで、撮影画像における水の濁りを軽減する画像処理や、水の揺らぎに因る魚体の歪みを補正する画像処理が行われてもよい。また、撮影画像を物体の水深や明るさ等の撮影条件を考慮して補正する画像処理が行われてもよい。このように、情報処理装置12が、撮影環境を考慮して撮影画像を画像処理(画像補正)することにより、計測対象の物体の計数精度をより高めることができる。また、情報処理装置12は、そのように画像補正された撮影画像を利用することにより、参考データの数を少なくできるという効果を得ることができる。
Further, in addition to the configurations of the second to fifth embodiments, image processing for reducing the turbidity of water in the photographed image at an appropriate timing such as before the detection unit 30 of the information processing device 12 starts the detection processing. Further, image processing for correcting the distortion of the fish body due to the fluctuation of water may be performed. In addition, image processing may be performed in which a captured image is corrected in consideration of imaging conditions such as the water depth and brightness of the object. As described above, the information processing apparatus 12 performs image processing (image correction) on the captured image in consideration of the imaging environment, so that the counting accuracy of the object to be measured can be further increased. In addition, the information processing apparatus 12 can obtain an effect that the number of reference data can be reduced by using the captured image that has been subjected to such image correction.
さらに、第2と第3の実施形態では、情報処理装置12における検知部30は、計測対象の物体の特徴部位として魚の頭を検知している。これに代えて、例えば、検知部30は、魚の側面部分を検知してもよい。この場合には、検知部30が検知処理で利用する参考データは、例えば、図26および図27に表されるような魚の側面部分のサンプル画像となる。
Furthermore, in the second and third embodiments, the detection unit 30 in the information processing apparatus 12 detects a fish head as a characteristic part of an object to be measured. Instead of this, for example, the detection unit 30 may detect a side surface portion of a fish. In this case, the reference data used by the detection unit 30 in the detection process is, for example, a sample image of the side surface portion of the fish as shown in FIGS.
さらに、検知部30が利用する図14~図17に表される参考データは一例であって、検知部30が利用する参考データは、さらに多くの参考データが含まれていてもよい。例えば、魚の頭あるいは尾の参考データとして、図27における見切りデータ(つまり、検知した部分の一部が撮影範囲から外れてしまっている画像データ)のようなデータが検知対象外のデータが含まれていてもよい。また、図27に表されるような魚のくねりを考慮した魚の尾のくねりを表すデータが参考データとして含まれていてもよい。
Furthermore, the reference data shown in FIGS. 14 to 17 used by the detection unit 30 is an example, and the reference data used by the detection unit 30 may include more reference data. For example, the reference data for the head or tail of the fish includes data that is not subject to detection, such as data for closing out in FIG. 27 (that is, image data in which a part of the detected part is out of the shooting range). It may be. In addition, data representing the fish tail bend in consideration of the fish bend as shown in FIG. 27 may be included as reference data.
さらに、第2~第5の実施形態では、計測対象の物体として魚を例にして説明しているが、第2~第5の実施形態で説明した構成を持つ計数システム10は、他の物体の計数にも適用可能である。
Furthermore, in the second to fifth embodiments, a fish is described as an example of an object to be measured. However, the counting system 10 having the configuration described in the second to fifth embodiments is not limited to other objects. It is also applicable to the counting.
以上、上述した実施形態を模範的な例として本発明を説明した。しかしながら、本発明は、上述した実施形態には限定されない。即ち、本発明は、本発明のスコープ内において、当業者が理解し得る様々な態様を適用することができる。
The present invention has been described above using the above-described embodiment as an exemplary example. However, the present invention is not limited to the above-described embodiment. That is, the present invention can apply various modes that can be understood by those skilled in the art within the scope of the present invention.
この出願は、2016年9月30日に出願された日本出願特願2016-194271を基礎とする優先権を主張し、その開示の全てをここに取り込む。
This application claims priority based on Japanese Patent Application No. 2016-194271 filed on September 30, 2016, the entire disclosure of which is incorporated herein.
1,12 情報処理装置
2,30 検知部
3,31 計数部
5,10 計数システム
6 撮影装置
11,40,50 カメラ DESCRIPTION OF SYMBOLS 1,12 Information processing apparatus 2,30 Detection part 3,31 Counting part 5,10 Counting system 6 Imaging device 11, 40, 50 Camera
2,30 検知部
3,31 計数部
5,10 計数システム
6 撮影装置
11,40,50 カメラ DESCRIPTION OF
Claims (9)
- 計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する検知手段と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数手段と
を備える情報処理装置。 Detecting means for detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
An information processing apparatus comprising: counting means for measuring the number of detected characteristic parts in the captured image as the number of objects to be measured. - 撮影範囲の一部が互いにオーバーラップしている複数の前記撮影画像から計測した前記特徴部位の数の合計値を前記計測対象の物体の数として計測する機能をさらに持つ前記計数手段を備えると共に、
前記複数の撮影画像から計数された計測対象の物体の数を、前記オーバーラップしている撮影範囲に基づいて補正する補正手段をさらに備えている請求項1に記載の情報処理装置。 The counting unit further includes a function of measuring the total value of the number of the characteristic parts measured from a plurality of the captured images in which a part of the imaging range overlaps each other as the number of objects to be measured,
The information processing apparatus according to claim 1, further comprising a correcting unit that corrects the number of objects to be measured counted from the plurality of captured images based on the overlapping imaging ranges. - 前記検知手段は、前記計測対象の物体における撮影条件が異なる前記特徴部位の複数の画像を利用して、前記撮影画像から前記特徴部位を検知する請求項1又は請求項2に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the detection unit detects the feature part from the photographed image using a plurality of images of the feature part having different photographing conditions in the object to be measured. .
- 計測対象の物体が存在している撮影空間を撮影する撮影装置と、
前記撮影装置による撮影画像に基づいて、前記撮影空間内における計測対象の物体の数を計測する情報処理装置と
を備え、
前記情報処理装置は、
前記計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定めた特徴を持つ特徴部位を検知する検知手段と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数手段と
を備える計数システム。 A photographing device for photographing a photographing space in which an object to be measured exists;
An information processing device that measures the number of objects to be measured in the imaging space based on images captured by the imaging device;
The information processing apparatus includes:
Detecting means for detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting system comprising: counting means for measuring the number of detected characteristic parts in the captured image as the number of objects to be measured. - 複数の撮影装置が備えられ、これら撮影装置は、予め定められた撮影空間内における高さ方向に互いに間隔を介して配設される請求項4に記載の計数システム。 5. The counting system according to claim 4, wherein a plurality of imaging devices are provided, and these imaging devices are arranged at intervals in the height direction within a predetermined imaging space.
- 前記撮影空間における同じ高さ位置に複数の前記撮影装置が配設されている請求項5に記載の計数システム。 The counting system according to claim 5, wherein a plurality of the photographing devices are arranged at the same height position in the photographing space.
- 前記複数の撮影装置は、それぞれ、撮影範囲が他の前記撮影装置の撮影範囲にオーバーラップするように配設されており、
前記情報処理装置は、前記撮影装置の撮影画像から計数された計測対象の物体の数を、前記オーバーラップしている撮影範囲領域に基づいて補正する補正手段をさらに備えている請求項6に記載の計数システム。 Each of the plurality of photographing devices is disposed so that a photographing range overlaps a photographing range of another photographing device,
The said information processing apparatus is further provided with the correction | amendment means which correct | amends the number of the objects of the measurement object counted from the picked-up image of the said imaging device based on the said imaging | photography range area | region which overlaps. Counting system. - 計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知し、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する計数方法。 Detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A counting method of measuring the number of detected characteristic parts in the captured image as the number of objects to be measured. - 計測対象の物体が撮影されている撮影画像から前記計測対象の物体における予め定められた特徴を持つ特徴部位を検知する処理と、
前記撮影画像における検知した前記特徴部位の数を前記計測対象の物体の数として計測する処理と
をコンピュータに実行させるコンピュータプログラムを記憶するプログラム記憶媒体。 A process of detecting a characteristic part having a predetermined characteristic in the measurement target object from a captured image in which the measurement target object is captured;
A program storage medium for storing a computer program for causing a computer to execute a process of measuring the number of detected characteristic parts in the captured image as the number of objects to be measured.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018542458A JP7007280B2 (en) | 2016-09-30 | 2017-09-20 | Information processing equipment, counting system, counting method and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016194271 | 2016-09-30 | ||
JP2016-194271 | 2016-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018061928A1 true WO2018061928A1 (en) | 2018-04-05 |
Family
ID=61759710
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/033885 WO2018061928A1 (en) | 2016-09-30 | 2017-09-20 | Information processing device, counter system, counting method, and program storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7007280B2 (en) |
WO (1) | WO2018061928A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109360237A (en) * | 2018-12-07 | 2019-02-19 | 北京市水产科学研究所(国家淡水渔业工程技术研究中心) | A kind of prediction technique of total fish catches |
CN111738125A (en) * | 2020-06-16 | 2020-10-02 | 中国银行股份有限公司 | Method and device for determining number of clients |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6876310B1 (en) * | 2020-03-18 | 2021-05-26 | マルハニチロ株式会社 | Counting system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010063001A (en) * | 2008-09-05 | 2010-03-18 | Mitsubishi Electric Corp | Person-tracking device and person-tracking program |
JP2010121970A (en) * | 2008-11-17 | 2010-06-03 | Chugoku Electric Power Co Inc:The | Moving body recognition system and moving body recognition method |
JP2011103098A (en) * | 2009-11-12 | 2011-05-26 | Shinshu Univ | Tree counting method and tree counting apparatus |
JP2016110381A (en) * | 2014-12-05 | 2016-06-20 | 古野電気株式会社 | Number-of-objects counting device, program, and method for counting number of objects |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003250382A (en) | 2002-02-25 | 2003-09-09 | Matsushita Electric Works Ltd | Method for monitoring growing state of aquatic life, and device for the same |
JP5875917B2 (en) | 2012-03-26 | 2016-03-02 | 一般財団法人電力中央研究所 | Moving body image discrimination apparatus and moving body image discrimination method |
JP6255819B2 (en) | 2013-09-09 | 2018-01-10 | 富士通株式会社 | COMPUTER PROGRAM FOR MEASUREMENT, MEASUREMENT DEVICE AND MEASUREMENT METHOD |
-
2017
- 2017-09-20 JP JP2018542458A patent/JP7007280B2/en active Active
- 2017-09-20 WO PCT/JP2017/033885 patent/WO2018061928A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010063001A (en) * | 2008-09-05 | 2010-03-18 | Mitsubishi Electric Corp | Person-tracking device and person-tracking program |
JP2010121970A (en) * | 2008-11-17 | 2010-06-03 | Chugoku Electric Power Co Inc:The | Moving body recognition system and moving body recognition method |
JP2011103098A (en) * | 2009-11-12 | 2011-05-26 | Shinshu Univ | Tree counting method and tree counting apparatus |
JP2016110381A (en) * | 2014-12-05 | 2016-06-20 | 古野電気株式会社 | Number-of-objects counting device, program, and method for counting number of objects |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109360237A (en) * | 2018-12-07 | 2019-02-19 | 北京市水产科学研究所(国家淡水渔业工程技术研究中心) | A kind of prediction technique of total fish catches |
CN109360237B (en) * | 2018-12-07 | 2019-06-14 | 北京市水产科学研究所(国家淡水渔业工程技术研究中心) | A kind of prediction technique of total fish catches |
CN111738125A (en) * | 2020-06-16 | 2020-10-02 | 中国银行股份有限公司 | Method and device for determining number of clients |
CN111738125B (en) * | 2020-06-16 | 2023-10-27 | 中国银行股份有限公司 | Method and device for determining number of clients |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018061928A1 (en) | 2019-07-25 |
JP7007280B2 (en) | 2022-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7188527B2 (en) | Fish length measurement system, fish length measurement method and fish length measurement program | |
JP6849081B2 (en) | Information processing equipment, counting system, counting method and computer program | |
EP3248374B1 (en) | Method and apparatus for multiple technology depth map acquisition and fusion | |
JP5875917B2 (en) | Moving body image discrimination apparatus and moving body image discrimination method | |
CN109544620B (en) | Image processing method and apparatus, computer-readable storage medium, and electronic device | |
CN103765870B (en) | Image processing apparatus, projector and projector system including image processing apparatus, image processing method | |
TWI527448B (en) | Imaging apparatus, image processing method, and recording medium for recording program thereon | |
US9124807B2 (en) | Imaging apparatus, control method therefor, and storage medium | |
WO2018061928A1 (en) | Information processing device, counter system, counting method, and program storage medium | |
JP6981531B2 (en) | Object identification device, object identification system, object identification method and computer program | |
US9380206B2 (en) | Image processing apparatus that combines images | |
JP6816773B2 (en) | Information processing equipment, information processing methods and computer programs | |
CN104243863B (en) | Filming apparatus, image pickup method | |
CN103460248B (en) | Image processing method and device | |
WO2019172363A1 (en) | Information processing device, object measurement system, object measurement method, and program storage medium | |
CN107181918A (en) | A kind of dynamic filming control method and system for catching video camera of optics | |
CN108781267A (en) | Image processing equipment and method | |
JP6583565B2 (en) | Counting system and counting method | |
JP7074185B2 (en) | Feature estimation device, feature estimation method, and program | |
JP2012015642A (en) | Imaging device | |
JP6879375B2 (en) | Information processing equipment, length measurement system, length measurement method and computer program | |
JP2016123044A (en) | Subject tracking device, and control method and program therefor | |
KR20110023762A (en) | Image processing apparatus, image processing method and computer readable-medium | |
WO2012014946A1 (en) | Image processing device and image processing program | |
EP2750391B1 (en) | Method, apparatus and computer program product for processing of images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17855883 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018542458 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17855883 Country of ref document: EP Kind code of ref document: A1 |