WO2023026551A1 - 測定装置、測定方法、プログラム - Google Patents
測定装置、測定方法、プログラム Download PDFInfo
- Publication number
- WO2023026551A1 WO2023026551A1 PCT/JP2022/012747 JP2022012747W WO2023026551A1 WO 2023026551 A1 WO2023026551 A1 WO 2023026551A1 JP 2022012747 W JP2022012747 W JP 2022012747W WO 2023026551 A1 WO2023026551 A1 WO 2023026551A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- target object
- carbon
- imaging
- measuring device
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 129
- 238000000691 measurement method Methods 0.000 title description 6
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 claims abstract description 208
- 229910052799 carbon Inorganic materials 0.000 claims abstract description 208
- 238000003384 imaging method Methods 0.000 claims abstract description 151
- 238000000034 method Methods 0.000 claims description 77
- 230000008569 process Effects 0.000 claims description 67
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 58
- 238000010801 machine learning Methods 0.000 description 30
- 238000010586 diagram Methods 0.000 description 20
- 238000005516 engineering process Methods 0.000 description 20
- 238000005286 illumination Methods 0.000 description 19
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 18
- 238000012545 processing Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 229910002092 carbon dioxide Inorganic materials 0.000 description 9
- 239000001569 carbon dioxide Substances 0.000 description 9
- 230000005484 gravity Effects 0.000 description 8
- 244000005700 microbiome Species 0.000 description 8
- 239000000284 extract Substances 0.000 description 5
- 239000005431 greenhouse gas Substances 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000005284 excitation Effects 0.000 description 4
- 239000010419 fine particle Substances 0.000 description 4
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 4
- 239000004576 sand Substances 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 210000003608 fece Anatomy 0.000 description 3
- 239000000779 smoke Substances 0.000 description 3
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000029553 photosynthesis Effects 0.000 description 2
- 238000010672 photosynthesis Methods 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000010792 warming Methods 0.000 description 2
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 1
- 229920000426 Microplastic Polymers 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003570 air Substances 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 229910001385 heavy metal Inorganic materials 0.000 description 1
- 229910052734 helium Inorganic materials 0.000 description 1
- 239000001307 helium Substances 0.000 description 1
- SWQJXJOGLNCZEY-UHFFFAOYSA-N helium atom Chemical compound [He] SWQJXJOGLNCZEY-UHFFFAOYSA-N 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 125000004435 hydrogen atom Chemical class [H]* 0.000 description 1
- 229910000037 hydrogen sulfide Inorganic materials 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 229910052742 iron Inorganic materials 0.000 description 1
- WPBNNNQJVZRUHP-UHFFFAOYSA-L manganese(2+);methyl n-[[2-(methoxycarbonylcarbamothioylamino)phenyl]carbamothioyl]carbamate;n-[2-(sulfidocarbothioylamino)ethyl]carbamodithioate Chemical compound [Mn+2].[S-]C(=S)NCCNC([S-])=S.COC(=O)NC(=S)NC1=CC=CC=C1NC(=S)NC(=O)OC WPBNNNQJVZRUHP-UHFFFAOYSA-L 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000003345 natural gas Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000013535 sea water Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000009919 sequestration Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/18—Water
- G01N33/1826—Organic contamination in water
- G01N33/1846—Total carbon analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N2021/1765—Method using an image detector and processing of image signal
- G01N2021/177—Detector of the video camera type
Definitions
- This technology relates to measurement devices, measurement methods, and programs, and in particular to technology for measuring the amount of carbon in water.
- a measuring device for measuring the abundance of phytoplankton by irradiating excitation light of a predetermined wavelength to excite phytoplankton and measuring the intensity of fluorescence emitted from the excited phytoplankton has been proposed (for example, See Patent Document 1).
- the above measuring device can only measure phytoplankton excited by excitation light. Moreover, although the measuring apparatus described above can measure the abundance of phytoplankton, it cannot measure the amount of carbon.
- the purpose of this technology is to estimate the amount of carbon.
- a measuring device includes an imaging control unit that causes an imaging unit to image a predetermined imaging range in water, and a carbon amount estimation unit that estimates a carbon amount based on the image captured by the imaging unit. It is. Thereby, the measuring device can estimate the carbon content based on the image captured by the imaging unit.
- FIG. 4 is a flowchart showing the procedure of measurement processing; It is a figure explaining a rule-based distance speed measurement process.
- FIG. 4 is a diagram for explaining an image that serves as teacher data; It is a model diagram of deep learning. It is a figure explaining the machine learning of a carbon amount estimation process. It is a figure explaining a spatial carbon amount estimation process. It is a figure explaining a deposited carbon amount estimation process.
- First Embodiment> [1.1 Configuration of measuring device] [1.2 Target object] [1.3 Measurement method of the first embodiment] [1.4 Measurement process] [1.5 Distance speed measurement process] [1.6 Carbon amount estimation process] ⁇ 2.
- Second Embodiment> [2.1 Configuration of measuring device] [2.2 Measurement process] [2.3 Machine learning distance speed measurement process] [2.4 Carbon amount estimation process] ⁇ 3.
- Other Configuration Examples of Measuring Device> ⁇ 4. Summary of Embodiments> ⁇ 5. This technology>
- the measurement device 1 is a device that estimates (measures) the amount of carbon in water by estimating the amount of carbon in a target object, for example, microorganisms or fine particles that exist in water such as the sea.
- the target microorganisms are phytoplankton, zooplankton, and aquatic microorganisms such as larvae of aquatic organisms that exist in water.
- fine particles serving as target objects include microplastics, dust, sand, marine snow, air bubbles, and the like.
- the target object may be other than these.
- FIG. 1 is a diagram for explaining the configuration of a measuring device 1 as a first embodiment.
- FIG. 2 is a diagram for explaining the imaging range 30 and measurement directions.
- the measuring device 1 includes a main body portion 2 and an illumination portion 3.
- the lighting unit 3 may be provided inside the main body unit 2 .
- the main unit 2 includes a control unit 10, a memory 11, a communication unit 12, a gravity sensor 13, an imaging unit 14 and a lens 15.
- the control unit 10 includes, for example, a microcomputer having a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory), and controls the measuring apparatus 1 as a whole.
- the control unit 10 functions as an imaging control unit 21, a class identification unit 22, a distance speed measurement unit 23, and a carbon amount estimation unit 24 in the first embodiment.
- the imaging control unit 21, the class identification unit 22, the distance/velocity measurement unit 23, and the carbon amount estimation unit 24 will be described in detail later.
- the control unit 10 reads data stored in the memory 11 , stores data in the memory 11 , and transmits and receives various data to and from external devices via the communication unit 12 .
- the memory 11 is composed of a non-volatile memory.
- the communication unit 12 performs wired or wireless data communication with an external device.
- the gravity sensor 13 detects gravitational acceleration (direction of gravity) and outputs the detection result to the control unit 10 . Note that the measuring device 1 does not have to include the gravity sensor 13 .
- the imaging unit 14 includes both or one of a vision sensor 14a and an imaging sensor 14b.
- the vision sensor 14a is a sensor called DVS (Dynamic Vision Sensor) or EVS (Event-Based Vision Sensor).
- the vision sensor 14a captures an underwater predetermined imaging range 30 through the lens 15 . 2, the horizontal direction of the imaging range 30 is defined as the X-axis direction, the vertical direction of the imaging range 30 is defined as the Y-axis direction, and the imaging direction (optical axis direction) of the imaging unit 14 is defined as the Z-axis direction. It is sometimes written as direction. It is also assumed that the Y-axis direction substantially coincides with the direction of gravity.
- the vision sensor 14a is an asynchronous image sensor in which a plurality of pixels having photoelectric conversion elements are arranged two-dimensionally and a detection circuit for detecting an address event in real time is provided for each pixel.
- an address event is an event that occurs according to the amount of incident light for each address assigned to each of a plurality of pixels arranged two-dimensionally. It is, for example, that the value or its variation exceeds a certain threshold.
- the vision sensor 14a detects whether or not an address event has occurred for each pixel, and when the occurrence of an address event is detected, reads a pixel signal as pixel data from the pixel where the address event has occurred. That is, the vision sensor 14a acquires pixel data asynchronously according to the amount of light incident on each of the two-dimensionally arranged pixels.
- a pixel signal readout operation is executed for pixels for which the occurrence of an address event has been detected.
- the amount of data read out for one frame is small.
- the measurement device 1 can detect the movement of the target object more quickly by using the vision sensor 14a.
- the vision sensor 14a can reduce the amount of data and the power consumption.
- the imaging sensor 14b is, for example, a CCD (Charge Coupled Device) type or CMOS (Complementary Metal-Oxide-Semiconductor) type image sensor, in which a plurality of pixels having photoelectric conversion elements are arranged two-dimensionally.
- the imaging sensor 14b images a predetermined imaging range 30 through the lens 15 at regular intervals according to the frame rate to generate image data.
- a zone plate, a pinhole plate, or a transparent plate can be used instead of the lens 15.
- the vision sensor 14a and the imaging sensor 14b are arranged so as to capture substantially the same imaging range 30 through the lens 15.
- a half mirror (not shown) is arranged between the vision sensor 14a and the imaging sensor 14b, and the lens 15, one of which is split by the half mirror is incident on the vision sensor 14a, and the other is incident on the imaging sensor 14b. You should do it like this.
- the illumination unit 3 is driven under the control of the control unit 10 and irradiates the imaging range 30 of the imaging unit 14 with light.
- the illumination unit 3 can switch and irradiate light with different wavelengths, for example, irradiate light with different wavelengths at intervals of 10 nm.
- FIG. 3 is a diagram illustrating a target object and movement of the target object.
- the image of the target object is shown in the upper part, and the moving direction of the target object is indicated by the arrow in the lower part.
- the target objects include microorganisms, marine snow, seabed sand, smoke, and air bubbles. It is known that among microorganisms, there are some that exhibit migration properties when irradiated with light of a specific wavelength.
- runnability is an innate behavior in which an organism reacts to light (external stimulus). Therefore, when microbes having running properties are irradiated with light of a specific wavelength, the microbes migrate according to their running properties.
- Marine snow is, for example, particles such as plankton excreta, carcasses, or decomposed plankton present in the sea, which move (in the direction of gravity) in a sinking manner in the sea.
- Seabed sand is, for example, particles such as sand deposited on the seabed, and moves in a whirlpool due to seabed currents.
- Smoke is, for example, a phenomenon in which high-temperature water heated by geothermal heat erupts from hydrothermal vents on the seafloor. The hot water blowing out from the hydrothermal vents can reach temperatures of several hundred degrees, and because it contains abundant dissolved components such as heavy metals and hydrogen sulfide, it reacts with the seawater to produce black or white smoke that swirls.
- Bubbles are, for example, natural gases such as methane and carbon dioxide leaking (erupting) from the seabed, or carbon dioxide leaking from reservoirs artificially injected by CCS (carbon dioxide sequestration), and they seem to rise from the seabed. move to
- the target object is not limited to microorganisms, and even fine particles, there are those that move in a specific moving direction. Identify as an object.
- the measurement apparatus 1 on the premise that the measurement is performed in an aphotic layer where sunlight does not reach, the target object is irradiated with light of different wavelengths, and an image of the reflected light (or excitation light) is captured. Identify the type of object. Then, the measuring device 1 estimates the carbon content of the type-specified target object.
- FIG. 4 is a diagram explaining an example of measurement settings.
- FIG. 5 is a diagram illustrating an example of an operation timesheet.
- the control unit 10 performs measurement according to the measurement settings specified in advance as shown in FIG.
- the measurement settings specify the measurement start conditions, the operation time sheet of the illumination unit 3, the identification program, the distance and speed measurement program, the carbon content estimation program, and the measurement end conditions.
- the measurement start condition specifies a condition for starting measurement. For example, the time to start measurement or reception of a measurement start command input via communication unit 12 is specified. there is
- a time sheet for operating the lighting unit 3 is specified in the operation time sheet. For example, in the operation time sheet shown in FIG. designated to be irradiated.
- the operation time sheet specifies what wavelength of light is to be applied from the illumination unit 3 to the imaging range 30 and at what timing.
- the reason why the illumination unit 3 is turned off that is, the timing at which light is not emitted is provided in order to image light when the target object is emitting light (excitation).
- the asynchronous vision sensor 14a can easily detect an event for each wavelength.
- the identification program specifies a program for identifying the type of target object, such as a machine learning identification program or a rule-based identification program.
- a program for measuring the distance, speed, etc. of a target object is specified in the distance/velocity measurement program.
- a distance/velocity measurement program based on machine learning, a rule-based distance/velocity measurement program, etc. are specified.
- a program for estimating the amount of carbon is specified in the carbon amount estimation program.
- a carbon amount estimation program based on machine learning, a rule-based carbon amount estimation program, etc. are specified.
- the measurement termination condition specifies a condition for terminating the measurement. For example, the time to terminate the measurement, or the reception of a measurement termination command input via the communication unit 12 is specified. there is
- FIG. 6 is a flowchart showing the procedure of measurement processing.
- the control unit 10 executes the measurement process shown in FIG. 6 by executing software stored in the memory 11 (including an identification program, a distance/speed measurement program, and a carbon content estimation program).
- step S1 the control unit 10 reads external environment information, which will be described later. Then, in step S2, the control unit 10 determines whether or not the measurement start condition specified in the measurement setting is satisfied. Then, the control unit 10 repeats steps S1 and S2 until the measurement start condition is satisfied.
- step S3 the imaging control unit 21 switches and irradiates light with different wavelengths from the illumination unit 3 according to the operation time sheet specified in the measurement settings.
- the imaging control unit 21 causes the imaging unit 14 to image the imaging range 30 each time the wavelength of the light emitted from the illumination unit 3 and on/off is switched, and acquires pixel data and image data.
- step S4 the class identification unit 22 executes class identification processing.
- the class identification unit 22 identifies (identifies) the type of target object based on the image (pixel data and image data) captured by the imaging unit 14 .
- the class identification unit 22 derives identification information from the image captured by the imaging unit 14 and compares it with definition information stored in the memory 11 to identify the type of target object.
- the definition information is provided for each target object and stored in the memory 11.
- the definition information includes the type of target object, movement information and image information.
- the movement information is information detected mainly based on the image captured by the vision sensor 14a, and is information based on the movement of the target object as shown in the lower part of FIG.
- the movement information is information such as movement direction (positive or negative) with respect to the light source, trajectory, etc., when the target object is a microorganism.
- the movement information is information such as movement direction and trajectory when the target object is a fine particle.
- the image information is information detected mainly based on the image captured by the imaging sensor 14b, and is external information of the target object.
- the image information may be information detected based on an image captured by the vision sensor 14a.
- the definition information may also include the direction of gravity detected by the gravity sensor 13 and external environment information acquired via the communication unit 12 .
- the external environment information includes depth, position coordinates (latitude and longitude of the measurement point, plane rectangular coordinates), electrical conductivity, temperature, ph, concentration of gas (eg, methane, hydrogen, helium), concentration of metal (for example, manganese, iron) and the like are conceivable.
- the class identification unit 22 detects objects existing in the imaging range 30 based on the image (pixel data) captured by the vision sensor 14a. For example, the class identification unit 22 creates one image (frame data) based on pixel data input within a predetermined period, and classifies a group of pixels within a predetermined range in which motion is detected in the image into one group. Detect as an object.
- the class identification unit 22 tracks the object between a plurality of frames by pattern matching or the like. Then, the class identification unit 22 derives the movement direction and trajectory of the object as identification information based on the tracking result of the object.
- the cycle in which the class identification unit 22 generates an image from the pixel data may be the same as or shorter than the cycle (frame rate) in which the imaging sensor 14b acquires the image data.
- the class identification unit 22 extracts an image portion corresponding to the object from the image data input from the imaging sensor 14b for the object for which identification information has been derived. Then, the class identification unit 22 derives the external features as identification information by image analysis based on the extracted image portion. In addition, since image analysis can use a well-known method, the description is abbreviate
- the class identification unit 22 defines the wavelength of the light emitted by the illumination unit 3 and the identification information (moving direction, trajectory, external features) derived for the detected object according to a designated identification program. By collating with the information, it is identified which target object it is. Here, for example, if the identification information of the derived object is within the range indicated by the definition information of the target object, the class identification unit 22 identifies that the derived object is of the type indicated by the definition information. Become.
- definition information are stored in the memory 11 by different methods for each identification program. For example, in a rule-based identification program, definition information is preset by a user and stored in the memory 11 . Further, in the machine learning identification program, the definition information is generated, updated and stored in the memory 11 by machine learning in the learning mode.
- the class identification unit 22 stores the identification result of the detected target object and the image portion of the target object imaged by the imaging sensor 14b in the memory 11, or transmits it to an external device via the communication unit 12. do.
- step S5 the distance/speed measurement unit 23 executes distance/speed measurement processing for measuring the distance and speed to the target object based on the type of the target object identified by the class identification unit 22. Details of the distance/velocity measurement process in step S5 will be described later.
- step S6 the carbon amount estimation unit 24 executes a carbon amount estimation process for estimating the carbon amount in water.
- the details of the carbon content estimation process in step S6 will be described later.
- step S7 the control unit 10 determines whether or not the measurement end condition is satisfied. Then, the control unit 10 repeats steps S3 to S6 until the measurement termination condition is satisfied, and when the termination condition is satisfied (Yes in step S6), the determination process is terminated.
- step S5 the distance/speed measurement unit 23 executes the distance/speed measurement process based on the rule-based or machine learning distance/speed measurement program.
- the rule-based distance/speed measurement process and the machine learning distance/speed measurement process will be described with specific examples.
- FIG. 7 is a diagram for explaining the rule-based distance/speed measurement process.
- the focal length f of the vision sensor 14a is stored in the memory 11 as known information.
- the memory 11 stores statistical information (average size H) for each target object. This is registered in advance by the user as a database.
- the distance/velocity measurement unit 23 reads out from the memory 11 the average size H of the type and the focal length f of the vision sensor 14a. After that, the distance/velocity measuring unit 23 calculates the length s of the image 42 of the target object captured on the imaging surface 40 of the vision sensor 14a in the longitudinal direction, for example, the number of pixels in which the image 42 is captured, and the actual length of the pixel. Calculated by multiplying.
- the distance/velocity measurement unit 23 calculates the distance D in the imaging direction (Z direction) from the measuring device 1 to the target object 41 using Equation (1).
- D fH/s (1)
- the distance/velocity measuring unit 23 calculates the distance D from the measuring device 1 to the actual target object 41 each time an image based on pixel data is acquired (each time the target object is detected from the image). (Measure. In addition, the distance/velocity measurement unit 23 determines the imaging direction (Z-axis direction) of the target object 41 tracked between successive images based on the interval at which the images are acquired and the distance D in each image. Calculate speed.
- the distance/velocity measurement unit 23 determines the interval at which images are acquired, the number of pixels in which the target object has moved between images (that is, the distance moved on the imaging surface 40), and the distance D in the imaging direction in each image. , the velocities of the target object in the X-axis direction and the Y-axis direction are calculated. By doing so, the distance/velocity measuring unit 23 calculates the velocities of the target object in each of the three axial directions.
- the distance/speed measurement unit 23 measures the distance and speed to the target object based on the statistical information (average size) for each target object.
- FIG. 8 is a diagram for explaining an image that serves as teacher data.
- FIG. 9 is a model diagram of deep learning.
- machine learning is performed using an image that is teacher data as shown in Fig. 8, and a model (architecture) for the distance and speed measurement process is generated.
- the distance in the imaging direction from the measuring device 1 to the target object is 5 patterns of 1 mm, 5 mm, 10 mm, 100 mm, and 200 mm, and the wavelength of the irradiated light is varied by 10 nm from 400 nm to 700 nm 31
- a total of 153 patterns of patterns, images of known target objects captured by the vision sensor 14a are prepared in advance.
- the distance/velocity measuring unit 23 For each of the prepared images, the distance/velocity measuring unit 23 detects, as a target object, a group of pixels within a predetermined range in which motion is detected, and resizes the group of pixels to 32 pixels by 32 pixels to obtain the An image that is teacher data as shown in 8 is generated.
- FIG. 8 shows a part of the image that is the teacher data.
- the attenuation rate of light at about 500 nm is low, and the attenuation rate of light with wavelengths shorter than about 500 nm and light with wavelengths greater than about 500 nm increases with distance from about 500 nm.
- the longer the distance from the measuring device 1 to the target object the lower the light arrival rate.
- the closer the target object is to the measuring device 1 and the closer the wavelength of the irradiated light is to 500 nm the sharper the target object is captured. Further, the farther the target object is from the measuring device 1 and the farther the wavelength of the irradiated light is from 500 nm, the more unclear the target object is or not at all.
- the distance/speed measurement unit 23 machine-learns the teacher data consisting of these images with a deep neural network, as shown in FIG.
- This model consists, for example, of five convolutional layers (Conv1 to Conv5), three pooling layers (Max Pooling) and two fully connected layers (FC). Then, by machine learning, a model that finally outputs a one-dimensional classification vector having five elements from Distance 1 mm to Distance 200 mm is generated and stored in the memory 11 .
- Machine learning with such a deep neural network is performed for each target object, and a model for each target object is generated and stored in the memory 11.
- the distance/speed measurement unit 23 reads out the identified type of model from the memory 11 . Further, the distance/velocity measurement unit 23 resizes the target object portion in the image captured by the vision sensor 14a to 32 pixels ⁇ 32 pixels, and inputs the resized image to the read model. As a result, the value of a one-dimensional classification vector having five elements from Distance 1mm to Distance 200mm is output. Then, the distance/velocity measurement unit 23 outputs (measures) the element with the highest value (one of Distance 1 mm to Distance 200 mm) among the five elements as the distance in the imaging direction of the target object. In addition, the distance/velocity measurement unit 23 calculates the distance in the imaging direction (Z-axis direction) based on the interval at which the images are acquired and the distance in the imaging direction in each image for the target object being tracked between successive images. Calculate the velocity of
- the distance/velocity measurement unit 23 determines the interval at which images are acquired, the number of pixels in which the target object has moved between images (that is, the distance moved on the imaging surface 40), and the distance D in the imaging direction in each image. and geometrically calculate the velocities of the target object in the X-axis direction and the Y-axis direction. By doing so, the distance/velocity measuring unit 23 calculates the velocities of the target object in each of the three axial directions.
- the distance/velocity measurement unit 23 calculates the focal length f of the vision sensor 14a, the distance (D) in the imaging direction of the target object, and the longitudinal length s of the image 42 of the target object captured on the imaging surface 40.
- the longitudinal size (length: H) of the target object is calculated.
- the distance/speed measurement unit 23 measures the size, distance, and speed of the target object based on the learning results learned in advance for each type of target object.
- the carbon amount estimation unit 24 estimates the carbon amount in the water by performing the carbon amount estimation process in step S6.
- the carbon amount estimation process a spatial carbon amount estimation process for estimating the amount of carbon existing within a predetermined volume at a specific timing, and a deposited carbon amount estimation process for estimating the amount of carbon deposited on the bottom of the water during a predetermined period are performed.
- the carbon amount estimation process as described above, the carbon amount is estimated by a rule-based or machine learning carbon amount estimation program.
- the carbon amount estimation processing of machine learning will be described in detail.
- FIG. 10 is a diagram explaining machine learning of the carbon amount estimation process.
- the learning mode of the carbon amount estimation process a large amount of training data in which the carbon amount is associated with the input information of the target object is prepared, and the carbon amount estimation model is generated by the computer 50. be.
- the input information of the target object for example, the image, size and type of the target object are set.
- the computer 50 having a CPU generates a carbon amount estimation model by performing machine learning with a known algorithm using these teacher data.
- the carbon amount estimation model generated here outputs the carbon amount estimated from the input information when the input information of the target object is input.
- the generated carbon amount estimation model is pre-stored in the memory 11 of the measuring device 1 .
- the computer 50 may be the measuring device 1 .
- FIG. 11 is a diagram explaining the spatial carbon amount estimation process.
- the carbon content estimation unit 24 estimates the total carbon content of all target objects identified by one imaging.
- the carbon amount estimation unit 24 acquires the image, size and type of the target object identified or measured by the class identification unit 22 and the distance speed measurement unit 23 as input information, By inputting the obtained input information into the carbon amount estimation model, the carbon amount of each target object is derived.
- the carbon content estimation unit 24 adds the estimated carbon content of the target object to calculate the total carbon content of all the target objects identified by one imaging.
- the imaging conditions such as the imaging range and focal length of the imaging unit 14 are known in advance, the volume of the imaging range that can be imaged by the imaging unit 14 is also known.
- the carbon amount estimating unit 24 divides the sum of the carbon amounts of all the target objects identified by one imaging by the volume of the known imaging range, thereby obtaining an instantaneous Calculate the amount of carbon (ugC/L).
- the amount of carbon calculated here is the amount of carbon retained by all the target objects identified by one imaging, so it is a value that indicates the amount of carbon retained in water.
- FIG. 12 is a diagram for explaining the deposited carbon amount estimation process.
- the carbon amount estimating unit 24 extracts a target object that has moved downward based on the moving direction of the target object from among the target objects imaged for a predetermined period such as one minute.
- the movement direction is indicated by an angle on a predetermined vertical plane with the vertical direction being 0°. extracted. In other words, here, only target objects that are likely to deposit on the bottom of the water are extracted.
- the carbon content estimation unit 24 acquires the image, size, and type of the extracted target object as input information, and uses the carbon content estimation model to estimate the carbon content of each target object moving downward.
- the carbon amount estimating unit 24 calculates the time until the carbon deposits on the bottom of the water based on the moving speed of each target object. In addition, the distance from the position where the measuring device 1 is provided to the bottom of the water is known.
- the carbon amount estimating unit 24 calculates the total carbon amount of the target object that reaches the bottom of the water in one day, for example, based on the extracted carbon content of each target object and the time until it deposits on the bottom of the water. It is calculated as the amount of carbon deposited per area (ugC/day).
- FIG. 13 is a diagram illustrating the configuration of a measuring device 100 as a second embodiment according to the present technology. As shown in FIG. 13, the measuring apparatus 100 differs from the measuring apparatus 1 according to the first embodiment in that the control unit 110 does not function as the class identification unit 22. Same as 1.
- the measuring device 100 measures the distance and speed to the target object in the imaging direction based on the image captured by the vision sensor 14a without specifying the type of the target object.
- FIG. 14 is a diagram explaining an example of measurement settings.
- the control unit 110 performs measurement according to preset measurement settings as shown in FIG.
- the measurement settings specify the measurement start conditions, the operation time sheet of the illumination section 3, the distance/speed measurement program, the carbon content estimation program, and the measurement end conditions.
- the measurement start condition specifies a condition for starting measurement. For example, the time to start measurement or reception of a measurement start command input via communication unit 12 is specified. there is
- a time sheet for operating the lighting unit 3 is specified in the operation time sheet.
- it is specified to irradiate the light by changing the wavelength from 400 nm to 700 nm by 10 nm, such as 400 nm, 410 nm, .
- a program for measuring the distance, speed, etc. of a target object is specified in the distance/velocity measurement program.
- a distance/velocity measurement program based on machine learning, a rule-based distance/velocity measurement program, etc. are specified.
- a program for estimating the amount of carbon is specified in the carbon amount estimation program.
- a carbon amount estimation program based on machine learning, a rule-based carbon amount estimation program, etc. are specified.
- the measurement termination condition specifies a condition for terminating the measurement. For example, the time to terminate the measurement, or the reception of a measurement termination command input via the communication unit 12 is specified. there is
- the measurement setting in the second embodiment differs from the measurement setting in the first embodiment in that no identification program is provided.
- FIG. 15 is a flowchart showing the procedure of measurement processing.
- the control unit 110 executes the software (distance speed measurement program, carbon amount estimation program) stored in the memory 11 to execute the measurement process shown in FIG. 15 .
- step S1 the control unit 110 reads external environment information. Then, in step S2, the control unit 10 determines whether or not the measurement start condition specified in the measurement setting is satisfied. Then, control unit 110 repeats step S1 and step S2 until the measurement start condition is satisfied.
- step S3 the imaging control unit 21 switches and irradiates light with different wavelengths from the illumination unit 3 according to the operation time sheet specified in the measurement settings.
- the imaging control unit 21 causes the imaging unit 14 to image the imaging range 30 each time the wavelength of the light emitted from the illumination unit 3 and on/off is switched, and acquires pixel data and image data.
- step S11 the distance/speed measurement unit 23 detects an object existing in the imaging range as a target object based on the image based on the pixel data, and measures the size, distance, and speed of the target object. to run. Details of the distance/speed measurement process in step S11 will be described later.
- step S12 the carbon amount estimation unit 24 executes a carbon amount estimation process for estimating the carbon amount in the water.
- the details of the carbon content estimation process in step S12 will be described later.
- step S6 the control unit 10 determines whether a termination condition for terminating the determination process is satisfied. Then, the control unit 10 repeats steps S3 to S6 until the termination condition for terminating the determination process is satisfied. Yes), the determination process is terminated.
- step S11 the distance/speed measurement unit 23 executes distance/speed measurement processing based on a rule-based or machine learning distance/speed measurement program.
- a specific example will be given of the distance/speed measurement processing of machine learning.
- the measuring apparatus 100 creates a deep learning model as shown in FIG.
- a model is generated for each target object, but in the second embodiment, without generating a model for each target object, Generate only one pre-trained model.
- the distance in the imaging direction from the measuring device 1 to the target object is 5 patterns of 1 mm, 5 mm, 10 mm, 100 mm, and 200 mm, and the wavelength of the irradiated light is varied by 10 nm from 400 nm to 700 nm 31
- a total of 153 patterns and different target objects, that is, images captured by the vision sensor 14a are prepared in a total of 153 patterns.times.the number of types of target objects.
- the distance/velocity measuring unit 23 For each of the prepared images, the distance/velocity measuring unit 23 detects, as a target object, a group of pixels within a predetermined range in which motion is detected, and resizes the group of pixels to 32 pixels by 32 pixels to obtain the An image that is teacher data as shown in 8 is generated.
- the distance/speed measuring unit 23 machine-learns the teacher data consisting of these images with a deep neural network, and stores the generated model in the memory 11, as shown in FIG. remembered.
- the distance/velocity measuring unit 23 resizes the target object portion in the image captured by the vision sensor 14 a to 32 pixels ⁇ 32 pixels, and inputs the resized image to the model read from the memory 11 . As a result, the value of a one-dimensional classification vector having five elements from Distance 1mm to Distance 200mm is output. Then, the distance/velocity measurement unit 23 outputs (measures) the element with the highest value (Distance 1 mm to Distance 200 mm) among the five elements as the distance in the imaging direction of the target object.
- the distance/velocity measurement unit 23 calculates the distance in the imaging direction (Z-axis direction) based on the interval at which the images are acquired and the distance in the imaging direction in each image for the target object being tracked between successive images. Calculate (measure) the speed of Furthermore, the distance/velocity measurement unit 23 calculates the distance in the imaging direction (Z-axis direction) based on the interval at which the images are acquired and the distance in the imaging direction in each image for the target object being tracked between successive images. Calculate the velocity of
- the distance/velocity measurement unit 23 determines the interval at which images are acquired, the number of pixels in which the target object has moved between images (that is, the distance moved on the imaging surface 40), and the distance D in the imaging direction in each image. , the velocities of the target object in the X-axis direction and the Y-axis direction are calculated. By doing so, the distance/velocity measuring unit 23 calculates the velocities of the target object in each of the three axial directions.
- the distance/velocity measurement unit 23 calculates the focal length f of the vision sensor 14a, the distance (D) in the imaging direction of the target object, and the longitudinal length s of the image 42 of the target object captured on the imaging surface 40.
- the longitudinal size (length: H) of the target object is calculated.
- the distance/speed measurement unit 23 measures the size, distance, and speed of the target object based on the learning results learned in advance for each type of target object.
- the data volume can be reduced because the number of models is smaller than in the first embodiment. Further, in the second embodiment, the distance measurement accuracy is low, but the calculation time is short.
- FIG. 16 is a diagram explaining machine learning of the carbon amount estimation process.
- the carbon amount estimating unit 24 executes the carbon amount estimating process based on the rule-based or machine learning carbon amount estimating program.
- a concrete example is given and demonstrated about the carbon amount estimation process of machine learning.
- the measurement device 100 creates a carbon content estimation model.
- the type, size, and image information of the target object are used as input information to generate the carbon amount estimation model.
- the size of the target object is set as the input information of the target object.
- the computer 50 having a CPU generates a carbon content estimation model by performing machine learning using a known algorithm using a large amount of teacher data including the size and carbon content of the target object.
- the generated carbon amount estimation model is pre-stored in the memory 11 of the measuring device 1 .
- FIG. 17 is a diagram explaining the spatial carbon amount estimation process. As shown in FIG. 17, the carbon amount estimating unit 24 estimates carbon amounts for all target objects identified by one imaging in the space carbon amount estimating process.
- the carbon content estimation unit 24 uses the size of the target object as input information and estimates the carbon content of each target object using a carbon content estimation model. Then, the carbon content estimation unit 24 adds the estimated carbon content of the target object to calculate the total carbon content of all the target objects identified by one imaging.
- the carbon content estimating unit 24 divides the total carbon content of all the target objects identified by one imaging by the volume of the known imaging range, thereby obtaining an instantaneous Calculate the amount of carbon (ugC/L).
- FIG. 18 is a diagram for explaining the deposited carbon amount estimation process.
- the carbon amount estimating unit 24 extracts a target object moving downward based on the moving direction of the target object from among the target objects imaged in, for example, one minute of imaging.
- the carbon content estimation unit 24 uses the size of the extracted target object as input information, and uses the carbon content estimation model to estimate the carbon content of each target object. In addition, the carbon amount estimating unit 24 calculates the time until the carbon deposits on the bottom of the water based on the moving speed of each target object.
- the carbon amount estimating unit 24 calculates the total carbon amount of the target object that reaches the bottom of the water in one day, for example, based on the extracted carbon content of each target object and the time until it deposits on the bottom of the water. It is calculated as the amount of carbon deposited per area (ugC/day).
- the measuring device 1 is provided with one illumination unit 3.
- the number of illumination units 3 is not limited to one, and a plurality of illumination units may be provided.
- FIG. 19 is a diagram illustrating the configuration of a measuring device 200 of a modified example.
- the measuring device 200 of the modification includes one body section 2 and two lighting sections 3 .
- the two illumination units 3 are arranged so as to be able to irradiate light in mutually orthogonal directions, and can irradiate the imaging range with light of different wavelengths.
- the illumination unit 3 may irradiate the light at an angle according to the measurement, or in parallel, instead of irradiating the light in the directions perpendicular to each other.
- a measuring device 200 In such a measuring device 200, light of different wavelengths can be emitted from the two illumination units 3, so that the identification information of the target object (microorganism) that exhibits running properties with respect to the light of different wavelengths can be measured once. can be derived and the measurement can be performed efficiently.
- FIG. 20 is a diagram illustrating the configuration of a measuring device 300 of a modified example.
- the measurement device 300 of the modification includes two main body sections 2 and one lighting section 3 .
- the two main bodies 2 are arranged so as to be able to capture images in directions perpendicular to each other.
- the main unit 2 may be arranged so as to pick up an angle or a parallel image according to the measurement, instead of the arrangement for picking up images in directions perpendicular to each other.
- images can be captured by the two main units 2 (imaging units 14), so that it is possible to detect the three-dimensional movement of the target object, and more efficient measurement can be performed. It can be performed.
- imaging units 14 imaging units 14
- the imaging unit 14 is provided with the vision sensor 14a and the imaging sensor 14b.
- the imaging unit 14 may include only one of the vision sensor 14a and the imaging sensor 14b.
- the imaging unit 14 may be provided with a SPAD (Single Photon Avalanche Diode) sensor instead of the vision sensor 14a and the imaging sensor 14b.
- SPAD Single Photon Avalanche Diode
- the method of identifying or measuring the type, size, moving direction, distance, speed, and moving speed of the target object described in the above embodiments is merely an example, and various known methods can be used to determine the type of target object. , size, direction of movement, distance, velocity, speed of movement may be identified or measured.
- one or more of the type, size, and image information of the target object are used as information regarding the target object, and the information regarding the target object is used as input information to estimate the carbon content.
- the information about the target object may include not only the type, size, and image information of the target object, but also other information such as moving speed and moving direction.
- the imaging control unit 21 causes the imaging unit 14 to image a predetermined imaging range in water, and the amount of carbon in the water is estimated based on the image captured by the imaging unit 14. and a carbon amount estimating unit 24 that Thereby, the measuring device 1 can estimate the amount of carbon based on the image captured by the imaging unit 14 .
- the measuring device 1 can observe carbon transition over a long period of time.
- the imaging unit 14 may include a vision sensor 14a that asynchronously acquires pixel data according to the amount of light incident on each of the pixels arranged in a two-dimensional array. This makes it possible to read out only the pixel data of the pixel where the event has occurred, and estimate the carbon content based on the pixel data. Therefore, the measuring apparatus 1 can realize high-speed imaging, reduction of power consumption, and low calculation cost of image processing by automatic separation from the background.
- the carbon content estimation unit 24 may estimate the carbon content existing within the imaging range. This makes it possible to estimate the amount of carbon contained (dissolved) in water.
- the carbon amount estimation unit 24 may estimate the amount of carbon deposited on the bottom of the water. This makes it possible to grasp the amount of carbon fixed to the bottom of the water.
- the carbon amount estimation unit 24 may estimate the amount of carbon deposited on the bottom of the water per predetermined time. This makes it possible to grasp the change in the amount of carbon fixed to the bottom of the water.
- the measurement device 1 includes a measurement unit (distance/velocity measurement unit 23) that measures the size of the target object captured by the imaging unit 14, and the carbon amount estimation unit 24 measures the size of the target object based on the size of the target object. It is conceivable to estimate the amount of carbon by Since the ratio of carbon contained in the target object existing in water is known to some extent, the carbon content can be estimated by simple processing by estimating the carbon content based on the size of the target object.
- the measuring device 1 according to the present technology described above includes the class identification unit 22 that identifies the type of the target object imaged by the imaging unit 14, and the carbon amount estimation unit 24 determines the carbon amount based on the type and size of the target object. can be estimated. Since the carbon ratio is known for each type of target object, it is possible to accurately estimate the carbon content by estimating the carbon content based on the type and size of the target object.
- the class identification unit 22 extracts the image portion of the target object captured by the imaging unit 14, and the carbon amount estimation unit extracts the image portion, type, and size of the target object. It is conceivable to estimate the amount of carbon based on By further using the image portion of the target object, it becomes possible to estimate the carbon content with higher accuracy.
- the measuring device 1 includes a direction measuring unit (class identifying unit 22) that measures the moving direction of the target object imaged by the imaging unit 14, and the carbon amount estimating unit 24 measures the moving direction of the target object It is conceivable to identify target objects deposited on the bottom of the water based on . This makes it possible to accurately estimate the amount of carbon deposited on the water bottom based on the target object deposited on the water bottom.
- a direction measuring unit class identifying unit 22
- the measuring device 1 includes a speed measuring unit (distance speed measuring unit 23) that measures the moving speed of the target object imaged by the imaging unit 14, and the carbon amount estimating unit 24 measures the movement of the target object Based on the direction and speed of movement, it is conceivable to identify target objects deposited on the bottom of the water per predetermined time. Accordingly, it is possible to accurately estimate the amount of carbon deposited on the water bottom per predetermined time based on the target objects deposited on the water bottom per predetermined time.
- distance speed measuring unit 23 measures the moving speed of the target object imaged by the imaging unit 14
- the carbon amount estimating unit 24 measures the movement of the target object Based on the direction and speed of movement, it is conceivable to identify target objects deposited on the bottom of the water per predetermined time. Accordingly, it is possible to accurately estimate the amount of carbon deposited on the water bottom per predetermined time based on the target objects deposited on the water bottom per predetermined time.
- the imaging unit is caused to capture an image of a predetermined imaging range in water, and the carbon content is estimated based on the image captured by the imaging unit.
- the imaging unit is caused to image a predetermined imaging range in water, and the information processing device is caused to execute processing for estimating the carbon content based on the image captured by the imaging unit.
- Such a program can be recorded in advance in a HDD as a recording medium built in equipment such as a computer device, or in a ROM or the like in a microcomputer having a CPU.
- a flexible disc a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a Blu-ray disc (Blu-ray Disc (registered trademark)), a magnetic disc, a semiconductor memory
- It can be temporarily or permanently stored (recorded) in a removable recording medium such as a memory card.
- Such removable recording media can be provided as so-called package software.
- it can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
- LAN Local Area Network
- the present technology can also adopt the following configuration.
- an imaging control unit that causes the imaging unit to image a predetermined imaging range in water; a carbon amount estimating unit that estimates the carbon amount based on the image captured by the imaging unit; measuring device with (2)
- the imaging unit is (1), comprising a vision sensor that acquires pixel data asynchronously according to the amount of light incident on each of the two-dimensionally arranged pixels.
- the carbon amount estimating unit The measuring device according to (1) or (2), which estimates the amount of carbon existing within the imaging range.
- the carbon amount estimating unit The measuring device according to any one of (1) to (3), which estimates the amount of carbon deposited on the bottom of the water.
- the carbon amount estimating unit The measuring device according to (4), which estimates the amount of carbon deposited on the bottom of the water per predetermined time.
- a measurement unit that measures the size of the target object imaged by the imaging unit, The carbon amount estimating unit The measuring device according to any one of (1) to (5), wherein the carbon content is estimated based on the size of the target object.
- a class identification unit that identifies the type of the target object imaged by the imaging unit, The carbon amount estimating unit The measuring device according to (6), wherein the carbon content is estimated based on the type and size of the target object.
- the class identification unit extracting an image portion of the target object imaged by the imaging unit; The carbon amount estimating unit The measuring device according to (7), wherein the carbon content is estimated based on the image portion, type and size of the target object.
- a direction measuring unit that measures the moving direction of the target object imaged by the imaging unit, The carbon amount estimating unit The measuring device according to (5), wherein the target object deposited on the bottom of the water is specified based on the moving direction of the target object.
- a speed measuring unit that measures the moving speed of the target object imaged by the imaging unit, The carbon amount estimating unit (9) The measuring device according to (9), wherein the object deposited on the bottom of the water is specified per predetermined time based on the movement direction and movement speed of the object.
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Food Science & Technology (AREA)
- Medicinal Chemistry (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
これにより、測定装置は、撮像部によって撮像される画像に基づいて、炭素量を推定することが可能となる。
<1.第一の実施形態>
[1.1 測定装置の構成]
[1.2 対象物体について]
[1.3 第一の実施形態の測定方法]
[1.4 測定処理]
[1.5 距離速度測定処理]
[1.6 炭素量推定処理]
<2.第二の実施形態>
[2.1 測定装置の構成]
[2.2 測定処理]
[2.3 機械学習の距離速度測定処理]
[2.4 炭素量推定処理]
<3.測定装置の他の構成例>
<4.実施形態のまとめ>
<5.本技術>
[1.1 測定装置の構成]
先ず、本技術に係る第一の実施形態としての測定装置1の構成について説明する。
測定装置1は、例えば海中などの水中に存在する微生物または微粒子を対象物体とし、対象物体の炭素量を推定していくことで水中の炭素量を推定(測定)する装置である。
図1に示すように、測定装置1は、本体部2および照明部3を備えている。なお、照明部3は、本体部2内に設けられるようにしてもよい。
また、制御部10は、メモリ11に記憶されたデータの読み出し処理やメモリ11にデータを記憶させる処理、および、通信部12を介した外部機器との間での各種データの送受信を行う。
図3は、対象物体および対象物体の移動について説明する図である。なお、図3では、上段に対象物体のイメージを示し、下段に対象物体の移動方向を矢印で示している。
そして、微生物のなかには、特定の波長の光が照射されることによって走行性を示すものが存在することが知られている。ここで、走行性とは、光(外部刺激)に対して、生物が反応する生得的な行動である。したがって、走行性を有する微生物に特定の波長の光を照射すると、その微生物は走行性に応じた移動を行う。
海底砂は、例えば、海底に沈殿している砂などの粒子であり、海底流によって渦を巻くように移動する。
スモークは、例えば、地熱で熱せられた高温水が海底の熱水噴出孔から噴出する現象である。そして、熱水噴出孔から吹き出す熱水は数百度にも達することがあり、溶存成分として重金属や硫化水素を豊富に含むため、海水と反応して黒色や白色の煙がスモークとして渦を巻きながら上昇するように移動する。
気泡は、例えば海底から漏出(噴出)するメタンや二酸化炭素などの自然ガス、または、CCS(二酸化炭素貯留)で人工的に圧入した貯留層から漏出する二酸化炭素等であり、海底から上昇するように移動する。
次に、第一の実施形態としての対象物体についての測定方法(測定処理)について説明する。
海洋は深度約150mで太陽光が届かない無光層となる。無光層は外洋の大部分を占め、上記の対象物体も多く存在する。一方で、対象物体は、照射される光の波長ごとに異なる波長または強度の光を反射または発光することが知られている。
図6は、測定処理の手順を示すフローチャートである。制御部10は、メモリ11に記憶されているソフトウェア(識別プログラム、距離速度測定プログラム、炭素量推定プログラムを含む)を実行することで、図6に示す測定処理を実行する。
次に距離速度測定処理について説明する。上記したように、距離速度測定部23は、ステップS5において、ルールベースまたは機械学習の距離速度測定プログラムに基づいて、距離速度測定処理を実行する。
ここでは、ルールベースの距離速度測定処理と、機械学習の距離速度測定処理とについて具体的な例を挙げて説明する。
図7は、ルールベースの距離速度測定処理を説明する図である。ルールベースの距離速度測定処理では、ビジョンセンサ14aの焦点距離fが既知の情報としてメモリ11に記憶されている。
D=fH/s ・・・(1)
また、距離速度測定部23は、連続する画像間で追跡されている対象物体41について、画像が取得される間隔と、それぞれの画像での距離Dとに基づき、撮像方向(Z軸方向)の速度を算出する。
このようにすることで、距離速度測定部23は、対象物体についての3軸方向のそれぞれの速度を算出する。
図8は、教師データとなる画像を説明する図である。図9は、ディープラーニングのモデル図である。
また、測定装置1から対象物体までの距離が遠くなるほど、光の到達率は低下する。
また、距離速度測定部23は、連続する画像間で追跡されている対象物体について、画像が取得される間隔と、それぞれの画像での撮像方向の距離とに基づき、撮像方向(Z軸方向)の速度を算出する。
このようにすることで、距離速度測定部23は、対象物体についての3軸方向のそれぞれの速度を算出する。
ところで、地球温暖化の要因となる温室効果ガスを削減するために、温室効果ガスである二酸化炭素の元となる炭素の推移を長期的に観測することが求められている。二酸化炭素の一部は、海中(水中)に吸収される。また、海中に吸収された二酸化炭素の一部は、植物プランクトンによる光合成で炭素固定される。さらに、植物プランクトン、あるいは、植物プランクトンを捕食した浮遊性生物(動物プランクトン、幼生等)の死骸や糞がマリンスノーとして水底に堆積することになる。そして、水底に堆積したマリンスノーは、1000年規模の巨大な炭素吸収源となる。
[2.1 測定装置の構成]
図13は、本技術に係る第二の実施形態としての測定装置100の構成を説明する図である。図13に示すように、測定装置100は、第一の実施形態としての測定装置1と比較して、制御部110がクラス識別部22として機能しない点で相違し、それ以外の構成は測定装置1と同様である。
図15は、測定処理の手順を示すフローチャートである。制御部110は、メモリ11に記憶されているソフトウェア(距離速度測定プログラム、炭素量推定プログラム)を実行することで、図15に示す測定処理を実行する。
上記したように、距離速度測定部23は、ステップS11において、ルールベースまたは機械学習の距離速度測定プログラムに基づいて、距離速度測定処理を実行する。
ここでは、機械学習の距離速度測定処理について具体的な例を挙げて説明する。
ここで、第一の実施形態においては、対象物体ごとにモデルを生成するようにしたが、第二の実施形態においては、対象物体ごとにモデルを生成することなく、対象物体の種類に拘らず予め学習された1つのモデルのみを生成する。
また、距離速度測定部23は、連続する画像間で追跡されている対象物体について、画像が取得される間隔と、それぞれの画像での撮像方向の距離とに基づき、撮像方向(Z軸方向)の速度を算出(測定)する。
さらに、距離速度測定部23は、連続する画像間で追跡されている対象物体について、画像が取得される間隔と、それぞれの画像での撮像方向の距離とに基づき、撮像方向(Z軸方向)の速度を算出する。
このようにすることで、距離速度測定部23は、対象物体についての3軸方向のそれぞれの速度を算出する。
図16は、炭素量推定処理の機械学習を説明する図である。上記したように、炭素量推定部24は、ステップS12において、ルールベースまたは機械学習の炭素量推定プログラムに基づいて、炭素量推定処理を実行する。
ここでは、機械学習の炭素量推定処理について具体的な例を挙げて説明する。
ここで、第一の実施形態においては、対象物体の種類、サイズ、画像情報を入力情報として炭素量推定モデルを生成するようにしたが、第二の実施形態においては、対象物体の種類、画像情報を入力情報に用いることなく炭素量推定モデルを生成する。
なお、実施形態としては上記により説明した具体例に限定されるものではなく、多様な変形例としての構成を採り得るものである。
このような測定装置200では、2個の照明部3から異なる波長の光を照射可能であるため、異なる波長の光に対して走行性を示す対象物体(微生物)の識別情報を1回の測定により導出することができ、効率的に測定を行うことができる。
このような測定装置300では、2個の本体部2(撮像部14)によって画像を撮像することができるため、対象物体の三次元的な移動を検出することが可能となり、より効率的に測定を行うことができる。
なお、2個の本体部2を備える場合、一方の本体部2は、撮像部14のみを備えるようにしてもよい。
上記のように実施形態の測定装置1においては、水中の所定の撮像範囲を撮像部14によって撮像させる撮像制御部21と、撮像部14で撮像された画像に基づいて、水中の炭素量を推定する炭素量推定部24と、を備えるものである。
これにより、測定装置1は、撮像部14によって撮像される画像に基づいて、炭素量を推定することが可能となる。
かくして、測定装置1は、炭素の推移を長期的に観測することが可能となる。
これにより、イベントが発生した画素の画素データのみを読み出し、その画素データに基づいて炭素量を推定することが可能となる。
したがって、測定装置1は、高速撮像、消費電力の削減、背景との自動分離による画像処理の低計算コスト化を図ることができる。
これにより、水中に含まれる(溶存する)炭素量を推定することが可能となる。
これにより、水底に炭素固定される炭素量を把握することが可能となる。
これにより、水底に炭素固定される炭素量の推移を把握することが可能となる。
水中に存在する対象物体に含まれる炭素の割合がある程度既知であるため、対象物体のサイズに基づいて炭素量を推定することで、簡易な処理によって炭素量を推定することが可能となる。
上記した本技術に係る測定装置1において、撮像部14で撮像された対象物体の種類を識別するクラス識別部22を備え、炭素量推定部24は、対象物体の種類およびサイズに基づいて炭素量を推定することが考えられる。
対象物体の種類ごとに炭素の割合が既知となっていることから、対象物体の種類およびサイズに基づいて炭素量を推定することで、精度よく炭素量を推定することが可能となる。
対象物体の画像部分をさらに用いることにより、さらに精度よく炭素量を推定することが可能となる。
これにより、水底に堆積する対象物体に基づいて、水底に堆積する炭素量を精度よく推定することが可能となる。
これにより、所定時間あたりに水底に堆積する対象物体に基づいて、所定時間あたりに水底に堆積する炭素量を精度よく推定することが可能となる。
上記した本技術に係るプログラムにおいては、水中の所定の撮像範囲を撮像部によって撮像させ、撮像部で撮像された画像に基づいて、炭素量を推定する処理を情報処理装置に実行させる。
あるいはまた、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magneto Optical)ディスク、DVD(Digital Versatile Disc)、ブルーレイディスク(Blu-ray Disc(登録商標))、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。
また、このようなプログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
本技術は以下のような構成も採ることができる。
(1)
水中の所定の撮像範囲を撮像部に撮像させる撮像制御部と、
前記撮像部で撮像された画像に基づいて、炭素量を推定する炭素量推定部と、
を備えた測定装置。
(2)
前記撮像部は、
二次元に複数配列された画素それぞれに入射した光量に応じて、非同期で画素データを取得するビジョンセンサを備える
(1)に記載の測定装置。
(3)
前記炭素量推定部は、
前記撮像範囲内に存在する炭素量を推定する
(1)または(2)に記載の測定装置。
(4)
前記炭素量推定部は、
水底に堆積する前記炭素量を推定する
(1)から(3)のいずれかに記載の測定装置。
(5)
前記炭素量推定部は、
所定時間あたりの水底に堆積する前記炭素量を推定する
(4)に記載の測定装置。
(6)
前記撮像部で撮像された対象物体のサイズを測定する測定部を備え、
前記炭素量推定部は、
前記対象物体のサイズに基づいて前記炭素量を推定する
(1)から(5)のいずれかに記載の測定装置。
(7)
前記撮像部で撮像された前記対象物体の種類を識別するクラス識別部を備え、
前記炭素量推定部は、
前記対象物体の種類およびサイズに基づいて前記炭素量を推定する
(6)に記載の測定装置。
(8)
前記クラス識別部は、
前記撮像部で撮像された前記対象物体の画像部分を抽出し、
前記炭素量推定部は、
前記対象物体の画像部分、種別およびサイズに基づいて前記炭素量を推定する
(7)に記載の測定装置。
(9)
前記撮像部で撮像された対象物体の移動方向を測定する方向測定部を備え、
前記炭素量推定部は、
前記対象物体の移動方向に基づいて、水底に堆積する対象物体を特定する
(5)に記載の測定装置。
(10)
前記撮像部で撮像された対象物体の移動速度を測定する速度測定部を備え、
前記炭素量推定部は、
前記対象物体の移動方向および移動速度に基づいて、所定時間あたりに水底に堆積する対象物体を特定する
(9)に記載の測定装置。
(11)
水中の所定の撮像範囲を撮像部によって撮像させ、
前記撮像部で撮像された画像に基づいて、炭素量を推定する
測定方法。
(12)
水中の所定の撮像範囲を撮像部によって撮像させ、
前記撮像部で撮像された画像に基づいて、炭素量を推定する
処理を測定装置に実行させるプログラム。
3 照明部
10 制御部
14 撮像部
14a ビジョンセンサ
14b 撮像センサ
21 撮像制御部
22 クラス識別部
23 距離速度測定部
24 炭素量推定部
Claims (12)
- 水中の所定の撮像範囲を撮像部に撮像させる撮像制御部と、
前記撮像部で撮像された画像に基づいて、炭素量を推定する炭素量推定部と、
を備えた測定装置。 - 前記撮像部は、
二次元に複数配列された画素それぞれに入射した光量に応じて、非同期で画素データを取得するビジョンセンサを備える
請求項1に記載の測定装置。 - 前記炭素量推定部は、
前記撮像範囲内に存在する炭素量を推定する
請求項1に記載の測定装置。 - 前記炭素量推定部は、
水底に堆積する前記炭素量を推定する
請求項1に記載の測定装置。 - 前記炭素量推定部は、
所定時間あたりの水底に堆積する前記炭素量を推定する
請求項4に記載の測定装置。 - 前記撮像部で撮像された対象物体のサイズを測定する測定部を備え、
前記炭素量推定部は、
前記対象物体のサイズに基づいて前記炭素量を推定する
請求項1に記載の測定装置。 - 前記撮像部で撮像された前記対象物体の種類を識別するクラス識別部を備え、
前記炭素量推定部は、
前記対象物体の種類およびサイズに基づいて前記炭素量を推定する
請求項6に記載の測定装置。 - 前記クラス識別部は、
前記撮像部で撮像された前記対象物体の画像部分を抽出し、
前記炭素量推定部は、
前記対象物体の画像部分、種別およびサイズに基づいて前記炭素量を推定する
請求項7に記載の測定装置。 - 前記撮像部で撮像された対象物体の移動方向を測定する方向測定部を備え、
前記炭素量推定部は、
前記対象物体の移動方向に基づいて、水底に堆積する対象物体を特定する
請求項5に記載の測定装置。 - 前記撮像部で撮像された対象物体の移動速度を測定する速度測定部を備え、
前記炭素量推定部は、
前記対象物体の移動方向および移動速度に基づいて、所定時間あたりに水底に堆積する対象物体を特定する
請求項9に記載の測定装置。 - 水中の所定の撮像範囲を撮像部によって撮像させ、
前記撮像部で撮像された画像に基づいて、炭素量を推定する
測定方法。 - 水中の所定の撮像範囲を撮像部によって撮像させ、
前記撮像部で撮像された画像に基づいて、炭素量を推定する
処理を測定装置に実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22860849.3A EP4394354A1 (en) | 2021-08-26 | 2022-03-18 | Measurement device, measurement method, and program |
JP2023543668A JPWO2023026551A1 (ja) | 2021-08-26 | 2022-03-18 | |
CN202280056389.4A CN117897607A (zh) | 2021-08-26 | 2022-03-18 | 测量装置、测量方法和程序 |
KR1020247004139A KR20240047368A (ko) | 2021-08-26 | 2022-03-18 | 측정 장치, 측정 방법, 프로그램 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-138153 | 2021-08-26 | ||
JP2021138153 | 2021-08-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023026551A1 true WO2023026551A1 (ja) | 2023-03-02 |
Family
ID=85322673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/012747 WO2023026551A1 (ja) | 2021-08-26 | 2022-03-18 | 測定装置、測定方法、プログラム |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4394354A1 (ja) |
JP (1) | JPWO2023026551A1 (ja) |
KR (1) | KR20240047368A (ja) |
CN (1) | CN117897607A (ja) |
WO (1) | WO2023026551A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06304546A (ja) * | 1993-04-21 | 1994-11-01 | Hitachi Ltd | 上水道プラントの運用制御装置 |
CN112362544A (zh) * | 2020-10-14 | 2021-02-12 | 南京吉泽信息科技有限公司 | 基于高光谱遥感的颗粒有机碳监测方法及系统 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7007225B2 (ja) | 2018-03-23 | 2022-01-24 | Jfeアドバンテック株式会社 | 特定種の植物プランクトンの存在量の算出方法及び算出装置、及び特定種の植物プランクトンによる赤潮発生の予兆検知方法及び予兆検知装置 |
-
2022
- 2022-03-18 EP EP22860849.3A patent/EP4394354A1/en active Pending
- 2022-03-18 CN CN202280056389.4A patent/CN117897607A/zh active Pending
- 2022-03-18 KR KR1020247004139A patent/KR20240047368A/ko unknown
- 2022-03-18 JP JP2023543668A patent/JPWO2023026551A1/ja active Pending
- 2022-03-18 WO PCT/JP2022/012747 patent/WO2023026551A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06304546A (ja) * | 1993-04-21 | 1994-11-01 | Hitachi Ltd | 上水道プラントの運用制御装置 |
CN112362544A (zh) * | 2020-10-14 | 2021-02-12 | 南京吉泽信息科技有限公司 | 基于高光谱遥感的颗粒有机碳监测方法及系统 |
Non-Patent Citations (1)
Title |
---|
MIYAI, HIROSHI ET AL.: "A Simple Method for the Estimation of Phytoplankton Biomass Based on Cell Morphology", PLANKTON SOCIETY OF JAPAN. BULLETIN, PLANKTON SOCIETY OF JAPAN, TOKYO, JP, vol. 35, no. 2, 1 January 1988 (1988-01-01), JP , pages 121 - 126, XP009543708, ISSN: 0387-8961 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023026551A1 (ja) | 2023-03-02 |
CN117897607A (zh) | 2024-04-16 |
KR20240047368A (ko) | 2024-04-12 |
EP4394354A1 (en) | 2024-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6959614B2 (ja) | 分析装置,及びフローサイトメータ | |
Samson et al. | A system for high-resolution zooplankton imaging | |
CN103617426B (zh) | 一种自然环境干扰和有遮挡时的行人目标检测方法 | |
CN109460753A (zh) | 一种检测水上漂浮物的方法 | |
Livanos et al. | Intelligent navigation and control of a prototype autonomous underwater vehicle for automated inspection of aquaculture net pen cages | |
CN109509187A (zh) | 一种针对大分辨率布匹图像中的小瑕疵的高效检验算法 | |
Cong et al. | Novel event analysis for human-machine collaborative underwater exploration | |
CN115631407B (zh) | 基于事件相机与彩色帧图像融合的水下透明生物检测 | |
CN113469097A (zh) | 一种基于ssd网络的水面漂浮物多相机实时检测方法 | |
WO2023026551A1 (ja) | 測定装置、測定方法、プログラム | |
WO2022255152A1 (ja) | 測定装置、測定方法、プログラム | |
WO2023026880A1 (ja) | 測定装置、測定方法、プログラム | |
CN116664545A (zh) | 一种基于深度学习的近海底栖生物定量检测方法及系统 | |
Chen et al. | A Self-Supervised Miniature One-Shot Texture Segmentation (MOSTS) Model for Real-Time Robot Navigation and Embedded Applications | |
WO2022254942A1 (ja) | 測定装置、測定方法、プログラム | |
Buayai et al. | Boundary detection of pigs in pens based on adaptive thresholding using an integral image and adaptive partitioning | |
WO2023112532A1 (ja) | 測定装置、測定方法、プログラム | |
Wang et al. | Impact of Traditional Augmentation Methods on Window State Detection | |
Sarker et al. | Automatic Detection of Microplastics in the Aqueous Environment | |
Gogate et al. | Nature Inspired Algorithms for Improving Underwater Object Detection | |
Zhao et al. | Fish Identification Method Based on Oriented Fast and Rotated Brief Algorithm and Convolutional Neural Network in Muddy Sea | |
CN116363085B (zh) | 基于小样本学习和虚拟合成数据的工业零件目标检测方法 | |
Sundar et al. | Underwater Biofouling Detection Using Image Processing And Neural Network | |
Dyomin et al. | Information extraction from digital holograms of particles | |
KR20230142469A (ko) | 서버 장치, 생성 방법, 전자 기기의 생성 방법, 데이터베이스의생성 방법, 전자 기기 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22860849 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023543668 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18684180 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280056389.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022860849 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022860849 Country of ref document: EP Effective date: 20240326 |