WO2022255152A1 - 測定装置、測定方法、プログラム - Google Patents
測定装置、測定方法、プログラム Download PDFInfo
- Publication number
- WO2022255152A1 WO2022255152A1 PCT/JP2022/021150 JP2022021150W WO2022255152A1 WO 2022255152 A1 WO2022255152 A1 WO 2022255152A1 JP 2022021150 W JP2022021150 W JP 2022021150W WO 2022255152 A1 WO2022255152 A1 WO 2022255152A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target object
- imaging
- unit
- measuring device
- measurement
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 135
- 238000000691 measurement method Methods 0.000 title claims description 9
- 238000003384 imaging method Methods 0.000 claims abstract description 198
- 238000005286 illumination Methods 0.000 claims description 60
- 238000000034 method Methods 0.000 claims description 52
- 230000008569 process Effects 0.000 claims description 36
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 16
- 238000005516 engineering process Methods 0.000 description 24
- 238000010801 machine learning Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 18
- 230000004048 modification Effects 0.000 description 17
- 238000012986 modification Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 244000005700 microbiome Species 0.000 description 8
- 230000005484 gravity Effects 0.000 description 7
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 6
- 230000005284 excitation Effects 0.000 description 6
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 4
- 239000004576 sand Substances 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 229910002092 carbon dioxide Inorganic materials 0.000 description 3
- 239000001569 carbon dioxide Substances 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 239000010419 fine particle Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 239000000779 smoke Substances 0.000 description 3
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 239000011859 microparticle Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 1
- 229920000426 Microplastic Polymers 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003570 air Substances 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 210000003608 fece Anatomy 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 229910001385 heavy metal Inorganic materials 0.000 description 1
- 229910052734 helium Inorganic materials 0.000 description 1
- 239000001307 helium Substances 0.000 description 1
- SWQJXJOGLNCZEY-UHFFFAOYSA-N helium atom Chemical compound [He] SWQJXJOGLNCZEY-UHFFFAOYSA-N 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 125000004435 hydrogen atom Chemical class [H]* 0.000 description 1
- 229910000037 hydrogen sulfide Inorganic materials 0.000 description 1
- 229910052742 iron Inorganic materials 0.000 description 1
- WPBNNNQJVZRUHP-UHFFFAOYSA-L manganese(2+);methyl n-[[2-(methoxycarbonylcarbamothioylamino)phenyl]carbamothioyl]carbamate;n-[2-(sulfidocarbothioylamino)ethyl]carbamodithioate Chemical compound [Mn+2].[S-]C(=S)NCCNC([S-])=S.COC(=O)NC(=S)NC1=CC=CC=C1NC(=S)NC(=O)OC WPBNNNQJVZRUHP-UHFFFAOYSA-L 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000003345 natural gas Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000013535 sea water Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000009919 sequestration Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/05—Underwater scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- This technology relates to measuring devices, measuring methods, and programs, and in particular to technology for measuring target objects in water.
- a measuring device for measuring the abundance of phytoplankton by irradiating excitation light of a predetermined wavelength to excite phytoplankton and measuring the intensity of fluorescence emitted from the excited phytoplankton has been proposed (for example, See Patent Document 1).
- the above measuring device can only measure phytoplankton excited by excitation light. In addition, although the measuring device can measure the abundance of phytoplankton, it cannot measure information about the position.
- the purpose of this technology is to efficiently measure information about the position of the target object.
- a measurement apparatus includes an imaging control unit that causes an imaging unit to capture an image of a predetermined imaging range in water, and a measurement that measures information about the position of a target object in the imaging direction based on the image captured by the imaging unit. and a part.
- the measuring device can measure information about the position of the target object in the imaging direction without having a complicated configuration.
- FIG. 4 is a flowchart showing the procedure of measurement processing; It is a figure explaining a rule-based distance speed measurement process.
- FIG. 4 is a diagram for explaining an image that serves as teacher data; It is a model diagram of deep learning. It is a figure explaining the composition of the measuring device as a second embodiment concerning this art. It is a figure explaining an example of a measurement setting.
- FIG. 4 is a flowchart showing the procedure of measurement processing; It is a figure explaining the structure of the measuring apparatus of a modification. It is a figure explaining the structure of the measuring apparatus of a modification. It is a figure explaining the lighting control in the modification 1.
- FIG. FIG. 11 is a diagram for explaining an image captured by a vision sensor during lighting control in modification 1; It is a figure explaining the lighting control in the modification 2.
- First Embodiment> [1.1 Configuration of measuring device] [1.2 Target object] [1.3 Measurement method of the first embodiment] [1.4 Measurement process] [1.5 Distance speed measurement process] ⁇ 2.
- Second Embodiment> [2.1 Configuration of measuring device] [2.2 Measurement process] [2.3 Machine learning distance speed measurement process] ⁇ 3.
- Other Configuration Examples of Measuring Device> ⁇ 4. Summary of Embodiments> ⁇ 5. This technology>
- the measuring device 1 is a device that measures information about the position of the target object in the imaging direction, for example, using microorganisms or fine particles that exist in water such as the sea as the target object.
- the target microorganisms are phytoplankton, zooplankton, and aquatic microorganisms such as larvae of aquatic organisms that exist in water.
- fine particles serving as target objects include microplastics, dust, sand, marine snow, air bubbles, and the like.
- the target object may be other than these.
- the information about the position of the target object in the imaging direction is, for example, the distance to the target object in the imaging direction of the imaging unit 14 (the Z-axis direction in FIG. 2) or the speed of the target object.
- FIG. 1 is a diagram for explaining the configuration of a measuring device 1 as a first embodiment.
- FIG. 2 is a diagram for explaining the imaging range IR and measurement directions.
- the measuring device 1 includes a main body portion 2 and an illumination portion 3.
- the lighting unit 3 may be provided inside the main body unit 2 .
- the main unit 2 includes a control unit 10, a memory 11, a communication unit 12, a gravity sensor 13, an imaging unit 14 and a lens 15.
- the control unit 10 includes, for example, a microcomputer having a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory), and controls the measuring apparatus 1 as a whole.
- the control unit 10 functions as an imaging control unit 21, a class identification unit 22, and a distance speed measurement unit 23 in the first embodiment. Details of the imaging control unit 21, the class identification unit 22, and the distance/speed measurement unit 23 will be described later.
- the control unit 10 reads data stored in the memory 11 , stores data in the memory 11 , and transmits and receives various data to and from external devices via the communication unit 12 .
- the memory 11 is composed of a non-volatile memory.
- the communication unit 12 performs wired or wireless data communication with an external device.
- the gravity sensor 13 detects gravitational acceleration (direction of gravity) and outputs the detection result to the control unit 10 . Note that the measuring device 1 does not have to include the gravity sensor 13 .
- the imaging unit 14 includes both or one of a vision sensor 14a and an imaging sensor 14b.
- the vision sensor 14a is a sensor called DVS (Dynamic Vision Sensor) or EVS (Event-Based Vision Sensor).
- the vision sensor 14 a captures an underwater predetermined imaging range IR through the lens 15 .
- the horizontal direction of the imaging range IR is defined as the X-axis direction
- the vertical direction of the imaging range IR is defined as the Y-axis direction
- the imaging direction (optical axis direction) of the imaging unit 14 is defined as the Z-axis direction. It is sometimes written as direction.
- the vision sensor 14a is an asynchronous image sensor in which a plurality of pixels having photoelectric conversion elements are arranged two-dimensionally and a detection circuit for detecting an address event in real time is provided for each pixel.
- an address event is an event that occurs according to the amount of incident light for each address assigned to each of a plurality of pixels arranged two-dimensionally. It is, for example, that the value or its variation exceeds a certain threshold.
- the vision sensor 14a detects whether or not an address event has occurred for each pixel, and when the occurrence of an address event is detected, reads a pixel signal as pixel data from the pixel where the address event has occurred. That is, the vision sensor 14a acquires pixel data asynchronously according to the amount of light incident on each of the two-dimensionally arranged pixels.
- a pixel signal readout operation is executed for pixels for which the occurrence of an address event has been detected.
- the amount of data read out for one frame is small.
- the measurement device 1 can detect the movement of the target object more quickly by using the vision sensor 14a.
- the vision sensor 14a can reduce the amount of data and the power consumption.
- the imaging sensor 14b is, for example, a CCD (Charge Coupled Device) type or CMOS (Complementary Metal-Oxide-Semiconductor) type image sensor, in which a plurality of pixels having photoelectric conversion elements are arranged two-dimensionally.
- the imaging sensor 14b captures a predetermined imaging range IR through the lens 15 at regular intervals according to the frame rate to generate image data.
- a zone plate, a pinhole plate, or a transparent plate can be used instead of the lens 15.
- the vision sensor 14 a and the imaging sensor 14 b are arranged so as to capture substantially the same imaging range IR through the lens 15 .
- a half mirror (not shown) is arranged between the vision sensor 14a and the imaging sensor 14b, and the lens 15, one of which is split by the half mirror is incident on the vision sensor 14a, and the other is incident on the imaging sensor 14b. You should do it like this.
- the illumination unit 3 is driven under the control of the control unit 10 and irradiates the imaging range IR of the imaging unit 14 with light.
- the illumination unit 3 can switch and irradiate light with different wavelengths, for example, irradiate light with different wavelengths at intervals of 10 nm.
- FIG. 3 is a diagram illustrating a target object and movement of the target object.
- the image of the target object is shown in the upper part, and the moving direction of the target object is indicated by the arrow in the lower part.
- the target objects include microorganisms, marine snow, seabed sand, smoke, and air bubbles. It is known that among microorganisms, there are some that exhibit migration properties when irradiated with light of a specific wavelength.
- runnability is an innate behavior in which an organism reacts to light (external stimulus). Therefore, when microbes having running properties are irradiated with light of a specific wavelength, the microbes migrate according to their running properties.
- Marine snow is, for example, particles such as plankton excreta, carcasses, or decomposed plankton present in the sea, which move (in the direction of gravity) in a sinking manner in the sea.
- Seabed sand is, for example, particles such as sand deposited on the seabed, and moves in a whirlpool due to seabed currents.
- Smoke is, for example, a phenomenon in which high-temperature water heated by geothermal heat erupts from hydrothermal vents on the seafloor. The hot water blowing out from the hydrothermal vents can reach temperatures of several hundred degrees, and because it contains abundant dissolved components such as heavy metals and hydrogen sulfide, it reacts with the seawater to produce black or white smoke that swirls.
- Bubbles are, for example, natural gases such as methane and carbon dioxide leaking (erupting) from the seabed, or carbon dioxide leaking from reservoirs artificially injected by CCS (carbon dioxide sequestration), and they seem to rise from the seabed. move to
- microbes not only microbes but also microparticles as target objects move in a specific moving direction.
- the measurement apparatus 1 on the premise that the measurement is performed in an aphotic layer where sunlight does not reach, the target object is irradiated with light of different wavelengths, and an image of the reflected light (or excitation light) is captured. Identify the type of object. Then, the measuring device 1 measures the distance and speed in the imaging direction for the target object whose type has been specified.
- FIG. 4 is a diagram explaining an example of measurement settings.
- FIG. 5 is a diagram illustrating an example of an operation timesheet.
- the control unit 10 performs measurement according to the measurement settings specified in advance as shown in FIG.
- the measurement settings specify the measurement start condition, the operation time sheet of the illumination unit 3, the identification program (identification method), the distance/speed measurement program (distance/speed measurement method), and the measurement end condition.
- the measurement start condition specifies a condition for starting measurement. For example, the time to start measurement or reception of a measurement start command input via communication unit 12 is specified. there is
- a time sheet for operating the lighting unit 3 is specified in the operation time sheet. For example, in the operation time sheet shown in FIG. designated to be irradiated.
- the operation time sheet specifies what wavelength of light is to be applied from the illumination unit 3 to the imaging range IR and at what timing.
- the reason why the illumination unit 3 is turned off that is, the timing at which light is not emitted is provided in order to image light when the target object is emitting light (excitation).
- the asynchronous vision sensor 14a can easily detect an event for each wavelength.
- the identification program specifies a program (method) for identifying the type of target object, for example, identification by machine learning or rule-based identification.
- a program (method) for measuring information about the position of the target object in the imaging direction is specified in the distance/velocity measurement program. For example, machine learning-based measurement, rule-based measurement, etc. are specified.
- the measurement termination condition specifies a condition for terminating the measurement. For example, the time to terminate the measurement, or the reception of a measurement termination command input via the communication unit 12 is specified. there is
- FIG. 6 is a flowchart showing the procedure of measurement processing.
- the control unit 10 executes the software (including the identification program and the distance/speed measurement program) stored in the memory 11, thereby executing the measurement process shown in FIG.
- step S1 the control unit 10 reads external environment information, which will be described later in detail. Then, in step S2, the control unit 10 determines whether or not the measurement start condition specified in the measurement setting is satisfied. Then, the control unit 10 repeats steps S1 and S2 until the measurement start condition is satisfied.
- step S3 the imaging control unit 21 switches and irradiates light with different wavelengths from the illumination unit 3 according to the operation time sheet specified in the measurement settings. Further, the imaging control unit 21 causes the imaging unit 14 to image the imaging range IR each time the wavelength of the light emitted from the illumination unit 3 and on/off is switched, and acquires pixel data and image data. After that, in step S4, the class identification unit 22 executes class identification processing.
- the class identification unit 22 identifies (identifies) the type of target object based on the image (pixel data and image data) captured by the imaging unit 14 .
- the class identification unit 22 derives identification information from the image captured by the imaging unit 14 and compares the definition information stored in the memory 11 to detect the target object.
- the definition information is provided for each target object and stored in the memory 11.
- the definition information includes the type of target object, movement information and image information.
- the movement information is information detected mainly based on the image captured by the vision sensor 14a, and is information based on the movement of the target object as shown in the lower part of FIG.
- the movement information is information such as movement direction (positive or negative), speed and trajectory with respect to the light source when the target object is a microorganism.
- the movement information is information such as movement direction, speed and trajectory.
- the image information is information detected mainly based on the image captured by the imaging sensor 14b, and is external information of the target object.
- the image information may be information detected based on an image captured by the vision sensor 14a.
- the definition information may also include the direction of gravity detected by the gravity sensor 13 and external environment information acquired via the communication unit 12 .
- the external environment information includes depth, position coordinates (latitude and longitude of the measurement point, plane rectangular coordinates), electrical conductivity, temperature, ph, concentration of gas (eg, methane, hydrogen, helium), concentration of metal (for example, manganese, iron) and the like are conceivable.
- the class identification unit 22 detects objects existing in the imaging range IR based on the image (pixel data) captured by the vision sensor 14a. For example, the class identification unit 22 creates one image (frame data) based on pixel data input within a predetermined period, and classifies a group of pixels within a predetermined range in which motion is detected in the image into one group. Detect as an object.
- the class identification unit 22 tracks the object between a plurality of frames by pattern matching or the like. Then, the class identification unit 22 derives the movement direction, speed, and trajectory as identification information based on the tracking result of the object.
- the cycle in which the class identification unit 22 generates an image from the pixel data may be the same as or shorter than the cycle (frame rate) in which the imaging sensor 14b acquires the image data.
- the class identification unit 22 extracts an image portion corresponding to the object from the image data input from the imaging sensor 14b for the object for which identification information has been derived. Then, the class identification unit 22 derives the external features as identification information by image analysis based on the extracted image portion. In addition, since image analysis can use a well-known method, the description is abbreviate
- the class identification unit 22 identifies the wavelength of the light emitted by the illumination unit 3 and the identification information (moving direction, trajectory, speed, external features) derived for the detected object according to a designated identification program. By collating with the defined information, it is identified which target object it is. Here, for example, if the identification information of the derived object is within the range indicated by the definition information of the target object, the class identification unit 22 identifies that the derived object is of the type indicated by the definition information. Become.
- definition information are stored in the memory 11 by different methods for each identification program. For example, in a rule-based identification program, definition information is preset by a user and stored in the memory 11 . Further, in the machine learning identification program, the definition information is generated, updated and stored in the memory 11 by machine learning in the learning mode.
- the class identification unit 22 stores the identification result of the detected target object and the image captured by the imaging sensor 14b in the memory 11 or transmits them to an external device via the communication unit 12.
- step S5 the distance/velocity measurement unit 23 performs distance/velocity measurement processing for measuring the distance and speed (information about the position of the target object) in the imaging direction of the target object based on the type of the target object identified by the class identification unit 22. Run. Details of the distance/velocity measurement process in step S5 will be described later.
- step S6 the control unit 10 determines whether or not the measurement end condition is satisfied. Then, the control unit 10 repeats steps S1 to S6 until the measurement termination condition is satisfied, and when the termination condition is satisfied (Yes in step S6), the determination process is terminated.
- step S5 the distance/speed measurement unit 23 executes the distance/speed measurement process based on the rule-based or machine learning distance/speed measurement program.
- the rule-based distance/speed measurement process and the machine learning distance/speed measurement process will be described with specific examples.
- FIG. 7 is a diagram for explaining the rule-based distance/speed measurement process.
- the focal length f of the vision sensor 14a is stored in the memory 11 as known information.
- the memory 11 stores statistical information (average size H) for each target object. This is registered in advance by the user as a database.
- the distance/velocity measurement unit 23 calculates the average size H of the target object, and .
- the focal length f of the vision sensor 14a is read out from the memory 11.
- the distance/velocity measuring unit 23 calculates the longitudinal length s of the image 42 of the target object captured on the imaging surface 40, for example, based on the number of pixels in which the image 42 is captured.
- the distance/velocity measuring unit 23 calculates the distance D from the measuring device 1 to the actual target object 41 each time an image based on pixel data is acquired (each time the target object is detected from the image). (Measure. In addition, the distance/velocity measurement unit 23 determines the imaging direction (Z-axis direction) of the target object 41 tracked between successive images based on the interval at which the images are acquired and the distance D in each image. Calculate (measure) velocity.
- the distance/speed measurement unit 23 measures information regarding the position of the target object based on statistical information (average size) for each target object.
- FIG. 8 is a diagram for explaining an image that serves as teacher data.
- FIG. 9 is a model diagram of deep learning.
- machine learning is performed using an image that is teacher data as shown in Fig. 8, and a model (architecture) for the distance and speed measurement process is generated.
- the distance in the imaging direction from the measuring device 1 to the target object is 5 patterns of 1 mm, 5 mm, 10 mm, 100 mm, and 200 mm, and the wavelength of the irradiated light is varied by 10 nm from 400 nm to 700 nm 31
- a total of 153 patterns of patterns, images of known target objects captured by the vision sensor 14a are prepared in advance.
- the distance/velocity measuring unit 23 For each of the prepared images, the distance/velocity measuring unit 23 detects, as a target object, a group of pixels within a predetermined range in which motion is detected, and resizes the group of pixels to 32 pixels by 32 pixels to obtain the An image that is teacher data as shown in 8 is generated.
- FIG. 8 shows a part of the image that is the teacher data.
- the attenuation rate of light with a wavelength of about 500 nm is low, and the attenuation rate of light with a wavelength shorter than about 500 nm and light with a wavelength longer than about 500 nm increases with distance from about 500 nm. . Also, the longer the distance from the measuring device 1 to the target object, the lower the light arrival rate.
- the closer the target object is to the measuring device 1 and the closer the wavelength of the irradiated light is to 500 nm the sharper the target object is captured. Further, the farther the target object is from the measuring device 1 and the farther the wavelength of the irradiated light is from 500 nm, the more unclear the target object is or not at all.
- the distance/speed measurement unit 23 machine-learns the teacher data consisting of these images with a deep neural network, as shown in FIG.
- This model consists, for example, of five convolutional layers (Conv1 to Conv5), three pooling layers (Max Pooling) and two fully connected layers (FC). Then, by machine learning, a model that finally outputs a one-dimensional classification vector having five elements from Distance 1 mm to Distance 200 mm is generated and stored in the memory 11 .
- Machine learning with such a deep neural network is performed for each target object, and a model for each target object is generated and stored in the memory 11.
- the distance/speed measurement unit 23 reads out the identified type of model from the memory 11 . Further, the distance/velocity measurement unit 23 resizes the target object portion in the image captured by the vision sensor 14a to 32 pixels ⁇ 32 pixels, and inputs the resized image to the read model. As a result, the value of a one-dimensional classification vector having five elements from Distance 1mm to Distance 200mm is output. Then, the distance/velocity measurement unit 23 outputs (measures) the element with the highest value (one of Distance 1 mm to Distance 200 mm) among the five elements as the distance in the imaging direction of the target object.
- the distance/velocity measurement unit 23 calculates the distance in the imaging direction (Z-axis direction) based on the interval at which the images are acquired and the distance in the imaging direction in each image for the target object being tracked between successive images. Calculate (measure) the speed of
- the distance/velocity measurement unit 23 measures the information about the position of the target object based on the learning result of the information about the position learned in advance for each type of the target object. .
- FIG. 10 is a diagram illustrating the configuration of a measuring device 100 as a second embodiment according to the present technology. As shown in FIG. 10, the measuring apparatus 100 differs from the measuring apparatus 1 according to the first embodiment in that the control unit 110 does not function as the class identification unit 22. Same as 1.
- the measuring device 100 measures the distance and speed to the target object in the imaging direction based on the image captured by the vision sensor 14a without specifying the type of the target object.
- FIG. 11 is a diagram explaining an example of measurement settings.
- the control unit 110 performs measurement according to preset measurement settings as shown in FIG.
- the measurement settings specify the measurement start conditions, the operation time sheet of the illumination section 3, the distance/speed measurement program (measurement method), and the measurement end conditions.
- the measurement start condition specifies a condition for starting measurement. For example, the time to start measurement or reception of a measurement start command input via communication unit 12 is specified. there is
- a time sheet for operating the lighting unit 3 is specified in the operation time sheet.
- it is specified to irradiate the light by changing the wavelength from 400 nm to 700 nm by 10 nm, such as 400 nm, 410 nm, .
- a program (method) for measuring information about the position of the target object in the imaging direction is specified in the distance/velocity measurement program. For example, machine learning-based measurement, rule-based measurement, etc. are specified.
- the measurement termination condition specifies a condition for terminating the measurement. For example, the time to terminate the measurement, or the reception of a measurement termination command input via the communication unit 12 is specified. there is
- the measurement setting in the second embodiment differs from the measurement setting in the first embodiment in that no identification program is provided.
- FIG. 12 is a flowchart showing the procedure of measurement processing.
- the control unit 110 executes the software (distance/speed measurement program) stored in the memory 11 to execute the measurement process shown in FIG.
- step S1 the control unit 110 reads external environment information. Then, in step S2, the control unit 10 determines whether or not the measurement start condition specified in the measurement setting is satisfied. Then, control unit 110 repeats step S1 and step S2 until the measurement start condition is satisfied.
- step S3 the imaging control unit 21 switches and irradiates light with different wavelengths from the illumination unit 3 according to the operation time sheet specified in the measurement settings. Further, the imaging control unit 21 causes the imaging unit 14 to image the imaging range IR each time the wavelength of the light emitted from the illumination unit 3 and on/off is switched, and acquires pixel data and image data.
- step S11 the distance/velocity measurement unit 23 detects an object existing in the imaging range as a target object based on the image based on the pixel data, and measures the distance and velocity of the target object in the imaging direction. Execute the process. Details of the distance/speed measurement process in step S11 will be described later.
- step S6 the control unit 10 determines whether a termination condition for terminating the determination process is satisfied. Then, the control unit 10 repeats steps S1 to S6 until the termination condition for terminating the determination process is satisfied. Yes), the determination process is terminated.
- step S11 the distance/speed measurement unit 23 executes distance/speed measurement processing based on a rule-based or machine learning distance/speed measurement program.
- a specific example will be given of the distance/speed measurement processing of machine learning.
- the measuring apparatus 100 creates a deep learning model as shown in FIG.
- a model is generated for each target object, but in the second embodiment, without generating a model for each target object, Generate only one pre-trained model.
- the distance in the imaging direction from the measuring device 1 to the target object is 5 patterns of 1 mm, 5 mm, 10 mm, 100 mm, and 200 mm, and the wavelength of the irradiated light is varied by 10 nm from 400 nm to 700 nm 31
- a total of 153 patterns and different target objects, that is, images captured by the vision sensor 14a are prepared in a total of 153 patterns.times.the number of types of target objects.
- the distance/velocity measuring unit 23 For each of the prepared images, the distance/velocity measuring unit 23 detects, as a target object, a group of pixels within a predetermined range in which motion is detected, and resizes the group of pixels to 32 pixels by 32 pixels to obtain the An image that is teacher data as shown in 8 is generated.
- the distance/speed measuring unit 23 machine-learns the teacher data consisting of these images with a deep neural network, and stores the generated model in the memory 11, as shown in FIG. remembered.
- the distance/velocity measuring unit 23 resizes the target object portion in the image captured by the vision sensor 14 a to 32 pixels ⁇ 32 pixels, and inputs the resized image to the model read from the memory 11 . As a result, the value of a one-dimensional classification vector having five elements from Distance 1mm to Distance 200mm is output. Then, the distance/velocity measurement unit 23 outputs (measures) the element with the highest value (Distance 1 mm to Distance 200 mm) among the five elements as the distance in the imaging direction of the target object.
- the distance/velocity measurement unit 23 calculates the distance in the imaging direction (Z-axis direction) based on the interval at which the images are acquired and the distance in the imaging direction in each image for the target object being tracked between successive images. Calculate (measure) the speed of
- the distance/velocity measurement unit 23 measures the information about the position of the target object based on the learning result of the information about the position learned in advance regardless of the type of the target object. do.
- the data volume can be reduced because the number of models is smaller than in the first embodiment. Further, in the second embodiment, the distance measurement accuracy is low, but the calculation time is short.
- the measuring device 1 is provided with one illumination unit 3.
- the number of illumination units 3 is not limited to one, and a plurality of illumination units may be provided.
- FIG. 13 is a diagram illustrating the configuration of a measuring device 200 of a modified example.
- the measuring device 200 of the modification includes one body section 2 and two lighting sections 3 .
- the two illumination units 3 are arranged so as to be able to irradiate light in mutually orthogonal directions, and can irradiate the imaging range with light of different wavelengths.
- light of different wavelengths can be emitted from the two illumination units 3, so that the identification information of the target object (microorganism) that exhibits running properties with respect to the light of different wavelengths can be measured once. can be derived and the measurement can be performed efficiently.
- FIG. 14 is a diagram illustrating the configuration of a measuring device 300 of a modified example.
- the measuring device 300 of the modification includes two main body sections 2 and one lighting section 3 .
- the two main bodies 2 are arranged so as to be able to capture images in directions orthogonal to each other.
- images can be captured by the two main units 2 (imaging units 14), so that it is possible to detect the three-dimensional movement of the target object, and more efficient measurement can be performed. It can be performed.
- imaging units 14 imaging units 14
- the imaging unit 14 is provided with the vision sensor 14a and the imaging sensor 14b.
- the imaging unit 14 may include only one of the vision sensor 14a and the imaging sensor 14b as long as it is possible to capture an image capable of measuring at least information about the position of the target object in the imaging direction.
- the imaging unit 14 may be provided with a SPAD (Single Photon Avalanche Diode) sensor instead of the vision sensor 14a and the imaging sensor 14b.
- SPAD Single Photon Avalanche Diode
- the identification information is derived based on the pixel data acquired by the vision sensor 14a and the image data acquired by the imaging sensor 14b to identify the type of target object.
- the type of target object can be identified based on at least one of the pixel data acquired by the vision sensor 14a and the image data acquired by the imaging sensor 14b, other methods may be used for identification. good.
- machine learning is performed by deep learning.
- the machine learning method is not limited to this, and machine learning may be performed by other methods.
- a model generated by machine learning may be created by an external device instead of the measuring device 1 .
- FIG. 16A and 16B are diagrams for explaining an image captured by the vision sensor 14a during lighting control in Modification 1.
- FIG. 16A and 16B are diagrams for explaining an image captured by the vision sensor 14a during lighting control in Modification 1.
- an address event occurs when the luminance changes and the current value changes beyond a certain threshold. Therefore, when the target object TO moves very slowly or does not move at all in the imaging range (hereinbelow, these are collectively referred to as "stopped"), no address event occurs in each pixel. Therefore, in such a case, the vision sensor 14a cannot image the target object TO.
- the imaging control unit 21 uses the vision sensor 14a to The target object TO is imaged as shown in a) and FIG. 16(b).
- the imaging control unit 21 When the target object TO cannot be detected within the imaging range (when the target object TO disappears within the imaging range), the imaging control unit 21 either stops the target object TO or moves the target object TO out of the imaging range at high speed. , or the target object TO has disappeared. Then, as shown in the middle part of FIG. 15, the imaging control section 21 temporarily stops the irradiation of the light from the lighting section 3 . When the irradiation of the light from the illumination unit 3 is stopped, if the target object TO exists in the imaging range, the brightness of the target object TO changes. The target object TO is imaged.
- the imaging control unit 21 restarts the irradiation of light from the illumination unit 3 as shown in the lower part of FIG. 15 .
- the brightness of the target object TO changes, so that the vision sensor 14a captures an image of the target object TO as shown in FIG. 16(e).
- the imaging control section 21 may change the wavelength of the light emitted from the illumination section 3 . By changing the wavelength of the light emitted from the illumination unit 3, it is also possible to image the target object TO stopped within the imaging range with the vision sensor 14a.
- FIG. 17A and 17B are diagrams for explaining illumination control in Modification 2.
- FIG. 2 a plurality of illumination units 3 are provided. Here, a case where two lighting units 3 are provided will be described. The two lighting units 3 are arranged at different positions.
- the image captured by the vision sensor 14a changes to the image captured by the imaging control unit 21.
- the target object TO is captured.
- the imaging control unit 21 stops the irradiation of light from one illumination unit 3 and stops the irradiation of light from the other illumination unit 3, as shown in the lower part of FIG. to start.
- the vision sensor 14a changes the brightness of the target object TO, so that the image of the target object TO is captured.
- the image capturing control unit 21 causes the image capturing unit 14 to capture an image of a predetermined image capturing range in water, and the image of the target object in the image capturing direction is determined based on the image captured by the image capturing unit 14. and a measuring unit (distance/velocity measuring unit 23) that measures information about the position.
- the measuring apparatus 1 can measure information about the position of the target object in the imaging direction without having a complicated configuration. For example, it is conceivable to measure information about the position of the target object in the imaging direction by providing two imaging units 14 in parallel and using them as a stereo camera. However, this method complicates the apparatus and makes the calibration of the two imaging units 14 difficult. In contrast, the measuring device 1 can efficiently measure information about the position of the target object.
- the imaging unit 14 may include a vision sensor 14a that asynchronously acquires pixel data according to the amount of light incident on each of the pixels arranged in a two-dimensional array. As a result, it is possible to read out only the pixel data of the pixel where the event has occurred, and measure the target object based on the pixel data. Therefore, the measuring apparatus 1 can realize high-speed imaging, reduction of power consumption, and low calculation cost of image processing by automatic separation from the background.
- the measurement apparatus 1 includes the illumination unit 3 that irradiates light of a predetermined wavelength to the imaging range, and the imaging unit 14 captures the imaging range irradiated with the light of the predetermined wavelength by the illumination unit 3. can be considered. As a result, only the reflected light or excitation light from the target object can be imaged at depths of water where sunlight does not reach. Therefore, the measuring device 1 can efficiently measure the target object.
- the illumination unit 3 can switch and irradiate light with different wavelengths, and the imaging unit 14 can capture images of the imaging ranges irradiated with light with different wavelengths. be done.
- images of reflected light or excitation light that differ depending on the wavelength can be captured for each type of target object. Therefore, the measuring apparatus 1 can acquire a characteristic image for each target object.
- the measuring unit measures the distance of the target object in the imaging direction. This makes it possible to measure the distance of the target object in the imaging direction with a simple configuration without using a complicated configuration such as a stereo camera.
- the measuring unit measures the velocity of the target object in the imaging direction. This makes it possible to measure the velocity of the object in the imaging direction with a simple configuration without using a complicated configuration such as a stereo camera.
- the measuring device 1 includes an identification unit (class identification unit 22) that identifies the type of the target object based on the image captured by the imaging unit 14, and the measurement unit is identified by the identification unit. Based on the type of target object obtained, information about the position of the target object is determined. This makes it possible to measure positional information using a method (model) adapted to each type of target object. Therefore, the measuring device 1 can accurately measure information about the position of the target object in the imaging direction.
- an identification unit class identification unit 22
- the measuring unit measures information about the position of the target object based on statistical information for each type of target object. This makes it possible to measure information about the position of the target object in the imaging direction by a simple method.
- the measurement unit derives information about the position of the target object based on the learning result of the information about the position learned in advance for each type of the target object. This makes it possible to accurately measure information about the position of the target object in the imaging direction.
- the measurement unit derives information about the position of the target object based on the learning result of the information about the position learned in advance regardless of the type of the target object. As a result, it is possible to reduce the data capacity and shorten the calculation time.
- the imaging control unit 21 may temporarily stop the irradiation of light from the illumination unit 3 when the target object cannot be detected within the imaging range. As a result, it is possible to continuously measure the target object stopped within the imaging range.
- the imaging control unit 21 may change the wavelength of the light emitted from the illumination unit 3 when the target object cannot be detected within the imaging range. As a result, it is possible to continuously measure the target object stopped within the imaging range.
- the imaging control unit 21 may move the illumination unit 3 when the target object cannot be detected within the imaging range. As a result, it is possible to continuously measure the target object stopped within the imaging range.
- the imaging control unit 21 may emit light from different illumination units 3 when the target object cannot be detected within the imaging range. be done. As a result, it is possible to continuously measure the target object stopped within the imaging range.
- a predetermined imaging range in water is captured by the imaging unit, and information regarding the position of the target object in the imaging direction is measured based on the captured image.
- the information processing apparatus executes a process of capturing an image of a predetermined imaging range in water by the imaging unit and measuring information about the position of the target object in the imaging direction based on the captured image.
- Such a program can be recorded in advance in a HDD as a recording medium built in equipment such as a computer device, or in a ROM or the like in a microcomputer having a CPU.
- a flexible disc a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a Blu-ray disc (Blu-ray Disc (registered trademark)), a magnetic disc, a semiconductor memory
- It can be temporarily or permanently stored (recorded) in a removable recording medium such as a memory card.
- Such removable recording media can be provided as so-called package software.
- it can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
- LAN Local Area Network
- Such a program is suitable for widely providing the information processing apparatus of the embodiment.
- a program for example, by downloading a program to a mobile terminal device such as a smartphone or tablet, a mobile phone, a personal computer, a game device, a video device, a PDA (Personal Digital Assistant), etc., it can function as the information processing device of the present disclosure.
- the present technology can also adopt the following configuration.
- an imaging control unit that causes an imaging unit to image a predetermined imaging range in water; a measurement unit that measures information about the position of the target object in the imaging direction based on the image captured by the imaging unit; measuring device with (2)
- the imaging unit is (1), comprising a vision sensor that acquires pixel data asynchronously according to the amount of light incident on each of the two-dimensionally arranged pixels.
- An illumination unit that irradiates the imaging range with light of a predetermined wavelength, The imaging unit is The measuring device according to (1) or (2), wherein an imaging range is irradiated with light of a predetermined wavelength by the illumination unit.
- the illumination unit can switch and irradiate light with different wavelengths, (3) The measuring device according to (3), wherein the imaging unit captures an image of an imaging range irradiated with light of a different wavelength. (5) The measurement unit The measuring device according to any one of (1) to (4), which measures the distance of the target object in the imaging direction. (6) The measurement unit The measuring device according to any one of (1) to (5), which measures the velocity of the target object in the imaging direction. (7) An identification unit that identifies the type of the target object based on the image captured by the imaging unit, The measuring device according to any one of (1) to (6), wherein the measurement unit measures information regarding the position of the target object based on the type of the target object identified by the identification unit.
- the measurement unit The measuring device according to (7), wherein the information about the position of the target object is measured based on the statistical information for each type of the target object.
- the measurement unit The measuring device according to any one of (1) to (6), wherein the information regarding the position of the target object is measured based on the learning result of the information regarding the position learned in advance for each type of the target object.
- the measurement unit The measuring device according to any one of (1) to (6), wherein the information regarding the position of the target object is measured based on the learning result of the information regarding the position learned in advance regardless of the type of the target object.
- the imaging control unit is The measuring device according to (3) or (4), wherein when the target object cannot be detected within the imaging range, the light irradiation from the illumination unit is temporarily stopped.
- the imaging control unit is (3) The measuring device according to (3), wherein when the target object cannot be detected within the imaging range, the wavelength of the light emitted from the illumination unit is changed.
- the imaging control unit is The measuring device according to (3) or (4), wherein the illumination section is moved when the target object cannot be detected within the imaging range.
- a plurality of the illumination units are provided, The imaging control unit is The measuring device according to (3) or (4), wherein when the target object cannot be detected within the imaging range, light is emitted from a different illumination unit.
- illumination unit 1 measurement device 3 illumination unit 10 control unit 14 imaging unit 14a vision sensor 14b imaging sensor 21 imaging control unit 22 class identification unit 23 distance speed measurement unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
これにより、測定装置は、複雑な構成を有することなく、対象物体の撮像方向の位置に関する情報を測定することが可能となる。
<1.第一の実施形態>
[1.1 測定装置の構成]
[1.2 対象物体について]
[1.3 第一の実施形態の測定方法]
[1.4 測定処理]
[1.5 距離速度測定処理]
<2.第二の実施形態>
[2.1 測定装置の構成]
[2.2 測定処理]
[2.3 機械学習の距離速度測定処理]
<3.測定装置の他の構成例>
<4.実施形態のまとめ>
<5.本技術>
[1.1 測定装置の構成]
先ず、本技術に係る第一の実施形態としての測定装置1の構成について説明する。
測定装置1は、例えば海中などの水中に存在する微生物または微粒子を対象物体とし、撮像方向における対象物体の位置に関する情報を測定する装置である。
図1に示すように、測定装置1は、本体部2および照明部3を備えている。なお、照明部3は、本体部2内に設けられるようにしてもよい。
また、制御部10は、メモリ11に記憶されたデータの読み出し処理やメモリ11にデータを記憶させる処理、および、通信部12を介した外部機器との間での各種データの送受信を行う。
図3は、対象物体および対象物体の移動について説明する図である。なお、図3では、上段に対象物体のイメージを示し、下段に対象物体の移動方向を矢印で示している。
そして、微生物のなかには、特定の波長の光が照射されることによって走行性を示すものが存在することが知られている。ここで、走行性とは、光(外部刺激)に対して、生物が反応する生得的な行動である。したがって、走行性を有する微生物に特定の波長の光を照射すると、その微生物は走行性に応じた移動を行う。
海底砂は、例えば、海底に沈殿している砂などの粒子であり、海底流によって渦を巻くように移動する。
スモークは、例えば、地熱で熱せられた高温水が海底の熱水噴出孔から噴出する現象である。そして、熱水噴出孔から吹き出す熱水は数百度にも達することがあり、溶存成分として重金属や硫化水素を豊富に含むため、海水と反応して黒色や白色の煙がスモークとして渦を巻きながら上昇するように移動する。
気泡は、例えば海底から漏出(噴出)するメタンや二酸化炭素などの自然ガス、または、CCS(二酸化炭素貯留)で人工的に圧入した貯留層から漏出する二酸化炭素等であり、海底から上昇するように移動する。
次に、第一の実施形態としての対象物体についての測定方法(測定処理)について説明する。
海洋は深度約150mで太陽光が届かない無光層となる。無光層は外洋の大部分を占め、上記の対象物体も多く存在する。一方で、対象物体は、照射される光の波長ごとに異なる波長または強度の光を反射または発光することが知られている。
図6は、測定処理の手順を示すフローチャートである。制御部10は、メモリ11に記憶されているソフトウェア(識別プログラム、距離速度測定プログラムを含む)を実行することで、図6に示す測定処理を実行する。
次に距離速度測定処理について説明する。上記したように、距離速度測定部23は、ステップS5において、ルールベースまたは機械学習の距離速度測定プログラムに基づいて、距離速度測定処理を実行する。
ここでは、ルールベースの距離速度測定処理と、機械学習の距離速度測定処理とについて具体的な例を挙げて説明する。
図7は、ルールベースの距離速度測定処理を説明する図である。ルールベースの距離速度測定処理では、ビジョンセンサ14aの焦点距離fが既知の情報としてメモリ11に記憶されている。
D=fH/s ・・・(1)
また、距離速度測定部23は、連続する画像間で追跡されている対象物体41について、画像が取得される間隔と、それぞれの画像での距離Dとに基づき、撮像方向(Z軸方向)の速度を算出(測定)する。
図8は、教師データとなる画像を説明する図である。図9は、ディープラーニングのモデル図である。
また、測定装置1から対象物体までの距離が遠くなるほど、光の到達率は低下する。
また、距離速度測定部23は、連続する画像間で追跡されている対象物体について、画像が取得される間隔と、それぞれの画像での撮像方向の距離とに基づき、撮像方向(Z軸方向)の速度を算出(測定)する。
[2.1 測定装置の構成]
図10は、本技術に係る第二の実施形態としての測定装置100の構成を説明する図である。図10に示すように、測定装置100は、第一の実施形態としての測定装置1と比較して、制御部110がクラス識別部22として機能しない点で相違し、それ以外の構成は測定装置1と同様である。
図12は、測定処理の手順を示すフローチャートである。制御部110は、メモリ11に記憶されているソフトウェア(距離速度測定プログラム)を実行することで、図12に示す測定処理を実行する。
上記したように、距離速度測定部23は、ステップS11において、ルールベースまたは機械学習の距離速度測定プログラムに基づいて、距離速度測定処理を実行する。
ここでは、機械学習の距離速度測定処理について具体的な例を挙げて説明する。
ここで、第一の実施形態においては、対象物体ごとにモデルを生成するようにしたが、第二の実施形態においては、対象物体ごとにモデルを生成することなく、対象物体の種類に拘らず予め学習された1つのモデルのみを生成する。
また、距離速度測定部23は、連続する画像間で追跡されている対象物体について、画像が取得される間隔と、それぞれの画像での撮像方向の距離とに基づき、撮像方向(Z軸方向)の速度を算出(測定)する。
なお、実施形態としては上記により説明した具体例に限定されるものではなく、多様な変形例としての構成を採り得るものである。
このような測定装置200では、2個の照明部3から異なる波長の光を照射可能であるため、異なる波長の光に対して走行性を示す対象物体(微生物)の識別情報を1回の測定により導出することができ、効率的に測定を行うことができる。
このような測定装置300では、2個の本体部2(撮像部14)によって画像を撮像することができるため、対象物体の三次元的な移動を検出することが可能となり、より効率的に測定を行うことができる。
なお、2個の本体部2を備える場合、一方の本体部2は、撮像部14のみを備えるようにしてもよい。
図15は、変形例1における照明制御を説明する図である。図16は、変形例1における照明制御時のビジョンセンサ14aで撮像される画像を説明する図である。
なお、対象物体TOが撮像範囲内で検出できなくなった場合、撮像制御部21は、照明部3から照射させる光の波長を変えるようにしてもよい。照明部3から照射させる光の波長を変更することでも、撮像範囲内で停止した対象物体TOをビジョンセンサ14aで撮像することが可能となる。
図17は、変形例2における照明制御を説明する図である。変形例2では、照明部3が複数設けられている。ここでは、照明部3が2つ設けられている場合について説明する。2つの照明部3は、異なる位置に配置されている。
なお、複数の照明部3が設けられていない場合、照明部3を移動させるようにすることで、複数の照明部3を切り換えて光を照射させる場合と同様に、対象物体TOが停止している場合であっても、対象物体TOを測定することができる。
上記のように実施形態の測定装置1においては、水中の所定の撮像範囲を撮像部14によって撮像させる撮像制御部21と、撮像部14で撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する測定部(距離速度測定部23)と、を備えるものである。
これにより、測定装置1は、複雑な構成を有することなく、対象物体の撮像方向の位置に関する情報を測定することが可能となる。
例えば、撮像部14を2つ並列して設けてステレオカメラとして用いることで撮像方向における対象物体の位置に関する情報を測定することも考えられる。しかしながら、この方法では、装置が複雑化するとともに、2つの撮像部14のキャリブレーションも大変になる。
これに対して、測定装置1は、対象物体の位置に関する情報を効率的に測定することができる。
これにより、イベントが発生した画素の画素データのみを読み出し、その画素データに基づいて対象物体について測定することが可能となる。
したがって、測定装置1は、高速撮像、消費電力の削減、背景との自動分離による画像処理の低計算コスト化を図ることができる。
これにより、太陽光が届かない水深において、対象物体からの反射光または励起光のみを撮像することができる。
したがって、測定装置1は、対象物体を効率的に測定することができる。
これにより、対象物体の種類ごとに、波長によって異なる反射光または励起光の画像を撮像することができる。
したがって、測定装置1は、対象物体ごとに特徴のある画像を取得することが可能となる。
これにより、ステレオカメラなどの複雑な構成を用いることなく、簡易な構成で対象物体について撮像方向の距離を測定することができる。
これにより、ステレオカメラなどの複雑な構成を用いることなく、簡易な構成で対象物体について撮像方向の速度を測定することができる。
これにより、対象物体の種類ごとに合わせた方法(モデル)によって位置に関する情報を測定することが可能となる。
したがって、測定装置1は、撮像方向における対象物体の位置に関する情報を精度良く測定することができる。
これにより、簡易な方法で、撮像方向における対象物体の位置に関する情報を測定することが可能となる。
これにより、撮像方向における対象物体の位置に関する情報を精度良く測定することが可能となる。
これにより、データ容量の削減、計算時間の短縮を図ることができる。
これにより、撮像範囲内で停止した対象物体を継続して測定することができる。
これにより、撮像範囲内で停止した対象物体を継続して測定することができる。
これにより、撮像範囲内で停止した対象物体を継続して測定することができる。
これにより、撮像範囲内で停止した対象物体を継続して測定することができる。
上記した本技術に係るプログラムにおいては、水中の所定の撮像範囲を撮像部によって撮像させ、撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する処理を情報処理装置に実行させる。
あるいはまた、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magneto Optical)ディスク、DVD(Digital Versatile Disc)、ブルーレイディスク(Blu-ray Disc(登録商標))、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。
また、このようなプログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
本技術は以下のような構成も採ることができる。
(1)
水中の所定の撮像範囲を撮像部によって撮像させる撮像制御部と、
前記撮像部で撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する測定部と、
を備えた測定装置。
(2)
前記撮像部は、
二次元に複数配列された画素それぞれに入射した光量に応じて、非同期で画素データを取得するビジョンセンサを備える
(1)に記載の測定装置。
(3)
前記撮像範囲に所定の波長の光を照射する照明部を備え、
前記撮像部は、
前記照明部によって所定の波長の光が照射された撮像範囲を撮像する
(1)または(2)に記載の測定装置。
(4)
前記照明部は、波長が異なる光を切り替えて照射可能であり、
前記撮像部は、異なる波長の光が照射された撮像範囲をそれぞれ撮像する
(3)に記載の測定装置。
(5)
前記測定部は、
前記対象物体について前記撮像方向の距離を測定する
(1)から(4)のいずれかに記載の測定装置。
(6)
前記測定部は、
前記対象物体について前記撮像方向の速度を測定する
(1)から(5)のいずれかに記載の測定装置。
(7)
前記撮像部で撮像された画像に基づいて、対象物体の種類を識別する識別部を備え、
前記測定部は、前記識別部によって識別された対象物体の種類に基づいて、前記対象物体の位置に関する情報を測定する
(1)から(6)のいずれかに記載の測定装置。
(8)
前記測定部は、
前記対象物体の種類ごとの統計的な情報に基づいて、前記対象物体の位置に関する情報を測定する
(7)に記載の測定装置。
(9)
前記測定部は、
前記対象物体の種類ごとに予め学習された位置に関する情報の学習結果に基づいて、前記対象物体の位置に関する情報を測定する
(1)から(6)のいずれかに記載の測定装置。
(10)
前記測定部は、
前記対象物体の種類に拘らず予め学習された位置に関する情報の学習結果に基づいて、前記対象物体の位置に関する情報を測定する
(1)から(6)のいずれかに記載の測定装置。
(11)
前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部からの光の照射を一時的に停止させる
(3)または(4)に記載の測定装置。
(12)
前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部から照射させる光の波長を変える
(3)に記載の測定装置。
(13)
前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部を移動させる
(3)または(4)に記載の測定装置。
(14)
前記照明部は複数設けられ、
前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、異なる前記照明部から光を照射させる
(3)または(4)に記載の測定装置。
(15)
水中の所定の撮像範囲を撮像部によって撮像させ、
撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する
測定方法。
(16)
水中の所定の撮像範囲を撮像部によって撮像させ、
撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する
処理を測定装置に実行させるプログラム。
3 照明部
10 制御部
14 撮像部
14a ビジョンセンサ
14b 撮像センサ
21 撮像制御部
22 クラス識別部
23 距離速度測定部
Claims (16)
- 水中の所定の撮像範囲を撮像部によって撮像させる撮像制御部と、
前記撮像部で撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する測定部と、
を備えた測定装置。 - 前記撮像部は、
二次元に複数配列された画素それぞれに入射した光量に応じて、非同期で画素データを取得するビジョンセンサを備える
請求項1に記載の測定装置。 - 前記撮像範囲に所定の波長の光を照射する照明部を備え、
前記撮像部は、
前記照明部によって所定の波長の光が照射された撮像範囲を撮像する
請求項1に記載の測定装置。 - 前記照明部は、波長が異なる光を切り替えて照射可能であり、
前記撮像部は、異なる波長の光が照射された撮像範囲をそれぞれ撮像する
請求項3に記載の測定装置。 - 前記測定部は、
前記対象物体について前記撮像方向の距離を測定する
請求項1に記載の測定装置。 - 前記測定部は、
前記対象物体について前記撮像方向の速度を測定する
請求項1に記載の測定装置。 - 前記撮像部で撮像された画像に基づいて、対象物体の種類を識別する識別部を備え、
前記測定部は、前記識別部によって識別された対象物体の種類に基づいて、前記対象物体の位置に関する情報を測定する
請求項1に記載の測定装置。 - 前記測定部は、
前記対象物体の種類ごとの統計的な情報に基づいて、前記対象物体の位置に関する情報を測定する
請求項7に記載の測定装置。 - 前記測定部は、
前記対象物体の種類ごとに予め学習された位置に関する情報の学習結果に基づいて、前記対象物体の位置に関する情報を測定する
請求項1に記載の測定装置。 - 前記測定部は、
前記対象物体の種類に拘らず予め学習された位置に関する情報の学習結果に基づいて、前記対象物体の位置に関する情報を測定する
請求項1に記載の測定装置。 - 前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部からの光の照射を一時的に停止させる
請求項3に記載の測定装置。 - 前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部から照射させる光の波長を変える
請求項4に記載の測定装置。 - 前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、前記照明部を移動させる
請求項3に記載の測定装置。 - 前記照明部は複数設けられ、
前記撮像制御部は、
撮像範囲内で前記対象物体が検出できなくなった場合、異なる前記照明部から光を照射させる
請求項3に記載の測定装置。 - 水中の所定の撮像範囲を撮像部によって撮像させ、
撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する
測定方法。 - 水中の所定の撮像範囲を撮像部によって撮像させ、
撮像された画像に基づいて、撮像方向における対象物体の位置に関する情報を測定する
処理を測定装置に実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280037923.7A CN117529634A (zh) | 2021-06-03 | 2022-05-23 | 测量装置、测量方法、程序 |
EP22815902.6A EP4350284A4 (en) | 2021-06-03 | 2022-05-23 | MEASURING DEVICE, MEASURING METHOD, PROGRAM |
JP2023525739A JPWO2022255152A1 (ja) | 2021-06-03 | 2022-05-23 | |
US18/563,394 US20240221203A1 (en) | 2021-06-03 | 2022-05-23 | Measuring device, measurement method, program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021093774 | 2021-06-03 | ||
JP2021-093774 | 2021-06-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022255152A1 true WO2022255152A1 (ja) | 2022-12-08 |
Family
ID=84323099
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/021150 WO2022255152A1 (ja) | 2021-06-03 | 2022-05-23 | 測定装置、測定方法、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240221203A1 (ja) |
EP (1) | EP4350284A4 (ja) |
JP (1) | JPWO2022255152A1 (ja) |
CN (1) | CN117529634A (ja) |
WO (1) | WO2022255152A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007189542A (ja) * | 2006-01-13 | 2007-07-26 | Fujifilm Corp | 撮影装置 |
WO2014171052A1 (ja) * | 2013-04-16 | 2014-10-23 | コニカミノルタ株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP2019008460A (ja) * | 2017-06-22 | 2019-01-17 | 株式会社東芝 | 物体検出装置、物体検出方法およびプログラム |
WO2019150786A1 (ja) * | 2018-01-31 | 2019-08-08 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
JP2019165687A (ja) | 2018-03-23 | 2019-10-03 | Jfeアドバンテック株式会社 | 特定種の植物プランクトンの存在量の算出方法及び算出装置、及び特定種の植物プランクトンによる赤潮発生の予兆検知方法及び予兆検知装置 |
WO2021038753A1 (ja) * | 2019-08-28 | 2021-03-04 | ウミトロン ピーティーイー エルティーディー | 水棲動物検出装置、情報処理装置、端末装置、水棲動物検出システム、水棲動物検出方法、及び水棲動物検出プログラム |
-
2022
- 2022-05-23 WO PCT/JP2022/021150 patent/WO2022255152A1/ja active Application Filing
- 2022-05-23 CN CN202280037923.7A patent/CN117529634A/zh active Pending
- 2022-05-23 US US18/563,394 patent/US20240221203A1/en active Pending
- 2022-05-23 JP JP2023525739A patent/JPWO2022255152A1/ja active Pending
- 2022-05-23 EP EP22815902.6A patent/EP4350284A4/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007189542A (ja) * | 2006-01-13 | 2007-07-26 | Fujifilm Corp | 撮影装置 |
WO2014171052A1 (ja) * | 2013-04-16 | 2014-10-23 | コニカミノルタ株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP2019008460A (ja) * | 2017-06-22 | 2019-01-17 | 株式会社東芝 | 物体検出装置、物体検出方法およびプログラム |
WO2019150786A1 (ja) * | 2018-01-31 | 2019-08-08 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
JP2019165687A (ja) | 2018-03-23 | 2019-10-03 | Jfeアドバンテック株式会社 | 特定種の植物プランクトンの存在量の算出方法及び算出装置、及び特定種の植物プランクトンによる赤潮発生の予兆検知方法及び予兆検知装置 |
WO2021038753A1 (ja) * | 2019-08-28 | 2021-03-04 | ウミトロン ピーティーイー エルティーディー | 水棲動物検出装置、情報処理装置、端末装置、水棲動物検出システム、水棲動物検出方法、及び水棲動物検出プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP4350284A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4350284A1 (en) | 2024-04-10 |
JPWO2022255152A1 (ja) | 2022-12-08 |
EP4350284A4 (en) | 2024-09-25 |
US20240221203A1 (en) | 2024-07-04 |
CN117529634A (zh) | 2024-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6959614B2 (ja) | 分析装置,及びフローサイトメータ | |
US11106903B1 (en) | Object detection in image data | |
US11290643B1 (en) | Efficient digital camera image acquisition and analysis | |
CN107764271B (zh) | 一种基于光流的可见光视觉动态定位方法及系统 | |
CN104598897A (zh) | 视觉传感器、图像处理方法和装置、视觉交互设备 | |
CN113221688B (zh) | 一种刀闸状态识别方法、装置和存储介质 | |
CN102402283A (zh) | 信息处理装置、信息处理方法、以及程序 | |
Zhang et al. | Prediction of keyhole TIG weld penetration based on high-dynamic range imaging | |
KR101326230B1 (ko) | 사용자 동적 기관 제스처 인식 방법 및 인터페이스와, 이를 사용하는 전기 사용 장치 | |
KR102327395B1 (ko) | 실내 조명 제어 장치 및 방법 | |
CN115631407B (zh) | 基于事件相机与彩色帧图像融合的水下透明生物检测 | |
CN111160100A (zh) | 一种基于样本生成的轻量级深度模型航拍车辆检测方法 | |
JP2021077350A (ja) | 対象物のための対象物分類を生成するための方法および装置 | |
WO2022255152A1 (ja) | 測定装置、測定方法、プログラム | |
CN104038689B (zh) | 图像处理装置和图像处理方法 | |
WO2023026551A1 (ja) | 測定装置、測定方法、プログラム | |
WO2023026880A1 (ja) | 測定装置、測定方法、プログラム | |
WO2022254942A1 (ja) | 測定装置、測定方法、プログラム | |
US20240353386A1 (en) | Measurement device, measurement method, and program | |
Sheth et al. | Recognition of underwater starfishes using deep learning | |
CN115294647A (zh) | 一种基于深度神经网络的吸烟行为检测方法和系统 | |
CN207610704U (zh) | 一种基于光流的可见光视觉动态定位系统 | |
WO2022130884A1 (ja) | 測定装置、測定方法、測定システム | |
WO2023112532A1 (ja) | 測定装置、測定方法、プログラム | |
CN113706436A (zh) | 一种基于自监督生成对抗学习背景建模的目标检测方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22815902 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023525739 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18563394 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280037923.7 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022815902 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022815902 Country of ref document: EP Effective date: 20240103 |